C++ Boost thread library pulls in the whole development environment - c++

I am using boost-thread in my application. When I deploy this application on a client machine (running Ubuntu 11.10), I need to make sure that libboost_thread.so is available on the machine. However, when I run "apt-get install libboost-thread1.46," it seems to pull in the whole development enviornment (libgcc, libbost1.46-dev, etc.). This machine needs just the runtime environment, not the development environment. I am wondering if there is a better way to handle this.

No such package exception: The package "libboost-thread1.46" does not exist on Ubuntu, is treated by apt-get as a regular expression, and the development package also matches the expression. The two candidate packages are named libboost-thread1.46-dev and libboost-thread1.46.1, where the latter is the package you want. It depends only on three libraries (libgcc, libc, libstdc++), all of which you need to deploy anyway because your program and libboost-thread link against them.
So, deploy by installing libboost-thread1.46.1 and everything should be fine.

You can build individual requirements yourself by download the boost tar and using the bjam build tool.

You could link statically against boost.

You can also use bcp and copy the necessary files into your own source tree. I personally have the headers installed on my system and just added the source files to my project (once.cpp, thread.cpp, timeconv.inl, tss_null.cpp on Linux).

Related

How do you package GCC for distribution?

I am making a modified C++ compiler and I have it built and tested locally. However, I would like to be able to package my build for Windows, Linux (Debian), and Mac OSX.
All of the instructions I can find online deal with building gcc but have no regard for making something distributable (or perhaps I am missing something?). I know for Windows I will need to bundle MinGW somehow, but that only further confuses me - and I have no idea how well Mac works with GCC these days..
Can anyone layout a set of discrete high-level steps I could try on each system so I can allow people to install my modified compiler easily?
First make sure your project installs well including executables, headers, runtime dependencies. If you're using something like cmake, this is a matter of installing things to CMAKE_INSTALL_PREFIX while possibly appending GnuInstallDirs. If you are using make, then you need to ensure that make install --prefix=... works well.
From here, you can target each platform independently. Treat the packaging independently from your project. Like Chipster mentioned, making rpm files isn't so tough. deb files for Debian-based OSs, tar.xz files for Arch-based OSs are similar. The rules for creating these packages can use your install rules to create the package. You mentioned mingw. If you're targeting an msys distribution of mingw for Windows deployment, then the Arch-based packaging of pacman will work on msys as well. You can slowly work on supporting one-platform at a time with almost no changes to your actual project.
Typically in the open-source world, people will release a tar.gz file supporting ./configure && make && make install or similar. Then someone associated with the platform (like a Debian-developer) will find your project, make some packaging rules for it, and release it into their distribution. That means your project can be totally agnostic to where it's being release. It also means you don't really need to worry about MacOS yet, you can wait until you have someone who wants it there, or some hardware to test it on.
If you really want to be in control of how things are packaged for each platform from inside of your project, and you are already using cmake, cpack is a great tool which helps out. After writing cpack rules for your project, you can simply type cpack to generate many types of deployable archives. You won't get the resulting *.deb file into Debian or Ubuntu official archives, but at least people can using those formats can install your package.
Also, consider releasing one package with the runtime libraries, and one with the development content (headers, compiler, static libraries). This way, if someone uses your compiler, they can re-distribute the runtime libraries which is probably going to be a much simpler install.

How should I provide library binaries to developers?

I want to make it easy for others to work on my repository. However, since some of the compiled dependencies are over 100mb in size, I cannot include them into the repository. Github rejects those files.
What is the best way to handle large binaries of dependencies? Building the libraries from source is not easy under Windows and takes hours. I don't want every developer to struggle with this process.
I've recently been working on using Ivy (http://ant.apache.org/ivy/) with C++ binaries. The basic idea is that you build the binaries for every build combination. You will then zip each build combination into a file with a name like mypackage-windows-vs12-x86-debug.zip. In your ivy.xml, you will associate each zip file with exactly one configuration (ex: windows-vs12-x86-debug). Then you publish this package of multiple zip files to an Ivy repo. You can either host the repo yourself or you can try to upload to an existing Ivy repo. You would create a package of zip files for each dependency, and the ivy.xml files will describe the dependency chain among all the packages.
Then, your developers must set up Ivy. In their ivy.xml files, they will list your package as a dependency, along with the configuration they need (ex: windows-vs12-x86-debug). They will also need to add an ivy resolve/retrieve step to their build. Ivy will download the zip files for your package and everything that your package depends on. Then they will need to set up unzip & move tasks in their builds to extract the binaries you are providing, and put them in places their build is expecting.
Ivy's a cool tool but it is definitely streamlined for Java and not for C++. When it's all set up, it's pretty great. However, in my experience as a person who is not really familiar with DevOps at all, integrating it into a C++ build has been challenging. I found that it was easiest to create simple ant tasks that do the required ivy actions, then use my "regular" build system (make) to call those ant tasks when needed.
So I should also mention that the reason I looked into using Ivy was that I was implementing this in a corporate environment where I couldn't change system files. If you and your developers can do that, you may be better off with a RPM/APT system. You'd set up a repo and get your developers to add your repo to the appropriate RPM/APT config file. Then they would run commands like sudo apt-get install mypackage and apt-get would do all the work of downloading and installing the right files in the right places. I don't know how this would work on Windows, maybe someone has created a windows RPM/APT client.

Python: How to package all necessary modules/Libs into one singleton executable file?

The running environment is ubuntu 12.04. Most of the time my python script have to import some external libraries or modules before run. When I distribute the script to some other linux machines. I have to install some necessary modules and libraries again.
Is there some way to package all necessary modules into one single python file and running without installing any module? Thanks
Just combine your files to one file. But it bad way. Select from better solutions:
create deb-package with all depends. In next times system will automatically install all libraries, will check correct state and will remove your files.
using rsync
get actually version from your version control system.
I have wrote script for generating deb-package after commit to our version control system.

Ncurses static libraries to include with a C++ project

I have installed the latest ncurses library which my project is using. Now, I want to check in the ncurses static libraries into svn so that I can checkout the project on a different machine and compile it without having to install ncurses on the system again.
So the question is what is the difference between libncurses.a, libncurses++.a and libncurses_g.a files? And do I need all of them for my C++ project?
Thanks!
libncurses.a - This is the C compatible library.
libncurses++.a - This is the C++ compatible library.
libncurses_g.a - This is the debug library.
libncurses_p.a - This is the profiling library.
If you want to find out if you can get by without using libncurses.a, you can rename the library and run a build of your application.
My answer comes a little late [ :-) ] since you posted your question more than 4 years ago. But:
Archiving the pre-compiled library in your SVN means that your built application may fail if the target machine differs under some critical aspect.
Yes, you can safely run the application on other machines which are configured entirely in the same way (e.g., on a fully homogeneous computation cluster). However, if the machines differ (e.g., because one machine had a system upgrade and the other not), it may stop working. This is not very likely, so the risk may be acceptable for what you'd like to do.
I would suggest another solution: Commit a recent, stable version of the libncurses sources (tarball) to your SVN repo and add a little script (or make target) that runs the libncurses build and installs the built library to some project directory (not the system directory but next to your applciation build directories, without committing to SVN). This build step only needs to be repeated if the libary shall be upgraded or if you would like to build/run on another machine.
This does not apply to the ncurses library in special but to any library.
Depending on your project target, consider further reading about
package management
cross compile

Install gcc on linux with no root privilege

I have access to computer in a public library and I want to try out some C++ and maybe other code. Problem is that there is no g++ installed and I can't install it using packages because I have no root access. Is there a "smart" way to make a full environment for programming in a home folder?
I have gcc installed (I can compile C code). Also, I have a home folder that is consistent. I don't know where to find precompiled g++, I found only source but I don't know what to do with it. I tried to ask them to install this but it didn't work :)
If you want to install it as a local user
GNU GSRC provides an easy way to do so
Link: http://www.gnu.org/software/gsrc/
After configuration, simply specify the following commands:
cd gsrc
make -C pkg/gnu/gcc
make -C pkg/gnu/gcc install
The second step could also be changed to speed up for an N-core system:
make -C pkg/gnu/gcc MAKE_ARGS_PARALLEL="-jN"
You can run the configure script with the --prefix parameter: ../gcc-4.5.0/configure --prefix=/home/foo/bar. Since it is very likely that the c++ standard library is different then the one on your system, you have to set export LD_LIBRARY_PATH=/home/foo/bar/lib before you can start a program compiled by this compiler.
Once, a long time ago (1992 or so), I went through something similar to this when I bought a SCO system with no development environment. Bootstrapping it up to having a full development environment was a gigantic pain, and not at all easy. Having library header files or gcc on a system would make your job a whole lot easier.
It depends a lot on just how obnoxious the library has been about what kinds of things are installed. If there is no gcc there, your job becomes a bit harder. If there are no header files for glibc there, your job is a LOT harder.
Also, do you get an account on the system so you have a home folder that's consistent from login to login?
If you have no gcc there, you need to find a pre-compiled binary of gcc/g++ and install it somewhere. If you have no header files there, you need to find copies of those and put them on the system.
There is no 'standard' way of installing gcc in your home folder though. All of the solutions are going to have some manner of hand-rolling involved.
Have you asked the librarians if they can change what's installed because you want to learn a bit of programming and only have access to their computers to do it with? That might well be the easiest solution.
From your comment it seems that you do have gcc and if you can compile C code, you have the library header files. So now it's a matter of actually compiling your own version of g++. You could probably find a way to entice the package manager on the system into installing a binary package somewhere other than in a system folder. I think this solution is less fun than compiling your own, and I think there may also be possible subtle problems as that installed package may be expecting to find things in particular places and not finding them there.
First thing to do is to make sure you've downloaded the right source for the gcc package. The place to find that is the GNU United States mirror page. You want to find the gcc-4.5.0.tar.bz2 or gcc-4.5.0.tar.gz file on the mirror site you choose. It will likely be in a gcc directory, and a gcc-4.5.0 sub-folder.
After you have that downloaded, you should untar it. In general you shouldn't build gcc in the folder you untar it into. So create another sibling folder that you actually want to build it in labeled gcc-build. Then the command you want is ../gcc-4.5.0/configure --prefix=$HOME/.local --enable-languages='c c++'.
gcc does require some other packages be installed in order to be able to compile itself. You can use the same --prefix line for these packages to install them in the same place. The gcc website has a list of pre-requisite packages.
$HOME/.local is sort of the standard accepted place for per-user installs of things.
If you have fakeroot, you can use that to set ~/some-path as root to install the packages from. Alternatively, you can setup a chroot environment to do the same.
Given this, you can then use dpkg -i package.deb to install the gcc package(s) on your system. You will need to download and install each package individually (e.g. from the debian website) -- at least binutils, glibc, linux-headers and gcc.
If you are on another system, you will need to get the right packages for that system and install them using the corresponding package manager.