I'm working on a C++ Linux application that uses wxWidgets, and needs to be distributed as a compiled binary application. The project lead has specified that we are to include all dependencies for the application so that the end user does not need to install anything to run the application, provided they have standard system components installed already (libc, etc). I think this requirement is something that the end user asked for. I know that this is not what you might consider to be a "normal" distribution process for Linux applications.
For simple libraries that don't have many dependencies themselves, this is not an issue. But for wxWidgets I'm running into issues with webkitgtk which is required for the WebView class (which is used in the application). webkitgtk has a number of dependencies itself, which may have their own dependencies, and so on. Basically, it looks like I'd be opening a real can of worms by trying to include everything in the application, and the more senior developer on the project seems to agree.
So I'm wondering, what are my options for distributing such an application? I've tried searching for information about this, and the prevailing opinion seems to be to have the end user install wxWidgets. These are the options that I've come across:
Compile all dependencies as shared libraries as the project lead wants. The downside to this is that there are many libraries to worry about and this will lead to significant bloat.
Require that the end user install wxWidgets (on top of GTK and webkitgtk). The downside here is that the user would have to install multiple dependencies, and if they aren't on a distribution with appropriate versions of the above in their package manager, this could be a real hassle for them. It also means we couldn't provide something that was specifically asked for.
Require that the end user have GTK and webkitgtk installed, but not wxWidgets. Same downsides as above, but with fewer dependencies. An additional downside is that there may be version compatibility issues if different versions of the dependencies are installed than were used to build the packaged wxWidgets library.
Am I correct in my assessment of the pros and cons of these various options? Are there any options that I'm missing?
Thanks!
David,
The best possible solution is probably to ask user to install X11, GTK+{2,3} and WebKit-GTK.
wxWidgets can be statically linked with the application.
You can ask you user to have a WebKit-GTK to be at least version X.Y.Z and that should satisfy the requirements. Integrating WebKit-GTK with all its dependencies, especially since there is a dependency on GTK+ itself will be very hard. So if you go this route you will be screwed.
As linux user i vote for manual dependencies installation via package manager. It's not that hard and could even be done automatically if you provide package (Not just binary). Carrying runtime may cause problems (E.g. Steam on Debian). Another option is to provide two flavors: all inclusive and dependency requiring.
Related
I am interested in building a cross-platform C++ Library and distributing it in source form. I want the consumers of this library to be able to acquire it, build it and consume it inside their software very easily on whatever platform they are working on and for whatever platform they are targeting. At the same time while building my library, I also want to be able to consume other popular OSS libraries through a similar mechanism.
I see that CMake and Ryppl were created with these intentions in mind and to some extent they do solve some of these problems, especially the build problem. But I don't quite know how exactly to go about achieving the above mentioned goals. Is it OK to settle on CMake as the build solution? How do I solve the library acquisition and distribution problem? Simply host the sources somewhere and let people discover, download and build them? Or is there a better way?
At the time of writing there is no accepted solution that handles everything you want. CMake gives you cross-platform builds and git (with submodules) gives you a way to manage source-level dependencies if all other projects are using CMake. But, in practice, many common dependencies you project wil need don't use CMake, or even Git.
Ryppl is meant to solve this, but progress is slow because the challenge is such a hard one.
Arguably, the most successful solution so far is to write header-only libraries. This is common practice and means your users just include your files, and their build system of choice takes care of everthing. There are no binary dependencies to manage.
TheHouse's answer is still essentially true. Also there don't seem to have been any updates to ryppl itself for a while (3 years) and the ryppl.org domain has expired.
There are some new projects aiming to solve the packaging issue.
Both build2 and wrap from mesonbuild have that goal in mind.
A proposal was made recently to add packages to the c++ standard which may open up the debate (reddit discussion here).
Wrap looks promising as meson's author has learned from cmake.
There is a good video when its author discussing this here.
build2 seems more oblivious (and therefore condemned to reinvent). However both suffer from trying to solve the external project dependencies issue simultaneously with providing a complete build system.
conan.io is another recent attempt which doesn't try to provide the build system as well. Time will tell if any of these gain any traction.
The accepted standard for packaging C and C++ projects on Unix was always a source tarball + a configure script (autotools) + make.
cmake is now beginning to replace autotools as your first choice.
It is able create RPMs and tarballs for distribution purposes.
Its also worth considering the package managers built into the various flavours of Linux. The easiest to build and install projects are those where most of the dependencies can be pulled in via yum or apt. This won't help you on windows of course. While there is a high barrier to entry getting your own projects added to the main Linux repositories (e.g. RedHat, Debian) there is nothing to stop you adding your maintaining your own satellite repo.
The difference between that and just hosting your project on github or similar is you can provide pre-built binaries for a number of popular systems.
You might also consider that configure times checks (e.g. from cmake findLibrary()) and your own documentation will tell people what needs to be installed as a prerequisite and providing you don't make it too onerous that might be enough.
I have written a portable C++ application using Qt libraries. This means that I cannot use the MT flag for compiling without risking memory issues.
This leaves me with two options:
1) Deploy the portable application with an installer.
2) Package the C++ dependencies within the same folder or use private assemblies.
Both 1 and 2 defeat the idea of portable software, so I was thinking of a third option:
3) Use IExpress to drop the C++ dependencies before launching the application. On exit, delete the C++ dependencies.
Unfortunately, option 3 has received some flak from some stackoverflow members. They even dislike option 2 which leaves me with only option 1. I can see option 1 as doable if I use a portable installer.
Is there such thing as a portable installer? Essentially, I want the installer to check to see if the needed dependencies are installed before running my application (just like a regular installer would) and if they are, then just continue running my application. Otherwise, give a message box to the user that they could download it providing a link to the URL. I am aware I can write my own installer that can do this in C++ but I was wondering if there are any installers that already offer this specific functionality.
http://qt-project.org/doc/qt-4.8/deployment.html
The dlls for Qt in windows are so small, that deploying them with the application isn't an issue in my opinion.
There aren't any programs out there that I know of that place the Qt dlls on windows in some place that another program later would find (like c:/Windows/system32).
I think the only place where you could expect reuse of the libraries is in Linux or a mobile device that has a lot of Qt apps. But even then you have make sure that the versions of the libraries are high enough to support all the functionality that you are using.
Hope that helps.
This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
What’s the best way to distribute a binary application for Linux?
We would like publish a proprietary game in Linux. The game is relatively simple, has a custom engine written in C++. We have been using the MinGW compiler on Windows and one person does programming and testing in Linux with the g++ compiler.
The game is written exclusively using open source libraries, which are all cross-platform.
What is the recommended way to make a program for Linux work across all Linux distributions without requiring packaging for any of them? The ideal way for me would be to package it as a .zip or tarball and provide that a download, just like good ol' .exe files under Windows that work with system libraries.
Most games I've played under Linux that provide such downloads offer a shell script (which is a compatibility problem in itself!) that launches it. I have no problem with it, but it reeks of user-unfriendliness. Additionally, a lot of games require additional libraries. For instance, SDL 2.0, the hardware-accelerated new-version of the popular library, is not yet available in most distributions' repositories yet several games are known to use it, and I had to compile it myself. This is even worse.
I would like a solution whereby a customer can click on the binary file in their file manager of choice, and it would run, no 'apt-getting' or 'yumming' of the necessary libraries. I don't mind breaking standards, and if it has to go in /opt, it will.
I would like a solution whereby a customer can click on the binary file in their file manager of choice, and it would run, no 'apt-getting' or 'yumming'. I don't mind breaking standards, and if it has to go in /opt, it will.
First, you have to understand that "yumming" and "apt-getting" are not really the actual installers of the applications (packages), they are simply the front-end programs used to look up / download / update / trace packages on the repositories (from the distro and others that you manually add). So, when you say "no 'apt-getting' or 'yumming'", we have to assume that you mean that you don't want to put up your game on a repository, which makes sense if you want people to pay to get your game (as opposed to other proprietary but free software like flash, graphics drivers, video codecs, and other things you typically find in repositories).
So, there are really just two types of package management systems, RPM and DEB, which use a command-line program, rpm and dpkg, respectively, to actually do the installation. Most distributions also come with a GUI front-end for those programs too (not the Synaptic-style package management software (which is a GUI front-end to apt-get or yum), but something simpler). When you double-click on a .deb or .rpm file, on most distros, you get this GUI front-end to pop up asking you for admin credentials and telling you about dependencies that are required, and, obviously, that you are about to install this package onto your system. From what I can tell, this is exactly what you want. And so, what you need to provide is a .deb file (for Debian distros) and a .rpm (for Red Hat distros), for both 32bit and 64bit versions of your game, just like I would assume you provide a .msi file for your Windows versions.
As for dependencies that might be hard for users to locate. What you should do is include in some directory of your installer a number of additional ('recommended' version) packages for these esoteric dependencies so that they can be installed from those offline packages if a newer version cannot be fetched from the distro's repositories. And that's about it.
And you can either make people pay to get the deb or rpm installers for your game, or include some kind of license-key system to unlock the game (and thus, make the deb/rpm files available for download, and charge for the key / code to unlock it).
The ideal way for me would be to package it as a .zip or tarball and provide that a download, just like good ol' .exe files under Windows that work with system libraries.
Ideal? Really!?! Yeah, if all you do is use system libraries, then it will work. But if there is anything more, it will be a nightmare (nearly as bad as it is under Windows of you don't rely on installers).
The game is written exclusively using open source libraries, which are all cross-platform.
Make sure none of those open source libraries are GPL-licensed, because if that's the case, you can't make your game proprietary. Your dependencies must be licensed under LGPL or BSD, or similar licenses, so watch out for that.
What is the recommended way to make a program for Linux work across
all Linux distributions without requiring packaging for any of them?
That is not recommended to begin with. So you need the less unrecommended way of doing that. I guess that would mean producing a statically linked binary (and you would need a 32bit and a 64bit version anyway).
The recommended way would be to decide a distribution system (RPM, DEB, ...) and verify dependencies on the various target platforms. Then the user could click on the installer package - much like he/she would do with a Windows MSI file - and be also able to uninstall/upgrade the program later on.
Note that in practice you would have to provide testbed environments for the target platforms even if you distributed the static binary, since you can't avoid doing tests. At that point, packaging the RPM/DEB/etc is not really a significant increase in time expenditure; and on the other hand, it would make the package much tighter and easier to download and install.
We are building a program under Linux which works within a specific Ubuntu version just fine. But we would like to have the same binary running on Ubuntu 10.04 and 11.10.
It would be completly ok to build the application on the 10.04 platform. But when I do this, I have dependencies to specific library versions (eg. libboost_thread.so.1.40.0) which are not aviable on 11.10 because it uses newer versions. The system is build using QMake.
I am looking for a tutorial or starting point how to solve these dependency conflicts for multiple Ubuntu platforms.
If nobody else feels like taking a swing at this I may as well inject something.
I am going to make a few assumptions.
You are distributing a binary/closed source application
You want to distribute it yourself
Thus ruling out the whole "just let the distro/users build it for their setup themselves".
Looking at how others have resolved similar issues I can see that it is common to include the shared libraries with your application and then use a loader/wrapper, what you want to call it, script that modifies the environment before launching the application. Specifically they modify the LD_LIBRARY_PATH to include the /lib folder included with the application.
The script could be as simple as.
#!/bin/sh
LD_LIBRARY_PATH=./lib ./myAppReal
That is how I solved distributing a Qt4 application to users having distributions not shipping newer than Qt-3.3.6 (in 2009... seriously). Edit: Might also say by users I mean the 5-ish people at the company paying for development, spec failure on our part not asking them to be more specific when they said cross-platform on modern operating systems.
Now someone will probably find about a dozen things wrong with this, but that's good, I can update and learn as we go.
EDIT: As JimR said this comes with security implications, if you leave your libs folder world writable someone may use it to inject malicious code into your application. Depending on how you plan on deploying it may or not be a real issue, but you should be aware of it.
A little background, we have a fairly large code base, which builds in to a set of libraries - which are then distributed for internal use in various binaries. At the moment, the build process for this is haphazard and everything is built off the trunk.
We would like to explore whether there is a build system which will allow us to manage releases and automatically pull in dependencies. Such a tool exists for java, Maven. I like it's package, repository and dependency mechanism, and I know that with either the maven-native or maven-nar plugin we could get this. However the problem is that we cannot fix the source trees to the "maven way" - and unfortunately (at least the maven-nar) plugins don't seem to like code that is not structured this way...
So my question is, is there a tool which satisfies the following for C++
build
package (for example libraries with all headers, something like the .nar)
upload package to a "repository"
automatically pull in the required dependencies from said repository, extract headers and include in build, extract libraries and link. The depedencies would be described in the "release" for that binary - so if we were to use CI server to build that "release", the build script has the necessary dependencies listed (like the pom.xml files).
I could roll my own by modifying either make+shell scripts or waf/scons with extra python modules for the packaging and dependency management - however I would have thought that this is a common problem and someone somewhere has a tool for this? Or does everyone roll their own? Or have I missed a significant feature of waf/scons or CMake?
EDIT: I should add, OS is preferred, and non-MS...
Most of the linux distributions, for example, contain dependency tracking for their packages. Of all the things that I've tried to cobble together myself to take on your problem, in the end they all are "not quite perfect". The best thing to do, IMHO, is to create a local yum/deb repository or something (continuing my linux example) and then pull stuff from there as needed.
Many of the source-packages also quickly tell you the minimum components that must be installed to do a self-build (as opposed to installing a binary pre-compiled package).
Unfortunately, these methods are that much easier, though it's better than trying to do it yourself. In the end, to be cross-platform supporting, you need one of these systems per OS as well. Fun!
I am not sure if I understand correctly what you want to du, but I will tell you what we use and hope it helps.
We use cmake for our build. It hat to be noted that cmake is quite powerful. Among other things, you can "make install" in custom directories to collect headers and binaries there to build your release. We combine this with some python scripting to build our releases. YMMV, but some things might just be too specific for a generic tool and a custom script may be the simpler solution.
Our build tool builds releases directly from a svn reposity (checkout, build, ...) which I can really recommend to avoid some local state polluting the release in some unforseen way. It also enforces reproducability.
It depends a lot on the platforms you're targeting. I can only really speak for Linux, but there it also depends on the distributions you're targeting, packages being a distribution-level concept. To make things a bit simpler, there are families of distributions using similar packaging mechanisms and package names, meaning that the same recipe for making a Debian package will probably make an Ubuntu package too.
I'd definitely say that if you're willing to target a subset of all known Linux distros using a manageable set of packaging mechanisms, you will benefit in the long run from not rolling your own and building packages the way the distribution creators intended. These systems allow you to specify run- and build-time dependencies, and automatic CI environments also exist (like OBS for rpm-based distros).