I made a little app and built a release version. Now I want to upload it to my site. I have never done this before with Qt, so I'm unsure as to what I should include along with the binary.
How do I figure out which DLLs should be included with my app? And where do I get them? I'm running Windows, but I'd also like to know what I should do in case I want to release a Linux version.
For windows:
You can use dependency walker to see what Qt libraries (or others) you should ship. This is the depends.exe executable that is included with Visual Studio, but you can download it separately from: http://www.dependencywalker.com/
Load your app into that and it will list out all the modules it expects at runtime. You might also have to ship a Visual C++ Runtime Redistributable compatible with the compiler that you built the executable with (if it's VC++).
Do note that dependency walker does not account for things like Qt's plugins. An example of this would be the QtAssistant system (for help menu-type functionality), which as of Qt4 relies on Qt's sqlite functionality, which is typically built as a plugin (qtsqlite4.dll if I remember).
For Linux:
This is trickier because of wider disparities in Linux distributions. You can of course use the GNU build system if you want to ship source, but if you're shipping binaries, and want to support a variety of distros, you might do best to build packages for each platform you want to release on.
In my past, a company I worked for switched to using cmake and after setting up all the project and build files, used that to generate builds and packages for different OSes. On Windows, this meant hooking in with Inno Setup, and for Unix-like systems, cmake knows how to generate things like installable shell scripts. Definitely made life much easier.
Our QA department would test our software in virtual machine instances of our supported platforms, completely clean, and see if anything was missing.
If you're talking about DLLs, I assume it is about Windows.
Use Dependency Walker to see the DLL dependencies.
Or... take a clean system, with no dev tools installed, and put your executable, try to run it there, and see what DLLs are reported as necessary and inexistent. Put the DLLs near the executable.
For a Linux version, you can either create platform targeted releases of installers for each Linux fork or you can let people compile from source. If your app is new, the only way you get exposure is supply people with readymade installers, the targeted installers. New users loathe compiling packages from source.
You can try debian (.deb) and redhat (.rpm) packages first. These two are extremely popular lines and will let you have a taste of things.
Related
I am making a modified C++ compiler and I have it built and tested locally. However, I would like to be able to package my build for Windows, Linux (Debian), and Mac OSX.
All of the instructions I can find online deal with building gcc but have no regard for making something distributable (or perhaps I am missing something?). I know for Windows I will need to bundle MinGW somehow, but that only further confuses me - and I have no idea how well Mac works with GCC these days..
Can anyone layout a set of discrete high-level steps I could try on each system so I can allow people to install my modified compiler easily?
First make sure your project installs well including executables, headers, runtime dependencies. If you're using something like cmake, this is a matter of installing things to CMAKE_INSTALL_PREFIX while possibly appending GnuInstallDirs. If you are using make, then you need to ensure that make install --prefix=... works well.
From here, you can target each platform independently. Treat the packaging independently from your project. Like Chipster mentioned, making rpm files isn't so tough. deb files for Debian-based OSs, tar.xz files for Arch-based OSs are similar. The rules for creating these packages can use your install rules to create the package. You mentioned mingw. If you're targeting an msys distribution of mingw for Windows deployment, then the Arch-based packaging of pacman will work on msys as well. You can slowly work on supporting one-platform at a time with almost no changes to your actual project.
Typically in the open-source world, people will release a tar.gz file supporting ./configure && make && make install or similar. Then someone associated with the platform (like a Debian-developer) will find your project, make some packaging rules for it, and release it into their distribution. That means your project can be totally agnostic to where it's being release. It also means you don't really need to worry about MacOS yet, you can wait until you have someone who wants it there, or some hardware to test it on.
If you really want to be in control of how things are packaged for each platform from inside of your project, and you are already using cmake, cpack is a great tool which helps out. After writing cpack rules for your project, you can simply type cpack to generate many types of deployable archives. You won't get the resulting *.deb file into Debian or Ubuntu official archives, but at least people can using those formats can install your package.
Also, consider releasing one package with the runtime libraries, and one with the development content (headers, compiler, static libraries). This way, if someone uses your compiler, they can re-distribute the runtime libraries which is probably going to be a much simpler install.
How do I deal with this situation where the user's machine may not have dll like msvcp100.dll, msvcp100.dll? I don't want to my software not install on user's machine because this kind of error. I've been thinking in either find a tool and copy every single needed dll to the executable file run or try build a static version of Qt (may have sort of same result, in the end). I've seen applications that doesn't provide those dlls and it's up to user to get them, install etc. It may be a issue for those users, finding the right dll matching version and all. I'd like to avoid it.
How is this usually solved in a real world application?
You redistribute them in your installer.
Deployment of Qt apps is an uneasy issue. One should understand that you need to redistribute a compiler's dlls, as well as Qt dlls. For the msvc 2010 compiler you may just put the msvcp100.dll and msvcr100.dll files near your executable file.
As for Qt, the easiest way for windows deployment is to use the windeployqt command prompt util.
Update. How is this usually solved in real applications?
There is no fully automatic tools, which could determine all the dependencies, because the deployment of applications is a complex task. For example, you application can depend on many things such as libraries dlls, compiler's dlls, registry keys, drivers, environment variables, computer reboot requirement, ActiveX/COM components, other installed applications, etc...
But there are tools that can help you in this matter. They can be configured once for target project and then work all the time. This is called Build Automation, another similar concept is a Continuous Integration. As for the creation installation packages for end users, there is a lot of tools such as Nullsoft Scriptable Install System, Inno Setup, Qt Installer Framework, WiX, etc...List of installation software.
the installer script, put a command to download the dll that the application needs to run, and in the application source code, put it to use the dll's ONLY in the folder where it is running
If you are using the NSI installer, use this example
I'm responsible for developing a set of C++ libraries and programs. Currently building on Linux and MacOS, but Windows support is also a requirement. We will need to support VS2010 and VS2012, and in the future will also include VS2013 and maybe also MinGW. We're using cmake for building, so our code should build on all the platforms without issues; my problem is how to manage all the dependencies on Windows in order to be able to build in the first place, and keeping it up-to-date over time. At the moment, we have one virtual machine per visual studio version as a jenkins slave, so parallel builds of all the variants is fairly easy, but managing it is not.
The problem is the number of variants this requires building. If we consider only VS2010 and VS2012, with debug/release and i386/x64 builds, that's already 8 copies of each library; 16 if we include the other compilers. We will need all the libraries our code depends on, which will include at a minimum boost, qt, xerces+xalan, zlib, icu, libpng/tiff/jpeg, hdf5 and more, plus python, and all their dependencies. And as new upstream releases are made, we'll need to keep the entire collection up-to-date and consistent for all the build/arch/compiler variants.
I don't want to do this by hand, since this really needs automating. However, I'm unaware of any good solution for doing this on Windows. The Windows building guides I've seen for other projects often involve hand-building all the dependencies, and only build for a single variant. On Linux, it's already packaged, you don't need separate debug builds, and the arch variants can be catered for with chroots; on MacOS there's homebrew, macports etc., and it's also fairly simple to automate stuff there as well. Is there any equivalent for Windows? I've looked at stuff like chocolatey, but it's entirely unsuited to handling libraries, and is pretty poor as a package manager.
This seems like it should be a common problem for anyone doing C++ development on Windows? Are there any common solutions, tools or methodologies for managing a complex set of libraries and tools for development? How do other developers manage this?
NB. Just for the record, we are not using the visual studio application; we're doing all builds non-interactively via scripts driving the compilers directly with cmake and/or msbuild.
Many thanks,
Roger
I worked on large windows C++ project that delivers X86 Release, x86 Debug, x64 Release and x64 debug. Very similarly I used build system that does parallel builds for all target platforms using custom script.
We manage all third party dependency libraries in organized folders.
For example x86\release\Zlib.dll x86\Debug\zlib.dll x64\release\zlib.dll x64\zlib.dll
Custom script is made to pick all these libraries and project source code from configuration management tool. This allows to automatically build the relevant target binaries as needed.
any third party libraries change is updated in configuration management tool and then later picked up by the script for the next build.
For your question on VS2010 and 2012 support I don't understand importance. Is not one version of VS enough to support for the project?
you may take a look at Link, and their build system https://github.com/gisinternals/buildsystem
It's basicly a set of batch and make files calling each others. You still need keep track of lib update manually.
This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
What’s the best way to distribute a binary application for Linux?
We would like publish a proprietary game in Linux. The game is relatively simple, has a custom engine written in C++. We have been using the MinGW compiler on Windows and one person does programming and testing in Linux with the g++ compiler.
The game is written exclusively using open source libraries, which are all cross-platform.
What is the recommended way to make a program for Linux work across all Linux distributions without requiring packaging for any of them? The ideal way for me would be to package it as a .zip or tarball and provide that a download, just like good ol' .exe files under Windows that work with system libraries.
Most games I've played under Linux that provide such downloads offer a shell script (which is a compatibility problem in itself!) that launches it. I have no problem with it, but it reeks of user-unfriendliness. Additionally, a lot of games require additional libraries. For instance, SDL 2.0, the hardware-accelerated new-version of the popular library, is not yet available in most distributions' repositories yet several games are known to use it, and I had to compile it myself. This is even worse.
I would like a solution whereby a customer can click on the binary file in their file manager of choice, and it would run, no 'apt-getting' or 'yumming' of the necessary libraries. I don't mind breaking standards, and if it has to go in /opt, it will.
I would like a solution whereby a customer can click on the binary file in their file manager of choice, and it would run, no 'apt-getting' or 'yumming'. I don't mind breaking standards, and if it has to go in /opt, it will.
First, you have to understand that "yumming" and "apt-getting" are not really the actual installers of the applications (packages), they are simply the front-end programs used to look up / download / update / trace packages on the repositories (from the distro and others that you manually add). So, when you say "no 'apt-getting' or 'yumming'", we have to assume that you mean that you don't want to put up your game on a repository, which makes sense if you want people to pay to get your game (as opposed to other proprietary but free software like flash, graphics drivers, video codecs, and other things you typically find in repositories).
So, there are really just two types of package management systems, RPM and DEB, which use a command-line program, rpm and dpkg, respectively, to actually do the installation. Most distributions also come with a GUI front-end for those programs too (not the Synaptic-style package management software (which is a GUI front-end to apt-get or yum), but something simpler). When you double-click on a .deb or .rpm file, on most distros, you get this GUI front-end to pop up asking you for admin credentials and telling you about dependencies that are required, and, obviously, that you are about to install this package onto your system. From what I can tell, this is exactly what you want. And so, what you need to provide is a .deb file (for Debian distros) and a .rpm (for Red Hat distros), for both 32bit and 64bit versions of your game, just like I would assume you provide a .msi file for your Windows versions.
As for dependencies that might be hard for users to locate. What you should do is include in some directory of your installer a number of additional ('recommended' version) packages for these esoteric dependencies so that they can be installed from those offline packages if a newer version cannot be fetched from the distro's repositories. And that's about it.
And you can either make people pay to get the deb or rpm installers for your game, or include some kind of license-key system to unlock the game (and thus, make the deb/rpm files available for download, and charge for the key / code to unlock it).
The ideal way for me would be to package it as a .zip or tarball and provide that a download, just like good ol' .exe files under Windows that work with system libraries.
Ideal? Really!?! Yeah, if all you do is use system libraries, then it will work. But if there is anything more, it will be a nightmare (nearly as bad as it is under Windows of you don't rely on installers).
The game is written exclusively using open source libraries, which are all cross-platform.
Make sure none of those open source libraries are GPL-licensed, because if that's the case, you can't make your game proprietary. Your dependencies must be licensed under LGPL or BSD, or similar licenses, so watch out for that.
What is the recommended way to make a program for Linux work across
all Linux distributions without requiring packaging for any of them?
That is not recommended to begin with. So you need the less unrecommended way of doing that. I guess that would mean producing a statically linked binary (and you would need a 32bit and a 64bit version anyway).
The recommended way would be to decide a distribution system (RPM, DEB, ...) and verify dependencies on the various target platforms. Then the user could click on the installer package - much like he/she would do with a Windows MSI file - and be also able to uninstall/upgrade the program later on.
Note that in practice you would have to provide testbed environments for the target platforms even if you distributed the static binary, since you can't avoid doing tests. At that point, packaging the RPM/DEB/etc is not really a significant increase in time expenditure; and on the other hand, it would make the package much tighter and easier to download and install.
I'm developing software using ITK and VTK, and it's all going very well. However, I would like to deploy the software onto end-user machines that do not have ITK or VTK installed. When I build the software, and run it on a machine that doesn't have the ITK or VTK installed, I get errors, such as "Cannot load libItkBasicFilters.dll" This makes perfect sense--there are no such DLLs on the machine.
I can't expect all my users to install ITK so that they can run my software--how can I make it so that they only have to run my executable, and they can enjoy the software? I'm afraid I don't understand these libraries well enough--so if you can give me any ideas, that would be very helpful.
You have two options:
Add the installer of those libraries into your installer and run it automatically if the libraries are not found on the system. This is how it's usually done if you have an installer and is the preferred option in that case.
Use static version of the libraries. Since the libraries use BSD license, you are allowed to do this (just recompile them as static libraries if you don't have those already). It will however take more space on the target computer and require reinstalling your software if the libraries need to be updated (.dlls can be updated for bug or security issue fixing without recompiling the dependent applications), but if you want the user to just fetch an .exe and start it, it's the only option.