I have written a portable C++ application using Qt libraries. This means that I cannot use the MT flag for compiling without risking memory issues.
This leaves me with two options:
1) Deploy the portable application with an installer.
2) Package the C++ dependencies within the same folder or use private assemblies.
Both 1 and 2 defeat the idea of portable software, so I was thinking of a third option:
3) Use IExpress to drop the C++ dependencies before launching the application. On exit, delete the C++ dependencies.
Unfortunately, option 3 has received some flak from some stackoverflow members. They even dislike option 2 which leaves me with only option 1. I can see option 1 as doable if I use a portable installer.
Is there such thing as a portable installer? Essentially, I want the installer to check to see if the needed dependencies are installed before running my application (just like a regular installer would) and if they are, then just continue running my application. Otherwise, give a message box to the user that they could download it providing a link to the URL. I am aware I can write my own installer that can do this in C++ but I was wondering if there are any installers that already offer this specific functionality.
http://qt-project.org/doc/qt-4.8/deployment.html
The dlls for Qt in windows are so small, that deploying them with the application isn't an issue in my opinion.
There aren't any programs out there that I know of that place the Qt dlls on windows in some place that another program later would find (like c:/Windows/system32).
I think the only place where you could expect reuse of the libraries is in Linux or a mobile device that has a lot of Qt apps. But even then you have make sure that the versions of the libraries are high enough to support all the functionality that you are using.
Hope that helps.
Related
I'm working on a C++ Linux application that uses wxWidgets, and needs to be distributed as a compiled binary application. The project lead has specified that we are to include all dependencies for the application so that the end user does not need to install anything to run the application, provided they have standard system components installed already (libc, etc). I think this requirement is something that the end user asked for. I know that this is not what you might consider to be a "normal" distribution process for Linux applications.
For simple libraries that don't have many dependencies themselves, this is not an issue. But for wxWidgets I'm running into issues with webkitgtk which is required for the WebView class (which is used in the application). webkitgtk has a number of dependencies itself, which may have their own dependencies, and so on. Basically, it looks like I'd be opening a real can of worms by trying to include everything in the application, and the more senior developer on the project seems to agree.
So I'm wondering, what are my options for distributing such an application? I've tried searching for information about this, and the prevailing opinion seems to be to have the end user install wxWidgets. These are the options that I've come across:
Compile all dependencies as shared libraries as the project lead wants. The downside to this is that there are many libraries to worry about and this will lead to significant bloat.
Require that the end user install wxWidgets (on top of GTK and webkitgtk). The downside here is that the user would have to install multiple dependencies, and if they aren't on a distribution with appropriate versions of the above in their package manager, this could be a real hassle for them. It also means we couldn't provide something that was specifically asked for.
Require that the end user have GTK and webkitgtk installed, but not wxWidgets. Same downsides as above, but with fewer dependencies. An additional downside is that there may be version compatibility issues if different versions of the dependencies are installed than were used to build the packaged wxWidgets library.
Am I correct in my assessment of the pros and cons of these various options? Are there any options that I'm missing?
Thanks!
David,
The best possible solution is probably to ask user to install X11, GTK+{2,3} and WebKit-GTK.
wxWidgets can be statically linked with the application.
You can ask you user to have a WebKit-GTK to be at least version X.Y.Z and that should satisfy the requirements. Integrating WebKit-GTK with all its dependencies, especially since there is a dependency on GTK+ itself will be very hard. So if you go this route you will be screwed.
As linux user i vote for manual dependencies installation via package manager. It's not that hard and could even be done automatically if you provide package (Not just binary). Carrying runtime may cause problems (E.g. Steam on Debian). Another option is to provide two flavors: all inclusive and dependency requiring.
I developed a Qt application in MacBook (El-Capitan 10.11.2) and it is ready now to be released.
What i want now, is to create the standalone executable file for both Mac and Windows OS.
But I don't know how !
I found this link but I am unable to follow it is guidance, it looks different from what my system is showing me.
If you have any idea, please help me.
Thank you
Well, to compile an application for windows, you will need a windows machine (or at least a virtual machine). You can't compile for windows on mac.
Regarding the "standalone": The easy way is to deploy your application together with all the required dlls/frameworks and ship them as one "package". To to this, there are the tools windeployqt and macdeployqt. However, those will not be "single file" applications, but rather a collection of files.
If you want to have one single file, you will have to build Qt statically! You can to this, but you will have to do it on your own. And if you do, please notice that the LGPL-license (the one for the free version of Qt) requires you to make the source-code of your program public! That's not the case if you just link to the dynamic libraries.
EDIT:
Deployment
Deployment can be really hard, because you have to do it differently for each platform. Most times you will have 3 steps
Dependency resolving: In this step, you collect all the exectuables/lirabries/translations/... your application requires and collect them somewhere they can find each other. For windows and mac, this can be done using the tools I mentioned above.
Installation: Here you will have to create some kind of "installer". The easiest way is to create a zip-file that contains everyhing you need. But if you want to have a "nice" installation, you will have to create proper "installers" for each platform. (One of many possibilities is the Qt Installer Framework. Best thing about it: It's cross platform.)
Distribution: Distribution is how to get your program to the user. On Mac, you will have the App-Store, for windows you don't. Best way is to provide the download on a website created for this (like sourceforge, github, ...)
I can help you with the first step, but for the second step you will have to research the possibilities and decide for a way to do it.
Dependencies
Resolving the dependencies can be done by either building Qt statically (this way you will have only one single file, but gain additional work because you will have to compile Qt) or using the dynamic build. For the dynamic build, Qt will help you to resolve the dependencies:
macdeployqt is rather easy to use. Compile your app in release mode and call <qt_install_dir>/bin/macdeployqt <path_to_your_bundle>/<bundle>.app. After thats done, all Qt libraries are stored inside the <bundle>.app folder.
For windeployqt is basically the same: <qt_install_dir>\bin\windeployqt --release <path_to_your_build>\<application>.exe. All dependencies will be inside the build folder. (Hint: copy the <application>.exe in an empty directoy and run windeployqt on that path instead. This way you get rid of all the build-files).
Regarding the static build: Just google it, you will find hundreds of explanations for any platform. But unless you have no other choice but to use one single file (for whatever reason) it would recommend you to use dynamic builds. And regarding the user experience: On mac, they won't notice a difference, since in both cases everything will be hidden inside the app bundle. On windows, it's normal to have multiple files, so no one will bother. (And if you create an installer for windows, just make sure to add a desktop shortcut. This way the user will to have "a single file" to click.)
This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
What’s the best way to distribute a binary application for Linux?
We would like publish a proprietary game in Linux. The game is relatively simple, has a custom engine written in C++. We have been using the MinGW compiler on Windows and one person does programming and testing in Linux with the g++ compiler.
The game is written exclusively using open source libraries, which are all cross-platform.
What is the recommended way to make a program for Linux work across all Linux distributions without requiring packaging for any of them? The ideal way for me would be to package it as a .zip or tarball and provide that a download, just like good ol' .exe files under Windows that work with system libraries.
Most games I've played under Linux that provide such downloads offer a shell script (which is a compatibility problem in itself!) that launches it. I have no problem with it, but it reeks of user-unfriendliness. Additionally, a lot of games require additional libraries. For instance, SDL 2.0, the hardware-accelerated new-version of the popular library, is not yet available in most distributions' repositories yet several games are known to use it, and I had to compile it myself. This is even worse.
I would like a solution whereby a customer can click on the binary file in their file manager of choice, and it would run, no 'apt-getting' or 'yumming' of the necessary libraries. I don't mind breaking standards, and if it has to go in /opt, it will.
I would like a solution whereby a customer can click on the binary file in their file manager of choice, and it would run, no 'apt-getting' or 'yumming'. I don't mind breaking standards, and if it has to go in /opt, it will.
First, you have to understand that "yumming" and "apt-getting" are not really the actual installers of the applications (packages), they are simply the front-end programs used to look up / download / update / trace packages on the repositories (from the distro and others that you manually add). So, when you say "no 'apt-getting' or 'yumming'", we have to assume that you mean that you don't want to put up your game on a repository, which makes sense if you want people to pay to get your game (as opposed to other proprietary but free software like flash, graphics drivers, video codecs, and other things you typically find in repositories).
So, there are really just two types of package management systems, RPM and DEB, which use a command-line program, rpm and dpkg, respectively, to actually do the installation. Most distributions also come with a GUI front-end for those programs too (not the Synaptic-style package management software (which is a GUI front-end to apt-get or yum), but something simpler). When you double-click on a .deb or .rpm file, on most distros, you get this GUI front-end to pop up asking you for admin credentials and telling you about dependencies that are required, and, obviously, that you are about to install this package onto your system. From what I can tell, this is exactly what you want. And so, what you need to provide is a .deb file (for Debian distros) and a .rpm (for Red Hat distros), for both 32bit and 64bit versions of your game, just like I would assume you provide a .msi file for your Windows versions.
As for dependencies that might be hard for users to locate. What you should do is include in some directory of your installer a number of additional ('recommended' version) packages for these esoteric dependencies so that they can be installed from those offline packages if a newer version cannot be fetched from the distro's repositories. And that's about it.
And you can either make people pay to get the deb or rpm installers for your game, or include some kind of license-key system to unlock the game (and thus, make the deb/rpm files available for download, and charge for the key / code to unlock it).
The ideal way for me would be to package it as a .zip or tarball and provide that a download, just like good ol' .exe files under Windows that work with system libraries.
Ideal? Really!?! Yeah, if all you do is use system libraries, then it will work. But if there is anything more, it will be a nightmare (nearly as bad as it is under Windows of you don't rely on installers).
The game is written exclusively using open source libraries, which are all cross-platform.
Make sure none of those open source libraries are GPL-licensed, because if that's the case, you can't make your game proprietary. Your dependencies must be licensed under LGPL or BSD, or similar licenses, so watch out for that.
What is the recommended way to make a program for Linux work across
all Linux distributions without requiring packaging for any of them?
That is not recommended to begin with. So you need the less unrecommended way of doing that. I guess that would mean producing a statically linked binary (and you would need a 32bit and a 64bit version anyway).
The recommended way would be to decide a distribution system (RPM, DEB, ...) and verify dependencies on the various target platforms. Then the user could click on the installer package - much like he/she would do with a Windows MSI file - and be also able to uninstall/upgrade the program later on.
Note that in practice you would have to provide testbed environments for the target platforms even if you distributed the static binary, since you can't avoid doing tests. At that point, packaging the RPM/DEB/etc is not really a significant increase in time expenditure; and on the other hand, it would make the package much tighter and easier to download and install.
I have read these two SO questions: Which runtime libraries to ship? and License of runtime libraries included in GCC? - both were very helpful but not quite what I was looking for.
I have always just written programs for use on my own machine, which has never caused me any problems, but now I want to start running software on other machines and I'm wary of the runtime requirements.
EDIT: See below example instead, this was misleading.
Specifically, if I write a C++ program on a Windows machine, compiled with gcc through MinGW, and want to run it on another machine:
Do I have to send the libstdc++.dll with my program?
Is this single file (I assume placed in the executable's directory) sufficient to allow the program to run?
Also, an identical example, except this time it is an Objective-C program. Is sending the libobjc.dll file to the other machine sufficient to allow the program to execute properly?
I am used to running programs on machines which have developer tools, etc, installed, but now I'm looking to run them on general purpose machines (friends', colleagues' etc), and I'm not quite sure what to do!
EDIT: In response to edifice's answer, I feel I should clarify what it is I'm looking for. I know how to identify the necessary DLL(s) (/dylibs, etc) that my programs use, (although I am accustomed to doing that work manually; I had not heard of any of the tools). My question was more "What do I do now?"
A more general example is probably needed:
Let's say I have written a program which has object files derived from C++, C and/or Objective-C(2) code. I have used some Windows API code which compiled successfully using MinGW's gcc. I also have a custom DLL I wrote in Visual Studio (C++).
I have identified which DLL's my program will use at runtime (one of which may be GCC's libobjc.dll, I'm not sure if this would/should make a difference on a Windows machine, but I want to make this as general as possible) - The "prerequisite DLLs".
I would like to run it on my colleagues' computers, most of which run Windows 7, but some now run Windows 8. Starting at the very start for the sake of completeness:
Do I need to transfer the prerequisite DLLs to my colleagues' computers?
What directory should I place them in? (exe directory / a system directory?)
Once in place, will the presence of these DLLs allow the program to execute correctly? (Assuming it knows where to find them)
Are there any other files that should be transferred with the DLLs?
Basically I'm trying to determine the entire thought-process for developing and running an application on another machine in terms of system runtime requirements.
When loading DLLs, the first place Windows looks is the directory that the exe is in. So it will probably work just fine to put the DLLs there.
For the Microsoft DLLs though, I think it makes more sense to ask your colleague to install the Visual C++ runtime, which is a redistributable package from Microsoft. Ideally you would make an installer using something like WiX and it would install that prerequisite for you, but it is OK to just tell your colleague to do it.
Be sure to include a license file with your software if you include DLLs from gcc, because the GPL requires it.
libstdc++ isn't necessarily sufficient. You almost certainly need libgcc too, but actual dependencies are liable to vary with your particular application.
The best way to determine what you need to ship with your application is to load your EXE into a program like Dependency Walker.
Just as an example, I've compiled a test C++ program which simply prints a std::string. As you can see, it depends directly on two modules other than those that come with Windows; libgcc_s_dw2-1.dll in addition to libstdc++-6.dll.
You should remember to expand the tree under each DLL to make sure that it itself doesn't have any other dependencies (if A depends on B, B might depend on C even if A doesn't directly depend on C).
If you're worried and want the strongest assurances, you could install Windows into a virtual machine (VirtualBox is free) and test your application inside it. If you use Microsoft APIs, you may wish to check the MSDN documentation to see with what version of Windows they were introduced and ensure that it aligns with your target minimum Windows version.
Update: As xtofl points out this won't cover libraries loaded dynamically using LoadLibrary. If you want to cover this base, use Process Monitor to examine what DLL files are touched when you run the application. (Add an 'Image Path' criterion with the path to your EXE in order not to get flooded.) This has the added advantage that it covers all files, registry entries, etc. that your application depends on, not just DLLs.
I'm implementing an installer in Java, that is supposed to download and install an application for non-privileged users in Windows (from XP and up). The application is written in C++, and depend on the usual VC runtime-libraries (msvcm90.dll and friends). In order to save bandwidth, I want to avoid downloading the VC redistributables if they already are available for the user. I do however have a problem finding a reliable method to detect if an assembly is installed.
If the assembly is missing, I will deploy it as described here:
http://msdn.microsoft.com/en-us/library/ms235291%28VS.80%29.aspx
So the question is simply how to detect if a (any) assembly is installed on the machine. It's no requirement that this can be done from Java. I can easily write a small probe in C++ and link it statically for the task.
jgaa
If you are willing to write a small test program, then rather than writing one that looks for your dependencies, write one that has the same dependencies as your application. Try to run it. If it runs, the dependencies are in place. If it fails, the probable reason is that the dependencies are missing.
Seems a fairly complicated trick really as depending on the setup these may already be located in several possible places. Perhaps your best bet would be testing for the existence of these DLL's using the WinAPI LoadLibrary - this should find any DLL that is shared and appropriate to the build automatically.
Even better LoadLibrary a DLL that requires them as Ben suggests.