Releasing a program - c++

So I made a c++ console game. Now I'd like to "release" the game. I want to only give the .exe file and not the code. How do i go about this. I'd like to make sure it will run on all windows devices.
I used the following headers-
iostream
windows.h
MMSystem.h
conio.h
fstream
ctime
string
string.h
*I used namespace std
*i used code::blocks 13.12 with mingw
& I used the following library-
libwinmm.a
Thank you in advance
EDIT

There are many different ways of installing applications. You could go with an installer like Inno or just go with a regular ZIP file. Some programs can even be standalone by packaging all resources within the executable, but this is not an easy option to my knowledge for C++.
I suppose the most basic way is to create different builds for different architectures with static libraries and then find any other DLLs specific to that architecture and bundle it together in one folder. Supporting x86/x86-64/ARM should be enough for most purposes. I do know that LLVM/Clang and GCC should have extensive support for many architectures, and if need be, you should be able to download the source code of the libraries you use and then compile them for each architecture you plan to support as well as the compilation options you need to compile to each one.
A virtual machine can also be helpful for this cross-compilation and compatibility testing.
tldr; Get all the libraries you need in either static or dynamic (DLL) format. Check that they are of the right architecture (x86 programs/code will not run on MIPS and vice versa). Get all your resources. Get a virtual machine, and then test your program on it. Keep testing until all the dependency problems go away.
Note: when I did this, I actually had some compatibility issues with, of all things, MinGW-w64. Just a note; you may need some DLLs from MinGW, or, if you're using Cygwin, of course you need the Cygwin DLL. I don't know much about MSVC, but I would assume that even they have DLLs needed on some level if you decide to support an outdated Windows OS.

Related

C++ library compiles but does not work like previous version

There is this library which is used as a reference by other programs: https://github.com/RetroAchievements/RASuite/tree/master/RA_Integration
I have downloaded the compiled programs (that come with the compiled library) and they work fine. My goal is to make a change in the library code, re-compile it and replace the DLL of the compiled programs I have downloaded with my own compiled DLL. Like so:
ProgramA.exe
|_ RA_Integration.dll < replace with my own (built)
Before even changing the code, I am just trying to compile the DLL and use it along the compiled programs I have downloaded. I am not willing to re-compile the programs themselves because it will be too much work because of dependencies etc. And I also would like to be able to just "ship" the DLL to whoever wants my fix.
So I have downloaded the source code of that library, re-compiled it myself successfully but when I use it instead of the one that comes with the programs, they do not start up (Windows Event Viewer say that there was a problem loading my DLL).
I am assuming that my system have differences with the system that built the original DLL and that it is the reason why it fails. My question is: can I find those differences? Although I am a professional .NET programmer (as in it's my job) I am a C++ newbie and I am having trouble to understand all those linker/precompiler/dependencies/c++ stuff that seem to give different builds/results from a machine to another.
All I have been able to find is that in the project properties the "Platform Toolset" is "Visual Studio 2013 - Windows XP (v120_xp)", therefore I have installed Visual Studio 2013 (with Update 5 since it seems Windows XP support was not present in base VS2013) but that seems to not be enough. I am running Windows 10, which was surely not the OS the original programmer used when they compiled the DLL a couple years ago, but not sure if that matters?
Is there anything that could be found from the DLL itself or from the project that would hint me as to what I need on my system?
Hope that makes sense.
Thanks
Before even changing the code, I am just trying to compile the DLL and use it along the compiled programs I have downloaded. I am not willing to re-compile the programs themselves because it will be too much work because of dependencies etc. And I also would like to be able to just "ship" the DLL to whoever wants my fix.
Here's your fallacy: your DLL is a linking dependency. You must re-build your application, because obviously, the ABI of the library changed, rendering it incompatible with what your program tries to call in functionality that it expects to be in the DLL.
There's no way around that short of building an ABI-compatible wrapper DLL using your precious programming knowledge :) Finding these differences is hard – because, you could for example export a symbol list from your DLL, which will basically contain all the functions that DLL "offers", but some aspects of how these functions need to be called aren't actually part of that and can only be deducted by a linker (or a skilled person with too much time on their hand and an unhealthy obsession for parsing things in their head) from the C++ source code.
In other words: you changed what you're run-time linking your program against. You must now rebuild your program. End of options!

Is it possible to use same DLL for clients using both Windows and Linux

I am looking to create a C++ library that can be used by both Linux and Windows clients. The OS specific functionality will be hooked up by the client by implementing the interfaces provided by the library.
Is this possible to achieve? Do I need to recompile the C++ project again in linux.
P.S: I am using CodeBlocks IDE
The short answer is no, you still need to compile your library for each targetted platform -- however, assuming your code is written such that it is cross-platform, you can set up your build to target both Windows and Linux environments with little fuss. I do this now using CMake to generate both Visual Studio projects for Windows environments and Makefiles for Linux environments.
I'm pretty confident that Linux will not accept a .dll :) And yes, you will need to recompile. Unless you run windows as a virtual machine under linux which sort of preempts the question.
It certainly cannot be the same binary file: shared objects ELF format on Linux, DLL "PE" format on Windows. And dynamic loading has different semantics on both systems. See Levine's linker and loader book for details.
You could, if done carefully, have the same source code giving the two different files (the DLL on Windows, the dynamic shared object on Linux).
But you probably would need some conditional compilation tricks like #ifdef WINDOWS etc...
You might use libraries providing you a common abstraction for such things. For instance, both GTK/Glib and Qt have some mechanism giving a common abstraction of dynamically linked (or dynamically loaded - ie dlopen-ed) libraries.
You probably want to read the Program Library Howto (at least for Linux).

Compiling linux library for mingw

I have been using a socket library for C++. Some other info: 32 bit Linux, Codelite and GCC toolset. I want to be able to compile my program for Windows using the windows edition of Codelite. The socket library I have been using doesn’t have a mingw32 build of the library, but it’s open source. So how can I make a mingw32 build of the socket library so I can make a windows build using the source provided?
Most open source linux libraries are built with the make build system (although there others like jam etc, and custom written scripts for building). MinGW comes with the make utility, it's mingw32-make.exe. It may be possible (if you're lucky) to simply rebuild your library by making it on Windows.
The more usual scenario is that you will need to configure the project before you can build it though. The windows shell doesn't support the scripting requirements required to configure, but there's another part of the MinGW project that does called MSYS. If you install msys and all the required tools you need for it, you'll be able to ./configure your project before running make.
Of course, the above will only work if the library is written to be portable. There are some breaking difference between the linux socket implementation (sys/socket.h), and the windows implementation (winsock2.h). You may be forced to edit chunks of the code to ensure that it is versioned correctly for the platform (or that any dependencies required are also built for Windows).
Also, there is the chance that the library may already be built for Windows, but using a different compiler like MSVC, which produces .lib and .dll files. Mingw requires .a files for libraries, but a clever feature is the ability to link directly against a .dll, without the need for an imports library, so you can often use an existing windows library that was not built against Mingw (Although this won't help for static linking). There is also a tool, dlltool, which can convert .lib to .a.
If you give detail on the specific library you're working with, I may be able to pick out for you what needs to be done to run it on Win.
You port it to the new platform. :)
You're fortunate that it is opensource, because then it would be practically impossible to port it (You'd have to pay $$$'s to get a copy of the code for a particular license, or rewrite the entire product).
Enjoy.
Alternatively, they may well already have a port... Check the documentation for the library you are using.
First off your going to need to make sure that you aren't including any Linux specific libraries.

Moving from Windows to Ubuntu [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I used to program in Windows with Microsoft Visual C++ and I need to make some of my portable programs (written in portable C++) to be cross-platform, or at least I can release a working version of my program for both Linux and Windows.
I am total newcomer in Linux application development (and rarely use the OS itself).
So, today, I installed Ubuntu 10.04 LTS (through Wubi) and equipped Code::Blocks with the g++ compiler as my main weapon. Then I compiled my very first Hello World linux program, and I confused about the output program.
I can run my program through the "Build and Run" menu option in Code::Blocks, but when I tried to launch the compiled application externally through a File Browser (in /media/MyNTFSPartition/MyProject/bin/Release; yes, I saved it in my NTFS partition), the program didn't show up.
Why? I ran out of idea.
I need to change my Windows and Microsoft Visual Studio mindset to Linux and Code::Blocks mindset.
So I came up with these questions:
How can I execute my compiled linux programs externally (outside IDE)?
In Windows, I simply run the generated executable (.exe) file
How can I distribute my linux application?
In Windows, I simply distribute the executable files with the corresponding DLL files (if any)
What is the equivalent of LIBs (static library) and DLLs (dynamic library) in linux and how to use them?
In Windows/Visual Studio, I simply add the required libraries to the Additional Dependencies in the Project Settings, and my program will automatically link with the required static library(-ies)/DLLs.
Is it possible to use the "binary form" of a C++ library (if provided) so that I wouldn't need to recompile the entire library source code?
In Windows, yes. Sometimes precompiled *.lib files are provided.
If I want to create a wxWidgets application in Linux, which package should I pick for Ubuntu? wxGTK or wxX11? Can I run wxGTK program under X11?
In Windows, I use wxMSW, Of course.
If question no. 4 is answered possible, are precompiled wxX11/wxGTK library exists out there? Haven't tried deep google search.
In Windows, there is a project called "wxPack" (http://wxpack.sourceforge.net/) that saves a lot of my time.
Sorry for asking many questions, but I am really confused on these linux development fundamentals.
Any kind of help would be appreciated =)
Thanks.
How can I execute my compiled linux
programs externally (outside IDE)? In
Windows, I simply run the generated
executable (.exe) file
On Linux you do the same. The only difference is that on Linux the current directory is by default not in PATH, so typically you do:
./myapp
If you add current dir to the path
PATH=".:$PATH"
then windows-like way
myapp
will do, but this is not recommended due to security risks, at least in shared environments (you don't want to run /tmp/ls left by somebody).
How can I distribute my linux application?
In Windows, I simply distribute the executable files with the corresponding DLL files (if any)
If you are serious about distributing, you should probably learn about .deb (Ubuntu, Debian) and .rpm (RedHat, CentOS, SUSE). Those are "packages" which make it easy for the user to install the application using distribution-specific way.
There are also a few installer projects which work similarly to windows installer generators, but I recommend studying the former path first.
What is the equivalent of LIBs (static library) and DLLs (dynamic library) in linux and how to use them?
.a (static) and .so (dynamic). You use them in more or less the same way as on Windows, of course using gcc-specific compilation options. I don't use Code::Blocks so I don't know how their dialogs look like, in the end it is about adding -llibrary to the linking options (guess what: on windows it is about adding /llibrary ;-))
Is it possible to use the "binary form" of a C++ library (if provided) so that I wouldn't need to recompile the entire library source code?
Yes. And plenty of libraries are already present in distributions.
Note also that if you use .deb's and .rpm's for distribution, you can say "my app needs such and such libraries installed" and they will be installed from the distribution archives. And this is recommended way, in general you should NOT distribute your copy of the libraries.
If I want to create a wxWidgets application in Linux, which package should I pick for Ubuntu? wxGTK or wxX11? Can I run wxGTK program under X11?
Try wxGTK first, dialogs may look better, gnome themes should be used etc.
If question no. 4 is answered possible, are precompiled wxX11/wxGTK library exists out there? Haven't tried deep google search.
Try
apt-cache search wx
(or spawn your Ubuntu Software Center and search for wx)
In short: you will find everything you need in distribution archives.
Navigate to the folder with your compiled program and execute ./program
Send the program, plus any .so files
.a is static library, .so is shared libraries.
Yes, but often you need to compile it yourself first.
Not sure about wxWidgets distributions, though.
Since Ubuntu comes with wxGTK packages you should definitely build against them. For development you should use a debug version though, so it might be good to build yourself, but for deployment building against the packages the system provides seems better.
wxX11 is a worse choice than wxGTK, use it only for systems where wxGTK doesn't exist or requires newer GTK libraries than are available.
Why not just stick with what you know and develop in .NET? Ubuntu comes native with Mono. You could keep using Visual C++ or step up to C# and make your life a whole lot easier.
A piece of general advice to Linux newcomers, but who are technically minded to begin with, is: You should learn to use your chosen distribution properly.
In your case, that means learning how to acquire the right development packages provided by Ubuntu. For instance, some other people are advising you to download the source for libraries you are going to use, but the better way is to use Ubuntu's package system to download the libraries you want to program against, together with the headers for that library (often put in a separate package) as well as the debug symbols for the library (also often in a separate package).
Look in the System->Administration menu in Ubuntu for the Synaptic tool, which allows you to search the package repositories on the Internet. You'll almost certainly find packages for the libraries you need, as well as all tools.
1, Unix generally doesn't have a particular extension for an executable - so myprog.exe would just be myprog.
You might have to set it to be executable if the IDE doesn't do this automatically, type "chmod +x myprog"
5, For wxWindows I would download the source and build it, check the build instructions but it's probably just a matter of "configure;make;make install". Generally in Unix you build libs form source so that they can correctly find all the components on your machine - you also have the source of examples etc.
I just added some information to rlbond's answer.
It is depens on Linux version. If you use a Ubuntu - create a deb-package. (http://ubuntuforums.org/showthread.php?t=51003)
Can I run wxGTK program under X11?
Yes, if you have wxGTK package installed :)
This is not really going to answer your questions, but I think is a valid recommendation.
You have two issues you are trying to deal with:
The Linux environment.
Making sure your program is
portable.
If I were you I would load CodeBlocks on Windows and run against either Cygwin or Mingw, that will help you make sure your code is portable across platforms. You are familiar with the environment and would gain maximum productivity getting over the OS hurdle.
Once you are satisfied with the above then take your code and move it to Linux. At that point any porting effort should be trivial.
When you say your program didn't show up I assume you mean that it was there in the file browser but when you double clicked it you got a busy cursor for a moment and then nothing happened?
If so then it means that the program failed to run, probably because it couldn't find the dynamic libraries it's linked against. To diagnose the problem you can run it from a terminal and then you'll be told what the problem is.
You might want to read the manual page for ld.so i.e. type
man ld.so
into a terminal. This tells you where the Linux dynamic library linker looks for libraries at run-time. It also refers you to another useful tool called ldd which I recommend becoming familiar with if your are doing Linux development.

Working with Visual Studios C++ manifest files

I have written some code that makes use of an open source library to do some of the heavy lifting. This work was done in linux, with unit tests and cmake to help with porting it to windows. There is a requirement to have it run on both platforms.
I like Linux and I like cmake and I like that I can get visual studios files automatically generated. As it is now, on windows everything will compile and it will link and it will generate the test executables.
However, to get to this point I had to fight with windows for several days, learning all about manifest files and redistributable packages.
As far as my understanding goes:
With VS 2005, Microsoft created Side By Side dlls. The motivation for this is that before, multiple applications would install different versions of the same dll, causing previously installed and working applications to crash (ie "Dll Hell"). Side by Side dlls fix this, as there is now a "manifest file" appended to each executable/dll that specifies which version should be executed.
This is all well and good. Applications should no longer crash mysteriously. However...
Microsoft seems to release a new set of system dlls with every release of Visual Studios. Also, as I mentioned earlier, I am a developer trying to link to a third party library. Often, these things come distributed as a "precompiled dll". Now, what happens when a precompiled dll compiled with one version of visual studios is linked to an application using another version of visual studios?
From what I have read on the internet, bad stuff happens. Luckily, I never got that far - I kept running into the "MSVCR80.dll not found" problem when running the executable and thus began my foray into this whole manifest issue.
I finally came to the conclusion that the only way to get this to work (besides statically linking everything) is that all third party libraries must be compiled using the same version of Visual Studios - ie don't use precompiled dlls - download the source, build a new dll and use that instead.
Is this in fact true? Did I miss something?
Furthermore, if this seems to be the case, then I can't help but think that Microsoft did this on purpose for nefarious reasons.
Not only does it break all precompiled binaries making it unnecessarily difficult to use precompiled binaries, if you happen to work for a software company that makes use of third party proprietary libraries, then whenever they upgrade to the latest version of visual studios - your company must now do the same thing or the code will no longer run.
As an aside, how does linux avoid this? Although I said I preferred developing on it and I understand the mechanics of linking, I haven't maintained any application long enough to run into this sort of low level shared libraries versioning problem.
Finally, to sum up: Is it possible to use precompiled binaries with this new manifest scheme? If it is, what was my mistake? If it isn't, does Microsoft honestly think this makes application development easier?
Update - A more concise question: How does Linux avoid the use of Manifest files?
All components in your application must share the same runtime. When this is not the case, you run into strange problems like asserting on delete statements.
This is the same on all platforms. It is not something Microsoft invented.
You may get around this 'only one runtime' problem by being aware where the runtimes may bite back.
This is mostly in cases where you allocate memory in one module, and free it in another.
a.dll
dllexport void* createBla() { return malloc( 100 ); }
b.dll
void consumeBla() { void* p = createBla(); free( p ); }
When a.dll and b.dll are linked to different rumtimes, this crashes, because the runtime functions implement their own heap.
You can easily avoid this problem by providing a destroyBla function which must be called to free the memory.
There are several points where you may run into problems with the runtime, but most can be avoided by wrapping these constructs.
For reference :
don't allocate/free memory/objects across module boundaries
don't use complex objects in your dll interface. (e.g. std::string, ...)
don't use elaborate C++ mechanisms across dll boundaries. (typeinfo, C++ exceptions, ...)
...
But this is not a problem with manifests.
A manifest contains the version info of the runtime used by the module and gets embedded into the binary (exe/dll) by the linker. When an application is loaded and its dependencies are to be resolved, the loader looks at the manifest information embedded in the exe file and uses the according version of the runtime dlls from the WinSxS folder. You cannot just copy the runtime or other modules to the WinSxS folder. You have to install the runtime offered by Microsoft. There are MSI packages supplied by Microsoft which can be executed when you install your software on a test/end-user machine.
So install your runtime before using your application, and you won't get a 'missing dependency' error.
(Updated to the "How does Linux avoid the use of Manifest files" question)
What is a manifest file?
Manifest files were introduced to place disambiguation information next to an existing executable/dynamic link library or directly embedded into this file.
This is done by specifying the specific version of dlls which are to be loaded when starting the app/loading dependencies.
(There are several other things you can do with manifest files, e.g. some meta-data may be put here)
Why is this done?
The version is not part of the dll name due to historic reasons. So "comctl32.dll" is named this way in all versions of it. (So the comctl32 under Win2k is different from the one in XP or Vista). To specify which version you really want (and have tested against), you place the version information in the "appname.exe.manifest" file (or embed this file/information).
Why was it done this way?
Many programs installed their dlls into the system32 directory on the systemrootdir. This was done to allow bugfixes to shared libraries to be deployed easily for all dependent applications. And in the days of limited memory, shared libraries reduced the memory footprint when several applications used the same libraries.
This concept was abused by many programmers, when they installed all their dlls into this directory; sometimes overwriting newer versions of shared libraries with older ones. Sometimes libraries changed silently in their behaviour, so that dependent applications crashed.
This lead to the approach of "Distribute all dlls in the application directory".
Why was this bad?
When bugs appeared, all dlls scattered in several directories had to be updated. (gdiplus.dll) In other cases this was not even possible (windows components)
The manifest approach
This approach solves all problems above. You can install the dlls in a central place, where the programmer may not interfere. Here the dlls can be updated (by updating the dll in the WinSxS folder) and the loader loads the 'right' dll. (version matching is done by the dll-loader).
Why doesn't Linux have this mechanic?
I have several guesses. (This is really just guessing ...)
Most things are open-source, so recompiling for a bugfix is a non-issue for the target audience
Because there is only one 'runtime' (the gcc runtime), the problem with runtime sharing/library boundaries does not occur so often
Many components use C at the interface level, where these problems just don't occur if done right
The version of libraries are in most cases embedded in the name of its file.
Most applications are statically bound to their libraries, so no dll-hell may occur.
The GCC runtime was kept very ABI stable so that these problems could not occur.
If a third party DLL will allocate memory and you need to free it, you need the same run-time libraries. If the DLL has allocate and deallocate functions, it can be ok.
It the third party DLL uses std containers, such as vector, etc. you could have issues as the layout of the objects may be completely different.
It is possible to get things to work, but there are some limitations. I've run into both of the problems I've listed above.
If a third party DLL allocates memory that you need to free, then the DLL has broken one of the major rules of shipping precompiled DLL's. Exactly for this reason.
If a DLL ships in binary form only, then it should also ship all of the redistributable components that it is linked against and its entry points should isolate the caller from any potential runtime library version issues, such as different allocators. If they follow those rules then you shouldn't suffer. If they don't then you are either going to have pain and suffering or you need to complain to the third party authors.
I finally came to the conclusion that the only way to get this to work (besides statically linking everything) is that all third party libraries must be compiled using the same version of Visual Studios - ie don't use precompiled dlls - download the source, build a new dll and use that instead.
Alternatively (and the solution we have to use where I work) is that if the third-party libraries that you need to use all are built (or available as built) with the same compiler version, you can "just" use that version. It can be a drag to "have to" use VC6, for example, but if there's a library you must use and its source is not available and that's how it comes, your options are sadly limited otherwise.
...as I understand it. :)
(My line of work is not in Windows although we do battle with DLLs on Windows from a user perspective from time to time, however we do have to use specific versions of compilers and get versions of 3rd-party software that are all built with the same compiler. Thankfully all of the vendors tend to stay fairly up-to-date, since they've been doing this sort of support for many years.)