What would be a practical way to test a C++ program on various platforms? Even within an operating system, I would like to test it on different versions (e.g., OS X 10.7, 10.8, 10.9, etc). I have access to a handful of machines that run on either Windows, Linux, or Mac OS X, but have a specific version installed. Assume I can compile the program on that platform.
Create virtual machines (VirtualBox, VMWare Player) for those systems and different versions.
Of course this may depend on specific hardware requested by your software.
Related
The situation is this:
I do not have access to a machine running on Linux, just a little embedded platform where I cannot install any IDE (which is in LINUX and is my target), so I got to develop the app from my Microsoft PC.
The question is: should I use Microsoft libraries? Because I am developing in a Microsoft environment, or should I use Linux libraries? Because my target is Linux?
Applications targeting Windows do not work out-of-the-box in a Linux system (see some discussion here https://superuser.com/a/209736).
You could, however, use a Linux guest from the Windows host, through a virtual machine or even docker.
Also, your "little chip target on which you cannot install an IDE" sounds like an embedded platform. Make sure the architecture on the target is the same as your windows pc x86-64 (intel). Many embedded platforms have a different architecture (e.g., ARM's aarch64). In this case, make sure to use an appropriate cross-compiler that will create code for the target.
I'm working on a game using C++ and relying on legacy features of OpenGL.
I've mostly been doing programming on Windows machines and am now looking into expanding into Linux and Mac OS. As a personal challenge, I'd like to keep my software as backwards-compatible as possible and support as many OSes as possible. With Windows, I've been able to upkeep support for as low as Windows 95 (with Visual Studio giving me some trouble in the process but working it out nevertheless). However, I have next to no knowledge of Mac OS, having never used it myself until just now. Considering my software runs on Windows 95, it certainly does not require any advanced features on functions.
The lowest version of Mac OS that old versions of Xcode seems to support is 10.1.x. If I was to write software for this version of Mac OS, would it run on more modern versions of Mac OS? I've read that there are software patches that enable support for some older MacOS software on never versions but I would rather not inconvenience the end user like that.
I would not mind building two different versions of the software (say, a legacy and modern launcher) but having to build five or ten different versions of the software would be a massive pain and I'd rather avoid that if at all possible.
I apologize for my lack of knowledge in the field, my own research has only yielded very limited information and I would rather not waste weeks on a fruitless endeavor.
TL;DR: I want to write software in Xcode (C++) for Mac OSX that supports as many versions of the OS as possible. If I was to target Mac OS version 10.1, can I expect it to run on modern versions of Mac OS? If not, how much effort would it take to support as many versions of Mac OS as possible?
You're going to have to make a decision as to where you're going to cut off support, or you're going to have to build multiple versions of the app in different specially-built build environments.
Modern macOS (10.15) is strictly 64-bit x86-64 only, zero support for 32-bit apps. If you build on 10.15 you can't even build a 32-bit image. If you did it would be linked against libraries that don't exist on older versions of the OS.
Mac OS X 10.6 was the last 32-bit x86 version that shipped, but was also the first 64-bit version of the OS, as it represented the transition to 64-bit. It was also the last PowerPC version that shipped.
If you want to support PowerPC machines, which was the only option in the 10.1 days, you'll need a compatible system to build it.
Ecosystem-wise, most Mac users are able to run current versions of the OS. Anything 2012+ is still able to get current OS releases, which is a lot of hardware. Anything before that is a mix of 32-bit and PowerPC hardware, though you won't see a lot of people with systems like that in the wild.
In short, 10.15 is easy, 10.14 and 10.13 aren't going to be hard. Anything beyond that will be various degrees of challenging, but beyond 10.6 things will get super complicated.
One way to estimate popularity is things like the Steam Hardware Survey where the breakdown looks like this:
10.15: 31.6%
10.14: 34.1%
10.13: 14.8%
10.12: 5.7%
10.11: 3.7%
So the good news is things drop off pretty steeply after 10.13 and by 10.11 there's not many users running an OS that old.
I am developing a cross-platform application and I need to determine whether machine B will be able to run the application that is compiled on machine A.
I am using Qt and I already understand that I need to either package the Qt libraries with the application or statically link against Qt itself.
I also understand that something compiled on Windows can't run on Linux.
However, there are still some other vectors that I'm not sure to what extent they matter. Here is the summary of my current understanding:
Affects Portability
Operating System (Windows, Mac, Linux)
Availability of third party libraries (Qt, static vs dynamic linking, etc)
May Affect Portability
Flavor of Linux (Ubuntu, Red Hat, Fedora)
Architecture (32 or 64-bit)
Version of Operating System (Windows 7 vs Windows XP, Rhel5 vs Rhel6)
Instruction type (i386, x64)
Of the May Affect Portability items, which ones actually do? Are there any that I am missing?
All. At least potentially.
If two different machines have no binary compatibility (e.g.
they run on different architectures, or interface to
incompatible systems), then it will be impossible to create
a single binary that will run on both. (Or... does running
a Windows program under Wine on Linux count?)
Otherwise, it depends. You mention third party libraries: if
they're dynamically loaded, they have to be there, but there's
always static linking, and there may be ways of deploying with
the dynamic library, so that it will be there.
The 32 bit vs. 64 bit is a difference in architectures: a 32 bit
program will not run in a 64 bit environment and vice versa.
But most modern systems will make both environments available
if they are on a 64 bit machine.
Issues like the flavor and version of the OS are more complex.
If you use any functions recently added to the OS, of course,
you won't be able to run on machines with an OS from before they
were added. Otherwise: the main reason why the low level system
libraries are dynamically loaded is to support forwards and
backwards compatibility; I've heard that it doesn't always work,
but I suspect that any problems involve some of the rarer
functions. (There are limits to this. Modern Windows programs
will not run under Windows95, and vice versa.)
There is also an issue as to whether various optional
packages are installed. Qt requires X Windows under Linux or
Solaris; I've worked on a lot of Linux and Solaris boxes where
it wasn't installed (and where there wasn't even a display
device).
And there is the issue whether it will run acceptably. It may
run on a smaller, older machine than the one on which you tested
it, but it could end up paging like crazy, to the point where it
becomes unusable.
If you compile an application on a 64-bit processor, it wouldn't by default run on a 32-bit processor. However, you can pass options to the compiler to have it compile code to run on a 32-bit processor. For example, if you're using GCC on a 64-bit machine, if you pass -m32, it will compile 32-bit code. 32-bit code by default can run on a 64-bit machine.
Sources
https://stackoverflow.com/a/3501880/193973
Different flavors of Linux or versions of operating systems may have different system libraries. For example, the GetModuleFileNameEx function is only available in Windows XP and up. As long as you watch what functions you use though, it shouldn't be too much of a problem.
The x64 architecture is backwards compatible with x86 ("32-bit"), so programs compiled for the x86 will run on x64 machines, but not vice versa. Note that there are other, less common architectures too, such as the ARM and PowerPC.
I can immediately think of three things that interfere with portability. But if your code is in a file format understood by both systems, an instruction set understood by both systems, and only makes system calls understood by both systems, then your code should probably run on both systems.
Executable File format
Windows understands PE, COFF, COM, PEF, .NET, and others
Linux by default ELF, OUT, and others
MaxOSX uses Mach-O
Linux has extensions to run the Windows/Mac formats too (Wine, Mono...)
Instruction set
Each processor is a little different, but
There is usually a "lowest common denominator" which can be targetted
Sometimes compilers will write code to do a function twice
one with the LCD set, one with a "faster" instruction set
and the code will pick the right one at runtime.
x64 can run 64bit and 32bit, but not 16bit.
x86 can run 32bit and 16bit, but no 64bit.
Operating System calls
Each OS usually has different calls
This can be avoided by using a dynamic library, instead of direct OS calls
Such as... the C runtime library.
Or the CLR for .Net.
The doubt
I have written some code in Microsoft Visual C++ 2010 Express as so:
#include<iostream>
int main()
{
system("cls");
char name[20];
cout<<"\nEnter your name:";
cin.getline(name,20);
system("pause");
cout<<"\nYour name is:"<<name;
system("pause");
return 0;
}
And now I have compiled it and sent it to a friend on a Linux machine. he downloads the DOSBox software and then runs this program.
THE DOUBT
Will it run as it does on my machine or will this create any problem?
why I am asking this?
I recently downloaded a linux live cd and ran it on my machine. I can't install it on this machine as it is a shared PC. Anyway, I typed cls into the terminal and there was no response. I typed pause again there was no response. So it set me wondering if the command "cls" that i am passing to the system in the above code will really have any effect on a linux machine.
There are a few reasons why this program won't work on other machines - I will summarise the two main ones:
You use system instructions which are not supported by other operating systems. If you attempt the run these instructions on a different OS, the OS will complain that it doesn't understand them and the program will crash.
(And probably more importantly,) the Windows executable you have created is a Windows .exe file which is Microsoft's Portable Executable format. Linux can only read executables in ELF format, and Mac OS X uses the Mach-O format.
These two points are worth discussion in their own right, and as Joachim pointed out in the comments, the WINE emulator is quite good at emulating a windows environment on Linux, so this may be an option for program compatibility.
EDIT: I should add here that Point 1 assumes that Point 2 has been overcome. Point 2 is the reason executables on one OS just plain "don't work" on other operating systems.
Response to comment:
Generally, yes, ELF files are the standard for all Linux distros (there may be a few rare exceptions). Similarly, PE files are the standard for all Windows versions. Provided you have a relatively up to date CPU, then if you compile an executable on one Linux distro, then it should work on others.
The exception here is, if you compile the program on a machine with a recent CPU, and wish to run it on a machine with a very old CPU, the old CPU may not support some of the instructions that the compiler creates. However, these days just compiling a program with the default settings generally works on all (Intel) CPUs. If you know for a fact that your target machine uses a very different or older CPU, you can add the -march=... compiler option so the compiler generates instructions that will definitely work on the target machine.
Finally, DOSBox is not a Windows Emulator, it is a DOS emulator. The two systems, despite their history, are quite different. DOSBox is not designed to run native Windows applications, it is designed to run native DOS applications (most of which are abandonware these days). If you'd like to run DOS programs on Linux such as Dangerous Dave (one of my nostalgic favourites), then you can. However, if you wish to run Windows applications, you will need an emulator designed for this purpose, such as WINE.
For reference, DOS uses the obsolete MZ Executable format.
pause and cls most likely will not work directly in other OSes because these are Windows/DOS-specific commands.
If you remove the DOS-specific commands and make the program generic, then the EXE file built in Windows can most probably be executed in Linux or MacOS through Wine. Please see http://www.winehq.org/about/ and http://wiki.winehq.org/MacOSX . I'm saying "most probably" because you still need to try it out to see if there are problems.
If you run your EXE executable inside a virtualized environment running Windows in it like Virtual Box, then it will work.
On Linux, the command to clear the screen is clear. Is that what you're really intending to do?
I'm developing an SDL application in C++, and some of my consumers have asked for a version that runs on Mac OS X. I am wondering if anyone knows of a good cross-compiler for Mac OS X targets, and maybe a Mac OS X emulator (maybe a virtual HDD for Virtual Box?) so that I can actually test it myself.The emulator is not 100% necessary though, as it's probably illegal and I can understand if nobody's willing.
I'm using a PC (Windows XP) for my host machine, and I don't have the funding to go and purchase a Mac, sadly.
The easiest, and most common, solution is the other way: use a Mac platform with Windows installed either as dual boot or in a virtual machine.
That way you will benefit of 100% of both worlds and never be bothered whenever a Mac system update is delivered.
Bonus: You can install Linux as well.