Im new about compile code for linux. It's propably Debian 5.0. And I need compile my cpp code for it as ready to run, i mean the other person can easly run program like in Windows, by just clicking on it.
Anybody can help?
I use virtualbox for this. It's easy and convenient. You can run multiple Linux distros and multiple versions of Windows provided you have the proper licenses. You can also run subversion, etc on each virtual machine so that you can sync your changes across all of them when building.
Assuming you want to be able to compile something on Windows and have it work on any Linux machine, that's simply not possible. Debian and Ubuntu both support many architectures, many of which have absolutely no binary compatibility. If you know what type of hardware your friend has you can build a binary targeted to that architecture.
If you want a quick and dirty answer, you can build for i386 since a 64bit machine can probably still run it fine (not guaranteed though).
Once you compile it, you can easily create a shortcut on the Desktop -or add an entry on a menu- to launch your program via a script; something like:
#!/bin/bash
/path/to/your/progam
Save it as launch.sh -for example- and give it ugo+x permissions as such
chmod ugo+x launch.sh
When you create the shortcut, you can associate a icon to your script exactly in the same way you do it in Windows.
UPDATE
If you are sending the compiled program to your friend (let's assume via email). You can simply instruct your friend to launch the terminal window in the same directory where he downloaded your file and run the following:
chmod ugo+x your_program
./your_program
Or you can send him 2 files: one with your program and one with a "launch" script as I described above. Since both files will be downloaded to the same directory, you can change your launch script to:
#!/bin/bash
./your_program
When he clicks on launch.sh, your program will be executed.
Related
Is there any way to compile programs remotely on Linux with Putty and Sublime? I'm running windows and my c++ compiler is different but my programs are graded on how they run when compiled on the server.
I use winscp to copy files to a unix box to compile / run if I am editing on windows. or you could use pscp (part of putty but not graphical)
https://winscp.net/eng/download.php
This may be a long way of doing it, but this is always what I have done when faced with this problem. (This is assuming that your code is saved on your windows machine and you want to compile it on your linux machine)
In the shell, use the touch command to create a new file
$ touch newFile.cpp
Then use nano to open the file and copy and paste your code into the shell.
$ nano newFile.cpp
then just use the command to compile and then execute.
There's probably a faster way to do but it works.
I'm a beginner with programming, and I've been doing work in C/C++ in Ubuntu. When I tell something to cin/cout/cerr or printf/scanf or take arguments from the command line, this all happens from the linux terminal in Ubuntu.
Now if I want to run these same programs (very simple programs, beginner-level) and run them in Windows, how do I run them from the Windows command line? A previous course I've taken had us download cygwin to simulate the linux command line in windows, but what if I want to just run the program from the ordinary windows command line? Is that possible, and does it require modification of the software?
You can cross-compile the program for Windows from linux.
On Ubuntu, process is basically this:
sudo apt-get install wine mingw32 mingw32-binutils mingw32-runtime
...
i586-mingw32msvc-g++ -o myProgram.exe myProgram.cpp
Easy, right? Google for "ubuntu cross-compile windows," there's a ton of information out there.
It's exactly the same. You run cmd and write the command (almost) exactly as you would in Linux.
For example, if you build your program to program, you would run it in Linux like this:
./program --option1 -o2 file1 file2
And in Windows, first you have to make the output have a .exe suffix and then in cmd you would write:
program.exe --option1 -o2 file1 file2
Basically saying, cmd is Windows' terminal. It's nowhere near as good as the Linux terminal, but that would be all you get without installing additional software.
cin/cout/cerr and printf/scanf/fprintf(stderr, ...) use the standard C preopened files stdin, stdout and stderr which are defined both in Linux and Windows. Once you run the application from Windows' terminal (cmd), you see the input/output exactly as you would in the Linux terminal. I/O redirection is also very similar.
cin and cout, and printf and scanf, work much the same in Windows as they do in Linux. (I'm pretty sure cerr does too, but that one i'm not 100% sure about. At the very least, though, it's there and works.) The biggest difference is that Windows typically won't expand wildcards (stuff like *.txt) before running your program; you have to do that yourself in most cases.
Basically, as long as the app doesn't use anything specific to Linux or GCC, you could just recompile it on the target machine using whatever compiler you like to test.
If you don't want to recompile...well...good luck with that. Even Cygwin won't run native Linux binaries. You'd need a virtual machine with Linux on it.
Well, if you program is portable and not using any features specific to Linux, you would have to compile it from source on Windows to make it work on Windows.
You would need the GCC tool-chain for windows to do that, which you can get from the TDM-GCC homepage. Its MinGW internally and the installer allows you to choose the features you want to install as well as the target directory for installation. It also adds itself to Windows path so that the compiler commands are available from the shell prompt.
I have to do the cross compilation regularly and it works without any issues for me. There is one change which you must make if your project is using Makefiles. For the target binary, such as <target>.out in linux, you would have to edit your Makefile and rename it to <target>.exe so that it runs on the command line. If you are not using Makefiles and just doing gcc <file.c>, the a.exe is produced by default (similar to a.out in Linux).
Say you have this program code you want to run on UNIX and Windows:
#include <stdio.h>
int main()
{
printf("Hi\n");
return 0;
}
When you type a command in a UNIX shell it will be something like this.
/usr/home/bobby# gcc main.c
/usr/home/bobby# ./a.out
Hi
/usr/home/bobby#
On Windows you'd have to first choose your development environment/compiler. Without going to something like Cygwin, you could install the Windows SDK or Visual studio (although if the latter you might just want to develop in the GUI).
Start -> Run -> cmd /k ""C:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\vcvarsall.bat"" x86
C:\Windows\system32>cd c:\bobby
C:\bobby>cl main.c
C:\bobby>main.exe
Hi
C:\bobby>
When a C program is compiled into an executable this is done in a system dependent way. On Ubuntu the ELF format is used and on Windows we have PE.
When you start a process the ELF or PE is read giving instructions/map on how to allocate memory and where to put various pieces of the process in a virtual memory table. Further it links up to dynamically loaded libraries, already in physical memory, that it share with other processes which is using the same libraries. Or if the dynamic libraries is not present load them. (Linux .so, windows .dll). If it has static libraries these are allocated and linked in (Linux .a, Windows .lib). - Very simplified.
Memory restrictions etc are inherited from previous process.
Environment variables are put into the running environment for the process. This being paths, arguments, etc. Then main() is added to the stack and called.
Now everything happening before main is called and how linkage etc are resolved, and so many other things, depends on the system. This is why one simply can't run an executable compiled on Linux on Windows.
Using cygwin one is simply creating a virtual environment where those linkages etc are the same and would work. One create an ELF environment.
To get it linked for native Windows command line one would have to compile for Windows. On that matter I see there is lots of answers already.
The ELF and PE, as on different systems, also have different ways of handling environment variables etc. What these are etc. So i.e. file expansion is handled differently. However both running processes has the default streams like stderr, stdout and stdin. But below your code in C they are not the same.
It is like driving a diesel vs a petrol car. Much is the same but under the hood quite a few things is different.
Be aware that i.e. signals are handled different on Windows.
I asked this question in a more general design context before. Now, I'd like to talk about the specifics.
Imagine that I have app.exe running. It downloads update.exe into the same folder. How would app.exe copy update.exe over the contents of app.exe? I am asking specifically in a C++ context. Do I need some kind of 3rd mediator app? Do I need to worry about file-locking? What is the most robust approach to a binary updating itself (barring obnoxious IT staff having extreme file permissions)? Ideally, I'd like to see portable solutions (Linux + OSX), but Windows is the primary target.
Move/Rename your running app.exe to app_old.exe
Move/Rename your downloaded update.exe to app.exe
With the next start of your application the update will be used
Renaming of a running i.e. locked dll/exe is not a problem under windows.
On Linux it is possible to remove the executable of a running program, hence:
download app.exe~
delete running app.exe
rename app.exe~ to app.exe
On Windows it is not possible to remove the executable of a running program, but possible to rename it:
download app.exe~
rename running app.exe to app.exe.old
rename app.exe~ to app.exe
when restarting remove app.exe.old
It's an operating system feature - not a C++ one.
What OS are you on?
In Windows see the MoveFileEx() function, on linux simply overwrite the running app ( Replacing a running executable in linux )
On Windows at least an application running is locking its own .exe file and all statically linked .dll files. This prevents an application from updating itself directly, at leads if it desires to prevent a re-boot (if re-boot is OK the app can pass in the MOVEFILE_DELAY_UNTIL_REBOOT flag to MoveFileEx and is free to 'overwrite' it's own .exe, as is delayed anyway). This is why typically applications don't check for updates on their own .exe, but they start up a shim that checks for updates and then launches the 'real' application. In fact the 'shim' can even be done by the OS itself, by virtue of a properly configured manifest file. Visual Studio built application get this as a prefab wizard packaged tool, see ClickOnce Deployment for Visual C++ Applications.
The typical Linux app doesn't update itself because of the many many many flavors of the OS. Most apps are distributed as source, run trough some version of auto-hell to self-configure and build themselves, and then install themselves via make install (all these can be automated behind a package). Even apps that are distributed as binaries for a specific flavor of Linux don't copy themselves over, but instead install the new version side-by-side and then they update a symbolic link to 'activate' the new version (again, a package management software may hide this).
OS X apps fall either into the Linux bucket if they are of the Posix flavor, or nowadays fall into the Mac AppStore app bucket which handles updates for you.
I would day that rolling your own self-update will never reach the sophistication of either of these technologies (ClickOnce, RPMs, AppStore) and offer the user the expected behavior vis-a-vis discovery, upgrade and uninstall. I would go with the flow and use these technologies in their respective platforms.
Just an idea to overcome the "restart" problem. How about making a program, that does not need to be updated. Just implement it in a plugin structure, so it is only an update host which itself loads a .dll file with all the functionality your program needs and calls the main function there. When it detects an update (possibly in a seperate thread), it tells the dll handle to close, replaces the file and loads the new one.
This way your application keeps running while it updates itself (only the dll file is reloaded but the application keeps running).
Use an updater 3rd executable like many other apps.
Download new version.
Schedule your updater to replace the app with the new version.
Close main app.
Updater runs and does the work.
Updater runs new version of your app.
Updater quits.
I came from gnu/linux world but recently I must to work on a windows system and I want to be able to compile my c/c++ console programs on it. The problem is I don't have administrative privilegies to install anything.
I looked for portable apps. I'd find gvim and mingw but I don't know how to make them work together on a flash drive. I'd found also a vim plugin called msysportable that's supposed to do the job but I don't know how.
So my question is: how can I make a portable windows c/c++ development environment using gvim?
(don't tell me to use code::blocks or visual studio, I've this installed but I want vim)
copy your development environment to your flash-drive. Create a batch file to setup environment variables for paths to include dirs ,lib dirs ,executables ,etc. Then use this environment in cmd session with console commands. If you have installed MSVC on your own system then there is a batch-file called vcdirs.bat thar does this on youre system, Take it as example how to make a portable environment. By the way ,INSTALLING youre tools at a vlient-site may be a license violation. The portable environment is not as long as you do not install it.
I am very disappointed with my school linux server when doing the homework on it.
The reason is: my homework requires to make GUI application.
All the tool that I have is:
- ssh from my local machine to school machine
- gcc/g++ in my school machine
I have been thinking and tried out different solutions for a week.
I still can't be able to figure out how to bring GUI to my application.
Here is some solutions I tried:
- Install some graphical library (sdl,ncurses...) but school computer does not allow to install because i'm not the root user
- Try to compile with /X11/ to produce X-GUI application. Then running it throgh ssh (tunneling). This does not work either because school computer does not have headers file located in X11.
So, What CAN I DO? Anybody has suggestion?
I will thank you million times if you could help for a solution.
Thanks you much.
tsubasa
It should be possible to install most things, like ncurses or even X11, in user space (in your home directory), if you install them from source. With a Gnu package, you just use --prefix= as an argument to configure, like this:
./configure --prefix=/name/of/directory/to/install/into
I'm not sure about the other packages.
Without a GUI library to link against, you won't be able to develop a C/C++ app on that server. It seems to me that you have a few options:
1) Develop this GUI app someplace else. If it has to be in Linux, and you're a Windows/Mac user, you can install Ubuntu (or some other Linux Distro) on a Virtual Machine and get a full featured environment.
2) Contact the Linux administrator to explain the homework assignment and convince them to install a GUI package for you. (It may help to have your professor also contact the Linux Administrator) (If you don't know who the linux admin is, try emailing root#linuxbox
3) Bend the rules on what a "GUI" environment is. For example, can your C/C++ app output HTML files for a GUI-like experience through a web-browser?
4) Try to install some sort of GUI package inside your account on the server. This will likely fail unless you are very, very good at administering a linux box, and you've hand-built packages before.
Could do it with ncurses
Perhaps you could ditch the school server and use Virtualbox to run a linux VM locally on your machine and develop on that. It's free.
From "INSTALL" file in ncurses source archive:
The package gets installed beneath the --prefix directory as follows:
In $(prefix)/bin: tic, infocmp, captoinfo, tset,
reset, clear, tput, toe
In $(prefix)/lib: libncurses*.* libcurses.a
In $(prefix)/share/terminfo: compiled terminal descriptions
In $(prefix)/include: C header files
Under $(prefix)/man: the manual pages
Note that the configure script attempts to locate previous
installation of
ncurses, and will set the default prefix according to where it finds the
ncurses headers.
Do not use commands such as
make install prefix=XXX
to change the prefix after configuration, since the prefix value
is used
for some absolute pathnames such as TERMINFO. Instead do this
make install DESTDIR=XXX
So I'd recommend using "make install DESTDIR=XXX" where XXX is the location where you have write persmissions.
HTH