I work on a legacy system with old roots, and we're moving from Red Hat Linux 6 to 8, and GCC 4.4.7 to 8.3.1. I'm currently fixing up compilation problems and have a roadblock.
The problem I've run into is that the "sigvec" struct and function are no longer defined in signal.h on the newer system, and I can't find definitions for them in any other include files. The man pages on the new system indicate that signal.h is where they should be.
It also recommends using the POSIX signal API, but considering the legacy code and my lack of experience in this area of a fairly large product code base, I'd prefer keeping the status quo. This is low-level stuff; I'll dig into it if I have to, but if there's an easy workaround, I'd much prefer that.
Is there a package that we should install, or are these buried somewhere in an include file that I'm just not finding? I've searched Google a bit, but haven't found anything on this.
We received word back from Red Hat Support that sigvec support has been removed recently. Rewriting the code appears to be the only option.
Related
Around 2006-7 when I was quite young, I had a Windows XP machine with a legacy C++ compiler and IDE installed. I am now trying to remember what the product's name was.
All I remember is that the editor opened in text mode and was obviously designed for DOS. The name "PGCPP" rings a bell, but all some googling found was PGI Compilers & Tools, who specialise in compilers for parallel computing. I downloaded an old release of theirs from 2000, and it contains their proprietary compiler alongside a Cygwin environment, including Vim and Emacs. This is clearly not the IDE I remember.
I remember downloading it for free, legally. It may have been a trial version, I suppose, but I do not remember.
Does anyone with C++ experience spanning around 15-20 years have any idea what product I used back then?
EDIT: The software was called DJGPP and it seems to still be maintained.
I asked on BetaArchive and somebody, somehow, had used that software before!
The software I remember is called DJGPP. It is a suite of GNU programs, including GCC, ported to DOS.
First of, I am very new to programming, but took an interest in it. I have successfully built a C++ Console program for Windows which is a simple Database program, which can edit / delete / input entries.
I am less and less relying on Windows for day to day stuff. I had an old HP Netbook which was impossible to use with Windows, but I put in a Linux Distro and works like a charm.
As I sometimes do use Windows, as well as having built the program to use in Windows, I am wondering if the same code can be used to compile a Linux program? I could use WINE to run it but would prefer running something specific to Linux. Is this possible with the same code or would I have to make another Linux version of it?
I would assume that since you are new to programming, that you did not make the extraordinary effort to make your code portable across platforms. That takes a significant skill set, especially if you are accessing external resources such as a database. So the answer is you will probably have to re-write for Linux, and specifically the database interface.
I guess that you want your C++ code to be compilable both on Linux and on Windows. You'll need operating-system specific compilers for that (a different one on Linux and on Windows).
I am wondering if the same code can be used to compile a Linux program?
The program to compile your C++ code is called a compiler. On Linux you will use GCC as the g++ command (which you could even customize with MELT, but that is not for newbies), or Clang/LLVM as the clang++ command. You'll use a builder like make (see here for why, and this example). Be sure to install a recent version (GCC 4.9 or Clang 3.5 at least in start of 2015) to get good C++11 support. Focus on learning C++11 (or C++14) not some earlier variant (so use a C++11 compiler).
I don't know Windows so I cannot recommend any good C++ compiler for it (but I heard of MinGW, CygWin and of Microsoft Visual C++; look also into recent Clang...).
You might be interested in cross-platform C++ framework libraries like Qt or POCO (and perhaps also Sqlite for database related stuff). They will help you to code some C++ usable on both systems (after recompilation).
BTW, you can always encapsulate your system specific code with preprocessor directives à la #if LINUX ;take care of putting all the OS specific (or OS related) code in a few source files.
It could happen (and I wish that for you) that you get fond & happy of Linux and will, in a few months, prefer to code for Linux only (you'll then install Linux on all your machines). BTW, study the source code of existing free software you like and use on Linux. That will teach you a lot.
The advices I gave here and here are still relevant today when coding on Linux. Read also something about porting & portability, and Advanced Linux Programming.
Firstly, please forgive my ignorance regarding these matters, I have done a search and not found any comprehensive answers as of yet.
I plan on learning how to develop for Windows, however I am very fond of the GNU toolchain and don't really want to move onto using big environments like Visual Studio until I feel more comfortable with the underlying basics.
From what I understand, one can download the Windows SDK, which contains the headers and libraries needed to build native Windows applications.
Is the SDK literally just a collection of libraries and headers? If so, as my logic goes, it should be possible to point MinGW towards these libraries/headers, and simply build as normal.
When I build using Visual Studio, I can't see what preprocessor directives are being defined, what is being linked in etc. etc., as I am still learning, I like to be able to know exactly what is going on, preferably so I have to manually define, link etc. Hence the question.
So, what I want to know: is my logic correct?
Again, apologies if the question is rudimentary, I am still learning.
P.s. I am planning to develop Windows applications in a windows environment, this is not a question regarding cross-compilation.
Thanks!
MinGW is not compatible with the official Windows SDK, with one of the reasons
being that the SDK contains many VS-specific things (opposed to the GCC base
on MinGW). MinGW has adapted many of the necessary files, and for many programs
this is enough.
You don´t need to know the VS project settings for some program;
MinGW is still GCC in the core and used as such. If you can compile
programs with GCC on linux, learning how to use MinGW won´t be hard.
If you need functions/structures/etc. which are not yet part of it,
you´re out of luck, other than doing the adaption yourself, which
can be everything between very easy or very hard, depending on the case.
Additionally, proper thread usage is a bit quirky (has some "hidden" pitfalls,
which could go unnoticed in an actual program for years, but then...).
(While this is a disadvantage to VS, you´ll get C++11/14 (while VS hasn´t
even finished with 11, see link), better optimzation in many cases etc.)
If you´re choosing what exactly to download, look at WinGW-W64 instead of
the "original" old one. The original project somewhat stopped, has poor
lib support compared to W64, no 64bit compiler etc. (and don´t misunderstand
the "W64", it can be used for 32bit programs too)
After reading many questions on here, I decided to give clang a go, and installed the svn version on Ubuntu 12.04 (64bit). I was expecting issues, but it all compiled smoothly with no warnings.
I noticed though that when re-running the configure script, if clang/clang++ is in your path it will choose this over gcc/g++ for its own compilation. Is it a good idea to recompile llvm/clang with itself? I know this is absolutely standard with gcc, but I've read that clang's C++ implementation isn't quite good enough yet (maybe this is out of date info...).
Clang has been self hosting for a few years. Losing that ability would be a serious regression.
Clang's current C++ support is quite good. Even much of C++11 is already available for your use.
If you want to be safe, stay on a stable branch.
I was recently tasked with performing a feasibility study based around switching from using DOS to Linux for use as an OS to run our industrial control software (developed internally). In a nutshell I have been restricted to using Ubuntu 8.04 (with a vendor supplied kernel upgrade providing drivers for the hardware on the board). As this is no longer supported I am unable to update or install software meaning that I am stuck using gcc version 4.2. I want to be able to use C++ and preferably boost libraries but currently this seems like I will not be able to do so.
Basically I am asking how do companies/professionals go about using Linux as a development environment? Is what I described above a common occurrence? Do you simply pick a version and a compiler and stick with it throughout the product lifetime to ensure that the development environment doesn't change too much or can you freely upgrade the kernel, compiler etc. as you go along? Is it common to be constrained by what a particular vendor can provide. Would anyone be prepared to give their opinion as to whether ubuntu 8.04 is a suitable choice of OS for development of industrial control software?
I am not a linux expert at all, but my research and experimentation so far is leading me to conclusion that I should abandon the linux approach and use DOS. Our company has no linux knowledge and is very small and for personal career reasons I have no interest in learning redundant technology like DOS.
I realise this is not exactly a yes/no type question but any responses will be gratefully received.
GCC 4.2 has no C++11 support but the C++03 support should be good and you should be able to find a version of Boost that can deal with that quite easily.
Ultimately, Linux has many upsides you won't find in DOS- for example, no segmentation, virtual memory, and such things that will make it easier and faster to develop software, not to mention additional libraries you might need, as absolutely nobody whatsoever will support DOS today.
With linux-based systems there's not much reason to stick with fixed OS+toolchain version, because backwards compatibility is a very serious issue in Unix-world. Sometimes it is good to target certain fixed system, but frankly these are rather rare, and even then the development can be done on up-to-date systems as long as testing is done on the target macine/platform.
Basically you could just upgrade to for example Ubuntu 12.04 LTS(long term support) for development and stick with it, it is very unlikely that there would be any sorts of uncompatibility problems on the target platform/machine.
Libraries and such tend to change between Linux distros, new versions of linux distros, and other *nix OSes.
I once worked on a C++ application that had to run on both Windows and RHEL. I was the 'Linux guy' on the team, so I got to deal with coaxing all the open-source linux libraries we were using to build and work on Windows (using cygwin), and getting the latest changes made by the devs working on Windows to work on Linux.
Midway through development, we upgraded to a newer version of RHEL. It was not a fun experience. Library versions had changed, some had been removed in favor of other 'equivalent' libs, etc. Shaking out all of the problems caused by changing gcc versions took a little while too (granted, the newer gcc version was a bit less forgiving and exposed some stuff in our code that probably wasn't quite right anyway).
A couple of days before a big demo, management informed us that the app needed to run on Solaris as well. That was not a trivial task -- Solaris is NOT Linux. They hinted about wanting it to run on IRIX at one point. Glad that didn't happen.
I would recommend that you pick a specific version of a Linux distro, gcc, etc. and stick with it throughout development. Upgrading that stuff can happen later, when the software is in maintenance. RHEL offers long-term support, at a cost. You might also consider the newly released Ubuntu 12.04 LTS