Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 2 years ago.
Improve this question
Looking for small C/C++ compiler available for OS X Catalina
without the need of installing both xcode and command line tools.
CLT itself takes over a Gig, xcode even more.
Preferably around or below 100Mb if possible.
Please kindly advice.
P.S. writing scripts for fluid dynamics, not even in need for OOP,
just wonder why compilers nowadays weight over USB-sticks capacities from 2000s :(
P.P.S Also considering installing server linux distributive just for C sake.
Happy to listen to different opinions.
Preferably around or below 100Mb if possible.
Why does 1 or 5 GBs bother you? For complex math calculations you need a very strong computer. Storage is cheap (1TB SSD for ~$120). I rather think that you will need to invest in the decent NVIDIA GPU and calculate there. We have (small business running waves simulations) $100k server with plenty Teslas and it is not fast enough :).
Forget the program sizes - it is the least important, no one cares about it.
You need a modern computer, a lots of RAM and plenty fast storage. Start from it. Compiler size does not matter
Looking for small C/C++ compiler available for OS X Catalina
C and C++ are different languages. Read and compare both n1570 (the C11 standard) and n3337 (the C++11 standard).
P.S. writing scripts for fluid dynamics, not even in need for OOP, just wonder why compilers nowadays weight over USB-sticks capacities from 2000s :(
Because recent C or C++ compilers are capable of very tricky optimizations, which programs on fluid dynamics practically need (be aware of OpenACC and of OpenMP and of OpenCL; your probably need one of them). See this draft report explaining more them.
If you need an unoptimizing C compiler, consider using tinycc or nwcc (and port them perhaps to MacOSX). Both are capable of compiling C code on MacOSX or Linux. Both are open source and coded in C.
You could use vim or GNU emacs as your source code editor. Or whatever Apple is giving on your Macbook. Choose also a good build automation tool (e.g. GNU make or ninja) to drive your C or C++ compiler and of course compile on the command line ...
But you probably could take advantage in your field of the many optimizations that either recent GCC (i.e. g++ for C++, gcc for C) or recent Clang (i.e. clang++ for C++, clang for C) are capable of. And both compilers have dozen of millions of source code lines.
If you want a scripting language to drive fluid dynamics libraries, consider using an existing one: Lua, Python, Guile, Ocaml ... comes to mind and can embed other huge libraries.
See also LinuxFromScratch
If you have lots of time to spend (and a few gigabytes of disk space) consider the following route: download some old C compiler; use it to compile nwcc from source code. Download the source code of GCC 4.5 (it is coded in C). Compile it. You have now a C++ compiler g++-4.5. Download the source code of GCC 9. Compile it with g++-4.5. You have now an optimizing C++11 compiler g++-9. That could take a week of your time.
Also considering installing server linux distribution just for C sake.
That choice is large, and matter of opinion. I would recommend a recent Debian or Ubuntu.
Related
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I am currently updating my knowledge on C++ to the new standard. It makes me feel like a little kid that just got the awesomest toy: I want to play with it all the time but I don't want to lose my friends because of it.
I am involved in a few open source projects for which some of the new features would be extremely useful, so I am quite keen on using them. My question is how many users can compile C++11 code, e.g. what's the adoption rate of C++11-complete compilers in the general public? Does anyone have related information?
I know that gcc 4.8.1 and clang 3.3 are C++11 feature complete, but I have no idea how many people actually use compilers that are up to date. I know most codemonkeys surely do, but what about the average open source user? Slapping potential users in the face and telling them to update their compilers is not really an option.
I am aware that this question may be criticised/closed for being similar to these questions:
How are you using C++11 today?
To use or not to use C++0x features
I would like to point out that things are different now, since we are talking about an actual approved standard. I believe that being aware of its adoption rate is important in practice for programming.
You should probably first decide which C++11 you absolutely want to be able to use, and then lookup the lowest compiler version that supports this on the platforms that you want to support. Apache has a nice survey of the earliest version of each major compiler (gcc, clang, visual c++, intel, etc.) that supported the various C++11 features.
In my experience, gcc 4.7 and Clang 3.2 are almost feature complete (except for things like inheriting constructors, which are useful but not game changers). You could get a lot of useful features with gcc 4.6 (but take the 4.6.3 version to avoid many bugs) or Clang 3.1, which is nice since gcc 4.6 is also the official Android NDK compiler (if you are looking to support that).
If you are looking to support Linux, you can take a look at DistroWatch, where you can see which gcc versions were installed for each distro version. E.g. many popular distributions based on Ubuntu have been on gcc 4.7 for almost a year now, and are going to upgrade to gcc 4.8.1 (feature complete) in their next releases.
On Windows, there is the Nuwen Distro currently running MinGW 4.8.1 (only 32-bit and no threading). Visual C++ is not up to the job and will take a while (year or more?) to get where gcc 4.8 and Clang 3.3 are.
Even if the distros don't officially support a recent version, there are private package repositories (often maintained by the same people also doing the official packaging) that provide cutting edge. The LLVM project even provides pre-built nightly SVN snapshots that enable many of the C++14 features (in -std=c++1y mode). For gcc there are no nightly packages AFAIK.
About forcing developers to upgrade compilers / distros. I don't think it is such a big deal (but the point by #ArneMertz about consulting with them first, is very good here). Virtual machines are a breeze to install (~45 minutes end-to-end), so if you only want to release a binary-only product, then go ahead. For users that's another matter, so if you are providing a header-only template library that all regular users need to compile, that should make you a lot more conservative in your transition pace.
I think this is a hard one to answer, since its a somewhat broad question. You are asking about "adoption in the general public", and that is quite dependent on how you define that.
I'd say in the majority of companies adoption of new compilers is slow, because for larger projects changing parts of the toolchain comes with some costs and risks. This is true especially for bigger and "older" companies. Smaller startups often are more likely to embrace new technology.
Open source projects on the other hand are often composed of people that do programming for fun and are keen to adopt new promising things. I am sure that many of your fellow contributors will feel the same as you. How your user community adopts the new compilers can not be said without knowing more about your projects. There are projects and communities that just want the program to work and don't care about newer compilers, and there are communities that will want you to use the newest technology available, because it's cool, faster, better, whatever.
Bottom line: Ask the other contributors of your projects how the think about adopting the new standard, as well as the user community of your projects.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
Since in some post on StackOverflow it was recommended to try to support multiple (in this case C/C++) compilers if feasible, since this forces you to code more standard compliant and helps finding bugs.
So I was looking for additional free C/C++ compilers I could add support for to my project (it is written C/C++ (both languages combined)). I found Open Watcom to be an interesting candidate.
So my question is: what are the advantages and disadvantages of Open Watcom C/C++ compiler in comparison to other ones (for example gcc/g++, Visual C++ etc.)?
There are probably no particular advantages since if portable code is your aim you would generally try to restrict your code to the standard subset implemented by all compilers. I would say lowest common denominator but that may seem somewhat derogatory.
The advantages of one compiler over another generally lie in either the extensions it provides, the libraries it includes, or the performance of the generated code, if portability is your aim, you are probably interested in neither. It is not the advantages of one compiler over another that should interest you in this case, but rather its adherence to and compliance with the ISO standards.
In its earlier commercial incarnation, Watcom was famously one of the best optimising compilers available; I doubt however whether it has kept pace with processor development since then however (or even the transition for 16 bit to 32 bit x86!).
Its one feature that may be seen as an advantage in some cases is that it supports DOS, OS/2 and Windows, but that is probably only an advantage if legacy systems maintenance is your aim. Efforts to port it to Linux and BSD and processors other than x86 exist but are not complete, while GCC is already there and has been for years.
I would suggest that if you can support GCC and VC++ you probably have sufficient compiler independence (but recommend you compile with high warning level settings (-Wall -Werrorin GCC and \W4 \Wx in VC++). I think that compiler portability is a trivial issue compared with OS portability, and what you really need to consider is cross-platform library support rather than compiler independent code support.
If however playing with compilers is your thing, also consider the Digital Mars compiler. Like Watcom, this also has commercial compiler heritage, having been the Zortech/Symantec C/C++ compiler in a previous life.
Something watcom has in favor if your a 'haxxor' is the fact you can define out of the ordinary calling conventions using #pragma aux. Other than that, I see no reason to even attempt to use such a dated compiler unless you had horrible hardware restrictions. Imo, there are only 3 to worry about, GCC, ICC and MSVC
Some people here use expressions having to do with the Watcom (actually OpenWatcom) compiler being "dated." So what does it mean?
It could mean that it doesn't implement the latest C standard. How
many "non-dated" compilers do?
It could mean that it doesn't provide frameworks as it is primarily
an environment for C and ForTran and somewhere far after that comes a
C++ implementation which I cannot judge.
It could mean that it cannot generate excellent assembly code from
garbage C code.
It could mean that it doesn't support x64 development.
It could mean that the debugger is rudimentary and supports assembly
debugging.
Now to what it does do - in addition to supporting 16-bit real and protected mode code:
It produces excellent 32-bit protected mode code in the flat memory
model everyone uses for the Win32 environment.
Its code generating capabilities are excellent and it's right up
there at the top with more "non-dated" compilers.
It's easy to tune multi-threaded code using its profiler.
How do you "feel" a compiler? I for one don't know how to do that. Is it how the error messages are written? Is it in the messages on the console log?
The world's greatest network operating system - Novell Netware - had Watcom as its development environment. That says a great deal about Watcom. And lest anyone forget: Netware died due to poor marketing management combined with Redmond foul play. It did not die from lack of technological excellence.
I guess what I'm trying to say is that you guys that don't know what you're talking about should perhaps be a little less eager to write answers.
I know I know it's all about getting those coveted points and badges and what have you. And how you get them is irrelevant, right?
The Open Watcom compiler is somewhat outdated and it feels. It is based on what was long time ago a good compiler for making MS DOS games. Currently it is not very standard compliant and its standard library is in immature state.
I would prefer more modern and popular compilers like Intel cc, g++, VC++ or CLang. Not sure about Borland C, haven't tried it long time.
Advantages:
it's free
it's open source. You can alter it and its runtime libraries any way you like
it is crossplatform. You can run it, among other platforms, on Windows and Linux. More, you can build programs with it for different platforms, using a single platform
Disadvantages:
it is outdated a bit, but not that much as in the past
Positive (2)
The code and projects are not bloated like the projects in Microsoft Visual Studio/C++ (Not hundreds of vproj and other files and folders). You can just generate a makefile like in GCC (Which is better to understand than the Visual Projects Makefiles...)
Even the installation takes no big time (on x64 Win 7), in comparisation to 2++ GBytes Visual Project...
Compared to GCC it may seem that it is better to handle
Negative
Clib is missing: strn... functions (strndup, strncmpi etc.), getoptlong
No ARM support (# 1st July 2015)
As Editor you should really use Notepad++, not the internal Editor
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 8 years ago.
Improve this question
Which tools would be most useful for analyzing a C++ codebase?
What do they cost?
Can we manage with free and trial software, or are there commercial software that is good and that we really should to pay for?
The main object would be to get an understanding of quality - memory issues etc, also to understand the code (For spotting architectural problems for example), perhaps coding standards.
Primarily statical analysis, but we are hoping to be able to run the code.
Think it needs to be "robust in the sense that it should work with code for arcane compilers.
The best free tool is your compiler's warning errors, I always use them at maximum level. The first goal should be a clean build without any cheating (eg. disabling or casting away not-understood warnings).
Visual C++ has built in Code Analysis which is good for catching some bugs and Win32 API misuse, but it's not included in the free version and is (obviously) Windows-specific. This used to be an internal Microsoft tool called Prefast - analogous to FxCop in .Net.
PC-Lint is good, but verbose and not free. If you can get a config file to trap 'useful things' and ignore the noise, that would be a big plus. Again this is for Windows, but I know there are versions for other platforms.
Take a look at:
http://www.cppdepend.com/
and a good many others:
http://www.chris-lott.org/resources/cmetrics/
http://www.locmetrics.com/alternatives.html
I've heard very good things about Valgrind. "automatically detect many memory management and threading bugs, and profile your programs in detail"
The number one stink in programs is code duplication.
You can use clone detectors to find duplicates. Many clone detectors compare just text lines for exact matches; other compare token streams and will find almost-exact matches where the differences are just changed identifiers. You can use our CloneDR to find duplication in which arbitrary langauge structures are inserted or removed, using the langauge grammar as a guide. CloneDR works for large C++ systems, as well as many other languages. At the link you can find typical clone detection reports.
A popular broad-spectrum static checker is PCLint. This checks for a variety of common coding errors predefined by the tool. I don't know how well it handles "arcane" (compilers) dialects of C++.
If you want to define custom checks, you need a full C++ front end parser and the ability to configure your checks arbitrarily. Our DMS Software Reengineering Toolkit is an engine that can be configured to accomplish this. DMS's C++ front end can be configured to handle "arcane" C++ dialects, but already covers ANSI, GCC3 and GCC4, MS Visual Studio 7 and 2005. Because DMS is a program transformation engine, it can even be used to "improve" the code quality by replacing poor constructs with better ones.
While not static analysis, test coverage tools for measuring how well you've tested your code are very helpful in assessing your code quality. Just because all your tests pass, doesn't mean you've tested well; unexercised code arguably can have any/all variety of problems.
Theres CCCC: http://cccc.sourceforge.net/ -- result of a research project on metrics.
To tell the truth, I've not found much benefit in such things. What do you hope to get?
You could try out Vigilant Sentry, which analyzes C and C++ and looks for advanced errors in your software. This includes memory or resources leaks, and crash causing memory corruption, among other things.
The small business edition is currently only $795 (by far the cheapest on the market for the value) and the enterprise is $4995. Good luck finding what you need.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
I have years of C++ programming experience in Windows. Now I need to program some applications for Linux. Is there any resource that helps me quickly get the required information about Linux technologies available to C++ developers?
Programming in C++ under Linux isn't all that different at the core. Linux compilers are generally more standard's conforming than MSVC; however, that is changing as MSVC is becoming a better compiler. The difference is more from the environment and available libraries. Visual Studio isn't available (obviously) but some other environments like Visual SlickEdit and Eclipse are available on both.
The build system is widely varied and will probably be dictated by your preference between Gnome, KDE, or the ever-present command line. Personally, I find the latter to be the cleanest and most consistent. If you end up at the command line, then learn GNU Make and pick up a copy of GNU Autoconf, Automake, and Libtool. This will introduce the GNU command line development stack pretty nicely.
Debugging is a lot different being that VS provides a nice GUI debugging environment. Most Linux environments simply wrap a command line debugger (usually gdb) with a GUI. The result is less than satisfactory if you expect a nicely integrated debugger. I would recommend getting comfortable with gdb. There are some decent tutorials for gdb online. Just google for a bunch of them. Once you get a little comfortable, read the online manual for the really neat stuff.
The other choice is to use whatever development environment is packaged with your windowing system or to use something like Eclipse and some C++ plug-in
As for books on the subject, Advanced Programming in the UNIX Environment is a must-read. UNIX Systems Programming is also a good read since it gives you a solid grounding in shells, processes, and what not. I would recommend both the POSIX Programmer's Guide and POSIX.4 Programmer's Guide since they give you a lot of the systems programming stuff.
With all of that said, enjoy your foray into an operating system that really cater to programmers ;)
I'm in the process of making the switch from Windows to Linux right now for a program and so far I have found that man and grep are great. Instead of looking up function prototypes in MSDN (or similar) I just use man.
If I need a code example, greping through an existing project that has some similarities to mine is a great help. Or if there is a project similar enough to warrant this, setting up an LXR of their code-base to more easily facilitate reading really helps a lot.
In general, the open source nature of Linux has been the greatest resource to learning to program on Linux.
Also Stevens' Advanced Programming in the UNIX Environment was a huge boon. But as for IDE's and the like, call me a luddite, but I just like vim and make.
I've learned a lot from Beginning Linux Programming by Matthew and Stones, though it's more C than C++.
I use die.net and lookup at The Open Group's website a lot, http://www.opengroup.org/onlinepubs/000095399/functions/{function}.html. They have much the same information as man. I use SciTE, and have the C API and The Open Group POSIX lookup as hotkeys, as described here.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 4 years ago.
The community reviewed whether to reopen this question 10 months ago and left it closed:
Original close reason(s) were not resolved
Improve this question
I am curious if anyone have used UnderC, Cint, Cling, Ch, or any other C++ interpreter and could share their experience.
There is cling Cern's project of C++ interpreter based on clang - it's new approach based on 20 years of experience in ROOT cint and it's quite stable and recommended by Cern guys.
Here is nice Google Talk: Introducing cling, a C++ Interpreter Based on clang/LLVM.
NOTE: what follows is rather CINT specific, but given that its probably the most widely used C++ interpreter it may be valid for them all.
As a graduate student in particle physics who's used CINT extensively, I should warn you away. While it does "work", it is in the process of being phased out, and those who spend more than a year in particle physics typically learn to avoid it for a few reasons:
Because of its roots as a C interpretor, it fails to interpret some of the most critical components of C++. Templates, for example, don't always work, so you'll be discouraged from using things which make C++ so flexible and usable.
It is slower (by at least a factor of 5) than minimally optimized C++.
Debugging messages are much more cryptic than those produced by g++.
Scoping is inconsistent with compiled C++: it's quite common to see code of the form
if (energy > 30) {
float correction = 2.4;
}
else {
float correction = 6.3;
}
somevalue += correction;
whereas any working C++ compiler would complain that correcton has gone out of scope, CINT allows this. The result is that CINT code isn't really C++, just something that looks like it.
In short, CINT has none of the advantages of C++, and all the disadvantages plus some.
The fact that CINT is still used at all is likely more of a historical accident owing to its inclusion in the ROOT framework. Back when it was written (20 years ago), there was a real need for an interpreted language for interactive plotting / fitting. Now there are many packages which fill that role, many which have hundreds of active developers.
None of these are written in C++. Why? Quite simply, C++ is not meant to be interpreted. Static typing, for example, buys you great gains in optimization during compilation, but mostly serves to clutter and over-constrain your code if the computer is only allowed to see it at runtime. If you have the luxury of being able to use an interpreted language, learn Python or Ruby, the time it takes you to learn will be less than that you loose stumbling over CINT, even if you already know C++.
In my experience, the older researchers who work with ROOT (the package you must install to run CINT) end up compiling the ROOT libraries into normal C++ executables to avoid CINT. Those in the younger generation either follow this lead or use Python for scripting.
Incidentally, ROOT (and thus CINT) takes roughly half an hour to compile on a fairly modern computer, and will occasionally fail with newer versions of gcc. It's a package that served an important purpose many years ago, but now it's clearly showing it's age. Looking into the source code, you'll find hundreds of deprecated c-style casts, huge holes in type-safety, and heavy use of global variables.
If you're going to write C++, write C++ as it's meant to be written. If you absolutely must have a C++ interpretor, CINT is probably a good bet.
cint is the command processor for the particle physics analysis package ROOT. I use it regularly, and it works very well for me.
It is fairly complete and gets on well with compiled code (you can load compiled modules for use in the interpreter...)
late edit:: Copied from a later duplicate because the poster on that questions didn't seem to want to post here: igcc. Never tried it personally, but the web page looks promising.
I have (about a year ago) played around with Ch and found it to be pretty good.
Also long ago I used a product call Instant C but I don't know that it ever developed further
Long ago, I used a C++ interpreter called CodeCenter. It was pretty nice, although it couldn't handle things like bitfields or fancy pointer mangling. The two cool things about it were that you could watch when variables changed, and that you could evaluate C/C++ code on the fly while debugging. These days, I think a debugger like GDB is basically just as good.
I looked at using ch a while back to see if I could use it for black box testing DLLs for which I am responsible. Unfortunately, I couldn't quite figure out how to get it to load and execute functions from DLLs. Then again, I wasn't that motivated and there may well be a way.
There is a program called c-repl which works by repeatedly compiling your code into shared libraries using GCC, then loading the resulting objects. It seems to be evolving rapidly, considering the version in Ubuntu's repository is written in Ruby (not counting GCC of course), while the latest git is in Haskell. :)