How to write fast (low level) code? [closed] - c++

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I would like to learn more about low level code optimization, and how to take advantage of the underlying machine architecture. I am looking for good pointers on where to read about this topic.
More details:
I am interested in optimization in the context of scientific computing (which is a lot of number crunching but not only) in low level languages such as C/C++. I am in particular interested in optimization methods that are not obvious unless one has a good understanding of how the machine works (which I don't---yet).
For example, it's clear that a better algorithm is faster, without knowing anything about the machine it's run on. It's not at all obvious that it matters if one loops through the columns or the rows of a matrix first. (It's better to loop through the matrix so that elements that are stored at adjacent locations are read successively.)
Basic advice on the topic or pointers to articles are most welcome.
Answers
Got answers with lots of great pointers, a lot more than I'll ever have time to read. Here's a list of all of them:
The software optimization cookbook from Intel (book)
What every programmer should know about memory (pdf book)
Write Great Code, Volume 2: Thinking Low-Level, Writing High-Level (book)
Software optimization resources by Agner Fog (five detailed pdf manuals)
I'll need a bit of skim time to decide which one to use (not having time for all).

Drepper's What Every Programmer Should Know About Memory [pdf] is a good reference to one aspect of low-level optimisation.

For Intel architectures this is priceless: The Software Optimization Cookbook, Second Edition

It's been a few years since I read it, but Write Great Code, Volume 2: Thinking Low-Level, Writing High-Level by Randall Hyde was quite good. It gives good examples of how C/C++ code translates into assembly, e.g. what really happens when you have a big switch statement.
Also, altdevblogaday.com is focused on game development, but the programming articles might give you some ideas.

An interesting book about bit manipulation and smart ways of doing low-level things is Hacker's Delight.
This is definitely worth a read for everyone interested in low-level coding.

Check out: http://www.agner.org/optimize/

C and C++ are usually the languages that are used for this because of their speed (ignoring Fortran as you didn't mention it). What you can take advantage of (which the icc compiler does a lot) is SSE instruction sets for a lot of floating point number crunching. Another thing that is possible is the use of CUDA and Stream API's for Nvidia/Ati respectively to do VERY fast floating point operations on the graphics card while leaving the CPU free to do the rest of the work.

Another approach to this is hands-on comparison. You can get a library like Blitz++ (http://www.oonumerics.org/blitz/) which - I've been told - implements aggressive optimisations for numeric/scientific computing, then write some simple programs doing operations of interest to you (e.g. matrix multiplications). As you use Blitz++ to perform them, write your own class that does the same, and if Blitz++ proves faster start investigating it's implementation until you realise why. (If yours is significantly faster you can tell the Blitz++ developers!)
You should end up learning about a lot of things, for example:
memory cache access patterns
expression templates (there are some bad links atop Google search results re expression templates - the key scenario/property you want to find discussion of is that they can encode many successive steps in a chain of operations such that they all be applied during one loop over a data set)
some CPU-specific instructions (though I haven't checked they've used such non-portable techniques)...

I learned a lot from the book Inner Loops. It's ancient now, in computer terms, but it's very well written and Rick Booth is so enthusiastic about his subject I would still say it's worth looking at to see the kind of mindset you need to make a CPU fly.

Related

Why c++ is not used in L2/L3/L4 development projects [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I have been involved in numerous c++ projects mainly in the application domain pertaining to VOIP protocols. Now I have to move to L3 , L2 protocol development projects where I found 'C' is preferred language of choice for the L2/L3/L4 developers.
Now I am wondering expect device firmware related applications, why protocols are developed using stone age era language. Why ppl dont take the benefits of OOPS techniques? Will it be prudent if I try to convince them to switch to c++. Most of the developers working in the team are C experts and not comfortable with C++.
There are several reasons for continuing using C.
There are existing C projects. Who will pay for converting them into C++?
C++ compiler (of a good quality) is not available on every platform.
Psychological reason. When you pass and return objects by value, temp objects are created left and right. This is not ok for small systems. People do not really understand that passing and returning references completely solves this problem. There are other similar issues.
And finally. What is wrong with C? It works! (Do not fix what is not broken).
It is possible to write the same performant code on C++ as on C, but this requires better understanding, training, code control discipline. Common percetion is that these flaws are unavoidable.
If you think of C as simply a "stone-age language," then I think you misunderstand why people continue to use it. I like and use both C and C++. I like them both for different reasons, and for different kinds of problems.
The C language presents a model of the computer that is both (mostly) complete and very easy to understand, with very few surprises. C++ presents a very complex model, and requires the programmer to understand a lot of nuance to avoid nasty surprises. The C++ compiler does a lot of stuff automatically (calling constructors, destructors, stack unwinding, etc.). This is usually nice, but sometimes it interferes with tracking down bugs. In general, I find that it's very easy to shoot yourself in the foot with both C and C++, but I find the resulting foot-surgery is much easier to do in C, simply because it's a simpler language model.
The C model of a computer is about as close to assembly as you can while still being reasonably portable. The language does almost nothing automatically, and lets you do all kinds of crazy memory manipulations. This allows for unsafe programming, but it also allows for very optimized programming in an environment with very few surprises. It's very easy to tell exactly what a line of code does in C. That is not true in C++, where the compiler can create and destroy temporary objects for you. I've had C++ code where it took profiling to reveal that automatic destructors were eating a ton of cycles. This never happens in C, where a line of code has very few surprises. This is less of an issue today than it was in the past; C++ compilers have gotten a lot better at optimizing many of their temporaries away. It can still be an issue, though, and especially in an embedded environment where memory (including stack space) is often tight.
Finally, code written in C++ often compiles slowly. The culprits are usually templates, but eliminating templates often makes your C++ code look a lot like C. And, I really cannot overstate how much this can affect productivity. It kills productivity when your debug-fix-recompile-test cycle is limited by the compilation time. Yes, I know and love pre-compiled headers, but they only do so much.
Don't get the impression that I'm anti-C++ here. I like and use the language. It's nice to have classes, smart pointers, std::vector, std::string, etc. But there's a reason that C is alive and kicking.
For a different perspective, and one that is firmly anti-C++, you should at least skim over Linus Torvald's perspective on C++. His arguments are worth thinking about, even if you disagree with them.

Why is fortran used for scientific computing? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
I've read that Fortran is still heavily used for scientific computing. For code already heavily invested in Fortran this makes sense to me.
But is there a reason to use Fortran over other modern languages for a new project? Are there language design decisions in Fortran that makes it much more suitable for scientific computing compared to say the more popular languages (C++, Java, Python, Ruby, etc.)? For example, are there specific language features of Fortran that maybe allow numeric optimization in compilers to a much higher degree compared to other languages I mentioned?
Fortran is, for better or worse, the only major language out there specifically designed for scientific numerical computing. It's array handling is nice, with succinct array operations on both whole arrays and on slices, comparable with matlab or numpy but super fast. The language is carefully designed to make it very difficult to accidentally write slow code -- pointers are restricted in such a way that it's immediately obvious if there might be aliasing, as the standard example -- and so the optimizer can go to town on your code. Current incarnations have things like coarray fortran, and do concurrent and forall built into the language, allowing distributed memory and shared memory parallelism, and vectorization.
The downsides of Fortran are mainly the flip side of one of the upsides mentioned; Fortran has a huge long history. Upside: tonnes of great libraries. Downsides: tonnes of historical baggage.
If you have to do a lot of number crunching, Fortran remains one of the top choices, which is why many of the most sophisticated simulation codes run at supercomputing centres around the world are written in it. But of course it would be a terrible, terrible, language to write a web browser in. To each task its tool.
The main reason for me is the nice array notation, and many other design decisions that make writing and debugging scientific code easier. The fact that it is usually the best choice in terms of performance on the relevant tasks (array operations) does not hurt either :)
Honestly, I would not consider most the languages cited as real competitors for Fortran -- Java and Ruby are far, far behind in terms of both convenience and performance, while C++ is much too complex and tricky a language to recommend to anyone whose main job for the last few years has been anything other than daily programming in C++. Python with numpy could be an option though. I am personally not a huge fan of the language, but I know a number of people who use numpy regularly and seem quite happy with it.
Real competition I see is not from these, but from Matlab, R, and similar languages, that offer similar convenience, combined with many standard libraries. Luckily, it is usually possible to start a project in R or Matlab, and write performance-critical parts in Fortran later.
Few projects are completely new projects. I'm not sure it's specific to scientific computing, but at least in this field, you tend to build your applications based on existing (scientific) models, perhaps produced by other groups/people. You will always have to deal with some amount of legacy code, whether you want it or not.
Fortran is what a lot of scientists have been taught with and what a lot of the libraries they need are implemented in. A number of them might not be computer scientists or IT people, more computational scientists. Their primary goal is rarely computing, it's their science first.
While a large number of programmers would have a tendency to learn a new programming language or framework whenever they get a chance (including during their spare time), most scientists would use that time exploring new ideas regarding their science.
A domain expert who's trained in Fortran (or any language) and surrounded by people who are in a similar situation will have no incentive to move away from it.
It's not just that now other languages can be as good as Fortran in terms of performance, they need to be much better: there needs to be a good reason to move away from what you have and know.
It's also a "vicious" circle to a degree. I've always found comparisons between Java and Fortran a bit difficult, simply because a number of Java scientific applications are not programmed in a Java way. Some of the Java Grande benchmark applications look clearly like Fortran programs turned into C programs, copied/pasted/tweaked into Java programs (in a method, passing the length of the array as an extra parameter next to the array itself gives a clue, if I remember well). Because of this, Java (for example) hasn't got a great reputation in the scientific community, even though its performance is getting better. A consequence of that is that there is little overlap between HPC experts and Java experts, for example. Even from the hardware vendors or libraries implementors, little demand from users leads to little support offered, which in turns deters users who would potentially be interested in moving to other languages.
Note that this doesn't preclude the same (or other) scientists from using other languages for other purposes (e.g. workflow management, data management, quicker modeling with Matlab, Numpy, ...).
As I understand it, there are libraries that are some of the most efficient implementations of their algorithms available, which makes Fortran popular for this kind of work in spite of the language's limitations.
One reason is in how the arrays were constructed. They are column major, unlike most other languages. This provides faster computation for their calculations.

Reasons for refactoring tools for C/C++ to be so limited [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 12 years ago.
What is the problem that no industrial level refactoring tool for C/C++ have been created, I only need a tool that "just works"?
What I mean by "industrial level" is a quality provided by JetBrains products (IntelliJ, ReSharper), or above. Any available solutions (including Visual Assist from Tomato Software or Eclipse CDT) are not mature enough.
Below are advantages for a start-up to drive such a project.
alleviation of C++ boring syntax (making development more fun);
C++ is evolving (0x version is coming hence a lot of work for such tool implementers);
marketing niche is broader than anything else (a lot of written C++ code, a lot of active C++ projects), even taking into account Web (HTML/JavaScript) projects;
C++ is chosen for system problems where each bug found by tool at compile time is a survival (a lot of corporations or governments should be interested in);
such tool can decrease project compilation time;
The only drawback is technical challenges... but looking at what Google, Microsoft, Intel, etc. are doing, there should be no unsolvable technical problems.
Lets summarize:
it is possible to implement such product
it is enormously profitable
it doesn't exist
No one wants make a profit? Collusion ;) ? Whats is the reason?
Let's consider just one basic refactoring: Rename function
The implementation appears very simple. Find all references, and replace the identifier used.
It gets a little more complicated with virtual functions. Now it's necessary to find the base declaration (and implementation, if not abstract), and references may be to any derived implementation in the class hierarchy. Much more difficult, but still feasible, because the group of functions to be renamed is well-defined.
That's the complexity that applies to languages like C# and Java. Now comes the kicker: C++ templates are duck typed. The set of implementations in the class hierarchy is no longer well-defined, because it could include ANY function with the same name and possibly compatible parameters. You did remember about Koenig lookup, right?
Trying to separate the list of functions which must have the same name, because they started out in the same overload resolution set, from those truly independent, would be an absolute nightmare. At this point you might as well just do a textual search-and-replace.
So let's go down your list of claims/desires:
"alleviate boring syntax". This is just trolling and means nothing in any technical sense. If the tool doesn't input and output C++ syntax, it's not a C++ refactoring tool.
"C++ is evolving". This means that such a tool would need to be maintained at considerable expense to keep up. It also means that a well-maintained tool would easily steal market share from older stable tools. And it means that any tool needs a ton of user configuration -- you don't want to generate C++0x-isms on a C++03 codebase after all -- so users will have to use the tool an awful lot to win back the time spent configuring.
"Each bug is a survival"? Not sure exactly what this grammatically nonsensical statement means, but perhaps it means zero bug tolerance? But that's best achieved by maintaining stability (the antithesis of automated refactoring that touches multiple files), solid architecture and documentation (which a refactoring tool won't automatically update, or do you want that too?), massive test suites, and static analysis. In fact, refactoring makes the massive test suites even more important, if that's possible, in order to verify that the tool didn't break anything. Note that automated static analysis faces some of the same challenges as refactoring, but since it doesn't change the code it can and does work on post-preprocessed code and/or compiler ASTs.
"decrease compilation time"? This is an unsupported claim in serious need of some evidence or at least solid reasoning. Are you expecting a refactoring tool to introduce pimpl for compilation firewalling? None of the C# and Java-style refactorings will reduce C++ compile time one whit.
"it is possible" No, actually it seems like it isn't. If you or I can't design one basic refactoring like Rename function in a sane manner, never mind implement it correctly, how can anyone implement dozens of them? If it is possible, it's also assuredly extremely expensive, which leads to:
"it is enormously profitable". Profitability requires not only a large base of users who are willing to pay for a product, but that the potential gross sales exceeds the costs. I claim you have seriously underestimated the costs so until you provide some real research on costs and markets I'll call this one wishful thinking as well.
"it doesn't exist". After going to some length to explain why such a tool can't exist, I'm now going to tell you that it already does? Yup. At least insofar as refactoring makes the final code better by doing things like hoisting common subexpressions outside loops, unrolling loops, exchanging order of iteration, inlining, cache optimization, vectorization, they are already provided by optimizing compilers, an area in which C++ leads both C# and Java by a wide-margin. C++ simply doesn't require many of the source-level tweaks needed in C# and Java in order to get a good final product. It's only the manipulation of source code to make it more accessible to humans that remains, and the existing tools do an adequate if not exception job of this.
This link talks about some details and complexity involved
The reason is that C++ is not parseable by most common parsing techniques (LALR, LL, etc.) and actually requires some pretty heavy lifting to properly parse. Some tools like clang and gcc-xml make it possible for other tools to process C++ without implementing their own C++ parsers, although both are fairly new, and it can still be complicated to process C++ even with these parsing tools. I think that the industry will eventually see all the Java-related goodies ported/adapted to C++ ... the question is when.
The problem is that C++ is hard to parse (its grammar is not context free) and hard to make any sense of without actually compiling the source. Java tools can work because the language is a lot simpler, both grammar and semantics.

Hopping from a C++ to a Perl/Unix job [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
I have been a C++ / Linux Developer till now and I am adept in this stack. Of late I have been getting opportunities that require Perl, Unix (with knowledge of C++,shell scripting) expertise. Organizations are showing interest even though I don't have much scripting experience to boast off. The role is more in a Support, maintenance project involving SQL as well. Off late I am in a fix whether to forgo these offers or not.
I don't know the dynamics of an IT organization and thus on one hand I fear that my C++ experience will be nullified and on the positive side I am getting to work on a new technology stack which will only add to my skill set.
I am sure, most of you at some point of time have encountered such dilemmas and would have taken some decision.
I want you to share your perspectives
on such a scenario where a person is
required to change his/her technology
stack when changing his/her job.
What are the merits and demerits in
going with either of the choices?
Also I know that C++ isn't going
anywhere in the near future. What
about perl? I have no clue as to what
the future holds for perl developer?
Whether there are enough
opportunities for a perl developer?
I am asking this question here because most of my fellow programmers face this career choice dilemma.
EDIT:
Since the last time I asked this question, I made up my mind to switch.
I was just about to sign on the dotted line but some divine intervention made me seek some clarification about the working hours, and to my horror, the profile required me to work in
shifts which I am never comfortable with. I was all the more livid because they didn't clarify this point earlier. It was a reputed organisation but still I gave them my piece of mind and said thank you very much.
Thanks.
Regarding changing of stack, it definitely helps you long term in your career, both from extra experience available to offer to next employer to expanded job set you can qualify for to increased programming IQ due to knowing different points of view (e.g. Perl, for all its scripting origins, when used properly, has both OO and very nice functional paradigms available, but this point applies to any new technology).
However, you must be willing to put extra effort in to actually learn new stack/environment/language, and to learn to do things new way (e.g. don't write C++ code in Perl :) - especially for that last benefit to kick in. Please note "environment" there - the jump involves for example learning new debuggers and debugging techniques (for me the hardest thing about C++ development after switching from Perl so far is probably doing effective gdb debugging after being used to flexibility/power of perl debugger).
Personally I had to make this jump twice - from C developer to Perl and 10 years later Perl to C++. I learned a lot both times, and am not sorry I made the jump. The first jump was from IT role (Junior SA/Production with some C coding) to a full-on developer, the second was just a jump between different business teams.
As for demerits, please be aware that you WILL lose your edge in whichever stack you're not currently using for a while. Not completely forget, but nowhere near where you left off - and that does not even count the fact that the stack may have naturally evolved in the time elapsed. Also, as I said, you MUST expect that to be effective, you have to put in a lot of effort to become fluent in idiomatics, philosophy and ecosystem of the new stack. E.g. simply learning Perl is a small piece of the puzzle - you need to become familiar with a large chunk of CPAN, just as you had to know STL etc... Not really a demerit as far as I'm concerned, but a point that needs to be kept in mind.
As for opportunities for Perl developer, this was extensively covered on SO before. While the absolute # of jobs is likely less than that of Java or C++ ones, a high quality developer will always be in demand, and there's plenty of companies (including, or may be especially, in financial industry) heavily using serious Perl development (as opposed to simple administrative scripting). The language itself is developing and moving forward as well.
This is a highly subjective question. Whether C++ is "going places" depends on where you look and who you ask. For instance, C++ is the development language for video games and graphics processing, and is also used a lot in device drivers in conjunction with C (usually I see a hybrid "C+-", where some features are used from C++ in conjunction with more C-style architecture).
I myself moved from a C/C++ environment into a mostly Perl one, with strong Unix all the while (I actually know next to no windows API programming, .NET, VB-Basic etc).
Basically what I would suggest is sticking with what you enjoy most. This may not be the same as what you currently know or are best at. There are opportunities in a diverse set of technologies. Don't also assume you should tie yourself to one environment -- dabble a little and have some fun. Many facets of programming are constant across languages and environments. Get good at problem solving, writing unit tests, refactoring and planning a project, and you'll do well no matter what set of technologies you're working with.
Why not use your C++ expertise when working with Perl (where appropriate)? It's quite possible to extend Perl with C and C++. I'm not suggesting that you write all your code in C++ just because you know it and than put a thin Perl layer on top, of course.
Being experienced in related technologies is a really big advantage, not a mis-qualification for a technical job. I would suggest you take the opportunity to learn a new technology. Going back from "experienced but rusty" to "on top of it" in your bread-and-butter discipline should be a piece of cake if you have to at a later point in time.

Is C++ a "waste of time"? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
I ran into this supposed interview of Bjarne Stroustrup, the inventor of C++.
http://artlung.com/smorgasborg/Invention_of_Cplusplus.shtml
Stroustrup: Well, it's been long enough, now, and I believe most people have figured out for themselves that C++ is a waste of time but, I must say, it's taken them a lot longer than I thought it would...
Interviewer: Yes, but C++ is basically a sound language.
Stroustrup: You really believe that, don't you? Have you ever sat down and worked on a C++ project? Here's what happens: First, I've put in enough pitfalls to make sure that only the most trivial projects will work first time. Take operator overloading. At the end of the project, almost every module has it, usually, because guys feel they really should do it, as it was in their training course. The same operator then means something totally different in every module. Try pulling that lot together, when you have a hundred or so modules. And as for data hiding, God, I sometimes can't help laughing when I hear about the problems companies have making their modules talk to each other.
Is this a hoax? Do any of these points seem true for any of the veteran C++ programmers out there?
You just have to check the Stroustrup's website (the FAQ part) to find that it's wrong - a well known hoax as Judah Himango already pointed :
Did you really give an interview to IEEE?
in which you confessed that C++ was
deliberately created as an awful
language for writing unmaintainable
code to increase programmers'
salaries? Of course not. Read the
real IEEE interview.
It's a well-known hoax.
And no, learning C++ isn't a waste of your time, something that's been discussed on StackOverflow many times.
As mentioned, this is a well-known hoax.
But it does provoke some interesting points. These days C++ is a waste of time, except for when you can't afford to waste time. Less opaquely: C++ is a waste of development time, except for when you can't afford to waste execution time.
From the article titled "The Real Stroustrup Interview" in IEEE Computer Magazine Vol. 31 Issue 6 pp.110-114 (June 1998):
For the past few months, a hoax interview between Stroustrup and Computer has been making the rounds in cyberspace. While we regret the incident, it offers us a welcome opportunity to have the father of C++ share his insights on Standard C++ and software development in general. We can also attest to his continued sense of proportion and humor—he suggests that the fictitious interview would have been a much funnier parody had he written it himself.
As others mentioned, this Interview is hoax.
Well, I am one of the persons who hate C++ and normally doesnt use it, but learning it was definitely not a waste of time. At least now I know why I hate C++ and I understand why other persons use this language and think it is good.
If you want to learn this language to know about its concepts, its benefits and its drawbacks, to be able to read code written in it, and in general to be able to "talk about" it, it is never a waste of time. Same for any other programming language. It will increase your expierience. For example, C++ shows one common way of OOP - a way I dont like, but a way many other people use.
But if you want to learn it because "the people say that it is the best" (as I sometimes read), then it is really a waste of time. Same for any other programming language.
Programmers that feel attracted to higher level languages that take care of memory management and other tasks for them, could feel that C++ is a waste of time.
It certainly is if you can achieve the same goal with another language in less time and with less bug fixing and don't mind the downsides as efficiency.
But I don't regret having learned and spent so many hours coding in C/C++ for it's such a beautiful language and allows you to produce things that not many other languages can.
I mean, don't you want to use the language with which operating systems and compilers are written? that's not a waste of time at all from my perspective.
C++ is far from being a waste of your time. You'll understand valuable concepts that will help you understand many other concepts in different programming languages. I.E.: VTABLE.
There is not a single framework which uses all language features of C++. This introduces a huge inconsistency to the language's ecosystem.
QT is one of the few APIs which I would call a framework (or API for a lot of things):
But it defines own string, own array, ...
What's the point of a "standard" library when no one can use it in a portable and compatible way (from the aspect of interaction with other APIs)?
I know, there is boost, but what is boost compared to an API such as QT? Nothing.
Look at Java: The is the standard Java API, and every "foreign" API uses it, it's all perfectly compatible.
C++ (and Java) probably the best language to learn to understand concepts of OOP.
I remember learning it in college benefited me a lot.
Stroustrup is not that stupid to say that! It is definitely a hoax!