C++ Better for Performance than C? [closed] - c++

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
I'm currently working on some high-performance, stability-critical frameworks, which will likely deploy on x86_64, ia64, and potentially ARM platforms. The library is so far done in C. The current standard is C99, though we are interested in experimenting with features of C11.
Initially, the choice to avoid C++ was because we wanted to prevent developers from using classes due to their inherent inefficiencies, such as larger memory footprints, vtables, and inheritance. We also wanted to keep structs free of member functions. In other words, C was chosen over C++ deliberately to prevent to use of certain features in C++.
However, we recently did a double-take after further investigating some of C++'s features. It certainly seems to have some benefits, mainly type safety and generics.
What I would like to know is:
1- What, exactly, does this type-safety mean for the programmer and compiler?
2- What are the benefits of C++'s type safety, and how can we avoid the pitfalls of unsafe typing with C?

1- What, exactly, does this type-safety mean for the programmer and compiler?
Type safety protects you from debugging silly mistakes, such as Adding Degrees and Radians together or trying to multiply a "string" to an integer. I wouldn't worry about the effects on the compiler. Having programmed in both type-safe languages (C++) and Non-typesafe (PERL,C) I would say that I normally spend less time debugging "Computer internal" things in the type-safe languages (again, adding strings and integers) but spend more time chasing type values and definitions and converting between them.
2- What are the benefits of C++'s type safety, and how can we avoid the pitfalls of unsafe typing with C?
The Type safety is a level of protection that allows the compiler to check that what you are doing is sane. For an individual this is less important than in a group setting because while you know that your "GetNumberOfStudents" function outputs a string instead of an integer, your co-workers may not. The bigger advantage of C++ over C is that you can separate the way you store your data from the way you retrieve your data, so that "GetListOfAllCustomers" won't change to the people using the function if you decide to internally change your data structures.
Short answer: If you're willing to trade developer time and hardware comprehension time for performance and compactness, I would lean towards C. If you're willing to trade a small amount of performance and aren't memory bound, to lessen developer time, I would lean towards C++. I program in C# for all my data analysis and C for all my embedded software work.

Templates in C++ can make it practical to write code that yields better performance than the same code in C. For example, you can do things like generate unrolled loops tuned at compile time to match your problem size coherently to your target's cache size.
Pointer arithmetic and casts in C can potentially lead to buffer over-runs, dangling references and memory leaks. C++ features like smart pointers and container classes can greatly reduce the incidence of these kinds of bugs.
Powerful idioms like RAII are directly supported by C++ language features, and make it much easier to do things like write multi-threaded concurrency correctly without introducing race conditions or deadlocks.
Rich types that represent critical data attributes like unit values and dimensions enable the compiler to catch unit conversion errors at compile time. It is possible to do checked conversions with structs and typedefs in C, but a typical C implementation would catch many unit errors only at run time, if at all, whereas a C++ implementation can be both safer and faster than the equivalent functionality written in C.
These are just some simple examples, but there is much, much more to say about the pitfalls and limitations of C, and the ways that C++ language features may be used to write code that is fast, correct and resilient to change.

Thanks to everyone's contributions here, our team decided to leave the code in C.
To answer the questions posed, here is what I found:
1- Type safety means that type compatibility is checked at compile time. In C, variables are mostly type safe. The reason that C is not regarded as type safe is generally because va_list, which is used in many common operations in C, especially stdio, is not type safe. Another reason that C has a reputation of being unsafe is that it has allows implicit conversions. To the programmer, type safety means catching type mistakes at compile time. To the compiler, it means checking type assignments and implicit conversions at compile time, and reacting more strictly than in a non-type-safe scenario. As far as I could find, it does not make any real differences in the compiled binary.
2- C++'s type safety mainly serves to catch invalid implicit conversions at compile time, hopefully making the programmer's life easier. However, by compiling with gcc and -Wconversion (we usually use -Wall) we get this feedback in the form of warnings rather than failures to build, so the benefits are relatively small as long as we pay close attention to our compiler output.
With C, unsafe typing issues can be virtually eliminated by good coding practice and review.

Related

Why c++ is not used in L2/L3/L4 development projects [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I have been involved in numerous c++ projects mainly in the application domain pertaining to VOIP protocols. Now I have to move to L3 , L2 protocol development projects where I found 'C' is preferred language of choice for the L2/L3/L4 developers.
Now I am wondering expect device firmware related applications, why protocols are developed using stone age era language. Why ppl dont take the benefits of OOPS techniques? Will it be prudent if I try to convince them to switch to c++. Most of the developers working in the team are C experts and not comfortable with C++.
There are several reasons for continuing using C.
There are existing C projects. Who will pay for converting them into C++?
C++ compiler (of a good quality) is not available on every platform.
Psychological reason. When you pass and return objects by value, temp objects are created left and right. This is not ok for small systems. People do not really understand that passing and returning references completely solves this problem. There are other similar issues.
And finally. What is wrong with C? It works! (Do not fix what is not broken).
It is possible to write the same performant code on C++ as on C, but this requires better understanding, training, code control discipline. Common percetion is that these flaws are unavoidable.
If you think of C as simply a "stone-age language," then I think you misunderstand why people continue to use it. I like and use both C and C++. I like them both for different reasons, and for different kinds of problems.
The C language presents a model of the computer that is both (mostly) complete and very easy to understand, with very few surprises. C++ presents a very complex model, and requires the programmer to understand a lot of nuance to avoid nasty surprises. The C++ compiler does a lot of stuff automatically (calling constructors, destructors, stack unwinding, etc.). This is usually nice, but sometimes it interferes with tracking down bugs. In general, I find that it's very easy to shoot yourself in the foot with both C and C++, but I find the resulting foot-surgery is much easier to do in C, simply because it's a simpler language model.
The C model of a computer is about as close to assembly as you can while still being reasonably portable. The language does almost nothing automatically, and lets you do all kinds of crazy memory manipulations. This allows for unsafe programming, but it also allows for very optimized programming in an environment with very few surprises. It's very easy to tell exactly what a line of code does in C. That is not true in C++, where the compiler can create and destroy temporary objects for you. I've had C++ code where it took profiling to reveal that automatic destructors were eating a ton of cycles. This never happens in C, where a line of code has very few surprises. This is less of an issue today than it was in the past; C++ compilers have gotten a lot better at optimizing many of their temporaries away. It can still be an issue, though, and especially in an embedded environment where memory (including stack space) is often tight.
Finally, code written in C++ often compiles slowly. The culprits are usually templates, but eliminating templates often makes your C++ code look a lot like C. And, I really cannot overstate how much this can affect productivity. It kills productivity when your debug-fix-recompile-test cycle is limited by the compilation time. Yes, I know and love pre-compiled headers, but they only do so much.
Don't get the impression that I'm anti-C++ here. I like and use the language. It's nice to have classes, smart pointers, std::vector, std::string, etc. But there's a reason that C is alive and kicking.
For a different perspective, and one that is firmly anti-C++, you should at least skim over Linus Torvald's perspective on C++. His arguments are worth thinking about, even if you disagree with them.

Comparative advantages between C vs C++ for new projects [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 12 years ago.
For each new low-level program or library I write on POSIX systems, I always have to start out with the initial decision: do I write it in vanilla C, or do I go with C++? I like to think that each time I'm making a relatively informed decision, but I wonder if there's something I'm missing.
This isn't a which is better question, but rather, what aspects of each are better? Presumably, each has compelling strengths. In which cases should I chose the one instead of the other?
For example, below are some of the points I consider. What else am I missing?
Favoring C
Compatibility: Virtually every language and framework has some mechanism for interfacing with code written in C.
Simplicity: Debugging template code makes you age faster
Popularity: Think of all your favorite applications, servers, interpreters, and other tools. Chances are most of them are written in C, even though C++ was available when they started. All the cool kids use C.
Favoring C++
The STL: You certainly could implement your own RB-tree, quicksort algorithm, or double-linked list. But it probably won't be as good.
Templates: Sure, it's a pre-processor function masquerading as a language feature, but it sure is convenient.
Classes: C++ isn't exactly smalltalk, but at least it's not fancy assembly language either.
Compatibility: You can still use C in a C++ project.
I think you're making it more complicated than it really is. Which language are you better at in expressing your idea? If neither, and if you're a beginner at both, use C; otherwise if you're good at both pick what you feel like. Otherwise it doesn't matter nearly as much as just starting.
Alice: Would you tell me, please, which way I ought to go from here?
The Cat: That depends a good deal on where you want to get to
Alice: I don't much care where.
The Cat: Then it doesn't much matter which way you go.
Alice: so long as I get somewhere.
The Cat: Oh, you're sure to do that, if only you walk long enough.
C++ simply has many more features than C. That makes it a more complex language. But the benefit of using these features is that you will have to write (and maintian) less code.
You're not required to use templates, stl, exceptions, function overloads, or whatever C++ feature. But if your problem needs just one of these features, your program will be more readable if you do it in C++, rather than emulating the missing functionality in C.
You forgot to mention that in C++ there are destructors that are called automatically, so when used correctly (RAII) you don't need to worry about resource deallocation. Another good feature are exceptions that could make the error handling easier and more maintainable.
For myself, there is only two reasons to use C. First is if you need the code to be extremely portable (going to be used as a library in different languages and/or operating systems), and second if you need raw speed, which usually isn't a big deal as C++ typically performs only slightly slower than c (not including OO features).
I really enjoy the OO features of C++, which if used properly can make life a lot easier when developing applications.
It sounds like you favor C over C++. I do too. However, ease of use is the most important factor in programming. C++ has better string support and more libraries, so for non-trivial projects, such as database access and stuff like that, go with C++. If you are aiming to be cross platform and maybe want to work on a lower level, use C. Besides, they're both the same anyway.
C++ is better in almost every way: safer, more efficient, works better in large projects... The only exception is that you can't use it when you interface with other languages. But in that case you still use C++ and add a small C layer for the interfacing part.
C has some advantages above C++ in the early phase of a project, it's simpler, easier and requires less design decisions. However, as the project grows so do the advantages of C++ and Object Oriented Code, which are essentially: Encapsulation, Abstraction and Information Hiding. The drawback is usually slightly slower code unless you break encapsulation.
Yes, it's possible to write like C++ in C, too, but it is far more complex and a hell to maintain.
When I have a choice, I go with a subset of C++.
compatibility - not a problem, you can use extern "C" for linking with C libraries
simplicity - avoid templates, overloading operators, and other C++ feature that obfuscate code
You still get the advantages of classes and RAII.

Is it worth writing part of code in C instead of C++ as micro-optimization?

I am wondering if it is still worth with modern compilers and their optimizations to write some critical code in C instead of C++ to make it faster.
I know C++ might lead to bad performance in case classes are copied while they could be passed by reference or when classes are created automatically by the compiler, typically with overloaded operators and many other similar cases; but for a good C++ developer who knows how to avoid all of this, is it still worth writing code in C to improve performance?
I'm going to agree with a lot of the comments. C syntax is supported, intentionally (with divergence only in C99), in C++. Therefore all C++ compilers have to support it. In fact I think it's hard to find any dedicated C compilers anymore. For example, in GCC you'll actually end up using the same optimization/compilation engine regardless of whether the code is C or C++.
The real question is then, does writing plain C code and compiling in C++ suffer a performance penalty. The answer is, for all intents and purposes, no. There are a few tricky points about exceptions and RTTI, but those are mainly size changes, not speed changes. You'd be so hard pressed to find an example that actually takes a performance hit that it doesn't seem worth it do write a dedicate module.
What was said about what features you use is important. It is very easy in C++ to get sloppy about copy semantics and suffer huge overheads from copying memory. In my experience this is the biggest cost -- in C you can also suffer this cost, but not as easily I'd say.
Virtual function calls are ever so slightly more expensive than normal functions. At the same time forced inline functions are cheaper than normal function calls. In both cases it is likely the cost of pushing/popping parameters from the stack that is more expensive. Worrying about function call overhead though should come quite late in the optimization process -- as it is rarely a significant problem.
Exceptions are costly at throw time (in GCC at least). But setting up catch statements and using RAII doesn't have a significant cost associated with it. This was by design in the GCC compiler (and others) so that truly only the exceptional cases are costly.
But to summarize: a good C++ programmer would not be able to make their code run faster simply by writing it in C.
measure! measure before thinking about optimizing, measure before applying optimization, measure after applying optimization, measure!
If you must run your code 1 nanosecond faster (because it's going to be used by 1000 people, 1000 times in the next 1000 days and that second is very important) anything goes.
Yes! it is worth ...
changing languages (C++ to C; Python to COBOL; Mathlab to Fortran; PHP to Lisp)
tweaking the compiler (enable/disable all the -f options)
use different libraries (even write your own)
etc
etc
What you must not forget is to measure!.
pmg nailed it. Just measure instead of global assumptions. Also think of it this way, compilers like gcc separate the front, middle, and back end. so the frontend fortran, c, c++, ada, etc ends up in the same internal middle language if you will that is what gets most of the optimization. Then that generic middle language is turned into assembler for the specific target, and there are target specific optimizations that occur. So the language may or may not induce more code from the front to middle when the languages differ greatly, but for C/C++ I would assume it is the same or very similar. Now the binary size is another story, the libraries that may get sucked into the binary for C only vs C++ even if it is only C syntax can/will vary. Doesnt necessarily affect execution performance but can bulk up the program file costing storage and transfer differences as well as memory requirements if the program loaded as a while into ram. Here again, just measure.
I also add to the measure comment compile to assembler and/or disassemble the output and compare the results of your different languages/compiler choices. This can/will supplement the timing differences you see when you measure.
The question has been answered to death, so I won't add to that.
Simply as a generic question, assuming you have measured, etc, and you have identified that a certain C++ (or other) code segment is not running at optimal speed (which generally means you have not used the right tool for the job); and you know you can get better performance by writing it in C, then yes, definitely, it is worth it.
There is a certain mindset that is common, trying to do everything from one tool (Java or SQL or C++). Not just Maslow's Hammer, but the actual belief that they can code a C construct in Java, etc. This leads to all kinds of performance problems. Architecture, as a true profession, is about placing code segments in the appropriate architectural location or platform. It is the correct combination of Java, SQL and C that will deliver performance. That produces an app that does not need to be re-visited; uneventful execution. In which case, it will not matter if or when C++ implements this constructors or that.
I am wondering if it is still worth with modern compilers and their optimizations to write some critical code in C instead of C++ to make it faster.
no. keep it readable. if your team prefers c++ or c, prefer that - especially if it is already functioning in production code (don't rewrite it without very good reasons).
I know C++ might lead to bad performance in case classes are copied while they could be passed by reference
then forbid copying and assigning
or when classes are created automatically by the compiler, typically with overloaded operators and many other similar cases
could you elaborate? if you are referring to templates, they don't have additional cost in runtime (although they can lead to additional exported symbols, resulting in a larger binary). in fact, using a template method can improve performance if (for example) a conversion would otherwise be necessary.
but for a good C++ developer who knows how to avoid all of this, is it still worth writing code in C to improve performance?
in my experience, an expert c++ developer can create a faster, more maintainable program.
you have to be selective about the language features that you use (and do not use). if you break c++ features down to the set available in c (e.g., remove exceptions, virtual function calls, rtti) then you're off to a good start. if you learn to use templates, metaprogramming, optimization techniques, avoid type aliasing (which becomes increasingly difficult or verbose in c), etc. then you should be on par or faster than c - with a program which is more easily maintained (since you are familiar with c++).
if you're comfortable using the features of c++, use c++. it has plenty of features (many of which have been added with speed/cost in mind), and can be written to be as fast as c (or faster).
with templates and metaprogramming, you could turn many runtime variables into compile-time constants for exceptional gains. sometimes that goes well into micro-optimization territory.

Situations where "old" C features might be better than newer C++ ones? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 12 years ago.
Recently i had a discussion with my boss (a long time C developer) who discouraged me in using C++ streams and stick to "good old" printf & friends. Now i can understand why he is saying this and believe me i did not follow his advice.
But still this is bugging me - are there things in C that are still better in some cases than newer C++ implementations of the same/similar thing? By better i mean for example performance, stability or even code readability/maintainability. And if so, can someone give me examples? I'm mainly talking about similar differences like printf/streams, not about features like inheritance or OOP for that matter. The reason why i'm asking all this is that i consider myself a C++ developer and as such I always try to code the C++ way.
C printf()-style output is typically faster than C++ ostream output. But of course it can't handle all the types that C++ output can. That's the only advantage I'm aware of - typically, because of aggressive inlining, C++ can be a lot faster than C.
There is one thing that C programmers sometimes point out and that is worth considering: If you stay away from macros, then it's mostly obvious what a line of C code does. Take for example this:
x = y;
In C, this is an assignment and only an assignment. The value of y is (after a possible conversion) copied into x.
In C++ this could literally mean anything.
A simple assignment,
a user defined conversion operator in y which deletes the internet and returns a value that is of the same type as x
There is a constructor which makes an object of x's type from y, after melting down a nuclear power plant. This value is assigned to x.
There is a user defined assigment operator which allows assignment from a bunch of other types, for which y has a conversion operator or which are in some other ways obtainable from y. The assignment operator has a bug which might create a black hole, because its a part of the LHC operation software.
more of the above.
To make it even more interesting, every single operation might throw an exception in C++, which means that every line must be written in a way that it can rollback what it changed, which is sometimes hard when you can't say what a line actually does. And to make it worse, your program might crash instantly, because the exception happens because the assignment is called during a exception unwind. In C++ things tend to become "vertically complex", which poses its own requirements to the capabilities and the communication skills of the developers.
When you're writing C++, write C++. When you're writing C, write C. Whoever says different is probably uncomfortable with the differences, or thinks of C++ as a "better C". That isn't the case; C++ is its own language with its own features, and is mostly C-compatible for the sole purpose of easing conversion.
As far as performance goes, I used to be a USACO competitor. I quickly found that 98% of one of my programs' runtime was spent using C++ IOStreams. Changing to fscanf reduced the overhead by a factor of ten. Performance-wise, there's no contest at all.
I think C style is better when you need raw memory management. It is a bit cumbersome to do that with C++ constructs and you don't have realloc() for example.
Someone who down voted that, probably never tried to explore the topic.
I'm surprised how people can't imagine themselves in different positions. I'm not saying that everybody should use C style constructs. I'm saying that C style is better when you NEED raw memory management. Someone has to write all those secure classes/libraries (including standard library, garbage collectors, memory pools). Your experience in which you never need it does not cover all cases.
Another situation is when you write a library. With C you get pretty symbols table, which can be easily binded with many other programming languages. With C++ you will have name mangling, which makes library harder (but not impossible) to use in non-C++ environment.
I couldnt give you a conclusive answer; however i found this rather dated comparison interesting.
http://unthought.net/c++/c_vs_c++.html
I dont think using printf style functions generally over iostreams is justified.
iostreams just greatly speed up development time and debugging time, and are much less error prone (e.g. think of buffer overflows, wrong % type specifiers, wrong number of arguments ... and the biggest problem is that the compiler cant help you at all).
And if you dont use endl when it isnt needed, cout isnt that much slower than printf.
So generally you should go with C++ iostreams, and only if profiling shows that critical sections take too much time because of iostream calls, then optimize those sections with C style functions, but make sure to use the safer versions of the functions like snprintf instead of sprintf.
Examples:
Consider you have a int foo variable, which you printf in a number of places, later during development, you realize you need foo to be a double instead. Now you have to change the type specifiers in every printf style call which uses foo. And if you miss one single line, welcome in the land of undefined behaviour.
Recently i had a case where my program crashed because i missed a simple comma, and because of the great printf-style command, my compiler didnt help me: printf("i will crash %s" /*,*/ "here");. This wouldnt have happened with iostreams either.
And of course you cant extend the behaviour of printf and friend to work with your own classes like you can with iostreams.
Good old C! Ah, the pre-ANSI days... <sarcasm>I certainly miss having practically no type checking on arguments and returns values or having the compiler assume anything untyped is an int and not an error.</sarcasm>
Seriously, though - there is a fairly good argument against using exceptions as error handling. I read a fairly decent argument against exceptions for system level work and mostly I think the problem is that you can't simply read a block of code and know it won't throw in C++, whereas you can read most C and say "all the errors (at this level) are trapped" or "the ones that aren't don't matter".
Where using C++ features might be problematic:
portability: IMHO C is still more portable
mixed language programming: calling a C function from another language is almost never problematic, with C++ you quickly get in trouble because of name mangling etc.
performance issues: features like templates may lead to code bloat, temporary object creation may have a huge impact too, etc...
maintainability: Since C++ is more complex than C, Restrict use to language features you expect the person who is later maintaining your code to be capable of.
However, some/most of C++ features are quite handy and useful if used with care.
Remember the saying "With C++ it's harder to shoot yourself in the knee, but if you do, it will cost you the entire leg".
I sometimes prefer pointers and memcpy over iterators and std::copy when I don't need generic code.
Same for iostreams, they are convenient and extensible but there are a lot of situations when [f|s]printf / scanf are as simple.

Does it feel anytime that c++ sometimes reduces problem solving time and increases syntactic, semantic rigor? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 12 years ago.
C++ introduces OOPS, templates and variety of other concepts. But sometimes I am stuck in unmanageable storm of calling convention incompatibilities between methods, convoluted casting between distantly connected classes and struggling with code that forces me think in the level of byte-alignment.
I am very tempted to go back to plain old-C language that is low level enough and simple enough that I can spend much of my time solving a problem at hand, than having to figure out the conceptual implementation and nuances of using C++.
What is your take on such an issue. Using C-language as the first class citizen in my code-base and coating it at the end with a C++ primer make for a better way to manage a conceptual code-base ??
sometimes I am stuck in unmanageable storm of calling convention incompatibilities between methods, convoluted casting between distantly connected classes and struggling with code that forces me think in the level of byte-alignment.
It sounds like you might be doing something wrong. I don't know what sorts of projects you work on, but I've rarely had to deal with any of those things--certainly never in over 99.9% of the code I've written--and I've written a bit of C++ code (though not nearly as much as others here on StackOverflow).
What is your take on such an issue.
You should consider getting a good book on C++ (like one of the introductory books or any of the best practices books listed in The Definitive C++ Book Guide and List) and really learn C++ if you want to use C++.
I often feel the way you do. C++ compilers are incredibly bitchy about insignificant details and, if you considered them an object, they have appalling encapsulation, and they give horrendously bad error messages. Sometimes, programming C++ feels like fighting against the Standard.
However, I'd never, ever ditch it for C. If you're considering it, you fail C++. Yes, templates and their syntax and some of their semantics can be a bitch, but the power they offer is unparalleled. The things offered like automatic memory management and the power of the STL just can't be matched by C.
In addition, nobody is forcing you to use templates. You could write a whole C++ program using nothing but the pre-provided templates. You could never use polymorphism, or templates, or encapsulation, or object-orientation as a whole, and yes, sometimes none of those solutions are appropriate. But it's plain stupid not to give yourself the option.
And if you're casting classes in C++ (frequently), it sounds to me like whoever wrote the original code flat out didn't know what they were doing. Same for byte alignment.
I've never had to worry about byte alignment unless I was writing binary files.
Casting distantly related classes to each other is bad design.
I've never worried about calling conventions unless I was writing interrupt routines.
If you forced to frequently "cast distantly connected classes" and "think in the level of byte-alignment" - there is something wrong with the architecture of the project. Language is not the problem here.
C++ is heavy. You probably should not use all the features it provides. KISS principle, you know.
You can always pretend that C++ is just "C with classes" and exploit templates and other "hard" stuff only when it will provide reasonable improvement in some areas (like code simplicity, as a matter of fact).
I recently switched back from C++ to C and took a liking in C99. But it certainly depends on what you are doing. For a library of reasonable size, C99 with its advantages over C89, is good enough, and as somebody else said you can easily provide a C++ wrapper, if necessary. C++, I only would go for a big project that has large amount of code reuse (internal or external) with templates.
Things from C++ I missed in C89 have nothing to do with objects or so, but are simple things as declaration of variables inside the for and a well defined inline feature. C99 has that plus variable length arguments for macros, so I am happy with it.
You are not alone. These are all well-known flaws in C++, a language which forces developers to attend incompatible conventions, to struggle to overcome a convoluted caste system, and to take their code in for byte-realignment every 3,000 lines. Definitely switch back to C.