Why is Boost.ProgramOptions not header-only? [closed] - c++

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
Some boost libraries are header-only, some are not, and for various reasons etc.
Is there a specific reason/design decision why Boost.ProgramOptions is not header-only?
I'm wondering because it claims to be a "small" library in its the documentation and I don't see any system-related reason (like threads or asio).

Program Options claims to be small, but it turns out to be the second largest library we were building, after Regex. (It is bigger than boost Filesystem and Thread libraries.) I believe that you should be glad they're building a library for it instead of choking your project with a ton of included headers. Perhaps the author thought it would be small when he started and forgot to change the comment when it continued to grow and add features.

Not all C++ code can be written in just headers due to one-definition-rule violations.
For example, the storage reservation for a static member of a class needs to be in exactly one translation unit (although future C++ standards may obviate that).
The original intention was for Boost to be header only, but they had to quickly relinquish that aspiration.

Related

Should I use the Guidelines Support Library (GSL) in a new C++ project? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
What are the pros and cons in favor of and against using the Guidelines Support Library (GSL) in a new C++ project? I find some constructs there very attractive but am a bit scared of including and relying on such a fundamental library.
The GSL is just a support library for the C++ core guidelines. If you are using the GSL, then these core guidelines should be the guidelines you apply to your code (not Google's or any other found online). You don't need the GSL for the core guidelines nor do you need to use everything in the GSL. Personnally I have started using it for simple bits like index and not_null.
The GSL is not perfect, there are many things that could/should be added, it doesn't impede me for doing crazy things, but it helps adding a framework/some kind of verification to what I'm doing. Also it removes the signed/unsigned issues with index.
I would advise to use it in a new project, as its run-time overhead should be null, but it's a matter of taste. If your project has lots of new developers (or toddlers), then it's something considering to help them growing up.

How to add c++ experimental library to mac compiler? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I want to add experimental library to mac os x.
http://en.cppreference.com/w/cpp/experimental
how can i add this ?
There is no "experimental" library.
All of the C++ technical specifications that add to the standard library put their definitions into the std::experimental namespace. So std::experimental::future comes from the Concurrency TS. That TS effectively defines a few new functions in std::future, but it does so essentially by creating a new type in a new namespace with the old functions plus a few new ones. Should the TS be incorporated into the standard proper, those features will be added directly to std::future.
These technical specifications are effectively optional features that your standard library implementation may or may not support. If it does not support them, you may find libraries that provide the TS's functionality. For example, the FileSystem TS was based on Boost.Filesystem.
But there is no one thing you can download which will ensure that you will have all of the stuff in std::experimental.

What advantage do we get in using xml as a database in Embedded systems? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I have seen recently that people use xml files as a database to store the settings. However, I don't know why exactly is it done. I am from a C/C++, Linux background. Thus, please help me to understand this concept. Any simple C/C++ example will help me to understand it's benefit better?
XML is a very common tool with tons of libraries to handle it. Although it isn't the most beautiful format in the world, it is possible to read and modify it by both hand and program. Probably one want to use it when program configuration modified by some gui or tool. If you intend manual configuration, it's probably better to choose something else, for example ini. This is why linux tools rarely use XML, BTW.
As a C++ programmer you'd probably find interesting the "boost::property_tree" library to deal with configs. Examples of usage included in the documentation. Also it provides with plenty of different backends to store configuration, so you haven't to stick to some one format.

Compilation process for C++ application? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I have been programming for 2 years now and always have faced difficulty when dealing with the compilation process.
I did not study Computer Science during my Engineering, but necessity drove me towards learning C++.
I tried understanding the compilation process from some blogs, but they were always in a language I could not understand.
So I searched this site for a similar question, but could find none.
So I would like to know how the text from a .cpp is converted to a binary executable?
Basically, the preprocessor runs first resolving all your #includes, #defines, etc with simple text substitution. Then the compiler creates a compilation-unit for each .cpp file which pretty much boils everything down to machine-code except for "connections" or linkages between shared data and functions. There may be many levels of optimisation for speed and/or space performed. This is repeated for all your .cpp files. Finally, a link phase ties all these compilation-units and the libraries they use together into an executable.

boost vs POCO as for learning curve and suitability for beginners (HTTP client) [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 2 years ago.
Improve this question
which library would you advise me to use? I don't know any of these libraries.
I heard, that Boost is very often used but also it's hard to code in.
So to make this question as objective as possible:
Just simply from the aspect of beginner programmer (I've coded ~1000 LOC in C++ in my life)
which library would be better to learn?
I'll be using it mainly for HTTP client.
The answer is bound to be subjective but with particular emphasis on for a beginner then I think POCO is clearly the way to go. It actually has some HTTPClient classes and once you get beyond the point of being happy that something works the code is clear enough to follow so that you can dig in and understand why it works if that is where things lead you.
POCO is well written OOP code and does not require much in the way of understanding templates and such. The classes are well integrated with one another, extensive, and the documentation more or less points you to the next (or previous) class that you need. You won't be dashing around 20 separate libs as Boost is likely to have you doing. (There is always time for that later!)