Where to save my C++ libraries? (Ubuntu) - c++

I am not sure this strictly a programming question, so I apologize if it is not.
I developed a few libraries in C++ that I mean to use in several different projects.
Till now I kept copying the updated library in the folders of the various projects.
As you can imagine this is not ideal, so I would like to create a "3rd-parties" folder
where I save the libraries I write and other that I might download in the future.
How can I do this? And considering I'll want to share/release my code later on what is the best strategy to be sure that the used libraries are included in the code I deploy?

There are no hard and fast rules. But if these are 1) general-purpose, for 2) global sharing, then I'd suggest /usr/local/lib (for your .a and .so libraries) and /usr/local/include (for the corresponding headers).
Here's a good description of "standard file locations" for Linux:
http://tldp.org/LDP/Linux-Filesystem-Hierarchy/html/Linux-Filesystem-Hierarchy.html

If you want to share your modules with new project its better to organise them in a single folder and mark this folder as included library path in your new projects.

Related

Suggestions on how to build a library that can be used in many of my applications

In the past I have created a jar file that contained many "helper functions" that I used and made common to many different applications. I felt this was important as anytime I used my "helper" jar file in any new applications or when making changes to any existing ones, the latest and most up to date version of my "helper" jar was always used. It was developed separately and had it's own version control.
I'm looking to do something similar with C/C++
At the moment I have a collection of headers, doing something similar to my "helper" jar in java but finding it cumbersome managing changes, ensuring the most up to date collections are used. So for example, if I made some changes to these "helper" headers, I need to copy them into each project and rebuild.
If we take the below as an example of what I do in Java;
and the below is the structure that I'd like to do something similar with in C++;
I'd like some way of keeping my_includes separate so that any changes I made to my_includes are automatically included in any existing or new applications, in the way Utilities.jar is in the above Java example
I accept that I cant build a library or such as it won't then be as portable, right?
I suspect I'm missing something quite obvious, just not to me.
All helpful comments appreciated, thanks in advance...
At first you could create a library from your utilitis.cpp and all include files independently and add this library to any project. I just provide URL for sample generate the static and shared library.
Create static and shared library (GCC)
And then you can add custom include files to any project in c++ just need to add the specific directories to your include Path in compile time base your platform or if you use cmake you can edit the "include_directories". And also you should link the generated library to your project as described in provided URL.

Good practices when adding downloaded c++ source code to my project

I am trying to use gnuplot++, but this is really a more general question about downloaded source code. I have downloaded the gnuplot++ source code and it consists of multiple .h and .cc files. I would like to use it in othercopy projects in the future so I am reluctant to add all the files into my project directory.
From what I understand gcc will look in /usr/local/include for header files, so I have put the code there for now. But what is the best way to compile and link the code?
Should I use the makefile to include the directory of the source code?
Should I keep it somewhere easy to find like /usr/local/include?
How do I know the best way to compile the code in gnuplot++?
Typically, if the project itself doesn't come with install instructions, I usually add it somewhere "public", e.g. /usr/local/project/{lib,include,src,...} where "project" in this case would be gnuplot++.
In this case, there doesn't appear to be any support for building this into a library, which makes it a little more awkward, as you need the sources included in your project itself. I'd still keep those sources separate, but you may prefer to just put them into a separate directory within the project [or spend an hour or three making a library of it].
For general practice, yes, keep the source for gnuplot++ (or any other similar 3rd-party project) separate from your own application source code. This makes it much easier to manage updates to the 3rd party projects, etc.
Yes, I would use the makefile for your application to also include the path to the headers for gnuplot++ and I would not copy those files directly into /usr/local/include. Instead, I would consider a couple options: do nothing and point your include path in your makefile to the gnuplot++ directory, or put symbolic links in /usr/local/include to point to the gnuplot++ files.
As for the best way to compile gnuplot++, I would have to look at gnuplot++ myself and see what it has to say, perhaps in a README file or similar.
In general, when using third-party libraries, you build and install those libraries according to the installation description that comes with the downloaded source.
If there is no installation guideline, it is typically a set of steps like
./configure
make
make install
Then it is the responsibility of the library to ensure the relevant headers and library files are easily locatable for use in your project.
gnuplot++ is an exception here, because it does not seem to come with its own build structure.
The best advice in cases such as this is to put the source from gnuplot++ in a directory within your project (possibly parallel to your own sources) and include the files in your own build setup.

CMake: how best to build multiple (optional) subprojects?

Imagine an overall project with several components:
basic
io
web
app-a
app-b
app-c
Now, let's say web depends on io which depends on basic, and all those things are in one repo and have a CMakeLists.txt to build them as shared libraries.
How should I set things up so that I can build the three apps, if each of them is optional and may not be present at build time?
One idea is to have an empty "apps" directory in the main repo and we can clone whichever app repos we want into that. Our main CMakeLists.txt file can use GLOB to find all the app directories and build them (not knowing in advance how many there will be). Issues with this approach include:
Apparently CMake doesn't re-glob when you just say make, so if you add a new app you must run cmake again.
It imposes a specific structure on the person doing the build.
It's not obvious how one could make two clones of a single app and build them both separately against the same library build.
The general concept is like a traditional recursive CMake project, but where the lower-level modules don't necessarily know in advance which higher-level ones will be using them. Yet, I don't want to require the user to install the lower-level libraries in a fixed location (e.g. /usr/local/lib). I do however want a single invocation of make to notice changed dependencies across the entire project, so that if I'm building an app but have changed one of the low-level libraries, everything will recompile appropriately.
My first thought was to use the CMake import/export target feature.
Have a CMakeLists.txt for basic, io and web and one CMakeLists.txt that references those. You could then use the CMake export feature to export those targets and the application projects could then import the CMake targets.
When you build the library project first the application projects should be able to find the compiled libraries automatically (without the libraries having to be installed to /usr/local/lib) otherwise one can always set up the proper CMake variable to indicate the correct directory.
When doing it this way a make in the application project won't do a make in the library project, you would have to take care of this yourself.
Have multiple CMakeLists.txt.
Many open-source projects take this appraoch (LibOpenJPEG, LibPNG, poppler &etc). Take a look at their CMakeLists.txt to find out how they've done this.
Basically allowing you to just toggle features as required.
I see two additional approaches. One is to simply have basic, io, and web be submodules of each app. Yes, there is duplication of code and wasted disk space, but it is very simple to implement and guarantees that different compiler settings for each app will not interfere with each other across the shared libraries. I suppose this makes the libraries not be shared anymore, but maybe that doesn't need to be a big deal in 2011. RAM and disk have gotten cheaper, but engineering time has not, and sharing of source is arguably more portable than sharing of binaries.
Another approach is to have the layout specified in the question, and have CMakeLists.txt files in each subdirectory. The CMakeLists.txt files in basic, io, and web generate standalone shared libraries. The CMakeLists.txt files in each app directory pull in each shared library with the add_subdirectory() command. You could then pull down all the library directories and whichever app(s) you wanted and initiate the build from within each app directory.
You can use ADD_SUBDIRECTORY for this!
https://cmake.org/cmake/help/v3.11/command/add_subdirectory.html
I ended up doing what I outlined in my question, which is to check in an empty directory (containing a .gitignore file which ignores everything) and tell CMake to GLOB any directories (which are put in there by the user). Then I can just say cmake myrootdir and it does find all the various components. This works more or less OK. It does have some side drawbacks though, such as that some third-party tools like BuildBot expect a more traditional project structure which makes integrating other tools with this sort of arrangement a little more work.
The CMake BASIS tool provides utilities where you can create independent modules of a project and selectively enable and disable them using the ccmake command.
Full disclosure: I'm a developer for the project.

C++: how to build my own utility library?

I am starting to be proficient enough with C++ so that I can write my own C++ based scripts (to replace bash and PHP scripts I used to write before).
I find that I am starting to have a very small collection of utility functions and sub-routines that I'd like to use in several, otherwise unrelated C++ scripts.
I know I am not supposed to reinvent the wheel and that I could use external libraries for some of the utilities I'm creating for myself. However, it's fun to create my own utility functions, they are perfectly tailored to the job I have in mind, and it's for me a large part of the learning process. I'll see about using more polished external libraries when I am proficient enough to work on more serious, long term projects.
So, the question is: how do I manage my personal utility library in a way that the functions can be easily included in my various scripts?
I am using linux/Kubuntu, vim, g++, etc. and mostly coding CLI scripts.
Don't assume too much in terms of experience! ;) Links to tutorials or places where relevant topics are properly documented are welcome.
"Shared objects for the object disoriented!"
"Dissecting shared libraries"
Just stick your hpp and cpp files in seperate directories somewhere. That way, it's easy to add the directory containing the C++ files to any new project, and easy to add the headers to the include path.
If you find compile time starts to suffer, then you might want to consider putting these files in a static library.
If you are compiling by hand you will want to create a makefile to remove the tedium of compiling your libraries. This tutorial helped me when I was learning to do what you are doing, and it has additional links on the site for more detailed tutorials on the makefile.
Unless it's very large, you should probably just keep your utility library in a .h file (for the declarations) and a .cpp file (for the implementation).
Just copy both files into your project folders and use #include "MyLibrary.h", or set the appropriate directory settings so you can use #include <MyLibrary.h> without copying the files each time you want to use them.
If the library gains substantial size, you might consider looking into static libraries.

Separate "include" and "src" folders for application-level code? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
This questions concerns mostly Unix/Linux style C++ development. I see that many C++ libraries store their header files in a "include" folder and source files in an "src" folder. For the sake of conformance I adopted this in my own code. But it is not clear to me whether this should be done for application code as well. I've seen a few cases where a flat directory structure is used for that. What would be the recommended approach?
I also separate them, but not strictly on the extension, but on the access of the file.
Suppose you have a module that manages customer information and uses 2 classes to do this: Customer, CustomerValidityChecker.
Also suppose that other parts in your application only need to know about the Customer class, and that the CustomerValidityChecker is only used by the Customer class to perform some checking.
Based on these assumptions I store the files like this:
Public folder (or include folder):
customer.h
Private folder (or source folder):
customer.cpp
customervaliditychecker.h
customervaliditychecker.cpp
That way, it becomes immediately clear for callers of your module which parts are accessible (public) and which parts aren't.
We have a build system that auto-generates our makefiles. One thing it does is recursively descend any subdirectories and build them as libraries, linking them together with the main directory's objects to make the application. (In practice, these "subdirectories" are usually symbolic links.) Libraries are static unless the directory name ends in ".so". One thing that's nice about this is that a full build of our system, which has many executables, doesn't have to repeatedly compile the common libraries.
However, as a result of this, there's no separation of headers and sources. And it has never been a problem. Honestly, I think it's better this way because headers and source files have commonality of location, and you can grab a directory and know you got everything you need to use it. It also works great with Subversion's "externals" feature, and similar features in other VCSs.
One last place where an include/src separation fails is if you use any code generators, such as flex, bison, or gengetopts. Figuring out where these tools should put their outputs so they get built is tricky if you've spread things out.
It makes sense to separate them for shared libraries because they may be distributed in a compiled form without the source. I've seen projects that separate out "public" headers (headers that may be accessed from code outside your project or library) while leaving "private" headers and source files in the same directory. I think it's good to use a consistent approach whether you're writing shared library or application level code because you never know when you may want to turn something that you've written at the application level into a lower level library that is shared by multiple projects.
A lot depends on the size of project involved. Up to a few dozen files or so, keeping them in one directory tends to be more convenient. For a bigger application that includes hundreds or thousands of files, you start to look for ways to separate them (though in the projects I've worked on, it was done more on functional lines than src/include). In between those, it's probably open to question.
I don't do this; there seems little advantage in it. Since headers tend to have a different extension from source files, you can have your editor show them separately if you really feel the need -- Visual Studio does this by default, but I disable it since I prefer seeing them together
Bottom Line: sources and headers that are still changing go in /src. Code that has crystallised should go in /lib & /include (actually you could keep all .libs and their .hs in /lib).
Keep own sources and headers together, provided they are (a) specific to this project or (b) have not yet been factored out as a shared library.
Once certain sources in the main project have been factored out as a (relatively stable) library, place the .a or .lib into /lib, and its public interface header into /include.
All third party libraries and their public interface headers also go into /lib & /include.
As others note, it is often more compatible for tools / IDEs to access .h/.c from one folder. But from an organisational view it can be useful to separate changing local code from stable lib code.
There is no clear advantage to either in my view. I finally decided to keep program and header files together because my editor (Visual SlickEdit) happens to provide additional referential features when they are not separated.
I almost always create include and src folders to split up my source code. I think it makes the folder less cluttered and files are easier to find in my IDE. But I think this is just a matter of taste.
Either method is valid. It depends on the coding style you want to follow how you do this.
I place include (header) and source files in the same directory (folder). I create different folders for different themes. I get frustrated when trying to find header files (while debugging and also for researching). In some shops, there are only two folders: source and includes. These directories tend to grow exponentially. Reusing code becomes a nightmare at best.
IMHO, I believe organizing by theme is better. Each theme folder should build into at least one library. Different projects can easily include the themes by searching or including the folders. The projects only need to include the libraries. Smart build engines can list the theme folders as dependencies. This speeds up the build process.
The theme organization also adds a bit of safety to the project. Accidental damage to files (such as removing the wrong ones or replacing with different versions) is reduced since files are located in different directories. Deletion of files in the "Person" folder will not affect files in the "Shape" folder.
This is just my opinion, Your Mileage May Vary.
We have a build system which use this rule. This build system is sconspiracy a set of scripts to configure SCons and dedicated to the C++ world. You can see an example which use this tools : fw4spl