Regarding the Necessity of ROS Packages - python-2.7

Up until this point, while working on my project, I've been building ROS scripts using rospy- establishing topics and nodes, subscribing to things, and just generally doing all sorts of functions. I've been led to believe, though, that eventually my scripts will need to be made into 'packages', with the notion being that they increase modularity of programs (and is just the way things are done).
So far, my scripts are pretty compact, and I don't see why sending out a python script invoking rospy would require this extra level of wrapping (particularly given the obfuscatory nature of most of ROS wiki's tutorials). I've not had to create catkin packages or anything for any of my programs so far. Is there some overwhelming reason why I need concern myself with ROS packages and catkin and the like? Right now, I just don't see the point when everything works well and likely would across any machine the script is run from.
Thanks!

There are a lot of cases in which you definitely want to use catkin:
Your package contains C++ code. This has to be compiled which will be taken care of by catkin.
You have custom message types. Custom messages have to be generated and compiled. Again this is done by catkin.
You have dependencies on other ROS package (or vice versa). catkin resolves this dependencies and build them if necessary.
You have Python modules which need to be installed so other packages can use them. Of course you can make your custom setup.py but using catkin is the ROS-way to do this.
When your scripts are in a catkin package, you can use the ROS command line tools (rosrun, roscd, rosed, ...), which are very convenient.
As long as you really only have simple Python scripts without dependencies on other non-core ROS packages, you are probably fine without bundling them in a package.
However, as soon as you are sharing your code with other ROS developers, I would package them nonetheless. While it may be working, it will be confusing for the others if they don't get the package structure they are used to.

Related

How should intermediate library dependencies be handled in c++?

I'm building a c++ repo that depends on external company repos that exist to support this repo. I have to build these to target certain versions of boost and other libraries specific to my system. At the end of the (long) build process, I have several static libraries and my finished executable. I use Docker for these builds.
I'm trying to decide what the cleanest approach is for managing these dependencies.
git submodules and build binaries from source each time (longest build)
build libraries individually and store them as artifacts/releases for each repo (most work, across several repos)
make a README on how to rebuild and commit the binaries to the main repo (feels dirty for some reason)
What is the common practice in c++ for dealing with these intermediate binaries?
It's probably impossible to give a generally valid answer. You have to think about who the users of your code (I mean those who compile it) are and what their workflow is.
I personally more and more tend to favor option three in your list - yes, the README file. The reason is that in many cases the users don't need to (and should not) bother with dependencies at all. Very often there is the higher-level build process in place, that makes sure all dependencies are properly prepared (downloaded, optionally patched, compiled and installed) as your application expects. With Docker, I have the feeling this is becoming the norm now. I always provide a Dockerfile (or sometimes even a complete Docker image) where all dependencies are in place and the user can compile without even thinking of those.
If it's not Docker, there may be an other higher-level build process that handles dependencies. As I'm in the embedded industry, we use mostly Yocto, but there are others also. The users don't even need to use Yocto themselves, as I provide them with an SDK that contains all dependencies.
And for the very few who refuse to use all these options and insist on compiling natively on their main machine, I write a list of dependencies in the README file, with a few lines on each, describing how to fetch, compile and install them.
As for the other two options you mentioned:
Option one (git submodules) - I must say that I never really got comfortable with that. IMO they are a bit poorly implemented in git (e.g. it's really cumbersome to find out exactly at which version each submodule is currently checked out). Also, with a higher-level build process in place, it might mean double-fetching each dependency, which is inefficient. Then it's hard to apply patches to the dependency, if you must. You would have to write some extra script and include it in your build process. Lastly, your dependencies may have dependencies by themselves, and then it gets really ugly.
Option two (storing binary artifacts in the repo) is an absolute emergency solution that I would always try to avoid.
And because somebody mentioned git subtrees - we tried that in our team for about half a year. It was an absolute disaster. Nobody really understood it, and about once a week someone messed up the entire repository. Never would I use that again.

Collaboration in a project with dependencies

I'm a DevOps engineer creating CI processes for projects. I was wondering what is the best way to deal with the following scenario: Let's say I have a C++ project (using CLion + CMake) with several developers working on it. Now in order to be built, the project has some libraries it depends on. That automatically reflects on the CMakeLists.txt file that should know where to look for those libraries.
Basically the problem is that we need to take care that every developer has these libraries in the correct paths on his machine, which is a big hassle.
One approach to handle this would be to keep those dependencies in the repository. That's great since all the developer has to do is to clone the repo and he got everything he needs in order to run compilation. But as we know, keeping binaries in SCM is not a good practice.
The question is, is there a good method to handle project dependencies in a C++ project?
I know that with C# for example, we could use NuGet packages to handle these kind of scenarios. So we'd have a NuGet repository in Artifactory that would host the dependency packages, and then in our project we'd keep a reference to the required packages, and in build time we would just download the dependecies and build the project.
Is there something alike in C++ (Running on Linux I mean)?
Hope the question is clear enough lol, had a hard time wording it..
It depends on how those dependencies are delivered and packaged. If whoever maintains your dependencies took CMake into account you can probably use find_package. If they didn't account for this, but they support pkg-config you can use FindPkgConfig. Now all you need to do is let the developers know what dependencies they need to install. This should work regardless of the OS used for development.
Other solutions may involve pulling and building the dependency code when you build your project (for example, by using git submodules if possible, or FetchContent, but this can become a nightmare if you have a lot of dependencies).
Additionally, you can try using a package manager like vcpkg, or conan (if all your dependencies are available there), or CPM.

C++ Libraries ecosystem using CMake and Ryppl

I am interested in building a cross-platform C++ Library and distributing it in source form. I want the consumers of this library to be able to acquire it, build it and consume it inside their software very easily on whatever platform they are working on and for whatever platform they are targeting. At the same time while building my library, I also want to be able to consume other popular OSS libraries through a similar mechanism.
I see that CMake and Ryppl were created with these intentions in mind and to some extent they do solve some of these problems, especially the build problem. But I don't quite know how exactly to go about achieving the above mentioned goals. Is it OK to settle on CMake as the build solution? How do I solve the library acquisition and distribution problem? Simply host the sources somewhere and let people discover, download and build them? Or is there a better way?
At the time of writing there is no accepted solution that handles everything you want. CMake gives you cross-platform builds and git (with submodules) gives you a way to manage source-level dependencies if all other projects are using CMake. But, in practice, many common dependencies you project wil need don't use CMake, or even Git.
Ryppl is meant to solve this, but progress is slow because the challenge is such a hard one.
Arguably, the most successful solution so far is to write header-only libraries. This is common practice and means your users just include your files, and their build system of choice takes care of everthing. There are no binary dependencies to manage.
TheHouse's answer is still essentially true. Also there don't seem to have been any updates to ryppl itself for a while (3 years) and the ryppl.org domain has expired.
There are some new projects aiming to solve the packaging issue.
Both build2 and wrap from mesonbuild have that goal in mind.
A proposal was made recently to add packages to the c++ standard which may open up the debate (reddit discussion here).
Wrap looks promising as meson's author has learned from cmake.
There is a good video when its author discussing this here.
build2 seems more oblivious (and therefore condemned to reinvent). However both suffer from trying to solve the external project dependencies issue simultaneously with providing a complete build system.
conan.io is another recent attempt which doesn't try to provide the build system as well. Time will tell if any of these gain any traction.
The accepted standard for packaging C and C++ projects on Unix was always a source tarball + a configure script (autotools) + make.
cmake is now beginning to replace autotools as your first choice.
It is able create RPMs and tarballs for distribution purposes.
Its also worth considering the package managers built into the various flavours of Linux. The easiest to build and install projects are those where most of the dependencies can be pulled in via yum or apt. This won't help you on windows of course. While there is a high barrier to entry getting your own projects added to the main Linux repositories (e.g. RedHat, Debian) there is nothing to stop you adding your maintaining your own satellite repo.
The difference between that and just hosting your project on github or similar is you can provide pre-built binaries for a number of popular systems.
You might also consider that configure times checks (e.g. from cmake findLibrary()) and your own documentation will tell people what needs to be installed as a prerequisite and providing you don't make it too onerous that might be enough.

Is there a build system for C++ which can manage release dependencies?

A little background, we have a fairly large code base, which builds in to a set of libraries - which are then distributed for internal use in various binaries. At the moment, the build process for this is haphazard and everything is built off the trunk.
We would like to explore whether there is a build system which will allow us to manage releases and automatically pull in dependencies. Such a tool exists for java, Maven. I like it's package, repository and dependency mechanism, and I know that with either the maven-native or maven-nar plugin we could get this. However the problem is that we cannot fix the source trees to the "maven way" - and unfortunately (at least the maven-nar) plugins don't seem to like code that is not structured this way...
So my question is, is there a tool which satisfies the following for C++
build
package (for example libraries with all headers, something like the .nar)
upload package to a "repository"
automatically pull in the required dependencies from said repository, extract headers and include in build, extract libraries and link. The depedencies would be described in the "release" for that binary - so if we were to use CI server to build that "release", the build script has the necessary dependencies listed (like the pom.xml files).
I could roll my own by modifying either make+shell scripts or waf/scons with extra python modules for the packaging and dependency management - however I would have thought that this is a common problem and someone somewhere has a tool for this? Or does everyone roll their own? Or have I missed a significant feature of waf/scons or CMake?
EDIT: I should add, OS is preferred, and non-MS...
Most of the linux distributions, for example, contain dependency tracking for their packages. Of all the things that I've tried to cobble together myself to take on your problem, in the end they all are "not quite perfect". The best thing to do, IMHO, is to create a local yum/deb repository or something (continuing my linux example) and then pull stuff from there as needed.
Many of the source-packages also quickly tell you the minimum components that must be installed to do a self-build (as opposed to installing a binary pre-compiled package).
Unfortunately, these methods are that much easier, though it's better than trying to do it yourself. In the end, to be cross-platform supporting, you need one of these systems per OS as well. Fun!
I am not sure if I understand correctly what you want to du, but I will tell you what we use and hope it helps.
We use cmake for our build. It hat to be noted that cmake is quite powerful. Among other things, you can "make install" in custom directories to collect headers and binaries there to build your release. We combine this with some python scripting to build our releases. YMMV, but some things might just be too specific for a generic tool and a custom script may be the simpler solution.
Our build tool builds releases directly from a svn reposity (checkout, build, ...) which I can really recommend to avoid some local state polluting the release in some unforseen way. It also enforces reproducability.
It depends a lot on the platforms you're targeting. I can only really speak for Linux, but there it also depends on the distributions you're targeting, packages being a distribution-level concept. To make things a bit simpler, there are families of distributions using similar packaging mechanisms and package names, meaning that the same recipe for making a Debian package will probably make an Ubuntu package too.
I'd definitely say that if you're willing to target a subset of all known Linux distros using a manageable set of packaging mechanisms, you will benefit in the long run from not rolling your own and building packages the way the distribution creators intended. These systems allow you to specify run- and build-time dependencies, and automatic CI environments also exist (like OBS for rpm-based distros).

Source code dependency manager for C++

There are already some questions about dependency managers here, but it seems to me that they are mostly about build systems, while I am looking for something targeted purely at making dependency tracking and resolution simpler (and I'm not necessarily interested in learning a new build system).
So, typically we have a project and some common code with another project. This common code is organized as a library, so when I want to get the latest code version for a project, I should also go get all the libraries from the source control. To do this, I need a list of dependencies. Then, to build the project I can reuse this list too.
I've looked at Maven and Ivy, but I'm not sure if they would be appropriate for C++, as they look quite heavily java-targeted (even though there might be plugins for C++, I haven't found people recommending them).
I see it as a GUI tool producing some standardized dependency list which can then be parsed by different scripts etc. It would be nice if it could integrate with source control (tag, get a tagged version with dependencies etc), but that's optional.
Would you have any suggestions? Maybe I'm just missing something, and usually it's done some other way with no need for such a tool? Thanks.
You can use Maven in relationship with C++ in two ways. First you can use it for dependency management of components between each other. Second you can use Maven-nar-plugin for creating shared libraries and unit tests in relationship with boost library (my experience). In the end you can create RPM's (maven-rpm-plugin) out of it to have adequate installation medium. Furthermore i have created the installation for CI environment via Maven (RPM's for Hudson, Nexus installation in RPM's).
I'm not sure if you would see an version control system (VCS) as build tool but Mercurial and Git support sub-repositories. In your case a sub-repository would be your dependencies:
Join multiple subrepos into one and preserve history in Mercurial
Multiple git repo in one project
Use your VCS to archive the build results -- needed anyway for maintenance -- and refer to the libs and header files in your build environment.
If you are looking for a reference take a look at https://android.googlesource.com/platform/manifest.