Use VS Code for Cross-Platform Scalable C++ Project - c++

This tutorial provided by the VS Code team explains how to setup C++ in VS Code with MinGW (https://code.visualstudio.com/docs/cpp/config-mingw), however it seems to only target small-scale projects as it is configured to rebuilds all files each time, and isn't setup to allow for easy configuration of a project (global defines, adding link libraries etc.) (or am I missing something?).
I am looking into using CMake as my build tool inside VS Code, however am really turned off by the idea of having to update the CMakeLists.txt file to change my project's configurations, e.g. every time I add files to my project or link a new library, as I have grown up using IDEs who do this process for me, which I find magnitudes easier as it allows me to purely to focus on writing code.
My question is, is there a known method to setup VS Code that will allow me to use a tool such as CMake without having to tinker with it too often?
This tutorial seems like a good starting point to setup CMake, but its method requires CMake knowledge https://pspdfkit.com/blog/2019/visual-studio-code-for-cpp/
I ideally would like the setup to cater for both small-scale and large-scale projects, both of which should allow for cross-platform building.
Thanks

As Visual Studio Code is not a heavy weight IDE, you will need to put some effort to get it run the way you want it to have, either:
Write a CMake InputFile generator, e.g. :
list(APPEND EXECUTABLE_LIST executable_1 executable_2)
list(APPEND LIBRARY_LIST library_2 library_2 library_3)
foreach(lib IN LISTS LIBRARY_LIST)
add_library(${lib} STATIC ${lib}.cpp)
endforeach()
foreach(exe IN LISTS EXECUTABLE_LIST)
add_executable(${exe} ${exe}.cpp)
target_include_directories(${exe} PUBLIC ${CMAKE_CURRENT_LIST_DIR})
target_link_libraries(${exe} ${LIBRARY_LIST})
endforeach()
or you could also use Visual Studio Code CMake Tool extention
For sure, you couldn't expect the highly functionality like Visual Studio in VS Code as they serve different purpose (highly specialised, deep system couple but slow and big vs small, fast, highly configurable).

Related

Cross-Platform C++ development with VisualStudio

I want to start a cross-platform C++ project with VisualStudio 2019.
After some research I found two possible ways:
CMake
Tutorial: Create C++ cross-platform projects in Visual Studio
Create and configure a Linux CMake project
CMake is pretty common for cross-platform projects, but I haven't done much with it and it feels like you need to put in much effort to make it run as you want to. But you have a huge amount of possibilities to configure it. This SO questions also recommends CMake, but says that there are other ways.
VS Shared Project
Shared Project : An Impressive Feature of Visual Studio 2015
Cross-platform code sharing with Visual C++
If you use cross-platform code sharing in VS you have several projects in your solution (one for each platform) and at least one shared project, which includes your source + header files. The platform specific projects include all the configuration settings and the program entry point. The shared code can be used from each project as if it is in the project.
All the configuration is possible in VS. It looks less complex but with limited possibilities.
Question
Both ways look like they would fit my needs, but I haven't found a comparison between them. I already coded running examples for both of them, but those small code bases haven't given me the insight I need.
What are the pro's and con's of those two ways? Considering attributes like build-speed, flexibility, time to learn/master/market, VS feature support, unit testing, continuous integration, ...
Take a look at Premake. You can generate Makefiles and VS project from simple lua configurations. Not a lot of projects use premake but my experience so far has been positive.

How to write truly cross-platform C++ libraries for distribution

The problem:
I'm writing an SDK that is primarily C++. The source code will be licensed to developers who pay for it, and the output libraries and include headers will be free for public usage. The SDK will target a plethora of platforms including Windows, Xbox, Playstation, Android, iOS, Mac OS X, and Linux. I'm a kind of guy who mostly likes Visual Studio and usually develops software using Windows machines. In the last few years, Visual Studio has made this quite a lot easier than it used to, where I have a mostly clear path to target all the previously mentioned platforms using a Visual Studio set of project files as the source of truth that brings all my source code files together... except for Mac OS X, unfortunately. Visual Studio is able to build executable code for iOS and Linux by remotely interfacing with a Mac or Linux box respectively for compilation and debugging, which is really quite cool, but for some reason Mac OS X as a target is left out here. Additionally I'm well aware that there are plenty of other professional developers out there that don't write code on Windows machines, nor do they have any interest in buying a commercial license for Visual Studio.
The question:
Since C++ still does not yet have a build system standard, and may not ever, how do I maintain a single source of truth that maintains the build configurations for all my source files targeting so many different platforms while simultaneously minimizing the barrier to entry of supporting software developer clients who need to build, run, and debug my source code?
Possible answers I'm aware of:
A) Visual Studio projects remain the source of truth from which any other C++ project types (such as Android Studio, XCode, and .make files) are derived. I'm aware of tools that can convert VS to .make and the like, but haven't actually tried them yet (my source code base is starting to get somewhat large already). Or I could just bite the bullet and write them by hand and try to keep them all in sync.
B) CMake. Sigh. So frustrating that it's very popular, and seems to exactly solve my issues, but it has its own set of problems that seem to be deal-breakers. For starters, once you go in on CMake, you pretty much can't come back. Using Visual Studio and property sheets, I've been able to tweak my build configurations with properties that are mostly inherited and rarely duplicated across projects and configurations. As far as I can see, CMake doesn't care about respecting such things, and for common properties, it just duplicates them on all vcxproj files. To make matters worse, all file paths it generates in the output projects are absolute, not relative, and to top it all off, it forces anyone else who builds your code to use CMake, disallowing distribution of the project build files it creates without it. Also, does this even work for game consoles? Last I checked I couldn't find a reasonable way of supporting them without hacking the source code.
C) Roll my own script that's similar to CMake, but allows redistribution of its output projects, and supports all the platforms I need. It goes without saying that this would consume a lot of dev time.
Any other options I'm missing here? Your input is greatly appreciated.
I agree with the comments from #Scheff and #arrowd. Use CMake. I have built and deployed software to multiple platforms and CMake is the best, though not perfect, solution I have found for building C++ code.
I have not had to hack the cmake code to get it to work on various platforms.
Do not worry about properties being duplicated in vcxproj files. With CMake the build language is in the CMakeLists.txt file(s). The vcxproj files are generated code. As long as you are do not have redundant cmake logic, you should not care about replicated properties in the generated vcxproj files. Similarly, you should not care that the generated vcxproj files have absolute paths; you do not reuse the vcxproj files you reuse the CMakeLists.txt files and regenerate vcxproj files for each new platform or build.
Use the top-level CMakeLists.txt to define the properties that are common to all targets. Then in individual target(s) CMakeLists.txt files use target properties to tweak builds of specific targets. In my experience the replication of properties in the generated files helps because they make the builds more consistent; I am able to minimize replication in the source CMake logic.
Various IDEs (VS 2017, CLion, QtCreator) can use cmake based projects directly.
There is nothing cmake specific about the generated artifacts. The headers, libraries (and dlls), and executables of our SDK are standalone artifacts. Yes, cmake can make it easier for your SDK users but using CMake in your SDK will not force your customers to use CMake.
Have you tried CMake and not been able to use it? Or are you looking for something better? There are certainly multiple C++ build systems, but despite its shortcomings I believe CMake is the best one available right now.

How to convert a cmake project into a Visual Studio equivalent?

The situation is the following: I have the source code of one programm (lets call it programA) (written in C and C++), as well as the CMakeLists.txt and CTestConfig.cmake files. I already installed programA using CMake's graphical user interface and, as it is obvious, it worked. It created the .exe file (I'm working on Windows 7 OS).
The problem is that, right now, I've been asked to edit the program (and so, I must be able to edit the code and degugging it as changes are made). I also need to compile it but not in .exe anymore but in .dll so I can add it to a website we have.
I've read in forums that CMake can compile programA into a .dll if I need to, but as I would need to make some changes I consider that CMake debugging is not as useful and easy as using entirely VS. From the little I know from CMake language, the CMakeLists.txt is mainly used to check the OS of the user as well as adding some libraries in case they are not found.
I have to admit I have no idea in programming CMake directives, as I have been working with ASP.NET, C, C++ and C# mostly. Then, my idea is to try to work only in visual studio 2010 instead of using cmake as well, so once I have the program 'adapted' to VS and can be compiled just using VS, I'm ready to start my job. So the question I have is how can I perform the same task CMake did just using Visual Studio (Is there any way of implementing CMake directives in VS?), can VS compile by receiving as an argument something similar to that CMake.txt file (though it needs to be translated into another language)?
To skip the use of CMake I tried to copy the source code into a new project in VS. However as it does not use the CMake directives when compiling, it gives several errors, most of them related to the fact that some headers.h can't be found (cause they might be in a subfolder). And there are so many subfolders to add the paths to the predefined directories of search that it would take ages.
I'm sorry I can't be more precise in my explanation. I'm good at programming little projects on my own, but it's the first time I have to work on other's programm. Please don't hesitate to ask if anything was not properly understood
I would appreciate a lot any suggestion / advice /guidance you can give.
To make a dll, use add_library command and the SHARED keyword
add_library(mylib SHARED ${files})
this is easy with CMake, don't go back in visual that will be harder at the end
The Good News
Fortunately, cmake can generate VS Projects automaticaly for you (this tutorial s specific for OpenTissue, but Steps 1 to 3 should be the same for you).
The [not so] Bad News
Depending on the complexity of the project, VS Projects automaticaly generated by cmake can get pretty nasty, to the point of illegibility. It will, for example, hard link any library dependencies using the specific paths of your machine, so the project will most certainly not be portable across setups. In any case, that's the intended bahavior, because the primary idea of supporting this generator is simply making it work, thus allowing users to easily compile projects using MSVC, so there's not much you can do here. Nonetheless, it should work in your machine and will certainly be a great starting point for you, just create a project yourself from scratch copying the relevant parts out of the automatic generated version.

C++ Buildsystem with ability to compile dependencies beforehand

I'm in the middle of setting up an build environment for a c++ game project. Our main requirement is the ability to build not just our game code, but also its dependencies (Ogre3D, Cegui, boost, etc.). Furthermore we would like to be able build on Linux as well as on Windows as our development team consists of members using different operating systems.
Ogre3D uses CMake as its build tool. This is why we based our project on CMake too so far. We can compile perfectly fine once all dependencies are set up manually on each team members system as CMake is able to find the libraries.
The Question is if there is an feasible way to get the dependencies set up automatically. As a Java developer I know of Maven, but what tools do exist in the world of c++?
Update: Thanks for the nice answers and links. Over the next few days I will be trying out some of the tools to see what meets our requirements, starting with CMake. I've indeed had my share with autotools so far and as much as I like the documentation (the autobook is a very good read), I fear autotools are not meant to be used on Windows natively.
Some of you suggested to let some IDE handle the dependency management. We consist of individuals using all possible technologies to code from pure Vim to fully blown Eclipse CDT or Visual Studio. This is where CMake allows use some flexibility with its ability to generate native project files.
In the latest CMake 2.8 version there is the new ExternalProject module.
This allows to download/checkout code, configure and build it as part of your main build tree.
It should also allow to set dependencies.
At my work (medical image processing group) we use CMake to build all our own libraries and applications. We have an in-house tool to track all the dependencies between projects (defined in a XML database). Most of the third party libraries (like Boost, Qt, VTK, ITK etc..) are build once for each system we support (MSWin32, MSWin64, Linux32 etc..) and are commited as zip-files in the version control system. CMake will then extract and configure the correct zip file depending on which system the developer is working on.
I have been using GNU Autotools (Autoconf, Automake, Libtool) for the past couple of months in several projects that I have been involved in and I think it works beautifully. Truth be told it does take a little bit to get used to the syntax, but I have used it successfully on a project that requires the distribution of python scripts, C libraries, and a C++ application. I'll give you some links that helped me out when I first asked a similar question on here.
The GNU Autotools Page provides the best documentation on the system as a whole but it is quite verbose.
Wikipedia has a page which explains how everything works. Autoconf configures the project based upon the platform that you are about to compile on, Automake builds the Makefiles for your project, and Libtool handles libraries.
A Makefile.am example and a configure.ac example should help you get started.
Some more links:
http://www.lrde.epita.fr/~adl/autotools.html
http://www.developingprogrammers.com/index.php/2006/01/05/autotools-tutorial/
http://sources.redhat.com/autobook/
One thing that I am not certain on is any type of Windows wrapper for GNU Autotools. I know you are able to use it inside of Cygwin, but as for actually distributing files and dependencies on Windows platforms you are probably better off using a Windows MSI installer (or something that can package your project inside of Visual Studio).
If you want to distribute dependencies you can set them up under a different subdirectory, for example, libzip, with a specific Makefile.am entry which will build that library. When you perform a make install the library will be installed to the lib folder that the configure script determined it should use.
Good luck!
There are several interesting make replacements that automatically track implicit dependencies (from header files), are cross-platform and can cope with generated files (e.g. shader definitions). Two examples I used to work with are SCons and Jam/BJam.
I don't know of a cross-platform way of getting *make to automatically track dependencies.
The best you can do is use some script that scans source files (or has C++ compiler do that) and finds #includes (conditional compilation makes this tricky) and generates part of makefile.
But you'd need to call this script whenever something might have changed.
The Question is if there is an feasible way to get the dependencies set up automatically.
What do you mean set up?
As you said, CMake will compile everything once the dependencies are on the machines. Are you just looking for a way to package up the dependency source? Once all the source is there, CMake and a build tool (gcc, nmake, MSVS, etc.) is all you need.
Edit: Side note, CMake has the file command which can be used to download files if they are needed: file(DOWNLOAD url file [TIMEOUT timeout] [STATUS status] [LOG log])
Edit 2: CPack is another tool by the CMake guys that can be used to package up files and such for distribution on various platforms. It can create NSIS for Windows and .deb or .tgz files for *nix.
At my place of work (we build embedded systems for power protection) we used CMake to solve the problem. Our setup allows cmake to be run from various locations.
/
CMakeLists.txt "install precompiled dependencies and build project"
project/
CMakeLists.txt "build the project managing dependencies of subsystems"
subsystem1/
CMakeLists.txt "build subsystem 1 assume dependecies are already met"
subsystem2/
CMakeLists.txt "build subsystem 2 assume dependecies are already met"
The trick is to make sure that each CMakeLists.txt file can be called in isolation but that the top level file can still build everything correctly. Technically we don't need the sub CMakeLists.txt files but it makes the developers happy. It would be an absolute pain if we all had to edit one monolithic build file at the root of the project.
I did not set up the system (I helped but it is not my baby). The author said that the boost cmake build system had some really good stuff in it, that help him get the whole thing building smoothly.
On many *nix systems, some kind of package manager or build system is used for this. The most common one for source stuff is GNU Autotools, which I've heard is a source of extreme grief. However, with a few scripts and an online depository for your deps you can set up something similar like so:
In your project Makefile, create a target (optionally with subtargets) that covers your dependencies.
Within the target for each dependency, first check to see if the dep source is in the project (on *nix you can use touch for this, but you could be more thorough)
If the dep is not there, you can use curl, etc to download the dep
In all cases, have the dep targets make a recursive make call (make; make install; make clean; etc) to the Makefile (or other configure script/build file) of the dependency. If the dep is already built and installed, make will return fairly promptly.
There are going to be lots of corner cases that will cause this to break though, depending on the installers for each dep (perhaps the installer is interactive?), but this approach should cover the general idea.
Right now I'm working on a tool able to automatically install all dependencies of a C/C++ app with exact version requirement :
compiler
libs
tools (cmake, autotools)
Right now it works, for my app. (Installing UnitTest++, Boost, Wt, sqlite, cmake all in correct order)
The tool, named «C++ Version Manager» (inspired by the excellent ruby version manager), is coded in bash and hosted on github : https://github.com/Offirmo/cvm
Any advices and suggestions are welcomed.

Using Makefile instead of Solution/Project files under Visual Studio (2005)

Does anyone have experience using makefiles for Visual Studio C++ builds (under VS 2005) as opposed to using the project/solution setup. For us, the way that the project/solutions work is not intuitive and leads to configuruation explosion when you are trying to tweak builds with specific compile time flags.
Under Unix, it's pretty easy to set up a makefile that has its default options overridden by user settings (or other configuration setting). But doing these types of things seems difficult in Visual Studio.
By way of example, we have a project that needs to get build for 3 different platforms. Each platform might have several configurations (for example debug, release, and several others). One of my goals on a newly formed project is to have a solution that can have all platform build living together, which makes building and testing code changes easier since you aren't having to open 3 different solutions just to test your code. But visual studio will require 3 * (number of base configurations) configurations. i.e. PC Debug, X360 Debug, PS3 Debug, etc.
It seems like a makefile solution is much better here. Wrapped with some basic batchfiles or scripts, it would be easy to keep the configuration explotion to a minimum and only maintain a small set of files for all of the different builds that we have to do.
However, I have no experience with makefiles under visual studio and would like to know if others have experiences or issues that they can share.
Thanks.
(post edited to mention that these are C++ builds)
I've found some benefits to makefiles with large projects, mainly related to unifying the location of the project settings. It's somewhat easier to manage the list of source files, include paths, preprocessor defines and so on, if they're all in a makefile or other build config file. With multiple configurations, adding an include path means you need to make sure you update every config manually through Visual Studio's fiddly project properties, which can get pretty tedious as a project grows in size.
Projects which use a lot of custom build tools can be easier to manage too, such as if you need to compile pixel / vertex shaders, or code in other languages without native VS support.
You'll still need to have various different project configurations however, since you'll need to differentiate the invocation of the build tool for each config (e.g. passing in different command line options to make).
Immediate downsides that spring to mind:
Slower builds: VS isn't particularly quick at invoking external tools, or even working out whether it needs to build a project in the first place.
Awkward inter-project dependencies: It's fiddly to set up so that a dependee causes the base project to build, and fiddlier to make sure that they get built in the right order. I've had some success getting SCons to do this, but it's always a challenge to get working well.
Loss of some useful IDE features: Edit & Continue being the main one!
In short, you'll spend less time managing your project configurations, but more time coaxing Visual Studio to work properly with it.
Visual studio is being built on top of the MSBuild configurations files. You can consider *proj and *sln files as makefiles. They allow you to fully customize build process.
While it's technically possible, it's not a very friendly solution within Visual Studio. It will be fighting you the entire time.
I recommend you take a look at NAnt. It's a very robust build system where you can do basically anything you need to.
Our NAnt script does this on every build:
Migrate the database to the latest version
Generate C# entities off of the database
Compile every project in our "master" solution
Run all unit tests
Run all integration tests
Additionally, our build server leverages this and adds 1 more task, which is generating Sandcastle documentation.
If you don't like XML, you might also take a look at Rake (ruby), Bake/BooBuildSystem (Boo), or Psake (PowerShell)
You can use nant to build the projects individually thus replacing the solution and have 1 coding solution and no build solutions.
1 thing to keep in mind, is that the solution and csproj files from vs 2005 and up are msbuild scripts. So if you get acquainted with msbuild you might be able to wield the existing files, to make vs easier, and to make your deployment easier.
We have a similar set up as the one you are describing. We support at least 3 different platforms, so the we found that using CMake to mange the different Visual Studio solutions. Set up can be a bit painful, but it pretty much boils down to reading the docs and a couple of tutorials. You should be able to do virtually everything you can do by going to the properties of the projects and the solution.
Not sure if you can have all three platforms builds living together in the same solution, but you can use CruiseControl to take care of your builds, and running your testing scripts as often as needed.