Autoconf: Check if a program in an unsupported language compiles - build

To conditionally enable a part of an autotooled project, I need to check whether a short program stub in a language not supported by autotools out of the box compiles or not.
I need something like AC_TRY_COMPILE with an arbitrary compiler executable – create a temporary file, write a piece of code to it, and try if invoking the compiler (found via AC_CHECK_PROGS before) returns an exit code equal to zero or not.
What is the most elegant/common way to do this?

AC_TRY_COMPILE (which is deprecated and replaced by AC_COMPILE_IFELSE) only supports a limited set of languages: C, C++, Fortran 77, Fortran, Erlang, Objective C, Objective C++ (source).
configure.ac can contain custom shell code - it just gets skipped over by autoconf (really m4). Why not just write your test in shell? If you're going to use more than one test, wrap it in an AC_DEFUN.

To enable an optional part of an autotooled project, you should use an --enable-something option. Don't make it dependent on what is currently available in the build environment. That is prone to mask errors in automated builds. (Example: Linux distributions have been known to ship crippled packages because of a missing build dependencies or other problem in the environment, where raising an error would have been more helpful than proceeding silently.)

Related

Xcode autocompletion does not work for C++ libraries included via CMake

I have an Objective-C based project with some C++ code. I have included library I want to use via CMake. However, Xcode autocompletion is not working properly for library's methods, classes and etc.
Despite that, project compiles and there are no errors during build after inputting some of the library classes or functions in code. Xcode can also correctly specify the error, if I miss something like required parameters for method call (It will show up build error, telling which parameter I forgot to use).
The problem is lack of autocompletion dramatically slows down the development and I need to fix it.
Considering the fact that Xcode is essentially just another UNIX make with GUI on top of it, I would advice just switching to VSCode because the C++ plugin there is designed to work with this kind of stuff.
In your case you could probably use some automatic cmake -> pbx generator if such tools even exist. Or, of course, do this manually, and configure the compilation out of an Xcode project.

How can I parse C++ code in a prebuild event?

I have a prebuild-event tool (written in Ruby) in my C++ toolchain, that generates additional C++ code from existing C++ source code. I would like to replace this tool with a faster generator and using clang would be the best option.
Is there a way to write a C++ application that parses C++ source code of a file, so I can implement this prebuild tool in Clang? I am looking for a keyword or page with how to start. Any help is highly appreciated!
Parsing C++ is not a simple thing. Compile-time trickery and implicit semantics make it incredibly hard. Because of this I would suggest going with Clang. Its developers made it possible to use Clang as a library. Check out this guide to see different interfaces Clang has. If you want real C++ experience you might want to choose LibTooling.
I want to warn you though that in order for any C/C++ parser to work as expected they absolutely need compilation options used by the real compiler. Without include directories or macro definitions the code can make little to no sense. Basically your build system should tell your custom tool how to compile each file. The simplest approach would be using compilation database. It is a go-to solution for many Clang-based tools. However, it looks like you're making it a part of your build system, so maybe incorporating your tool and using options directly from the build system can be not such of a burden for you.
I hope this information is helpful!

Capture all compiler invocations and command line parameters during build

I want to run tools for static C/C++ (and possibly Python, Java etc.) code analysis for a large software project built with help of make. As it is known, make (or any other build tool) invokes compiler and similar tools for specified source code files. It is also possible to control compilation by defining environmental variables to be later passed to the compiler via its arguments.
The key to accurate static analysis is to provide defines and include paths exactly as they were passed to the compiler (basically all its -D and -I arguments). This way, the tool will be able to follow same code paths the compiler have followed.
The problem is, the high complexity of the project means there is no way to statically determine such environment, as different files are built with different sets of defines/include paths and other compilation flags.
The idea is that it should be somehow possible to capture individual invocations of the compiler with all arguments passed to it for each input file. Having such information and after its straightforward filtering (e.g. there is no need to know -O optimization levels or -W warning settings) it should be possible to invoke the static analyzer for each input file with the identical set of defines/includes used just for that input file.
The question is: are there existing tools/workflows that implement the idea I've described? I am mostly interested in a solution for POSIX systems, but ideas for Windows are also welcome.
A few ideas I've come to on my own.
The most trivial solution would be to collect make output and process it afterwards. However, certain projects have makefile rules that give very concise output instead of verbose one, so it might require some tinkering with Makefiles, which is not always desirable. Parallel builds may also have their console output mixed up and impossible to parse. Adaptation to other build systems (Cmake) will not be trivial either, so it is far from being the most convenient way.
Running make under ptrace and recording all invocations of exec* system calls that correspond to starting new applications, including compiler invocations. Then one will need to parse ptrace's output. This approach is build system and language agnostic (will catch all invocations of any compiler for any language) and should work for parallel builds. However it seems to be more technically complex. Performance degradation to the build process because of ptrace sitting on make's back is unclear either. It will also be harder to port it to Windows, as program-tracing API is somewhat different there.
The proprietary static analyzer for C++ on Windows (and recently Linux AFAIK) PVS-Studio seems to implement the second approach, however details on how they do it are welcome. If there are other IDEs/tools that already have something similar to what I need, please share information on them.
There are the following ways to gather information about the parameters of compilation in Linux:
Override environment CC/CXX variables. It is used in the utility scan-build from Clang Analyzer. This method works reliably only with simple projects for Make.
procfs - all the information on the processes is stored in /proc/PID/... . Reading from a disk is a slow process, you might not be able to receive information about all processes of a build.
strace utility (ptrace library). The output of this utility contains a lot of useful information, but it requires a complicated parsing, because information is written randomly. If you do not use many threads to build the project, it is a fairly reliable way to gather information about the processes. It’s used in PVS-Studio.
JSON Compilation Database in CMake. You can get all the compilation parameters using the definition -DCMAKE_EXPORT_COMPILE_COMMANDS=On. It is a reliable method if a project does not depend on non-standard environment variables. Also the project for CMake can be written with errors and issue incorrect Json, although this doesn’t affect the project build. It’s supported in PVS-Studio.
Bear utility (function substitution using LD_PRELOAD). You can get JSON Database Compilation for any project. But without environment variables it’ll be impossible to run the analyzer for some projects. Also, you cannot use it with projects, which already use LD_PRELOAD for a build. It’s supported in PVS-Studio.
Collecting information about compiling in Windows for PVS-Studio:
Visual Studio API to get the compilation parameters of standard projects;
MSBuild API to get the compilation parameters of standard projects;
Win API to get the information on any compilation processes as, for example, Windows Task Manager does it.
VERBOSE=true is a default make option to display all commands with all parameters. It also works with CMake, for instance.
You might want to look at Coverity. They are attaching their tool to the compiler to get everything that the compiler receives. You could overwrite the environment variables CC or CXX to first collect everything and then call the compiler as usual.

How to make building / compilation more comfortable

My current workflow when developing Apps or programs with Java or C/C++ is as follows:
I don't use any IDE like IntelliJ, Visual Studio, ...
Using linux or OS X, I use vim as code editor. When I build with a makefile or (when in Java) gradle, I :!make and wait for the compiler and linker to create the executable, which will be run automatically.
In case of compilation errors, the output of the compiler can get very long and the lines exceed the columns of the console. So everything gets messy, and sometimes takes too much time to find out, what the first error ist (often causing all following compile errors).
My question is, what is your workflow as a C++ developer? For example is there a way, to generate a nicely formatted local html file, that you can view / update in your browser window. Or other ideas?
Yes, I know. I could use Xcode or any other IDE. But I just don't want.
Compiling in vim with :!make instead of :make doesn't make any sense -- it's even one of the early features of vim. The former will expect us to have good eyes. The latter will display compilation errors into the quickfix window, which we can navigate. In other words, no need to use an auxiliary log file: we can navigate compilation errors even in (a coupled of) editors that run into a console.
I did expand on a related topic in https://stackoverflow.com/a/35702919/15934.
Regarding compilation, there are a few plugins that permits to compile in background. I've added this facility in build-tool-wrapper lately (it requires vim 7.4-1980 -- and it's still in a development branch at this time). This plugin also permits me to easily filter errors in the standard library with the venerable STLfilt, and to manage several build configurations (each in a separate directory).

Figuring out the intended target system of a makefile

I am trying to understand the exact actions being taken by a make file (first time working with builds using make).
I want to know whether the make file is intended to be used by BSD make, GNU make or Windows nmake. 1) How can I do this without reading documentation of all three and understanding their differences?
2) After reading several articles, I have come to the conclusion that make is a utility for the primary purpose of building source code into an executable or a DLL or some form of output, something that IDEs usually allow us to do. Makefile is the means of giving instructions to the make utility. Is this right?
3) What is the relation between make and the platform for which the code will be built?
4) The commands that are issued under a dependency line (target:components) are shell commands?
Thanks in advance.
How can I do this without reading documentation of all three and understanding their differences?
Well, most likely, that you will use GNU Make. I believe, it is relatively simple to distinguish Makefiles written for different versions of Make.
AFAIK, GNU Make and BSD Make have many differences, at least in their language syntax. For example, in GNU Make a typical conditional directive looks like:
ifdef VARIABLE
# ...
endif
And in BSD Make it is something like this (though I'm not sure):
.if
# ...
.endif
See also:
How similar/different are gnu make, microsoft nmake and posix standard make?
Use the same makefile for make (Linux) and nmake(Windows)
BSD Make and GNU Make compatible makefile
Makefile is the means of giving instructions to the make utility. Is this right?
Yep, absolutely right.
What is the relation between make and the platform for which the code will be built?
Make is just an utility for tracking dependencies between artifacts to be built and running commands in the proper order to satisfy these dependencies. Which exact commands should be used to accomplish the task is up to the user. Thus you can use Make for cross compilation as well.
The commands that are issued under a dependency line (target : components) are shell commands?
Yes, typically Make spawns a sub-shell which executes the specified recipe.