How to use standard library macros with std module in C++23 - c++

I am currently playing with C++ modules, trying to modernize our company's code C++ base to use modules for the core features. In particular there is the upcoming C++23 std module which seems very interesting as a better alternative to precompiled headers.
So I am using Visual Studio 2022 17.5 Preview 2.0, which has a preliminary support for std module. By the way, I faced a compiler internal error, a bug that I reported to Microsoft.
In all our C++ source files, there is now an import std; statement, and this works quite well. Every identifier that is supposed to be in std namespace seem to be exported as expected. I measured a slight decrease of compilation time compared to the previous use of a precompiled header.
I found out that if you import std;, you have a lot of weird compilation errors in you also #include <> any standard C++ header, because Microsoft compiler get confused and complains about redefinition. So I took care of removing all of them.
My problem is that there are a few macros that are defined in standard library (mostly in C compatibility library), which are obviously not exported because C++ modules by design never export macros.
Our code base use a very limited number of those standard macros, but I think it would be hard to avoid them. Here is the short list of them (unsure it is complete):
stdout
errno
va_start, va_arg, va_end
For the va_* macros, I #include <stdarg.h> and it compiles fine on VS 2022, although it breaks the rule I previously mentioned. This is probably because this header has nearly only macros. But for stdout and errno, I don't know what to so.
Does C++23 specify how to access the important standard macros like stdout or errno when importing std module? Is there a good workaround?

According to the C++ working draft:
A declaration in the standard library denotes the same entity regardless of whether it was made reachable through including a header, importing a header unit, or importing a C++ library module.
So this is probably a bug relative to C++23. Though considering that they only claim partial support for the C++23 modular standard library, it's not surprising.
However, for your specific problem, the issue is mostly solved. Many of these macros come from headers that only export the macros. errno comes from <cerrno>, and assert comes from <cassert>. stdout is a problem because it lives in <cstdio> which provides a lot more than just that macro.

Related

The mess that is ctime, time.h, sys/time.h

I hope some Linux die-hard can answer me how I should write portable (POSIX) code when working with time functions.
Some SO threads suggest that including ctime would be the right thing to do when writing C++ code, whereas you would still include time.h for C code. However, they both define the same function, albeit in a different namespace. Technically you should be able to specify both.
One SO post suggested that one should AVOID using the sys/* based includes alltogether..
.. while this thread implies that sys/time.h must be included before sys/resources.h is included, in particular for BSD-based platforms.
This post says including sys/time.h improves portability. I imagine the poster thinks that it allows you to link more 3rd party libaries that use particular functions like gettimeofday.. however..
gettimeofday() has been discouraged, and is now currently enjoying deprecated status, so I should use clock_gettime() instead. This clock_gettime() is defined in time.h, see https://linux.die.net/man/3/clock_gettime..
.. if one installs and links with libavutil (e.g. as part of ffmpeg-dev) it becomes clear that time.h was created to drive people nuts. Ffmpeg (and some other libs) has it's own time.h, and even timeb.h. It turns out that if ANY .c or .cpp anywhere in your build stack ever includes a time.h, with the include path holding multiple valid entries (including the one for ffmpeg), it may refer to the wrong one, and the declarations are simply replaced. # FFmpeg, the reasoning seems to be that an ugly hack is sufficient to fix the problem. I haven't been that lucky yet. Also, Php-izing all sources does not sound like a solution at all.
Another time.h exists in usr/include/i386-linux-gnu/bits on my system, so this isn't an ffmpeg-only phenomenon either. Simply referring to usr/include/i386-linux-gnu as an include path thus becomes deadly, which is odd when referring to system includes.
I've rewritten my CMake scripts, taking care to use dedicated include folder specs for most of the targets. I've tried including all sorts of permutations of time.h/ctime and sys/time.h in a precompiled header that is referred to throughout the codebase. I still get errors like:
error: field ‘st_atim’ has incomplete type ‘timespec’, struct timespec st_atim;
error: ‘::time’ has not been declared
etc..
So, for a C++ setup linking with many 3rd party dependencies, what is the correct way to make sure everything keeps compiling w.r.t. including time.h? Should I pin the time.h include for the system to the specific platform I'm compiling to? Should I go over all targets that possibly need time.h? Dancing elephants and confetti lie ahead.
Update: The issue seems to be related to the version of C++ as was hinted at in the comments below. I've since updated gcc to 8.3.0 (from 5.4), and I've given up on supporting older c++ compatibility below c++11 on linux. After updating and rebuilding all 3rd party packages (including ffmpeg), I now no longer have the issue I described, but that doesn't mean it's fixed in the sense that it can't occur again for someone else. In fact I think the issue is mainly with how ffmpeg compiles on older compilers and without c++11 explicitly requested, so I'm leaving it open.
I recommend you consider Howard Hinnant's date library that has been accepted into the next version of the C++ standard library (C++20):
https://github.com/HowardHinnant/date
It should work on any platform that supports C++11 onwards.
The Standard version is documented here: https://en.cppreference.com/w/cpp/chrono

Apply language standard to just part of a source file?

Is it possible to enforce a C or C++ language standard for just part of a source file? I want to ensure that my source file is C90 compliant, but it depends on some headers that require C99. The compliance of those headers isn't important to me right now (I can compile with C99 for the time being), but I want to minimize effort required to port my code to a more restrictive platform in the future.
In short, I want a language standard to apply to the entire file except for the included headers. Given how header inclusion works in C and C++, I figure that the general problem is to apply a language standard to an arbitrary portion of a given source file.
I'm working with GCC in particular, but I'm also curious if this is possible with other compilers (msvc, clang).
If you want to make sure your code is C90 compatible while still relying on headers that use C99, you can enable the GCC warning flag -Wc90-c99-compat. This allows you to use C99 features, but emits a warning wherever you use features not available in C90. To avoid generating these warnings in your header files (which are presumably correct), enable the warning using a pragma, after you include the files.
This will basically achieve what you want. The warnings will only be emitted for the code you specify.
#include "myheader.h"
#pragma GCC diagnostic warning "-Wc90-c99-compat"
void func(void) { ... }
A stricter version would be to use:
#pragma GCC diagnostic error "-Wc90-c99-compat"
As far as I know, Clang is mostly compatible, but MSVC doesn't really support C99 that well to begin with so you're on your own.
No. You cannot build a file (compilation unit) as partly one language standard and partly another.
What you can do however is split the file into two files and compile each of them using different language standards (just be sure that it is then still well defined to link both object files afterwards).
A better solution would be to ensure that all the code complies with the newer or older language standard and then just use that.

Intel <math.h> vs C <math.h>?

I have a C++ project on Linux where I have included the library path:
/opt/intel/include/
so that I can use certain Intel libraries. However, I also wish to use the standard C/C++ math.h so that I can call pow(x,y);
I included <math.h> and used using namespace::std and then made a call using pow(x,y). The compiler (gcc4.7) complains:
/opt/intel/include/math.h:27:3: error: #error "This Intel is
for use with only the Intel compilers!"
How do I specify that I am referring to the C/C++ math.h pow() and not the Intel pow()?
This is the reason C++ uses namespaces for this sort of thing and also uses more specific header names that are less likely to collide with other libraries.
If you #include <cmath> (which you ought to in C++ software, rather than <math.h>) you can distinguish between the stdlib's implementation and Intel's by using std::pow (...). This is another reason not to apply using namespace std; willy-nilly as well, it might make code appear cleaner, but since the function names used in the stdlib are so generic they frequently collide with other libraries.
It seems the #error in Intel's <math.h> is rather blunt and obvious: The header is guarded against use with other compilers, probably because it depends on specific extensions (e.g., built-in functions) not available in other compilers. If it really is <math.h> it would be part of the standard C or C++ library and as such tied to the compiler unless it is explicitly part of the platform ABI which doesn't seem to be the case.
It doesn't quite look as if you want to use Intel's <math.h> but only other headers from Intel's library. In that case one of the following techniques may work:
Specify the location of the system/gcc <math.h> with another -I option preceding the one for Intel's headers: the order in which headers are searched is generally the same order in which the -I options appear.
Do not use a -I directive to find Intel's headers but include them with a path name or with a relative path name (the latter in combination with a -I option, e.g., -I/opt/intel).
Create a custom directory with symbol links to the headers/directories in /opt/intel/include and remove any header which you want to get picked up from somewhere else. Alternatively, work the other way around: create a symbolic link to each header needed from /opt/intel/include.
Since this directory doesn't seem to be constructed to be usable as a drop-in for other compilers, it is quite possible that none of this won't work: headers shipping with a specific compiler have a tendency to be specific to that compiler. For example, you'll also need to link to the corresponding Intel libraries and I'm not sure if the Intel compiler and gcc use the same ABI (on Linux they may use a common ABI, though).

Visual C++ Standard Library keywords

I wanted to write a unicode version of std::exception and std::runtime_error.
So I thought what better way then to just take implementations from the C++ Standard Library and alter them to support unicode.
So I pulled up the exception and stdexcept headers in Visual C++, copied the code, made my changes.
The thing is I couldn't get it to link unless I removed the _CRTIMP_PURE.
I also removed the _EXCEPTION_INLINE __CLR_OR_THIS_CALL prefix from all the member functions.
It's working but I'm very curious what all those things did.
_EXCEPTION_INLINE it literally defined right above it as #define _EXCEPTION_INLINE, and my googling skills can't find any documentation on what they do.
So, does anyone know what these are meant to do? And why it wound't link until I removed the _CRTIMP_PURE prefix from the class?
These aren't really anything mysterious (but it might be a bit pf a pain to track down where they're defined - but only a little). They're defined in headers that are part of the library, and they take on different definitions depending on how the compiler is configured for the current run. In particular, these macros seem to be concerned mostly with whether or not the current run is configured for /clr:pure.
_CRTIMP_PURE is defined to __declspec(dllimport) if you're linking against the DLL version of the C runtime (and not building with /clr:pure), and defined to nothing otherwise.
If your library isn't a DLL (or if it won't necessarily be a DLL whenever the DLL runtime is configured), then you shouldn't use it. You probably shouldn't use it anyway, because you'd need to define it differently when building your library than when your library is being used (that's what Microsoft does when they build the C runtime libraries).
__CLR_OR_THIS_CALL is used by Microsoft's libraries to declare a function with __clrcall if you're building with /clr:pure (indicating these functions will only be called by managed code - the compiler can perform certain optimizations in that case it seems).
Finally, _EXCEPTION_INLINE is used to make the member functions of class exception inline if building with /clr:pure.
So the bottom line is, don't use __CLR_OR_THIS_CALL or _EXCEPTION_INLINE unless you plan to support /clr for your library, and you probably shouldn't use _CRTIMP_PURE in your implementation, but probably should use something similar of your own making and under your own control.
Standard library implementation code often uses compiler-specific extensions for performance/documentation/reliability reasons. You should not use these extensions in your own code, because they can break in subsequent versions of the compiler.
There's no problem in duplicating the interface of std::exception and friends, and you can look at Visual Studio's implementation for inspiration. But your implementation should only use publically documented language/library features.

Header Files in C/C++, Standard?

Are header files standard or different in gcc vs other compilers?
Its not really clear what you are asking, but the "standard" header files are only standard in the sense that they (should) meet the C/C++ standard (as specified by the governing body, e.g. ANSI, etc.)
Different compilers often meet these standards through different implementations, at least when the standard allows them to do so.
In other words, you should only rely on the behavior that is specified in the standard, as specific implementations may vary slightly.
Standard header files are called so, because they are defined as a part of ANSI C/C++ standard, an so, they will be the same for all compilers, that are ANSI-compliant.
I hope I understand your question but here's my go.
Header files (.h) that go along with a .cpp file to create a class are how you do things in C++.
For most cases, a SomeClass.h will prototype the class, and the SomeClass.cpp will contain the code necessary for the class to work.
If for some reason GCC does something very different for compilation, then I have no idea. I assume it's the same idea for any compiler.
The concept of header files, outside of the standard ones required for the standard library, is not specified by the standard. But using #include to specify files to import is. So that is standard, as well as the general order the compiler searches for those files. And the #ifndef BLAH method for avoiding multiple inclusion is also mandated by the standard, insofar as the behavior of the preprocessor is well defined (although as I said, the standard is silent on whether or not you use them at all). #pragma once is not standard, though, so use it at your own risk.
You may find minor variations between different compiler suites. But more significantly, you will find a variety of libraries and header files across different platforms. For instance, GCC is often found on POSIX systems, so it's quite common to find, say, <pthread.h>, whenever __GNUC__ is pre-defined. This leads to assumptive code like the following:
#ifdef __GNUC__
#include <pthread.h>
#else
#include <windows.h>
#endif
If in doubt, favor the C++ Standard Library when using C++ and the C Standard Library when using C. (But continue to expect a few niggling inconsistencies caused mostly by different compiler versions.)
Also, test that your code builds and runs on different systems. If it works using Visual Studio under Windows and GCC under Linux, you can be somewhat assured that porting your code to other systems will be straight-forward.