I have used boost.log in one project for a while and it's a really great log lib.
But to use a basic log like BOOST_LOG_TRIVIAL(lvl) I need to include boost/log/trivial.hpp which will cause a lots of boost related stuff to be scoped.
Is there any way (like using a wrapper or alias) to "hide" boost/log/trivial.hpp and BOOST_LOG_TRIVIAL(lvl) so I can be sure any other developer can only call the wrapped version of BOOST_LOG_TRIVIAL(lvl)?
At the very beginning I thought BOOST_LOG_TRIVIAL(lvl) was a simple stream object but then I found that it will be expanded as:
#define BOOST_LOG_STREAM_WITH_PARAMS_INTERNAL(logger, rec_var, params_seq)\
for (::boost::log::record rec_var = (logger).open_record((BOOST_PP_SEQ_ENUM(params_seq))); !!rec_var;)\
::boost::log::aux::make_record_pump((logger), rec_var).stream()
There is a for-loop here and I have no idea how to wrap it.
This is a non-trivial task to replicate what that macro is doing. You need a macro too, but if you want to hide all the boost details in an implementation unit, then you need to replicate the method calls in some way.
Firstly you need a wrapper around boost::log::record and then a wrapper around boost::log::record_pump<> and finally methods to make your wrapper instances. The end result should look something like this:
namespace foo {
enum severity {
debug, info, warn, error;
}
struct record {
// use pimpl to hide
};
struct record_pump {
// use pimpl to hide
};
// This will call the trivial logger to create a record and then
// wrap with your record
record make_record(severity lvl);
// This method will call the boost `boost::log::make_record_pump` method
// with the trivial logger and wrap the resulting `record_pump<T>` by your wrapper
record_pump make_record_pump(record& rec);
}
Then declare a macro that looks like the boost macro - remember you'll need to expose the levels in your own enums
#define LOG(lvl) \
for (auto rec = foo::make_record(lvl); !!rec;) \
foo::make_record_pump(rec).stream()
You'll need to implement the necessary bits of the record and record_pump interface.
Related
I am creating a gcc plugin that analyse a C++ file after parsing it.
The plugin walks through the classes and generate some information about them.
The plug-in is working, this is how I walk through classes.
cp_binding_level* level(NAMESPACE_LEVEL(nameSpace));
for (decl = level->names; decl != 0; decl = TREE_CHAIN(decl)) {
tree type(TREE_TYPE(decl));
tree_code dc(TREE_CODE(decl));
tree_code tc;
if (dc == TYPE_DECL&& tc == RECORD_TYPE &&
!DECL_IS_BUILTIN (decl) && DECL_ARTIFICIAL (decl)) {
//Now we know this is a class
//Do something
}
}
I would like to choose which class he can analyse and which one he can't.
My first idea is to add some sort of annotation, that I would read when I parse the class, and decide to analyze it or not.
I never used any sort of annotation in C++, so I don't know if this is possible. If so how would you recommend me to use them, and to get the annotation inside the plug-in ?
If it's not, is there a good way to do what I need ?
It can be done, it is not too hard, and it is a pretty common thing to do using a GCC plugin.
First you must register a new attribute. GCC provides the PLUGIN_ATTRIBUTES callback as a convenient time to do so. Your callback function can then call register_attribute to register attributes. This is documented in the manual, just one link away from the spot you linked to.
With this function you register another callback that is called when your attribute is applied. You'll have to read some GCC header files or source to really understand what this function should do. But, it can easily track whether it is being applied to a class and, if so, make a note of this for later processing.
In C++ I want to make functions that when declared, gets automatically added to a map( or vector, doesn't really matter in this case) as a function pointer and is called later automatically. For example this would be useful if I am writing unit test framework and I just want users to declare each of their unit tests like this:
UNIT_TEST_FUNCTION(function_name){
// do something
}
and instead something like this gets called
void function_name(){
//do something
}
int temp = register_function("function_name", function_name);
Where register_function() adds the user defined function in a map of function pointers for example. So basically, I need a mechanism that adds additional lines of code after a function definition, so that some action is performed automatically on the defined function. Is this possible using macros perhaps?
A macro can only generate a consecutive block of text. It can't lay things out the way you show in the question.
However if you're willing to rearrange a little, it can be done.
#define UNIT_TEST_FUNCTION(function_name) \
void function_name(); // forward declaration \
int temp##function_name = register_function(#function_name, function_name); \
void function_name()
A single preprocessor macro can't do what you want because it can only generate a single, contiguous block of text. Preprocessor macros are stupid in the sense that they don't understand anything about the language -- hence the preprocessor in 'preprocessor macro'.
What you can do is use a pair of macros or tuple of macros to delimit the begin and end of your test case mapping, and a single macro for each individual test case. Something along these lines:
TEST_CASES_BEGIN
UNIT_TEST_FUNCTION(function_name){
// do something
}
TEST_CASES_END
The Boost unit test facility uses a mechanism very similar to this. You might even (eventually) find this design to be a little more expressive than the design you are trying to achieve.
I was assigned the task of updating a very old project a while back. The first thing I had to do was to expand the existing code to incorporate a new feature. As part of this I modified existing macros to print JSON representations of incoming messages (over CORBA, into C++ structs). I then incorporated boost program_options and a new logger and now I want to modernise the macros.
The problem is that I have no idea how to implement what I did with the macros with templates. The key problem is that I use the name of the parameters to the macros to access the fields of the struct:
//Defines the string that precedes the variable name in a JSON name-value pair (newline,indent,")
#define JSON_PRE_VNAME _T("%s,\n\t\t\t\t\"")
//Defines the string that follows the variable name in a JSON name-value pair (":) preceding the value
#define JSON_SEP _T("\":")
#define printHex(Y,X) _tprintf(_T("%02X"), (unsigned char)##Y->##X );
// ******** MACRO **********
// printParam (StructureFieldName=X, ParamType=Y)
// prints out a json key value pair.
// e.g. printParam(AgentId, %s) will print "AgentId":"3910"
// e.g. printParam(TempAgent, %d) will print "TempAgent":1
#define printParam(X,Y) if(strcmp(#Y,"%s")==0){\
_byteCount += _stprintf(_logBuf,JSON_PRE_VNAME _T(#X) JSON_SEP _T("\"%s\""),_logBuf,myEvent->##X);\
}else{\
_byteCount += _stprintf(_logBuf,JSON_PRE_VNAME _T(#X) JSON_SEP _T(#Y),_logBuf,myEvent->##X);\
}\
printBufToLog();
And it is used like this:
//CORBA EVENT AS STRUCT "event"
else if(event.type == NI_eventSendInformationToHost ){
evSendInformationToHost *myEvent;
event.data >>= myEvent; //demarshall
printParam(EventTime,%d);
printParam(Id,%d);
printParam(NodeId,%d);
}
and this results in JSON like this:
"EventTime":1299239194,
"Id":1234567,
"NodeId":3
etc...
Obviously I have commented these macros fairly well, but I am hoping that for the sake of anyone else looking at the code that there is a nice way to achieve the same result with templates. I have to say the macros do make it very easy to add new events to the message logger.
Basically how do I do "#X" and ##X with templates?
Any pointers would be appreciated.
Thanks!
There are some things that you cannot really do without macros, and for some specific contexts macros are the solution. I would just leave the macros as they are and move on to the next task.
Well, I would actually try to improve a bit the macros. It is usually recommended not to use ; inside macros, and with macros that contain more than a single statement wrap them in do {} while(0) loops:
#define printHex(Y,X) _tprintf(_T("%02X"), (unsigned char)##Y->##X )
// remove ; ^
// add do while here:
#define printParam(X,Y) do { if(strcmp(#Y,"%s")==0){\
_byteCount += _stprintf(_logBuf,JSON_PRE_VNAME _T(#X) JSON_SEP _T("\"%s\""),_logBuf,myEvent->##X);\
}else{\
_byteCount += _stprintf(_logBuf,JSON_PRE_VNAME _T(#X) JSON_SEP _T(#Y),_logBuf,myEvent->##X);\
}\
printBufToLog();\
} while (false)
This might help avoid small mistakes that would otherwise be hard to fix, as, for example uses of the macros with if:
if (condition) printHex(a,b);
else printHex(c,d);
// looks good, but would originally expand to a syntax error:
if (condition) _tprintf(_T("%02X"), (unsigned char)##Y->##X );;
else ...
Similarly
if (condition) printParam(a,b);
else ...
would expand to a whole lot of non-sense for the compiler even if it looks correct enough to the casual eye.
I think that in many cases it's better to use an external code generator... starting from a nice neutral definition it's easy to generate C++, Javascript and whatnot to handle your data.
C++ templates are quite primitive and structure/class introspection is just absent. By playing some tricks you can be able to do ifs and loops (wow! what an accomplishment) but a lot of useful techniques are just out of reach. Also once you get your hard to debug template trickery working, at the first error the programmer makes you get screens and screens of babbling nonsense instead of a clear error message.
On the other side you have the C preprocessor, that is horribly weak at doing any real processing and is just a little more (and also less) than a regexp search/replace.
Why clinging to poor tools instead of just implementing a separate code generation phase (that can easily be integrated in the make process) where you can use a serious language of your choice able to do both processing and text manipulation easily?
How easy would be to write a neutral easy-to-parse file and then using for example a Python program to generate the C++ struct declarations, the serialization code and also the javascript counterpart for that?
I am creating a logging facility for my library, and have made some nice macros such as:
#define DEBUG myDebuggingClass(__FILE__, __FUNCTION__, __LINE__)
#define WARING myWarningClass(__FILE__, __FUNCTION__, __LINE__)
where myDebuggingClass and myWarningClass both have an overloaded << operator, and do some helpful things with log messages.
Now, I have some base class that users will be overloading called "Widget", and I would like to change these definitions to something more like:
#define DEBUG myDebuggingClass(__FILE__, __FUNCTION__, __LINE__, this)
#define WARNING myWarningClass(__FILE__, __FUNCTION__, __LINE__, this)
so that when users call 'DEBUG << "Some Message"; ' I can check to see if the "this" argument dynamic_casts to a Widget, and if so I can do some useful things with that information, and if not then I can just ignore it. The only problem is that I would like users to be able to also issue DEBUG and WARNING messages from non-member functions, such as main(). However, given this simple macro, users will just get a compilation error because "this" will not be defined outside of class member functions.
The easiest solution is to just define separate WIDGET_DEBUG, WIDGET_WARNING, PLAIN_DEBUG, and PLAIN_WARNING macros and to document the differences to the users, but it would be really cool if there were a way to get around this. Has anyone seen any tricks for doing this sort of thing?
Declare a global Widget* const widget_this = NULL; and a protected member variable widget_this in the Widget class, initialized to this, and do
#define DEBUG myDebuggingClass(__FILE__, __FUNCTION__, __LINE__, widget_this)
Macros are basically a straight text substitution done by the preprocessor. There's no way for a macro to know the context from which it's being called to do the sort of detection you're interested in.
The best solution is probably separate macros as you suspect.
I don't think you can do this with a macro. You can probably manage to do it with SFINAE, but code that uses SFINAE (at least directly) is1 hard to write, harder to debug, and virtually impossible for anybody but an expert to read or understand. If you really want to do this, I'd try to see if you can get Boost enable_if (or a relative thereof) to handle at least part of the dirty work.
1 ...at least in every case I've ever seen, and I have a hard time imagining it being otherwise either.
Inspired by solipist, but slightly simpler in the implementation:
class Widget {
protected:
::myDebuggingClass myDebuggingClass(char const* file, char const* function, int line) {
return ::myDebuggingClass(file, function, line, this);
}
// ...
This eliminates the need for a shadowed variable; it relies on simple class name lookup rules.
The only way I can think of to possibly get this to work is to define a global variable:
Widget * this = NULL;
If that even compiles (I have my doubts, but don't have a compiler to test it), member functions will use the nearest scoped variable (the real his pointer), and everything else will get a null. Everyone's happy (so to speak...)
you could use weak reference to detect variable or function whether exist.
eg:
detect int a exist:
int a attribute((weak));
if (a)
exist
else
not exist
the context
I'm working on a project having some "modules".
What I call a module here is a simple class, implementing a particular functionality and derivating from an abstract class GenericModule which force an interface.
New modules are supposed to be added in the future.
Several instances of a module can be loaded at the same time, or none, depending on the configuration file.
I though it would be great if a future developer could just "register" his module with the system in a simple line. More or less the same way they register tests in google test.
the context² (technical)
I'm building the project with visual studio 2005.
The code is entirely in a library, except the main() which is in an exec project.
I'd like to keep it that way.
my solution
I found inspiration in what they did with google test.
I created a templated Factory. which looks more or less like this (I've skipped uninteresting parts to keep this question somewhat readable ):
class CModuleFactory : boost::noncopyable
{
public:
virtual ~CModuleFactory() {};
virtual CModuleGenerique* operator()(
const boost::property_tree::ptree& rParametres ) const = 0;
};
template <class T>
class CModuleFactoryImpl : public CModuleFactory
{
public:
CModuleGenerique* operator()(
const boost::property_tree::ptree& rParametres ) const
{
return new T( rParametres );
}
};
and a method supposed to register the module and add it's factory to a list.
class CGenericModule
{
// ...
template <class T>
static int declareModule( const std::string& rstrModuleName )
{
// creation de la factory
CModuleFactoryImpl<T>* pFactory = new CModuleFactoryImpl<T>();
// adds the factory to a map of "id" => factory
CAcquisition::s_mapModuleFactory()[rstrModuleName ] = pFactory;
return 0;
}
};
now in a module all I need to do to declare a module is :
static int initModule =
acquisition::CGenericModule::declareModule<acquisition::modules::CMyMod>(
"mod_name"
);
( in the future it'll be wrapped in a macro allowing to do
DECLARE_MODULE( "mod_name", acquisition::modules::CMyMod );
)
the problem
Allright now the problem.
The thing is, it does work, but not exactly the way i'd want.
The method declareModule is not being called if I put the definition of the initModule in the .cpp of the module (where I'd like to have it) (or even in the .h).
If I put the static init in a used .cpp file .. it works.
By used I mean : having code being called elsewhere.
The thing is visual studio seems to discard the entire obj when building the library. I guess that's because it's not being used anywhere.
I activated verbose linking and in pass n°2 it lists the .objs in the library and the .obj of the module isn't there.
almost resolved?
I found this and tried to add the /OPT:NOREF option but it didn't work.
I didn't try to put a function in the .h of the module and call it from elsewhere, because the whole point is being able to declare it in one line in it's file.
Also I think the problem is similar to this one but the solution is for g++ not visual :'(
edit: I just read the note in the answer to this question. Well if I #include the .h of the module from an other .cpp, and put the init in the module's .h. It works and the initialization is actually done twice ... once in each compilation unit? well it seems it happens in the module's compilation unit ...
side notes
Please if you don't agree with what I'm trying to do, fell free to tell, but I'm still interested in a solution
If you want this kind of self-registering behavior in your "modules", your assumption that the linker is optimizing out initModule because it is not directly referenced may be incorrect (though it could also be correct :-).
When you register these modules, are you modifying another static variable defined at file scope? If so, you at least have an initialization order problem. This could even manifest itself only in release builds (initialization order can vary depending on compiler settings) which might lead you to believe that the linker is optimizing out this initModule variable even though it may not be doing so.
The module registry kind of variable (be it a list of registrants or whatever it is) should be lazy constructed if you want to do things this way. Example:
static vector<string> unsafe_static; // bad
vector<string>& safe_static()
{
static vector<string> f;
return f;
} // ok
Note that the above has problems with concurrency. Some thread synchronization is needed for multiple threads calling safe_static.
I suspect your real problem has to do with initialization order even though it may appear that the initModule definition is being excluded by the linker. Typically linkers don't omit references which have side effects.
If you find out for a fact that it's not an initialization order problem and that the code is being omitted by the linker, then one way to force it is to export initModule (ex: dllexport on MSVC). You should think carefully if this kind of self-registration behavior really outweighs the simple process of adding on to a list of function calls to initialize your "modules". You could also achieve this more naturally if each "module" was defined in a separate shared library/DLL, in which case your macro could just be defining the function to export which can be added automatically by the host application. Of course that carries the burden of having to define a separate project for each "module" you create as opposed to just adding a self-registering cpp file to an existing project.
I've got something similar based on the code from wxWidgets, however I've only ever used it as a DLL. The wxWidgets code works with static libs however.
The bit that might make a difference is that in wx the equivelant of the following is defined at class scope.
static int initModule =
acquisition::CGenericModule::declareModule<acquisition::modules::CMyMod>(
"mod_name"
);
Something like the following where the creation of the Factory because it is static causes it to be loaded to the Factory list.
#define DECLARE_CLASS(name)\
class name: public Interface { \
private: \
static Factory m_reg;\
static std::auto_ptr<Interface > clone();
#define IMPLEMENT_IAUTH(name,method)\
Factory name::m_reg(method,name::clone);\