Custom command to generate object(.o) from binary file using autoconf - c++

We have a project (c++) and it needs to include a binary file into shared library. This is done on windows by referencing the binary file from a resource file. On Linux it can be achieved by using objcopy as shown here
The question is how can this be automated this using autoconf/automake? There exists Makefile.am and configure.ac files. Is this going to be a manual task?
(Maybe this question needs to be on the unix stack exchange site?)

Does your binary file have a distinctive extension? If so, refer to the Suffixes chapter of the manual:
.bin.o:
bin2o -o $# $<
And you then list foo.bin in your foo_SOURCES variable.
If you don't have a distinctive extension, try something like this:
foo_SOURCES = foo.c bar.c baz.c
foo_LDADD = foobin$(OBJEXT)
foobin$(OBJEXT): foobin
bin2o -o $# $<

Related

gmake is deleting some files implicitly on Solaris 10

I am running GNU gmake for my build process. I use file extensions as the following:
.pc: pro*c code (compiled by using oracle's proc compiler)
.cpp : c++ code (compiled by GNU g++ compiler)
.o: object file
.mc: c++ code (compiled by GNU g++ compiler)
.mo: object file
A .pc file gets compiled by Oracle proc compiler and output is having extension .cpp or that gets compiled by g++ compiler to generate .o file. All .o files eventually get linked together to make the executable file.
For some other requirement, I need to have some special code block within .pc file as below.
#ifdef SPCL_BLCK
// some code
#endif // end of SPCL_BLCK
I need to have to different paths to have my executable variants, with and without special code part.
Without special code part, the paths are:
.pc -> .cpp -> .o -> executable
With special code part, the paths are:
.pc -> .mc -> .mo -> executable
I have my PROC flags defined as:
PROCFLAGS := code=cpp 'include=(<<list of comma-separated include dirs>>)'
normal rules (for path without special code):
.pc.c:
proc $(PROCFLAGS) $<
%.o: %.c
g++ -c $(CCFLAGS) $<
where, CCFLAGS are usual C++ compiler flags
rules (for path with special code):
%.mo: %.mc
$(CC) -c $(CCFLAGS) -DSPCL_BLCK -o $# -x c++ $<
%.mc: %.pc
$(PROC) $(PROCFLAGS) define=SPCL_BLCK oname=$# $<
All is well, but after the successful build, the .mc files gets deleted.
I see that's done by rm command, but is there any implicit rule that is driving the removal of the .mc files?
Can I have some command that could stop firing implicit rules?
I need to define the following in gmakefile:
.PRECIOUS:%.c %.mc
Otherwise, GNU gmake thinks that .mc files are intermediate files and drives the removal of these .mc files

Adding dependency on a non-compiled file in a Makefile

I have the following code generation scheme:
The original file is *.c.macro
It is processed by a special program that generates the *.c file. The program uses external XML files mentioned in the original .c.macro file. Somthing along this line:
macroprocess foo.c.macro -o foo.c
Then the *.c file is compiled to *.o normally
What I want to do is to make the .c file dependant on the XML files. So that if the XML changes, macroprocessing is automatically redone when 'make' command is invoked.
I can modify the code of the macroprocessor so that it would generate a list of all XMLs and write it somewhere, but I have no idea how to incorporate it into a makefile. Apparently I would need to play with 'include' command in the makefile, but usually it's employed to add dependencies on actual code like .h files.
If you can get your macroprocess to write a dependency file (e.g. *.xmldep) for each *.c.macro file you can do:
-include $(C_MACRO:.c.macro=.xmldep)
where $(C_MACRO) is a variable that contains your *.c.macro files.
The - in front will ensure that the first time, the make will not fail when it cannot locate the dependency files.
You can add foo.xml as a dependency of foo.c and use a filter command to get only the %.c.macro file as arguments for macroprocess.
Something like this:
# Basic targets
all: foo
foo: foo.o
gcc -o $# $^
foo.o: foo.c
gcc -o $# -c $<
# Special target with filter
foo.c: foo.c.macro
macroprocess -o $# $(filter %.c.macro,#^)
# Adding foo.xml as a dependency of foo.c
foo.c: foo.xml
.PHONY: all foo

Managing Dependency Complexity in Make Files

I am working on my first open source C++ project: https://github.com/jehugaleahsa/spider-cpp.
I am managing my own Makefile and I have "best practices" question regarding how to manage dependencies. Right now, I make each .o file dependent on each of the included header files in the .cpp file. So:
code.o: code.cpp code.hpp dep1.hpp de2.hpp
g++ -c code.cpp
First of all, I am pretty sure Make supports a shorthand for creating object files. If someone would show an example of this, I'd appreciate it.
Next, is there a way to avoid listing every included header as a dependency? I want to make sure if I change a dependency, that the changes are still compatible. Listing the included headers is tedious and easy to mess up.
OP:
First of all, I am pretty sure Make supports a shorthand for creating object files. If someone would show an example of this, I'd appreciate it.
From here:
OBJS := foo.o bar.o
#Your program should have the objects as dependencies, and link them
proggie: $(OBJS)
gcc $(OBJS) -o proggie
# compile
%.o: %.c
gcc -c $(CFLAGS) $*.c -o $*.o
OP:
Next, is there a way to avoid listing every included header as a dependency
Lower down on the same page, see these lines:
# pull in dependency info for *existing* .o files
-include $(OBJS:.o=.d)
# compile and generate dependency info
%.o: %.c
gcc -c $(CFLAGS) $*.c -o $*.o
gcc -MM $(CFLAGS) $*.c > $*.d
Basically what this does is use gcc's -MM option to obtain a list of header files, and now we can depend on them. Thus we output a file with a list of such header files to a .d file, and then next time, we add the list of files as a dependency, which is what the -include command does. The "-" avoids error if the dependency .d files don't exist yet.
Note, you should modify the above to account for .cpp files
Yes, make supports shorthand for creating object files. This is called rules. And yes, there is a way to avoid listing every included header as a dependency. g++/gcc has -MM option which will generate the full list of dependencies.
Unfortunately, there is no simple explanation on how to do it which I could put here.. You have to read docs and play with make utility. I found this book very helpful: "Managing Projects with GNU Make". There is a doc on GNU site but I found it a bit harder to read. www.gnu.org/software/make/manual/make.html

how to write makefile to take care of changes in the header file

Actually i have a library 'cryptopp' and what i want is that when i make any change to a file and issue the make command it should take care of the changes made in any file in the source directory. well, the GNUMakefile of cryptoopp takes care of the changes 'if' made in the '.cpp' files but not for the changes made in a '.h' file.
So what changes can i make in the 'GNUMakefile' of cryptopp so that it looks at all the modified header files and recompiles all the files dependent on the 'modified' header file.
If you are building with g++ you can let g++ generate dependancy makefiles.
You can include these in your main makefile.
Use the -M and -M* arguments to use this feature. (see http://gcc.gnu.org/onlinedocs/gcc-4.6.1/gcc/Preprocessor-Options.html#Preprocessor-Options)
You have to add all the dependencies to your Makefile:
mycode.o: mycode.cpp mycode.h somelib.h resources.h
$(CXX) -c -o $# $< $(CXXFLAGS) $(INCLUDES)
If you already have a generic pattern matching command line, you don't have to say the command again, you can just list the dependencies:
%o: %.cpp
$(CXX) -c -o $# $< $(CXXFLAGS) $(INCLUDES)
mycode.o: mycode.cpp mycode.h somelib.h resources.h
yourcode.o: yourcode.cpp yourcode.h mycode.h somethingelse.h
# ...
In general, this is a terrible and unscalable mess. You'll almost definitely want a higher-level build system to generate the Makefile for you. Even for very small projects keeping the header dependencies up to date in the Makefile is such a pain that it is simply not worth it.
There are several popular portable build environments. I personally like cmake a lot, which includes discovery if you changed the build settings (say from Debug to Release) and will always build all the necessary files (for example, if you change the cmake master file and type "make" it'll automatically run cmake again for you first).
For a Unix-only solution you could try makedepend, or the infamous autotools, though that's a whole other headache...
You might try 'makedepend' if it's installed on your system. The easiest way is to add a target to your makefile. Something like:
depend:
makedepend *.cc
You might have to replace the '*.cc' with a list of your source files. Then you can regenerate all the dependencies with 'make depend' command. You might want to redirect error messages to /dev/null since it always seems to generate a lot of noise.

Object files generation and best practices for linking using makefiles - C++

Background
I am just getting started with C++ programming on LINUX. In my last question, I asked about best practices of using makefiles for a big application. "SO" users suggested to read Miller's paper on recursive makefiles and avoid makefile recursion (I was using recursive makefiles).
I have followed miller and created a makefile like the below. Following is the project structure
root
...makefile
...main.cpp
...foo
......foo.cpp
......foo.h
......module.mk
My makefile looks like the below
#Main makefile which does the build
CFLAGS =
CC = g++
PROG = fooexe
#each module will append the source files to here
SRC :=
#including the description
include foo/module.mk
OBJ := $(patsubst %.cpp, %.o, $(filter %.cpp,$(SRC))) main.o
#linking the program
fooexe: $(OBJ)
$(CC) -o $(PROG) $(OBJ)
%.o:
$(CC) -c $(SRC)
main.o:
$(CC) -c main.cpp
depend:
makedepend -- $(CFLAGS) -- $(SRC)
.PHONY:clean
clean:
rm -f *.o
Here is the module.mk in foo directory.
SRC += foo/foo.cpp
When I run make -n, I get the following output.
g++ -c foo/foo.cpp
g++ -c main.cpp
g++ -o fooexe foo/foo.o main.o
Questions
Where should I create the object(.o) files? All object files in a single directory or each object files in it's own modules directory? I mean which is the best place to generate foo.o? Is it in foo directory or the root (My example generates in the root)?
In the provided example, g++ -c foo/foo.cpp command generates the .o file in the root directory. But when linking(g++ -o fooexe foo/foo.o main.o) it is looking for the foo/foo.o. How can I correct this?
Any help would be great
Where should I create the object(.o) files? All object files in a single directory or each object files in it's own modules directory? I mean which is the best place to generate foo.o? Is it in foo directory or the root (My example generates in the root)?
I find it easier for investigating failed builds to localize object files in a separate directory under the module level directory.
foo
|_ build
|_ src
Depending on the size of the project, these object files are grouped to form a component at a higher level and so on. All components go to a main build directory which is where the main application can be run from (has all dependent libraries etc).
In the provided example, g++ -c foo/foo.cpp command generates the .o file in the root directory. But when linking(g++ -o fooexe foo/foo.o main.o) it is looking for the foo/foo.o. How can I correct this?
Use:
g++ -o fooexe foo.o main.o
+1 for SCons.
I am using SCons, too. It scans the dependencies for you and it only rebuilds when source has changed as it uses cryptographic hash sums instead of timestamps.
In my SCons build the objects live in parallel directories to the source (to enable multiple builds like combinations of 32bit and 64bit, release and debug):
src
.build
linux
i686
debug
release
x86_64
debug
release
With regards to object and other generated interim files, I put these in a directory completely separate from the sources (I.e. under a directory that is excluded from backup and revision control). It may be slightly more bother to setup in projects or makefiles, but it saves time packaging up sources, and it is easier to have clean backups and revision control.
I create a subdirectory structure for the object files that matches the subdirectory structure for sources. Typically I have a separate subdirectory for each of my libraries and programs.
Additionally I also use multiple compilers (and versions) and multiple operating systems, so I will reproduce the object file directory structure under a directory for each of these compilers (which have newer versions of the standard and vendor libraries) to prevent object files with mismatched included header file versions.
The best thing you can do for yourself is to use something better than Make. SCons is my tool of choice on POSIX systems. Boost also has a build tool that is very flexible, but I had a hard time wrapping my head around it.
Oh, and if you want to use make, go ahead and build recursive makefiles. It really isn't that big a deal. I worked on a gigantic project using tons of recursive makefiles over the last three years, and it worked just fine.