my autotools project has a couple of unit-tests.
one of these tests (filereader) needs to read a file (data/test1.bin)
Here's my filesystem layout:
- libfoo/tests/filereader.c
- libfoo/tests/data/test1.bin
and my libfoo/tests/Makefile.am:
AUTOMAKE_OPTIONS = foreign
AM_CPPFLAGS = -I$(top_srcdir)/foo
LDADD = $(top_builddir)/src/libfoo.la
EXTRA_DIST = data/file1.bin
TESTS = filereader
check_PROGRAMS= filereader
filereader_SOURCES = filereader.c
this works great, as long as i do in-tree builds.
However, when running the test-suite out-of-tree (e.g. make distcheck), the filereader test cannot find the input file anymore.
This is obviously because only the source tree contains the input file, but not the build tree.
i wonder what is the canonical way to fix this problem?
compile the directory of the test-file into the unittest (AM_CPPFLAGS+=-DSRCDIR=$(srcdir))
pass the qualified input file as a cmdline argument to the test? (e.g. $(builddir)/filereader $(srcdir)/data/file1.bin)
copy the input file from the source tree to the build tree? (cp $(srcdir)/data/file1.bin $(builddir)/data/file1.bin? how would a proper make-rule look like??)
Canonically, the solution would be to define the path to your file into the unittest, so the first option you laid out. The second one is also possible but it requires using an in-between driver script.
I would suggest avoiding the third one, but if you do want to go down that route, use $(LN_S) rather than cp; this way you reduce the I/O load of the test.
There is a way to do this with autoconf. From the netcdf-c configure.ac:
##
# Some files need to exist in build directories
# that do not correspond to their source directory, or
# the test program makes an assumption about where files
# live. AC_CONFIG_LINKS provides a mechanism to link/copy files
# if an out-of-source build is happening.
##
AC_CONFIG_LINKS([nc_test4/ref_hdf5_compat1.nc:nc_test4/ref_hdf5_compat1.nc])
AC_CONFIG_LINKS([nc_test4/ref_hdf5_compat2.nc:nc_test4/ref_hdf5_compat2.nc])
AC_CONFIG_LINKS([nc_test4/ref_hdf5_compat3.nc:nc_test4/ref_hdf5_compat3.nc])
AC_CONFIG_LINKS([nc_test4/ref_chunked.hdf4:nc_test4/ref_chunked.hdf4])
AC_CONFIG_LINKS([nc_test4/ref_contiguous.hdf4:nc_test4/ref_contiguous.hdf4])
Related
I have a project and there is a ./tests directory at its root containing several hundred MB of data that is used by the tests of several libraries:
./tests
./src/lib1/dune
./src/lib1/tests/dune
./src/lib1/tests/tests.ml
./src/lib2/dune
./src/lib2/tests/dune
./src/lib2/tests/tests.ml
...
I also defined tests that use the data in ./tests for each library like this:
(rule
(alias runtest)
(action (run ./tests/tests.exe)))
I now have to somehow communicate the location of the test data to each of my tests.exe. What is the most elegant way of doing this using dune?
It seems that dune copies my test data into _build which is unnecessary because the data never changes and it doesn't make sense to waste several hundred MB of space that way. From the documentation it seems that %{project_root} would contain the path to my source files but unfortunately, the variable evaluates to . which is useless for the tests which are run after a cd _build/default/src/libX and thus . does not point to the project root anymore. So is there a dune-way to specify the path to the original source directory without ugly hacks?
Right now, I'm using an environment variable containing the full path before I run dune runtest but is there a more integrated way?
I have not tried it myself but it sounds like the data_only_dirs stanza is what you are looking for: https://dune.readthedocs.io/en/stable/dune-files.html#data-only-dirs-since-1-6
My project folder has the following structure
-Project/
/src
-Main.cpp
-MyReader.cpp
/headers
-MyReader.h
/DataFiles
-File.dat
-File1.dat
My class Object.cpp has a couple of methods which reads from File.dat and File1.dat and parse the information to Map objects. My problem is that I am using Autotools (in which I'm very very newbie) for generating config and installer files and I don't know how to make all the DataFiles files accessible for the program after installation. The program doesn't work properly because of the code fails when trying to read those files through relative paths. Locally, the program runs perfectly after executing in terminal make && ./program.
How can I solve this issue? Thanks in advance for your help!
A platform independent way to do this with Autotools is using the $(datadir) variable to locate the system data directory and work relative to that.
So in your Makefile.am file you can create a name like this:
myprog_infodir = $(datadir)/myprog
# Set a macro for your code to use
myprog_CXXFLAGS = -DDATA_LOCATION=\"$(datadir)/myprog\"
# This will install it from the development directories
myprog_info_DATA = $(top_srcdir)/DataFiles/File.dat $(top_srcdir)/DataFiles/File1.dat
# make sure it gets in the installation package
extra_DIST = $(top_srcdir)/DataFiles/File.dat $(top_srcdir)/DataFiles/File1.dat
Then in your program you should be able to refer to the data like this:
std::ifstream ifs(DATA_LOCATION "/File.dat");
Disclaimer: Untested code
I figured out one method and will give my example here:
In my Makefile.am
AM_CPPFLAGS = -D MATRIXDIR="\"$(pkgdatadir)/matrix\""
nobase_dist_pkgdata_DATA = matrix/AAcode.txt \
matrix/BLOSUM50 matrix/BLOSUM70.50 matrix/BLOSUM100 matrix/BLOSUM50.50 \
matrix/BLOSUM75 matrix/BLOSUM100.50 matrix/BLOSUM55 matrix/BLOSUM75.50 \
... more not shown
I put quite some number of datafiles in the matrix directory, just show a few of them. In my source file, I simply use the macro MATRIXDIR:
scorematrix.cpp:string MatrixScoreMethod::default_path=MATRIXDIR;
This seems to work well for me. You can use other versions of the data automake variable, such as dist_data_DATA instead of pkgdata. It is a good idea to use pkgdata this way your data will not be mixed with other packages. The nobase_ is to tell automake not to strip the matrix directory during install. Those escaped double quotes seems to be needed for string type so that you don't get compiler errors.
Having used CMake, I've become used to out-of-source builds, which are encouraged with CMake. How can out-of-source builds be done with Cargo?
Using in-source-builds again feels like a step backwards:
Development tools need to be configured to ignore paths. Sometimes multiple plugins and development tools - especially using VIM or Emacs!
Some tools can't be configured to easily hide build files. While dotfiles are typically hidden, they will still show Cargo.lock and target/, worse still, recursively exposing their contents.
Deleting un-tracked files to remove everything outside of version control, typically to cleanup editor temp files or some test output, can backfire if you forgot to add a new file to version control and don't manually check the file list properly before deleting them.
Dependencies are downloaded into your source code path, sometimes adding *.rs files in the target directory as part of building indirect deps, so operating on all *.rs files may accidentally pickup other files which aren't in a hidden directory, so might not be ignored even after development tools have been configured.
While it's possible to work around all these issues, I'd rather just have an external build path and keep the source directory pristine.
You can specify the directory of the target/ folder either via configuration file (key build.target-dir) or environment variable (CARGO_TARGET_DIR). Here is an example using a configuration file:
Suppose you want to have a directory ~/work/ in which you want to save the Cargo project (~/work/foo/) and next to it the target directory (~/work/my-target/).
$ cd ~/work
$ cargo new --bin foo
$ mkdir .cargo
$ $EDITOR .cargo/config
Then insert the following into the configuration file:
[build]
target-dir = "./my-target"
If you then build in your normal Cargo project directory:
$ cd foo
$ cargo build
You will notice that there is no target/ dir, but everything is in ~/work/my-target/.
However, the Cargo.lock is still saved inside the Cargo project directory, but that kinda makes sense. For executables, you should check the Cargo.lock file into your git! For libraries, you shouldn't. I guess having to ignore one file is better than having to ignore an entire folder.
Lastly, there are a few caveats to changing the target-dir, which are listed in the PR which introduced the feature.
While useful manually setting this up isn't all that convenient, I wanted to be able to build multiple crates within a source tree, having all of them out-of-source, something that ../target-dir configuration option wouldn't achieve.
Helper utility for convenient out-of-source builds
Using the environment variable I've written a small utility to wrap cargo, so it automatically builds out-of-source, supporting crates both at the top-level, on in a subdirectory of the source tree.
Thanks to Lukas for pointing out CARGO_TARGET_DIR and target-dir configuration option.
What I really wanted was a dynamic CARGO_TARGET_DIR that changes relative to where I am.
This bash alias puts all builds in a mirrored directory structure, e.g. instead of putting target into ~/mydir/myproj it puts in into ~/rustbuild/mydir/myproj
alias cargo='CARGO_TARGET_DIR=$(echo $PWD | sed "s|$HOME|$HOME/rustbuild|g") cargo'
You could also make your rustbuild directory hidden.
I'm working on a C++ project that has some hand coded source files, as well as some source and header files that are generated by a command line tool.
The actual source and header files generated are determined by the contents of a JSON file that the tool reads, and so cannot be hardcoded into the scons script.
I would like to set up scons so that if I clean the project, then make it, it will know to run the command line tool to generate the generated source and header files as the first step, and then after that compile both my hand coded files and the generated source files and link them to make my binary.
Is this possible? I'm at a loss as to how to achieve this, so any help would be much appreciated.
Yes, this is possible. Depending on which tool you're using to create the header/source files, you want to check our ToolIndex at https://bitbucket.org/scons/scons/wiki/ToolsIndex , or read our guide https://bitbucket.org/scons/scons/wiki/ToolsForFools for writing your own Builder.
Based on your description you'll probably have to write your own Emitter, which parses the JSON input file and returns the filenames that will finally result from the call. Then, all you need to do is:
# creates foo.h/cpp and bar.h/cpp
env.YourBuilder('input.json')
env.Program(Glob('*.cpp'))
The Glob will find the created files, even if they don't physically exist on the hard drive yet, and add them to the overall dependencies.
If you have further questions or problems arise, please consider subscribing to our User mailing list at scons-users#scons.org (see also http://scons.org/lists.html ).
Thanks to Dirk Baechle I got this working - for anyone else interested here is the code I used.
import subprocess
env = Environment( MSVC_USE_SCRIPT = "c:\\Program Files (x86)\\Microsoft Visual Studio 11.0\\VC\\bin\\vcvars32.bat")
def modify_targets(target, source, env):
#Call the code generator to generate the list of file names that will be generated.
subprocess.call(["d:/nk/temp/sconstest/codegenerator/CodeGenerator.exe", "-filelist"])
#Read the file name list and add a target for each file.
with open("GeneratedFileList.txt") as f:
content = f.readlines()
content = [x.strip('\n') for x in content]
for newTarget in content:
target.append(newTarget)
return target, source
bld = Builder(action = 'd:/nk/temp/sconstest/codegenerator/CodeGenerator.exe', emitter = modify_targets)
env.Append(BUILDERS = {'GenerateCode' : bld})
env.GenerateCode('input.txt')
# Main.exe depends on all the CPP files in the folder. Note that this
# will include the generated files, even though they may not currently
# exist in the folder.
env.Program('main.exe', Glob('*.cpp'))
There's an example at:
https://github.com/SCons/scons/wiki/UsingCodeGenerators
I'll also echo Dirk's suggestion to join the users mailing list.
I've recently picked up scons to implement a multi-platform build framework for a medium sized C++ project. The build generates a bunch of unit-tests which should be invoked at the end of it all. How does one achieve that sort of thing?
For example in my top level sconstruct, I have
subdirs=['list', 'of', 'my', 'subprojects']
for subdir in subdirs:
SConscript(dirs=subdir, exports='env', name='sconscript',
variant_dir=subdir+os.sep+'build'+os.sep+mode, duplicate=0)
Each of the subdir has its unit-tests, however, since there are dependencies between the dlls and executables built inside them - i want to hold the running of tests until all the subdirs have been built and installed (I mean, using env.Install).
Where should I write the loop to iterate through the built tests and execute them? I tried putting it just after this loop - but since scons doesn't let you control the order of execution - it gets executed well before I want it to.
Please help a scons newbie. :)
thanks,
SCons, like Make, uses a declarative method to solving the build problem. You don't want to tell SCons how to do its job. You want to document all the dependencies and then let SCons solve how it builds everything.
If something is being executed before something else, you need to create and hook up the dependencies.
If you want to create dmy touch files, you can create a custom builder like:
import time
def action(target, source, env):
os.system('echo here I am running other build')
dmy_fh = open('dmy_file','w')
dmy_fh.write( 'Dummy dependency file created at %4d.%02d.%02d %02dh%02dm%02ds\n'%time.localtime()[0:6])
dmy_fh.close()
bldr = Builder(action=action)
env.Append( BUILDERS = {'SubBuild' : bldr } )
env.SubBuild(srcs,tgts)
It is very important to put the timestamp into the dummy file, because scons uses md5 hashes. If you have an empty file, the md5 will always be the same and it may decide to not do subsequent build steps. If you need to generate different tweaks on a basic command, you can use function factories to modify a template. e.g.
def gen_a_echo_cmd_func(echo_str):
def cmd_func(target,source,env):
cmd = 'echo %s'%echo_str
print cmd
os.system(cmd)
return cmd_fun
bldr = Builder(action = gen_a_echo_cmd_func('hi'))
env.Append(BUILDERS = {'Hi': bldr})
env.Hi(srcs,tgts)
bldr = Builder(action = gen_a_echo_cmd_func('bye'))
env.Append(BUILDERS = {'Bye': bldr})
env.Bye(srcs,tgts)
If you have something that you want to automatically inject into the scons build flow ( e.g. something that compresses all your build log files after everything else has run ), see my question here.
The solution should be as simple as this.
Make the result of the Test builders depend on the result of the Install builder
In pseudo:
test = Test(dlls)
result = Install(dlls)
Depends(test,result)
The best way would be if the Test builder actually worked out the dll dependencies for you, but there may be all kinds of reasons it doesn't do that.
In terms of dependencies, what you want is for all the test actions to depend on all the program-built actions. A way of doing this is to create and export a dummy-target to all the subdirectories' sconscript files, and in the sconscript files, make the dummy-target Depends on the main targets, and have the test targets Depends on the dummy-target.
I'm having a bit of trouble figuring out how to set up the dummy target, but this basically works:
(in top-level SConstruct)
dummy = env.Command('.all_built', 'SConstruct', 'echo Targets built. > $TARGET')
Export('dummy')
(in each sub-directory's SConscript)
Import('dummy')
for target in target_list:
Depends(dummy, targe)
for test in test_list:
Depends(test, dummy)
I'm sure further refinements are possible, but maybe this'll get you started.
EDIT: also worth pointing out this page on the subject.
Just have each SConscript return a value on which you will build dependencies.
SConscript file:
test = debug_environment.Program('myTest', src_files)
Return('test')
SConstruct file:
dep1 = SConscript([...])
dep2 = SConscript([...])
Depends(dep1, dep2)
Now dep1 build will complete after dep2 build has completed.