Having used CMake, I've become used to out-of-source builds, which are encouraged with CMake. How can out-of-source builds be done with Cargo?
Using in-source-builds again feels like a step backwards:
Development tools need to be configured to ignore paths. Sometimes multiple plugins and development tools - especially using VIM or Emacs!
Some tools can't be configured to easily hide build files. While dotfiles are typically hidden, they will still show Cargo.lock and target/, worse still, recursively exposing their contents.
Deleting un-tracked files to remove everything outside of version control, typically to cleanup editor temp files or some test output, can backfire if you forgot to add a new file to version control and don't manually check the file list properly before deleting them.
Dependencies are downloaded into your source code path, sometimes adding *.rs files in the target directory as part of building indirect deps, so operating on all *.rs files may accidentally pickup other files which aren't in a hidden directory, so might not be ignored even after development tools have been configured.
While it's possible to work around all these issues, I'd rather just have an external build path and keep the source directory pristine.
You can specify the directory of the target/ folder either via configuration file (key build.target-dir) or environment variable (CARGO_TARGET_DIR). Here is an example using a configuration file:
Suppose you want to have a directory ~/work/ in which you want to save the Cargo project (~/work/foo/) and next to it the target directory (~/work/my-target/).
$ cd ~/work
$ cargo new --bin foo
$ mkdir .cargo
$ $EDITOR .cargo/config
Then insert the following into the configuration file:
[build]
target-dir = "./my-target"
If you then build in your normal Cargo project directory:
$ cd foo
$ cargo build
You will notice that there is no target/ dir, but everything is in ~/work/my-target/.
However, the Cargo.lock is still saved inside the Cargo project directory, but that kinda makes sense. For executables, you should check the Cargo.lock file into your git! For libraries, you shouldn't. I guess having to ignore one file is better than having to ignore an entire folder.
Lastly, there are a few caveats to changing the target-dir, which are listed in the PR which introduced the feature.
While useful manually setting this up isn't all that convenient, I wanted to be able to build multiple crates within a source tree, having all of them out-of-source, something that ../target-dir configuration option wouldn't achieve.
Helper utility for convenient out-of-source builds
Using the environment variable I've written a small utility to wrap cargo, so it automatically builds out-of-source, supporting crates both at the top-level, on in a subdirectory of the source tree.
Thanks to Lukas for pointing out CARGO_TARGET_DIR and target-dir configuration option.
What I really wanted was a dynamic CARGO_TARGET_DIR that changes relative to where I am.
This bash alias puts all builds in a mirrored directory structure, e.g. instead of putting target into ~/mydir/myproj it puts in into ~/rustbuild/mydir/myproj
alias cargo='CARGO_TARGET_DIR=$(echo $PWD | sed "s|$HOME|$HOME/rustbuild|g") cargo'
You could also make your rustbuild directory hidden.
Related
In an iOS c++/Qt application, I need to ship a few files and to keep them in their directory structure.
For the Android version, we bundle a zip which we unzip on the target before creating the QApplication.
On iOS, it seems that CMake is not capable of bundling files in a tree:
https://cmake.org/cmake/help/latest/prop_tgt/RESOURCE.html#prop_tgt:RESOURCE
https://cmake.org/cmake/help/latest/prop_sf/MACOSX_PACKAGE_LOCATION.html
I am not sure if this is a limitation of cmake or if this is a global limitation on iOS.
From the docs about iOS bundles:
It uses a relatively flat structure with few extraneous directories in an effort to save disk space and simplify access to the files.
What would be the preferred approach?
Is there a solution to ship the files from CMake directly?
If not, how can I achieve this so that they are available before the QApplication is created?
The xcode command
Thanks to #Cy-4AH, I added the folder in Xcode and could get the command to do this:
CpResource _PATH_TO_DIRECTORY_ _APP_BUNDLE_DIRECTORY_/_RESOURCE_DIR_NAME_
cd /Users/denis/opt/qfield/ios/QField
export PATH="....."
builtin-copy -exclude .DS_Store -exclude CVS -exclude .svn -exclude .git -exclude .hg -strip-debug-symbols -strip-tool /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/strip -resolve-src-symlinks _PATH_TO_DIRECTORY_ _APP_BUNDLE_DIRECTORY_
But how can I create this from cmake? builtin-copyis an xcode command.
Simple system copy command
From an old (2008) discussion, we could use simple cp commands.
This works up to signing, but then I get an error unsealed contents present in the bundle root.
From this answer, it seems related that I cannot simply add folders in the resource directory. From the docs anatomy of framework bundles: Nonlocalized resources reside at the top level of the Resources directory
(Disclaimer: I'm not a CMake user, and there may be a more CMake-ey way to do this)
If you can set up post-build action, the following terminal script can efficiently sync files into your bundle from another location. I use it in my game engine because it only copies updated or new files upon subsequent builds, and preserves directory structure:
mkdir -p PATHTO/ORIGINFOLDERNAME
mkdir -p PATHTOBUILDFOLDER/PROJECTNAME.app/Contents/Resources/DESTINATIONFOLDERNAME
rsync -avu --delete --exclude=".*" PATHTO/ORIGINFOLDERNAME/ PATHTOBUILDFOLDER/PROJECTNAME.app/Contents/Resources/DESTINATIONFOLDERNAME
The mkdir commands are only to ensure that the folders are generated, if they were deleted.
So apparently the CMake method also works for directories.
target_sources(${QT_IOS_TARGET} PRIVATE ${_resource})
set_source_files_properties(${_resource} PROPERTIES MACOSX_PACKAGE_LOCATION Resources)
It will just be added at the root directory of the bundle and not within the Resources.
If the embedded file is not too big, you might consider :
in your source tree, generating a C++ file embedding that file as a constant array. For example, if your file contains just hello, world with a new line, you could have something like
/// file contents.cc
const char file_contents[] = "hello, world\n";
and at the beginning of your program (perhaps in your main function, before your QApplication) call a C++ function which writes such a file (perhaps in /tmp/).
in your build automation (e.g your Makefile or your qmake things), have something which generates the C++ contents.cc file from the genuine source
This is with a POSIX/Linux point of view, adapt my answer to your iOS.
Context:
When I want to build a projet, there are usually CMakeCache.txt and CMakeVars.txt files that are created at a first stage, e.g. using cmake-gui > configure.
Questions:
What are the differences between these files and what are they used for?
If I have tweaked a lot of options within cmake-gui, which file should I safely backup in an other location (for copying it back later on in order to not spend much time figuring out which options I had checked...) before removing the build/ directory contents (this directory normally includes these two files) if I want to start from a clean build/ directory (which is sometimes needed) ?
I have the following directory structure:
my_dir
|
--> src
| |
| --> foo.cc
| --> BUILD
|
--> WORKSPACE
|
--> bazel-out/ (symlink)
|
| ...
src/BUILD contains the following code:
cc_binary(
name = "foo",
srcs = ["foo.cc"]
)
The file foo.cc creates a file named bar.txt using the regular way with <fstream> utilities.
However, when I invoke Bazel with bazel run //src:foo the file bar.txt is created and placed in bazel-out/darwin-fastbuild/bin/src/foo.runfiles/foo/bar.txt instead of my_dir/src/bar.txt, where the original source is.
I tried adding an outs field to the foo rule, but Bazel complained that outs is not a recognized attribute for cc_binary.
I also thought of creating a filegroup rule, but there is no deps field where I can declare foo as a dependency for those files.
How can I make sure that the files generated by running the cc_binary rule are placed in my_dir/src/bar.txt instead of bazel-out/...?
Bazel doesn't allow you to modify the state of your workspace, by design.
The short answer is that you don't want the results of the past builds to modify the state of your workspace, hence potentially modifying the results of the future builds. It'll violate reproducibility if running Bazel multiple times on the same workspace results in different outputs.
Given your example: imagine calling bazel run //src:foo which inserts
#define true false
#define false true
at the top of the src/foo.cc. What happens if you call bazel run //src:foo again?
The long answer: https://docs.bazel.build/versions/master/rule-challenges.html#assumption-aim-for-correctness-throughput-ease-of-use-latency
Here's more information on the output directory: https://docs.bazel.build/versions/master/output_directories.html#documentation-of-the-current-bazel-output-directory-layout
There could be a workaround to use genrule. Below is an example that I use genrule to copy a file to the .git folder.
genrule(
name = "precommit",
srcs = glob(["git/**"]),
outs = ["precommit.txt"],
# folder contain this BUILD.bazel file is tool which will be symbol linked, we use cd -P to get to the physical path
cmd = "echo 'setup pre-commit.sh' > $(OUTS) && cd -P tools && ./path/to/your-script.sh",
local = 1, # required
)
If you're passing the name of the output file in when running, you can simply use absolute paths. To make this easier, you can use the realpath utility if you're in linux. If you're on a mac, it is included in brew install coreutils. Then running it looks something like:
bazel run my_app_dir:binary_target -- --output_file=`realpath relative/path/to.output
This has been discussed and explained in a Bazel issue. Recommendation is to use a tool external to Bazel:
As I understand the use-case, this is out-of-scope for building and in the scope of, perhaps, workspace configuration. What I'm sure of is that an external tool would be both easier and safer to write for this purpose, than to introduce such a deep design change to Bazel.
The tool would copy the files from the output tree into the source tree, and update a manifest file (also in the source tree) that lists the path-digest pairs. The sources and the manifest file would all be versioned. A genrule or a sh_test would depend on the file-generating genrules, as well as on this manifest file, and compare the file-generating genrules' outputs' digests (in the output tree) to those in the manifest file, and would fail if there's a mismatch. In that case the user would need to run the external tool, thus update the source tree and the manifest, then rerun the build, which is the same workflow as you described, except you'd run this tool instead of bazel regenerate-autogenerated-sources.
I would like to edit an existing software to add a new source file (Source.cpp).
But, I can't manage the compilation process (it seems to be automake and it looks very complicated).
The software (iperf 2: https://sourceforge.net/projects/iperf2/files/?source=navbar) is compiled using a classical ./configure make then make install.
If I just add the file to the corresponding source and include directory, I got this error message:
Settings.cpp:(.text+0x969) : undefined reference to ...
It looks like the makefile isn't able to produce the output file associated with my new source file (Source.cpp). So, I probably need to indicate it manually somewhere.
I searched a bit in the project files and it seemed that the file to edit was: "Makefile.am".
I added my source to the variable iperf_SOURCES in that file but it didn't workded.
Could you help me to find the file where I need to indicate my new source file (it seems a pretty standard compilation scheme but I never used automake softwares and this one seems very complicated).
Thank you in advance
This project is built with the autotools, as you already figured out.
The makefiles are built by automake. It takes its input in files that usually have a am file name extension.
The iperf program is built by the makefile generated from src/Makefile.am. This is indicated by:
bin_PROGRAMS = iperf
All (actually this is a simplification, but which holds in this case) source files of a to be built binary are in the corresponding name_SOURCES variable, thus in this case iperf_SOURCES. Just add your source file to the end of that list, like so (keeping their formatting):
iperf_SOURCES = \
Client.cpp \
# lines omitted
tcp_window_size.c \
my_new_file.c
Now, to reflect this change in any future generated src/Makefile you need to run automake. This will modify src/Makefile.in, which is a template that is used by config.sub at the end of configure to generate the actual makefile.
Running automake can happen in various ways:
If you already have makefiles that were generated after an configure these should take care of rebuilding themselves. This seems to fail sometimes though!
You could run automake (in the top level directory) by hand. I've never done this, as there is the better solution to...
Run autoreconf --install (possibly add --force to the arguments) in the top level directory. This will regenerate the entire build system, calling all needed programs such as autoheader, autoconf and of course automake. This is my favorite solution.
The later two options require calling configure again, IMO ideally doing an out of source built:
# in top level dir
mkdir build
cd build
../configure # arguments
make # should now also compile and link your new source file
I use git to interface with an SVN repository. I have several git branches for the different projects I work on.
Now, whenever I switch from one branch to another using 'git checkout ', all the compiled executables and object files from the previous branch are still there. What I would like to see is that switching from branch A to B results in a tree with all object files and binaries from the last time I worked on branch B.
Is there a way to handle this without creating multiple git repositories?
Update: I understand that executables and binaries should not end up in the repository. I'm a bit disappointed in the fact that all the branching stuff in git is useless to me, as it turns out I'll have to clone my proxy git repository for every branch I want to start. Something I already did for SVN and hoped to avoid with git. Of course, I don't have to do it, but it would result in me doing a new make most of the time after switching between branches (not fun).
What you want is a full context, not just the branch... which is generally out of scope for a version control tool. The best way to do that is to use multiple repositories.
Don't worry about the inefficiency of that though... Make your second repository a clone of the first. Git will automatically use links to avoid having multiple copies on disk.
Here's a hack to give you want you want
Since you have separate obj directories, you could modify your Makefiles to make the base location dynamic using something like this:
OBJBASE = `git branch --no-color 2> /dev/null | sed -e '/^[^*]/d' -e 's/* \(.*\)/\1\//'`
OBJDIR = "$(OBJBASE).obj"
# branch master: OBJBASE == "master/", OBJDIR == "master/.obj"
# non-git checkout: OBJBASE == "", OBJDIR == ".obj"
That will but your branch name into OBJBASE, which you can use to build your actual objdir location from. I'll leave it to you to modify it to fit your environment and make it friendly to non-git users of your Makefiles.
This is not git or svn specific - you should have your compiler and other tools direct the output of intermediate files like .o files to directories that are not under version control.
To keep multiple checkouts of the same repo, you can use git --work-tree.
For example,
mkdir $BRANCH.d
GIT_INDEX_FILE=$BRANCH.index git --work-tree $BRANCH.d checkout $BRANCH
You could set your IDE compiler to generate all private temporary files (.class and so on) in <output>\branchName\....
By configuration your compilation setting branch by branch, you can register the name of the branch in the output directory path.
That way, even if though private files remain when you git checkout, your project on the new branch is ready to go.
In the contrib/ directory of the git distribution, there is a script called git-new-workdir that allows you to checkout multiples branches in different directories without cloning your repository.
Those files aren't tracked by Git or Subversion, so they're left alone on the assumption that they are of some use to you.
I just do my checkouts in different directories. Saves me the trouble of doing cleanup.
A make clean should not be necessary because files that are different between different branches get checked out with the actual date!!!
This means that if your Makefile is correct, only those object-files, libs and executables are compiled again that really changed because of the checkout. Which is exactly the reason a makefile is there in the first place.
The exception is if you need to switch compiler options or even compilers in different branches. In that case probably git-new-workdir is the best solution.
If the compiled executables are files that have been checked in
then git stash solves the problem.
[compile]
git stash save "first branch"
git checkout other_branch
[Fiddle with your code]
[compile]
git stash save "second branch"
git checkout first_branch
git stash apply [whatever index your "first branch" stash has]
# alternatively git stash pop [whatever index...]
If the compiled executables are files that have not and will not be checked in
then simply add them to .gitignore