The title of this question does not begin to capture my years of exasperation with the RPM system. There is a vast gulf between a development system (./configure; make; make install;) and a rpm system (tar files, patch files, spec files, arcane build scripts, environments and tools) which I cannot bridge.
All I want to do is to change a few lines of code in a bigger program.
The problems which I run into:
Getting the source code of the system as-installed (e.g. SRPM from EPEL, original tarball, something else). What source should I use?
Getting that source code into a ready-to-edit form - something that I can edit with my favourite editor. How can I know that I'm editing the code as-deployed, bugs and all? (rpm -ivh x.src.rpm gives me tar files and squabs of patch files littered about in the SOURCES directory ... how can I get it right?)
Editing the code to implement some amazing hack (this part I can actually do).
Compiling the amazing code as edited - just compiling it in-place. Usually I can get this right, but it would be nice to have a hand sometimes, e.g. with ./configure set to use something other than the default /usr/local and /lost+found/opt/etc/opt or whatever crazy default autoconf decides to use.
Transforming my edits into a patch against the previous source and building new RPMs to test on some remote system (this is the great promise of RPM - pristine sources and hacky patches). If I do a diff of the original and the edited directories, the resulting patch contains all sorts of rubbish that I don't want to delete because I'm still developing (e.g. object code). (Actually, I don't have an 'original' at this point to do a diff against ... because I was only looking at the code casually when I realised I could "improve" it ...) Should I use some revision control system to track the changes I am making?
This should be simple stuff, but somehow all I can do is edit the code. After I have edited the code, it can never get over the hump, even though it is an already-solved problem. I have a GREAT fix for an open source project, but every single time that I finish developing my amazing hack, having delved into the code and made it compile (and possibly work), I am completely stumped. Nothing at all can turn my modified and now amazing source tree into a RPM. I end up deploying source code (into /usr/local), because that at least works.
How do people who do (say) security fixes actually go about the extract-edit-compile-test loop?
The SRPM is (relatively) self-contained: there are often some assumptions about build requirements not reflected in the spec file.
I would start by taking the SRPM, and rebuilding it to address the issue of build-requirements (adding whatever is needed to get it to build).
Then, extract the spec-file and sources from the SRPM, putting the patches and tar-file(s) into ~/rpmbuild/SOURCES, and building from the spec-file
Next, modify the spec-file to add my own patch file (or scripting changes),
Finally there's a new SRPM with my changes.
For extracting, I use an unrpm script (essentially a wrapper around cpio) which can be found on the network.
Making your own patch file is discussed here:
HowTo Create A Patch File For A RPM
RPM - Creating Patches
Patches for .spec file
Related
Ok, n00b question. I have a cpp file. I can build and run it in the terminal. I can build and run it using clang++ in VSCode.
Then I add gtest to it. I can compile in the terminal with g++ -std=c++0x $FILENAME -lgtest -lgtest_main -pthread and then run, and the tests work.
I install the C++ TestMate extension in VSCode. Everything I see on the internet implies it should just work. But my test explorer is empty and I don't see any test indicators in the code window.
I've obviously missed something extremely basic. Please help!
Executables should be placed inside the out or build folder of your workspace. Or one can modify the testMate.cpp.test.executables config.
I'd say, never assume something will "just work".
You'll still have to read the manual and figure out what are the names of config properties. I won't provide exact examples, because even though I've only used this extension for a short time, its name, and therefore full properties path, has already changed, so any example might get obsolete quite fast.
The general idea is: this extension monitors some files/folders, when they change, it assumes those are executables created using either gtest or catch2. The extension tries to run them with standard (for those frameworks) flags to obtain a list of test suites and test cases. If it succeeds, it will parse the output and create a nice list in the side panel. Markers in the code are also dependent on the exactly same parsed output, so if you have one, you have the other as well.
From the above, you need 3 things to make this work:
Provide correct path (or a glob pattern) for finding all test executables (while ignoring all non-test executables) in the extension config. There are different ways to do this, depending on the complexity of your setup, they are all in the documentation though.
Do not modify the output of the test executable. For example, if you happen to print something to stdout/stderr before gtest implementation parses and processes its standard flags, extension will fail to parse the output of ./your_test_binary --gtest-list_tests.
If your test executable needs additional setup to run correctly (env vars, cwd), make sure, that you use the "advanced" configuration for the extension and you configure those properties accordingly.
To troubleshoot #2 and #3 you can turn on debug logging for the extension (again, in the VSCode's config json), this will cause an additional "Output" tab/category to be created, where you can see, which files were considered, which were run, what was the output, and what caused this exact file to be ignored.
This messed with me for a while, I did as Mate059 answered above and it didn't work.
Later on I found out that the reason it didn't work was because I was using a Linux terminal inside windows (enabled from the features section) and I previously had installed the G++ compiler using the linux terminal so the compiler was turning my code into a .out file, for some reason TestMate could not read .out files.
Once I compiled the C++ source file using the powershell terminal it created a .exe file which I then changed the path in the setting.json as Mate059 said and it showed up.
TL;DR
Mate059 gave a great answer, go into settings.json inside your .vscode folder and modify "testMate.cpp.test.executables": "filename.exe".
For me it also worked using the wildcard * instead of filename.exe but I do not suggest to do that as in that might mess up something with the .exe from the main cpp file and what not.
As far as I know, CMake checks the time stamp of a source file to detect if it is outdated and needs to be rebuild (and with it, all files including it). When switching branches in a large git repository, this can causes problems.
Let's say I have one source folder and two build directories (build1 and build2), which correspond to two different branches (branch1 and branch2)
project
+-- src
+-- branch1_build
+-- branch2_build
Say my two branches have few differences, in few files; mostly, they only differ for some configuration option, all encapsulated in a config.h file, generated by the CONFIGURE_FILE command in cmake. The source files for the two config.h files (the config.h.in, as it is often called) are different. For instance, one branch introduces a new subfolder, which can be activated with a config-time option, which gets put in config.h.in with something like #cmakedefine HAVE_NEW_FEATURE_FOLDER. In such a scenario, when switching branches in the source folder, this happens: cmake recognizes that something changed in the config.h.in file, so it runs again; by running again, it generates a new config.h file; since config.h has a new time stamp, all files that includes it (directly or indirectly) end up being recompiled.
Now, if I alternatively switch between branch1 and branch2 in the source folder (cause I'm working on both branches every day), two consecutive make commands issued in the same build folder (either branch1_build or branch2_build) will trigger a full recompilation, since, although config.h has not changed in content, its time stamp has changed, so cmake flags it has changed.
My question is: what options do I have to avoid this? Or, better phrased, how can I avoid recompiling a source-build tree pair that is in fact unchanged since the last build, while also minimizing the changes to the source code?
The only solution I can think of is to execute CONFIGURE_FILE on config.h.in, with output config.h.tmp; compare config.h.tmp with config.h, and, only if different, copy config.h.tmp to config.h. However, this seems clumsy, and overcomplicated. I hoped cmake already had a mechanism for this, perhaps hidden under some options/variations of CONFIGURE_FILE...
Assuming this is not yet possible, I was wondering how complicated it would be for cmake to check the sha (rather than the timestamp) of a particular file when deciding whether it is outdated or not, and comparing it with the sha of a previous build (yes, the word outdated has date in it, but let's not get into enlish vocabulary discussions here). I imagine this is more expensive, so I would think that, if possible at all, this behavior should not be the default, and the user should use sparingly this feature, by explicitly tagging a file as check_sha_not_time kind of file. In the example above, the user would tag config.h as check_sha_not_time, and avoid recompilation of pretty much the whole library.
Note 0: I know little of how cmake internally works, so my suggestion of using sha rather than timestamp could be completely crazy and/or impossible given cmake implementation. I apologize for that. But that's why one asks things here, cause he/she doesn't know, right?
Note 1: I also tried using ccache, but unsuccessfully. Perhaps I need to use some particular flag or configuration option in ccache to trigger this capability.
Note 2: I want to avoid duplicating the source folder.
How I can build local sources and dependancies with flatpak-builder?
I can build local sources
flatpak build ../dictionary ./configure --prefix=/app
I can extract and build application with dependancies with a .json
flatpak-builder --repo=repo dictionary2 org.gnome.Dictionary.json
But no way to build dependancies and local sources? I don't find sources type
like dir or other, only archive, git (no hg?) ...
flatpak-builder is meant to automate the whole build process, with a single entry-point: the JSON manifest.
Everything else it obtains from Git, Bazaar or tarballs. Note that for these the "url" property may be a local URL starting with file://.
(There is indeed no support for Hg. If that's important for you, feel free to request it.)
In addition to that, there are a few more source types (see the flatpak-manifest(5) manpage), which can be used to modify the extracted sources:
file which point to a local file to copy somewhere in the extracted sources;
patch which point to a local patch file to apply to the extracted sources;
script which creates a script in the extracted sources, from an array of commands;
shell which modifies the extracted sources by running an array of commands;
Adding a dir source type might be useful.
However (and I only flatpaked a few apps, and contributed 2 or 3 patches to the code, so I might be completely wrong) care must be taken as this would easily make builds completely unreproducible, which is one thing flatpak-builder tries very hard to enable.
For example, when using a local file source, flatpak-builder will base64-econde the content of that file and use it as a data:text/plain;charset=utf8;base64,<content> URL for the file which it stores in the manifest included inside the final build.
Something similar might be needed for a dir source (tar the folder then base64-encode the content of the tar?), otherwise it would be impossible to reproduce the build. I've just been told (after submitting this answer) that this changed in Git master, in favour of a new flatpak-builder --bundle-sources option. This would probably make it easier to support reproducible builds with a dir source type.
In any case, feel free to start the conversation around a new dir source type in the upstream bug tracker. :)
There's a expermental cli tool if you want to use it https://gitlab.com/csoriano/flatpak-dev-cli
You can read the docs
http://docs.flatpak.org/en/latest/building-simple-apps.html
http://docs.flatpak.org/en/latest/flatpak-builder.html
In a nutshell this is what you need to use flatpak as develop workbench
https://github.com/albfan/gnome-builder/wiki/flatpak
LLVM can create graphs in Graphviz's "dot" format, and automatically invoke a viewer to display them. By default it uses dotty to display those graphs. I know that I can change it to use a different viewer, but I was not able to find precise instructions on how to do so.
How can I make it open the graphs with a different viewer?
I'm running on Linux but would be interested in an answer for Windows as well.
I found out I'm supposed to change the CMakeCache.txt file in my build folder. For instance, to use XDot instead of dotty, I edited the LLVM_PATH_XDOT_P property in that file to point to the full path of my xdot.py file.
It now opens the alternative viewer successfully, after rebuilding the project.
I just needed to do this.
I managed to do this with a workaround: made a backup of dotty (just in case) and created a link from dotty to XDot.
cp /usr/bin/dotty /usr/bin/dotty_copy
ln -s /usr/bin/dotty /usr/bin/xdot
I believe you could also set some variable during configuration step (possibly LLVM_PATH_DOTTY), but I never tried this as I didn't want to recompile LLVM.
You may try hacking the DisplayGraph function or fidging with the makefiles until you manage to enable one of the #ifdefs in DisplayGraph.
Context
I am working on a klee (http://klee.llvm.org) fork and want to clean up our repository to separate our stuff from the "canonical" klee code. Anyway, I'm having trouble understanding/extending the build system.
Problem
The directory structure in /lib/ looks like this
Basic/
Core/
Support/
Expr/
Solver/
Module/
Mine/
Mine was just added by me, so far we threw everything in Core and I am moving it to Mine. How do I tell the build system to do this properly?
My attempt
Being unable to figure this out on my own, I edited /lib/Makefile:
LEVEL=..
PARALLEL_DIRS=Basic Support Expr Solver Module Core Mine
include $(LEVEL)/Makefile.common
and copied the /lib/Core/Makefile to /lib/Mine/Makefile while changing LIBRARYNAME=kleeCore to LIBRARYNAME=kleeMine.
Caveat
I have a feeling that this is not the proper way to do it, and I should rather modify some configure script or something. Also it does not link (it compiles, though).
A colleague just told me how to get it to link, which is by modifying /tools/klee/Makefile
USEDLIBS = kleeCore.a kleeModule.a kleaverSolver.a kleaverExpr.a kleeSupport.a kleeBasic.a kleeMine.a