How can make my makefile overwrite a file? - c++

descript: progam.cpp
g++ progam.cpp -o descript
./descript 2>output.txt | tee -a output.txt
From my understanding, first command compiles program.cpp and the second command sends the output to both terminal and a textfile.
Is there a way to adjust this so that I :
Use "make".Go through program prompts. Output is saved in output.txt
Use "./descript" or some command a second time and overwrite output.txt with new output
I'm fairly new to linux commands in general so anything would help.

It may be helpful to include a make clean function in your Makefile.
Example make clean function could include:
make clean:
rm -f output.txt
Then, insert the make clean at the beginning of your descript portion of the Makefile to auto-remove the previous output.

Related

How to compile and run both at the same time for a c++ code in linux terminal?

I am using Ubuntu(latest). If I have a test.cpp file in my home directory, I write two commands in terminal to compile and run this file.
prateek332#pp-pc:~$ g++ test.cpp
prateek332#pp-pc:~$ ./a.out
Is there a way to write these two command simultaneously (or maybe even a better way). I used pipelining but it doesn't work.
prateek332#pp-pc:~$ g++ test.cpp | ./a.out
This doesn't work. It doesn't compile to new changes in test.cpp file, instead just runs the old code in file.
g++ test.cpp && ./a.out
First compile and then, if it was successfull, run the code.
You can create a shell function since this is something you will do often.
In ~/.bashrc (or whatever your shell config is like ~/.zshrc)
function cpp() { g++ $1 && ./a.out; }
Now you can just type
cpp test.cpp
You can name the function whatever you want. Open a new shell window to load the function (or run source ~/.bashrc).

Eclipse not calling sub-makefiles correctly

I am trying to refactor a single makefile project to hierarchical structure. The project is imported in Eclipse as "External C/C++ project with makefile".
The new folder with the separate makefile contains source files and a makefile with the following recipe:
.PHONY: test
test:
echo "test"
The top directory contains a the top-level makefile with the following recipe:
clean:
# echo ...cleaning
cd CppAudioPeriphs && make test
rm -f $(OBJECTS) $(NAME).lst $(NAME).elf $(NAME).bin $(NAME).s19 $(NAME).map $(NAME).dmp
When I call from Eclipse Clean project, the last line from the last recipie clearly completes correctly. However, the line, asking to go to the sub-directory and execute make clean returns with the following message:
make[1]: `build/PeriphPhysical.o' is up to date.
This is the first object file declared, and the message is the same even if recipe test does not exist.
On the other hand, from the command line, everything works. Id est open cmd.exe, go to project folder, type make clean - > the "echo test" command gets executed.
I am using gcc and binutils, compiled for Windows, for cross-compilation for arm. Where could be my problem.
EDIT: response to jimmy
These may be additional clues.
1) If I replace
cd CppAudioPeriphs && "make test"
with
cd CppAudioPeriphs && C:\arm_tools\tools\bin\make.exe test
, the result is:
/usr/bin/sh: C:arm_toolstoolsbinmake.exe: command not found
If I change the slashes to forward slashes, the old message of ``build/PeriphPhysical.o' is up to date.` pops back in.
Replaced
cd CppAudioPeriphs && C:\arm_tools\tools\bin\make.exe test
with
make -C CppAudioPeriphs test
as a workaround and now everything compiles.

Make uses old Makefile

I'm using make to build a C++ project. During the course of the project, I wanted to make some changes to the Makefile. Unfortunately, ever since I executed make once, it keeps using that particular version of the Makefile and just doesn't do anything with the changes at all.
I have run make clean, I have renamed the makefile, I've searched for other Makefiles which might be used instead, all to no avail. There is no mention of any caching mechanism in the man pages for make, nor anywhere on Google.
Does anyone have any idea why make isn't using the new version and what I can do about it? I'm compiling on a Ubuntu 12.04.2 LTS (x86_64) box, with (GNU) make version 3.81.
Update:
Some additional information. It seems make is using the current version of the makefile after all. If I change something in the main target, it's working just fine. But if I change something in the obj/%.o target, it just keeps running the same command, no matter what changes I make to that target.
Full Makefile an be found here: http://pastebin.com/WK43NRcL
CC_FILES = $(shell find -name "*.cc" -exec echo "{}" +;)
That find command is incorrect, shouldn't it be looking in the src directory? And why use echo to print the name when that's what find does anyway?
That means your list of CC_FILES and so also list of OBJ_FILES is empty.
I think you want:
CC_FILES := $(shell find src -name "*.cc")
Note that this uses := not = because otherwise the shell function gets run every time you reference the CC_FILES variable. Using := means it is run and evaluated only once.
However, since it seems all your .cc files are in the same directory you don't need a recursive find, just do:
CC_FILES := $(wildcard src/*.cc)
As you've realised, your patsubst is broken, you can just do:
OBJ_FILES := $(patsubst src/%.cc,obj/%.o,$(CC_FILES))
(Again, use := here)
Also:
obj/%.o: obj src/%.cc
$(CXX) $(CFLAGS) -c -o $# $<
I think you need to read what the $< variable expands to, because the rule above isn't going to do what you expect.
This makefile is full of errors, you need to use echo in pattern rules to print out the values of variables, so you can verify they have the values you expect. (As another option for debugging, set SHELL=bash -x so every shell command is echoed)
make does not somehow magically keep track of your old makefile; it will use whatever file is first in the list of files it looks for, in the current directory.
To find out which Makefile is actually used, see this question: Getting the name of the makefile from the makefile
Since you're using GNU make, check its excellent manual on what filenames it looks for, and in which order.

xgettext - extract translatable strings and update .pot

I have inherited a sample.pot file. Now, I have added new messages in a1.c and a2.cpp. Is it possible for me to use xgettext and output the contents to same sample.pot instead of creating a new one? Eg:-
xgettext -d sample -s -o sample.pot a1.c
xgettext -d sample -s -o sample.pot a2.cpp
Is this preferred way to go in order to update the template such that old messages are also preserved? The other question is how do we distinguish translatable strings from normal strings in source code. I assume xgettext will pull all strings from mentioned source code file.
It would be great if anybody can share the correct approach..Thanks
Does the -j, --join-existing option ("join messages with existing file") not do what you need?
Note that you can specify more than one input file on the command line.
xgettext -d sample -s -j -o sample.pot a1.c a2.cpp
The simplest way to achieve this is:
xgettext -o sample.pot -s a1.c a2.cpp sample.pot
You don't need -j, --join-existing because xgettext accepts .po and .pot files as regular input files.
The option -j, --join-existing is rarely useful. In conjunction with -D, --directory it has the effect that the output file sample.pot used as an input file is not searched in the list of directories. If you use -l c, --language=c you need -j, --join-existing because sample.pot would otherwise be parsed as a C/C++ source file.
Besides, -o sample.pot, --output=sample.pot has exactly the same effect as -d sample, --default-domain=sample. You can safely omit one of them.

Does g++ still generate an output file even if the program fails to compile/load?

I am compiling some C++ programs through a perl script using:
g++ -o out `find ./ -iname "*.h" -or -iname "*.cpp"`
This seems to generate an an out file every time, regardless of whether the program compiled successfully or not.
Whenever the script tries to run programs like this, it gets permission errors (weird since I'm running as root).
Is this accurate and if so, how can I prevent it?
Thanks.
The answer to your title's question ("Does g++ still generate an output file even if the program fails to compile/load?") is no:
% echo blah > test.cpp
% g++ -o out test.cpp
test.cpp:1: error: expected constructor, destructor, or type conversion at end of input
% ls *out*
/bin/ls: *out*: No such file or directory
%
I solved it as follows:
For some reason, trying to put the output executable using -o out seemed to force creating the file even after the compile failed (it seems to me).
g++ -o out.tmp `find ./ -iname "*.h" -or -iname "*.cpp"` && mv out.tmp out