Combining two cpp files into one cpp file - c++

I am working on an assignment where I have main.cpp,lego.cpp, and lego.h. My program currently runs fine and gives out the desired output. My issue is that to submit, I must have everything in a single cpp file. Would appreciate if someone could give a simple example. (I want everything in main.cpp)
Merge C++ files into a single source file
i found this link that is similar but I cant grasp the example

if anyone has this problem I just fixed it by doing this
class lego {things in class};
lego::functionA(){}
lego::functionB(){}
int main(){};
make sure that you delete any other cpp files that are in your source folder and remove any header files from main. Was what was causing my issue.

Combining two cpp files into one cpp file
On a POSIX system, you can use the following command:
cat main.cpp lego.cpp > combined.cpp
On windows, use type instead of cat.
This trivial approach works for most programs, and very likely for a simple assignment. But this can have problems with more complicated programs in particular those that rely heavily on pre-processor.

using the command prompt by combining the files. you need to put the files that you need to combine in a single folder and using the command line (windows + r, on windows 10) run commands that access the directory in which the files are.
Enter the command copy/b *.cpp combined.cpp That will save the two cpp files into one file combined.cpp.
Then access the combined file combined.cpp in the same directory.
This might help you too https://www.youtube.com/watch?v=0atEx7EngoE

Related

How can I compile c++ to multiple files?

I have a program (cpp) with many classes. Every class is in separate source file (.h + .cpp).
How can I split the compiled program into multiple files (instead of one big executable file)?
Let's say, one file for every class (same as the code structure).
So that every time there is change in a specific class, I compile only that class, and replace the specific compiled file related to that class.
(Something similar to .DLL files in Windows.)
Example from real life:
I am making TUI interface for managing mysql.
I would like to create mysql text editor (TUI) with ncurses.
the code (class) for creating and managing single window object is in
'textWin.cpp' + 'textWin.h'
the code (class) for managing multiple windows, by creating windows objects from previous class is in winMan.cpp winMan.h
the code (class) for managing mysql database is in :
mysql.cpp mysql.h
and so on...
so, I have the following files:
MyProgram.cpp
- winMan.cpp + winMan.h
- textWin.cpp + textWin.h
- mysql.cpp + mysql.h
- ..
- ..
After g++ compilation, I get one executable file, './MyProgram' (size about 15Mb.) which I deliver to all my customers (1000's of them).
I Just found a typo in textWin.cpp, I fixed it, and I told to all customers that there is an update... all of them need to download one big 15Mb file, this consumes allot of bandwidth and server resources, for just a small update.
Is there a way to send to all my customers smaller file, that contains only the compiled code for textWin class ?
I use g++ on Centos7
The gcc compiler will happily take a list of cpp files to compile together to make one executable. You don't need to write a "containing" cpp file. However, you still have the issue that each time it rebuilds them all.
The alternative is to build each sourcefile separately to an object file, then link those all together. Hopefully each of those invocations of the compiler will add up to less time than the single command-line. But how to keep track of which cpp files actually need to be rebuilt?
The usual approach is to use a makefile and a make utility which will check the dates of all the mentioned files. There are a variety of flavours of makefile, and helper makefile engines. Download a simple package like gzip and you can quickly get an idea of how the Makefile is structured. Then there is lots of help online, or you may decide that this is just too much trouble for a project with 5 files in it.
As suggested in the comments by #RSahu
Shared Libraries (.so files) is the way to split your compiled code.
here is a small example:
https://www.cprogramming.com/tutorial/shared-libraries-linux-gcc.html
Of course, you could put your texts into separate text-files and only deploy those in the an error is there. For your special use case, where binary differences must be deployed, this question might be helpful: How do I create binary patches?
Another option, do proper versioning. That way, your customers might be able to decide for themselves. That is, if they need this update.

How to add lodepng.cpp and lodepng.h to my project using DevC++?

I have wrote this code to which is a simple utility to separate RG/GB Bayer color channels to individual files. It takes in input a RAW12 file and outputs png files corresponding to different Bayer Channels.I tried to compile it using DevC++ and it shows
[Error] lodepng.h: No such file or directory
I'm kinda new to these kind of things, and I don't know how to include lodepng.h and lodepng.cpp in DevC++ I tried a lot to find how. Any help would be appreciated.
#include "lodepng.h" statement is enough to add the file, if it is present in the same path as the current file in which you are working.
Once you have added the file, you can directly call all the methods in that file or refer to any objects or variables in that file.

Writing output files in different directory when many output files are being created

I am using fortran 95. I have a question very similar to Accessing files in sub directory of main program
The additional problem that I am having is this: I am creating files in a loop using following commands:
write(fn,fmt='(a,i0,a)')"degseqA",filenumber,'.dat'
open(unit=filenumber,file=fn)
Hence I cannot use 'output/myfile.dat' to make myfile.dat go to the directory output. Is there any way to solve this?
Thanks
If the directory already exists, it is totally straightforward.
write(fn,fmt='(a,i0,a)') "output/degseqA",filenumber,'.dat'
open(unit=filenumber,file=fn)
or in general
write(fn,fmt='(a,i0,a)') trim(directory_name)//"degseqA",filenumber,'.dat'
where directory_name is a character variable with the name of the directory.
Make sure fn is large enough.

Where should I put this .h file, or how can I properly set my path in TextMate?

I'm just getting my feet wet in C++ using the Stanford CS 106B lectures available online. The assignments have the students use some custom libraries which are available for download online, although the installation instructions are gone.
While I can do the assignments in Xcode using a pre-built blank project which includes the relevant files and source trees set up, I also have TextMate on hand and thought I'd like to try coding with it, since I liked using it a lot for coding LaTeX. So far so good.
The first program I'm trying to run (a very simple ten-line program) contains an # include "genlib.h" in the first line. I have the genlib.h file, but can't seem to get either of the following to work:
Add the path to the relevant file in TextMate: When I try to add the path to the folder on my desktop (/previouspathinthelist:/Users/me/Desktop/C++\ libraries) where the file lives I get an error: /Users/me/Documents/c++ programs/powertab.cpp:9:20: error: genlib.h: No such file or directory even though the file is right there! (Maybe I should note here that the file to be imported and the program file are in two different folders).
Add the file to one of the other paths: I can't move the files using mv in terminal to usr/bin, usr/sbin, etc. because it says I don't have the proper permissions.
Is there something I'm doing wrong in setting my path to my folder in Documents? There aren't any spelling mistakes or anything since the path came straight from get info in the finder. I know this is a programming forum and not a TextMate support forum, but I thought it'd be good to know where people generally put these kinds of files on their systems.
Just put the file in the same directory as your other source files.
#include "filename"
searches the source directory first, whereas
#include <filename>
only searches the include file path.
The reason why /previouspathinthelist:/Users/me/Desktop/C++\ libraries doesn't work probably has to do with the space in the file name. It is quite possible that a backslash is not the right way to quote the space in the tool you're using. Many tools from the C/unix tradition deal rather badly with pathnames that contain space (even though the Unix kernel itself has no such problem); often you'll find that there is no single amount of quoting that will simultaneously satisfy all the tools and subsystems that use some setting. Better to avoid spaces in filenames entirely when you're doing development.

How to bundle C/C++ code with C-shell-script?

I have a C shell script that calls two
C programs - one after the another
with some file handling before,
in-between and afterwards.
Now, as such I have three different files - one C shell script and 2 .c files.
I need to give this script to other users. The problem is that I have to distribute three files - which the users must keep in the same folder and then execute the script.
Is there some better way to do this?
[I know I can make one C code file out of those two... but I will still be left with a shell script and a C code. Actually, the two C codes do entirely different things... so I want them to be separate]
Sounds like you're worried that your users aren't savy enough to figure out how to resolve issues like command not found errors and the like. If absolutely MUST hide "complexity" of a collection of files you could have your script create the other files. In most other circumstances I would suggest that this approach is only going to increase your support workload since semi-experienced users are less likely to know how to troubleshoot the process.
If you choose to rely on the presence of a compiler on the system that you are running on you can store the C code as a collection of cat $STRING >> file.c commands to to create your two C files, which you then compile and use.
If you would want to use pre-compiled programsn instead then the same basic process can be used except instead use xxd to both generate the strings in your script and reverse the conversion process to give you working binaries. Note: Remember to chmod the binary so that it is executable.
use shar command to create self-extracting archive.
or better yet use unzipsfx with AUTORUN option.
This provides users with ONE file, and only ONE command to execute (as opposed to one for untarring and one for execution).
NOTE: The unzip command to run should use "-n" option, that way only the first run would extract the files and the subsequent would skip the extraction.
Use a zip or tar file? And you do realize that .c files aren't executable, you need to compile & link them first?
You can include the c code inside the shell script as a here document:
#!/bin/bash
cat > code.c << EOF
line #1
line #2
...
EOF
# compile
# execute
If you want to get fancy, you can test for the existence of the executable and skip compiling them if they exists.
If you are doing much shell programming, the rest of the Advanced Bash-Scripting Guide is worth looking at as well.