Creating an ofstream for a `FILE*` - possible? [duplicate] - c++

The well known way of creating an fstream object is:
ifstream fobj("myfile.txt");
ie. using a filename.
But I want to create an ifstream object using a file descriptor.
Reason: I want to execute a command using _popen(). _popen() returns the output as a FILE*. So there is a FILE* pointer involved but no filename.

You cannot do that just in standard C++, since iostreams and C I/O are entirely separate and unrelated. You could however write your own iostream that's backed by a C FILE stream. I believe that GCC comes with one such stream class as a library extension.
Alternatively, if all you want is an object-y way of wrapping a C FILE stream, you could use a unique pointer for that purpose.

You can try QTextStream.
See in https://doc.qt.io/qt-6/qtextstream.html#QTextStream-2

You can create a string, and use fread to read and append to it. It's not clean, but you're working with a C interface.
Something like this should work:
FILE * f = popen(...)
const unsigned N=1024;
string total;
while (true) {
vector<char> buf[N];
size_t read = fread((void *)&buf[0], 1, N, f);
if (read) { total.append(buf.begin(), buf.end()); }
if (read < N) { break; }
}
pclose(f);

Related

Read an image or pdf using C++ without external library

I was just thinking after reading about Java & C#, whether C++ can also read image & pdf files without the use of external libraries ? C++ doesn't have the byte type like Java & C#. Then how can we accomplish the task ( again without using an external library) ?
Can anyone give a small demonstration (ie a program or code to read or copy or write image or pdf files) ?
You can use unsigned char or char reinterpreted as some integer type to parse binary file formats like pdf, jpeg etc. You can create a buffer as std::vector<char> and read it as following:
std::vector<char> buffer((
std::istreambuf_iterator<char>(infile)), // Ensure infile was opened with binary attribute
(std::istreambuf_iterator<char>()));
Related questions: Reading and writing binary file
There is no difference what file you are reading opened in binary mode, there is only difference is how you should interpret the data you get from the file.
It's significantly better to take ready to use library like e.g. libjpeg or whatever. There are plenty of them. But If you really want to do this, at first you should define suitable structures and constants (see links below) to make code to be convinient and useable. Then you just read the data and try to interpret it step by step. The code below is just pseudo code, I didn't compile it.
#include <fstream>
// define header structure
struct jpeg_header
{
enum class marker: unsigned short { eoi = 0xffd8, sof0 = 0xffc0 ... };
...
};
bool is_eoi(unsigned short m) { return jpeg_header::eoi == m; }
jpeg_header read_jpeg_header(const std::string& fn)
{
std::ifstream inf(fn, std::ifstream::binary);
if (!inf)
{
throw std::runtime_error("Can't open file: " + fn);
}
inf.exceptions(std::ifstream::failbit | std::ifstream::eofbit);
unsigned short marker = inf.get() << 8;
marker |= inf.get();
if (!is_eoi(marker))
{
throw std::runtime_error("Invalid jpeg header");
}
...
jpeg_header header;
// read further and fill header structure
...
return header;
}
To read huge block of data use ifstream::read(), ifstream::readsome() methods. Here is the good example http://en.cppreference.com/w/cpp/io/basic_istream/read.
Those functions also work faster then stream iterators. It's also better define your own exception classes derived from std::runtime_error.
For details on file formats you interested in look here
Structure of a PDF file?
https://en.wikipedia.org/wiki/JPEG_File_Interchange_Format
https://en.wikipedia.org/wiki/JPEG
It would be a strange world to have a system language like C and in this case C++ without a type byte :).
Yeah, I take it, it has strange name, unsigned char, but it is still there:).
Really just think about the magnitude of re-development of all things to avoid byte:). Peripherals, many registers in CPU's and other chips, communication, data protocols. It would all have to be redone:).

File loader problems

i have a text file which contains authors and books lists, i need to load it to my program, here is the code of the method which should load it:
void Loader::loadFile(const char* path)
{
FILE* file = fopen(path, "r");
char* bufferString;
while (feof(file) != 1) {
fgets(bufferString, 1000, file);
printf("%s", bufferString);
}
}
I use it in my main file:
int main(int argc, char** argv) {
Loader* loader = new Loader();
loader->loadFile("/home/terayon/prog/parser/data.txt");
return 0;
}
And I get data.txt file is not completely printed.
What I should do to get data completed?
fgets reads into the memory pointed to by the pointer passed as first parameter, bufferString on your case.
But your bufferString is an uninitialised pointer (leading to undefined behaviour):
char * bufferString;
// not initialised,
// and definitely not pointing to valid memory
So you need to provide some memory to read into, e.g by making it an array:
char bufferString[1000];
// that's a bit large to store on the stack
As a side note: Your code is not idiomatic C++. You're using the IO functions provided by the C standard library, which is possible, but using the facilities of the C++ STL would be more appropriate.
You have undefined behavior, you have a pointer bufferString but you never actually make int point anywhere. Since it's not initialized its value will be indeterminate and will seem to be random, meaning you will write to unallocated memory in the fgets call.
It's easy to solve though, declare it as an array, and use the array size when calling fgets:
char bufferString[500];
...
fgets(bufferString, sizeof(bufferString), file);
Besides the problem detailed above, you should not do while(!feof(file)), it will not work as you expect it to. The reason is that the EOF flag is not set until you try to read from beyond the file, leading the loop to iterate once to many.
You should instead do e.g. while (fgets(...) != NULL)
The code you have is not very C++-ish, instead it's using the old C functions for file handling. Instead I suggest you read more about the C++ standard I/O library and std::string which is a auto-expanding string class that won't have the limits of C arrays, and won't suffer from potential buffer overflows in the same way.
The code could then look something like this
std::ifstream input_file(path);
std::string input_buffer;
while (std::getline(input_file, input_buffer))
std::cout << input_buffer << '\n';

C++: cin >> *char

So, I'm currently writing a line editor as a learning project on I/O, writing files, and the like. It is written in C++, and I am currently trying to write out to a file of the user's choosing. I have CLI arguments implemented, but I currently have no idea how to implement an in program way of specifying the file to write to.
char *filename;
if (argc >= 2){
filename = argv[1];
} else{
cout << "file>";
cin >> filename;
cin.ignore();
}
This works perfectly well when I use command line arguments; however, whenever I do not, as soon as I start the program, it Segmentation Faults. The place where I use the actual filename is in the save command:
void save(char filename[], int textlen, string file[]){
ofstream out(filename);
out << filestring(textlen, file);
out.close();
}
Which also works perfectly well. Is there any way you can help me? Full source code, for review, is up on https://github.com/GBGamer/SLED
The problem is that char* filename is just a pointer to some memory containing characters. It does not own any memory itself.
When you use the command line argument, the program handles storing that string somewhere, and you get a pointer to it. When you try to read using cin >> filename there isn't actually anywhere to store the read data.
Solution: Replace char* filename with std::string filename (and #include <string>).
Then to open the output file, you need a c-style string (null terminated char array). std::string has a function for this. You would write
std::ofstream out(filename.c_str());
^^^^^
Or, in fact, if you can use a recent compiler with c++11 features, you don't even need to use c_str(). A new std::ofstream constructor has been added to accept a std::string.
Your filename variable points to argv[1] when command line argument is provided, it does not need memory to be allocated but when going in else block, you have not allocated memory to filename. Its just a pointer.
Use malloc to assign filename some memory then take user input.
filename = (char *)malloc(sizeof(char)*(FILE_NAME_LENGTH+1))

Read .part files and concatenate them all

So I am writing my own custom FTP client for a school project. I managed to get everything to work with the swarming FTP client and am down to one last small part...reading the .part files into the main file. I need to do two things. (1) Get this to read each file and write to the final file properly (2) The command to delete the part files after I am done with each one.
Can someone please help me to fix my concatenate function I wrote below? I thought I had it right to read each file until the EOF and then go on to the next.
In this case *numOfThreads is 17. Ended up with a file of 4742442 bytes instead of 594542592 bytes. Thanks and I am happy to provide any other useful information.
EDIT: Modified code for comment below.
std::string s = "Fedora-15-x86_64-Live-Desktop.iso";
std::ofstream out;
out.open(s.c_str(), std::ios::out);
for (int i = 0; i < 17; ++i)
{
std::ifstream in;
std::ostringstream convert;
convert << i;
std::string t = s + ".part" + convert.str();
in.open(t.c_str(), std::ios::in | std::ios::binary);
int size = 32*1024;
char *tempBuffer = new char[size];
if (in.good())
{
while (in.read(tempBuffer, size))
out.write(tempBuffer, in.gcount());
}
delete [] tempBuffer;
in.close();
}
out.close();
return 0;
Almost everything in your copying loop has problems.
while (!in.eof())
This is broken. Not much more to say than that.
bzero(tempBuffer, size);
This is fairly harmless, but utterly pointless.
in.read(tempBuffer, size);
This the "almost" part -- i.e., the one piece that isn't obviously broken.
out.write(tempBuffer, strlen(tempBuffer));
You don't want to use strlen to determine the length -- it's intended only for NUL-terminated (C-style) strings. If (as is apparently the case) the data you read may contain zero-bytes (rather than using zero-bytes only to signal the end of a string), this will simply produce the wrong size.
What you normally want to do is a loop something like:
while (read(some_amount) == succeeded)
write(amount that was read);
In C++ that will typically be something like:
while (infile.read(buffer, buffer_size))
outfile.write(buffer, infile.gcount());
It's probably also worth noting that since you're allocating memory for the buffer using new, but never using delete, your function is leaking memory. Probably better to do without new for this -- an array or vector would be obvious alternatives here.
Edit: as for why while (infile.read(...)) works, the read returns a reference to the stream. The stream in turn provides a conversion to bool (in C++11) or void * (in C++03) that can be interpreted as a Boolean. That conversion operator returns the state of the stream, so if reading failed, it will be interpreted as false, but as long as it succeeded, it will be interpreted as true.

Writing to file using c and c++

When I try to write the file using C; fwrite which accepts void type as data, it is not interpreted by text editor.
struct index
{
index(int _x, int _y):x(_x), y(_y){}
int x, y;
}
index i(4, 7);
FILE *stream;
fopen_s(&stream, "C:\\File.txt", "wb");
fwrite(&i, sizeof(index), 1, stream);
but when I try with C++; ofstream write in binary mode, it is readable. why doesn't it come up same as written using fwrite?
This is the way to write binary data using a stream in C++:
struct C {
int a, b;
} c;
#include <fstream>
int main() {
std::ofstream f("foo.txt",std::ios::binary);
f.write((const char*)&c, sizeof c);
}
This shall save the object in the same way as fwrite would. If it doesn't for you, please post your code with streams - we'll see what's wrong.
C++'s ofstream stream insertion only does text. The difference between opening a iostream in binary vs text mode is weather or not end of line character conversion happens. If you want to write a binary format where a 32 bit int takes exactly 32 bits use the c functions in c++.
Edit on why fwrite may be the better choice:
Ostream's write method is more or less a clone of fwrite(except it is a little less useful since it only takes a byte array and length instead of fwrite's 4 params) but by sticking to fwrite there is no way to accidentally use stream insertion in one place and write in another. More less it is a safety mechanism. While you gain that margin of safety you loose a little flexibility, you can no longer make a iostream derivative that compresses output with out changing any file writing code.