ifstream binary read/write only takes char*? - c++

I'm having a little trouble figuring out how to write this value to a file correctly. I did a little research on the internet and found this article.
http://www.eecs.umich.edu/courses/eecs380/HANDOUTS/cppBinaryFileIO-2.html
#include <fstream>
#include <iostream>
int main()
{
int testVar = 71;
std::ofstream outputFile;
outputFile.open("C:/binary.dat", std::ios::out | std::ios::binary);
outputFile.seekg(0);
outputFile.write(&testVar, sizeof(testVar));
outputFile.close();
}
What I understand from the article is that the first parameter is a void pointer? which means that it will accept any type? But when I'm typing it out, the intelisense says there is no overload and the first parameter takes type char*.
Am I using the wrong header or something from an older C++ version??
Could really use some help here.
Thanks!

I am not familiar with the history of development of the functions. Hence, I can't comment on why the arguments are of type char* and not void*.
To solve your problem...
You can use:
outputFile.write(reinterpret_cast<char*>(&testVar), sizeof(testVar));
Use reinterpret_cast while using istream::read() also.

You really just need to cast it to char* like was said before, but there are other problems with the code.
seekg() is for input streams, and you are writing an output file. If you meant to clear the line, just open the file with trunc.
#include <fstream>
#include <iostream>
int main()
{
int testVar = 71;
std::ofstream outputFile("C:/binary.dat", std::ios::out | std::ios::binary | std::ios::trunc);
outputFile.write((char*)&testVar, sizeof(testVar));
outputFile.close();
}

The first parameter is a char*.
char having size 1 byte signifies byte by byte data and char* signifies a block of data.
Hence when writing raw binary data, the data is passed as a char* and size of the block.
Am I using the wrong header or something from an older C++ version??
No it is not the problem as stated it is char* and not void* that is taken as argument.
NOTE The seekg() member function is for input streams. I think the functionality you needed was seekp()

Related

How does read and write function work in C++ file handling?

I'm learning file handling in c++ from internet alone. I came across the read and write function. But the parameters they take confused me.
So, I found the syntax as
fstream fout;
fout.write( (char *) &obj, sizeof(obj) );
and
fstream fin;
fin.read( (char *) &obj, sizeof(obj) );
In both of these, what is the function of char*?
And how does it read and write the file?
The function fstream::read has the following function signature:
istream& read (char* s, streamsize n);
You need to cast your arguments to the correct type. (char*) tells the compiler to pretend &obj is the correct type. Usually, this is a really bad idea.
Instead, you should do it this way:
// C++ program to demonstrate getline() function
#include <iostream>
#include <string>
using namespace std;
int main()
{
string str;
fstream fin;
getline(fin, str); // use cin instead to read from stdin
return 0;
}
Source: https://www.geeksforgeeks.org/getline-string-c/
The usage of the char * cast with read and write is to treat the obj variable as generic, continuous, characters (ignoring any structure).
The read function will read from the stream directly into the obj variable, without any byte translation or mapping to data members (fields). Note, pointers in classes or structures will be replaced with whatever value comes from the stream (which means the pointer will probably point to an invalid or improper location). Beware of padding issues.
The write function will the entire area of memory, occupied by obj, to the stream. Any padding between structure or class members will also be written. Values of pointers will be written to the stream, not the item that the pointer points to.
Note: these functions work "as-is". There are no conversions or translations of the data. For example, no conversion between Big Endain and Little Endian; no processing of the "end of line" or "end of file" characters. Basically mirror image data transfers.

C++ Writing to file vector of byte

I have:
typedef unsigned char;
std::vector<byte> data;
I tried to save data in file this way (but I have error):
fstream file(filename,ios::out);
file.write(&data, data.size());
How to process or cast data to write it in file.
To store a vector in a file, you have to write the contents of the vector, not the vector itself. You can access the raw data with &vector[0], address of the first element (given it contains at least one element).
ofstream outfile(filename, ios::out | ios::binary);
outfile.write(&data[0], data.size());
This should be fairly efficient at writing. fstream is generic, use ofstream if you are going to write.
*Statement file.write(&buffer[0],buffer.size()) makes error:
error C2664: 'std::basic_ostream<_Elem,_Traits>::write' : cannot
convert parameter 1 from 'unsigned char *' to 'const char *'
*In my compiler (VS2008) I don't have data() method for vector.
I think below is correct:
file.write((const char*)&buffer[0],buffer.size());
Use vector::data to get a pointer the the underlying data:
file.write(data.data(), data.size());
You are to pass the address of the first element, not the address of the vector object itself.
&data[0]
Note: Make sure that the vector is not empty before doing this.
A lot of these solutions are only partially complete (lacking includes & casts), so let me post a full working example:
#include <vector>
#include <fstream>
int main()
{
std::vector<std::byte> dataVector(10, std::byte{ 'Z' });
const std::string filename = "C:\\test_file.txt";
std::ofstream outfile(filename, std::ios::out | std::ios::binary);
outfile.write(reinterpret_cast<const char*>(dataVector.data()), dataVector.size());
return 0;
}
I think that ostream.write (myVector[0], ...) will not work, as it does not work in reading into vector (that I had.)
What works is ostream.write(MyVector.data(), ...)
Note: for reading use ifstream.read(const_cast(MyVector.data()), ...)

c++: Reading a file line by line

I'm wondering if there's a C++ way of opening a file and reading the input line by line.
I encountered the following code that accomplishes the task:
#include <iostream>
#include <fstream>
using namespace std;
int main () {
ifstream myfile;
myfile.open ("example.txt");
return 0;
}
I'm encouraged to not use any C functions or commands.
The thing is, my "example.txt" is in the form of a string, and using str.c_str() is a C function, so I guess I have two ways to solve the issue.
Is there another way to read input from a file line by line? Perhaps using something that will accept a string as a parameter for the filepath? Is there a C++ way of doing things? :)
Or, is there another way to convert the string in to a const char *, which is what the myfile.open() function needs?
Many thanks in advance!
EDIT: My lack of practivity and research led me to think c_str() was a C function, and it isn't. My apologies. Since it isn't I have found my answer.
C++11's fstream constructor accepts string. In most cases, you want to use fstream's constructor, rather than .open() - you save one line and one function call.
For reading the file line-by-line, you should use std::getline().
Also note that string::c_str() is still C++ function, not C one, as well as fstream's constructor taking const char *. Most of (if not all, I'm not 100% sure) C standard library function are also included in C++ standard.
Since the issue about str.c_str() is already answered, I'm just gonna add a bit about getting inputs line by line. for example, you wanna take 2 ints input per line, extract them, and put it into a vector.
fstream fs(filename.c_str(), ios_base::in);
string line;
stringstream ss;
int a,b;
vector<int> d;
int numlines;
int i;
for (i = 0; getline(fs, line); i++) {
for (ss.str(line); ss >> a >> b; d.push_back(a), d.push_back(b)) {}
ss.clear();
}
numlines = i;
Hope you get the idea of using getline() and fstream()
It's going to look very similar. You'll want an ifstream instead of an ofstream, you'll want the >> operator, and assuming your file has more than one line, you'll need a loop and the ifstream::feof() function.

saving files in c++ at different full paths

I am writing a program in C++ which I need to save some .txt files to different locations as per the counter variable in program what should be the code? Please help
I know how to save file using full path
ofstream f;
f.open("c:\\user\\Desktop\\**data1**\\example.txt");
f.close();
I want "c:\user\Desktop\data*[CTR]*\filedata.txt"
But here the data1,data2,data3 .... and so on have to be accessed by me and create a textfile in each so what is the code?
Counter variable "ctr" is already evaluated in my program.
You could snprintf to create a custom string. An example is this:
char filepath[100];
snprintf(filepath, 100, "c:\\user\\Desktop\\data%d\\example.txt", datanum);
Then whatever you want to do with it:
ofstream f;
f.open(filepath);
f.close();
Note: snprintf limits the maximum number of characters that can be written on your buffer (filepath). This is very useful for when the arguments of *printf are strings (that is, using %s) to avoid buffer overflow. In the case of this example, where the argument is a number (%d), it is already known that it cannot have more than 10 characters and so the resulting string's length already has an upper bound and just making the filepath buffer big enough is sufficient. That is, in this special case, sprintf could be used instead of snprintf.
You can use the standard string streams, such as:
#include <fstream>
#include <string>
#include <sstream>
using namespace std;
void f ( int data1 )
{
ostringstream path;
path << "c:\\user\\Desktop\\" << data1 << "\\example.txt";
ofstream file(path.str().c_str());
if (!file.is_open()) {
// handle error.
}
// write contents...
}

What is the most elegant way to read a text file with c++?

I'd like to read whole content of a text file to a std::string object with c++.
With Python, I can write:
text = open("text.txt", "rt").read()
It is very simple and elegant. I hate ugly stuff, so I'd like to know - what is the most elegant way to read a text file with C++?
Thanks.
There are many ways, you pick which is the most elegant for you.
Reading into char*:
ifstream file ("file.txt", ios::in|ios::binary|ios::ate);
if (file.is_open())
{
file.seekg(0, ios::end);
size = file.tellg();
char *contents = new char [size];
file.seekg (0, ios::beg);
file.read (contents, size);
file.close();
//... do something with it
delete [] contents;
}
Into std::string:
std::ifstream in("file.txt");
std::string contents((std::istreambuf_iterator<char>(in)),
std::istreambuf_iterator<char>());
Into vector<char>:
std::ifstream in("file.txt");
std::vector<char> contents((std::istreambuf_iterator<char>(in)),
std::istreambuf_iterator<char>());
Into string, using stringstream:
std::ifstream in("file.txt");
std::stringstream buffer;
buffer << in.rdbuf();
std::string contents(buffer.str());
file.txt is just an example, everything works fine for binary files as well, just make sure you use ios::binary in ifstream constructor.
There's another thread on this subject.
My solutions from this thread (both one-liners):
The nice (see Milan's second solution):
string str((istreambuf_iterator<char>(ifs)), istreambuf_iterator<char>());
and the fast:
string str(static_cast<stringstream const&>(stringstream() << ifs.rdbuf()).str());
You seem to speak of elegance as a definite property of "little code". This is ofcourse subjective in some extent. Some would say that omitting all error handling isn't very elegant. Some would say that clear and compact code you understand right away is elegant.
Write your own one-liner function/method which reads the file contents, but make it rigorous and safe underneath the surface and you will have covered both aspects of elegance.
All the best
/Robert
But beware that a c++-string (or more concrete: An STL-string) is as little as a C-String capable of holding a string of arbitraty length - of course not!
Take a look at the member max_size() which gives you the maximum number of characters a string might contain. This is an implementation definied number and may not be portable among different platforms. Visual Studio gives a value of about 4gigs for strings, others might give you only 64k and on 64Bit-platforms it might give you something really huge! It depends and of course normally you will run into a bad_alloc-exception due to memory exhaustion a long time before reaching the 4gig limit...
BTW: max_size() is a member of other STL-containers as well! It will give you the maximum number of elements of a certain type (for which you instanciated the container) which this container will (theoretically) be able to hold.
So, if you're reading from a file of unknow origin you should:
- Check its size and make sure it's smaller than max_size()
- Catch and process bad_alloc-exceptions
And another point:
Why are you keen on reading the file into a string? I would expect to further process it by incrementally parsing it or something, right? So instead of reading it into a string you might as well read it into a stringstream (which basically is just some syntactic sugar for a string) and do the processing. But then you could do the processing directly from the file as well. Because if properly programmed the stringstream could seamlessly be replaced by a filestream, i. e. by the file itself. Or by any other input stream as well, they all share the same members and operators and can thus be seamlessly interchanged!
And for the processing itself: There's also a lot you can have automated by the compiler! E. g. let's say you want to tokenize the string. When defining a proper template the following actions:
- Reading from a file (or a string or any other input stream)
- Tokenizing the content
- pushing all found tokens into an STL-container
- sort the tokens alphabetically
- eleminating any double values
can all(!!) be achived in one single(!) line of C++-code (let aside the template itself and the error handling)! It's just a single call of the function std::copy()! Just google for "token iterator" and you'll get an idea of what I mean. So this appears to me to be even more "elegant" than just reading from a file...
I like Milan's char* way, but with std::string.
#include <iostream>
#include <string>
#include <fstream>
#include <cstdlib>
using namespace std;
string& getfile(const string& filename, string& buffer) {
ifstream in(filename.c_str(), ios_base::binary | ios_base::ate);
in.exceptions(ios_base::badbit | ios_base::failbit | ios_base::eofbit);
buffer.resize(in.tellg());
in.seekg(0, ios_base::beg);
in.read(&buffer[0], buffer.size());
return buffer;
}
int main(int argc, char* argv[]) {
if (argc != 2) {
cerr << "Usage: this_executable file_to_read\n";
return EXIT_FAILURE;
}
string buffer;
cout << getfile(argv[1], buffer).size() << "\n";
}
(with or without the ios_base::binary, depending on whether you want newlines tranlated or not. You could also change getfile to just return a string so that you don't have to pass a buffer string in. Then, test to see if the compiler optimizes the copy out when returning.)
However, this might look a little better (and be a lot slower):
#include <iostream>
#include <string>
#include <fstream>
#include <cstdlib>
using namespace std;
string getfile(const string& filename) {
ifstream in(filename.c_str(), ios_base::binary);
in.exceptions(ios_base::badbit | ios_base::failbit | ios_base::eofbit);
return string(istreambuf_iterator<char>(in), istreambuf_iterator<char>());
}
int main(int argc, char* argv[]) {
if (argc != 2) {
cerr << "Usage: this_executable file_to_read\n";
return EXIT_FAILURE;
}
cout << getfile(argv[1]).size() << "\n";
}