I have a binary file created by some fortran code. I want to write a c++ code to read this binary file and then spit it out through std::cout. Here is so far my code:
#include<fstream>
#include<iostream>
using namespace std;
int main(){
ifstream file("tofu.txt", ios::binary | ios::in | ios::ate);
ifstream::pos_type size;
if(file.is_open()){
size = file.tellg();
cout << "size = " << size << '\n';
file.seekg(0);
char bar[500];
file.read((char*) (&bar), size);
file.close();
string foo(bar);
cout << "foo = " << foo << '\n';
}
else cout << "Unable to open file";
return 0;
}
However, when compiled and run, the code gives me nothing:
size = 250
foo =
Could someone tell me where I'm doing wrong in the code? Thanks!
You forgot to terminate your char array, leading to undefined behaviour. Fix it like this:
char bar[500];
assert(size < 500);
file.read((char*) (&bar), size - 1);
bar[size] = '\0';
(Make sure you check that size isn't larger than you have space for, too!)
Related
I am trying to get the file size of a file with c++. The code is like this:
ifstream fin(filepth, ios::in | ios::binary | ios::ate);
if (!fin.is_open()) {
cout << "cannot open file:" << filepth << endl;
}
int len = fin.tellg();
fin.seekg(0, ios::beg);
fin.clear();
cout << "file length is: " << len << endl;
cout << "file length is: " << fs::file_size(fs::path(filepth)) << endl;
It turns out that the method of ios::ate got the wrong result. What did I miss and how could I got the correct result ?
I got the reason of the problem. My file is about 9 gigabytes long, which cannot be expressed by a 32 bit int variable. I used int64_t and the problem no longer exists.
Below is link for details as I think return type of file_size needs to typecast:
https://www.codingame.com/playgrounds/5659/c17-filesystem
I am trying to write and read string to/from binary file, but I can't understand why sizeof(t) returns 4.
//write to file
ofstream f1("example.bin", ios::binary | ios::out);
string s = "Valentin";
char* t = new char[s.length()+1];
strcpy(t, s.c_str());
cout << s.length()+1 << " " << sizeof(t) << endl; // prints 9 4
for(int i = 0; i < sizeof(t); i++)
{
//t[i] += 100;
}
f1.write(t, sizeof(t));
f1.close();
// read from file
ifstream f2("example.bin", ios::binary | ios::in);
while(f2)
{
int8_t x;
f2.read((char*)&x, 1);
//x -= 100;
cout << x; //print Valee
}
cout << endl;
f2.close();
It doesn't matter what size I put in char* array t, code always prints "4" as its size. What must I do to write longer than 4 bytes of data?
Here's how to do the writing code the easy way
//write to file
ofstream f1("example.bin", ios::binary | ios::out);
string s = "Valentin";
f1.write(s.c_str(), s.size() + 1);
f1.close();
EDIT the OP actually wants something like this
#include <algorithm> // for transform
string s = "Valentin";
// copy s to t and add 100 to all bytes in t
string t = s;
transform(t.begin(), t.end(), t.begin(), [](char c) { return c + 100; });
// write to file
ofstream f1("example.bin", ios::binary | ios::out);
f1.write(t.c_str(), t.size() + 1);
f1.close();
sizeof(char*) prints the size used by a pointer to (a) char(s). It's 4 on your platform.
If you need the size of the string, you should use strlen. Or, simply, s.length().
char *t is a pointer, not an array, so sizeof will return the size of a pointer on your machine, which is apparently 4 bytes.
The correct way to determine the length of a C-style string is to include <cstring> and use std::strlen.
I want to read a binary file of integer type and print the occurrence of the number of 3's in the file. I somehow wrote a program to open and read a binary file.
Here is the couple of problems I am facing:
If I try to print the file on my terminal, the execution continues
forever and the loop never ends.
I have no idea of how to filter out 3's from it.
Here is my code:
#include <iostream>
#include <fstream>
using namespace std;
int main () {
streampos size;
char * memblock;
ifstream file ("threesData.bin", ios::in|ios::binary|ios::ate);
if (file.is_open())
{
size = file.tellg();
memblock = new char [size];
file.seekg (0, ios::beg);
file.read (memblock, size);
file.close();
cout << "the entire file content is in memory";
for (int i = 0; i < size; i += sizeof(int))
{
cout << *(int*)&memblock[i] << endl;
}
delete[] memblock;
}
else
cout << "Unable to open file";
return 0;
}
Here is a way to implement your requirements:
int main()
{
unsigned int quantity = 0U;
ifstream file ("threesData.bin", ios::in|ios::binary|ios::ate);
uint8_t byte;
while (file >> byte)
{
if (byte == 3U)
{
++ quantity;
}
}
cout << "The quantity of 3s is: " << quantity << endl;
return 0;
}
The first step should always get a simple version working first. Only optimize if necessary.
Allocating memory for a file and reading the entire file is an optimization. For example, your platform may not have enough available memory to read the entire file into memory before processing.
This program takes in an input, write it on a file character by character, count the amount of characters entered, then at the end copy it to an array of characters. The program works just fine until we get to the following snippet file.getline(arr, inputLength);. It changes the .txt file data and returns only the first character of the original input.
Any ideas?
#include <iostream>
#include <fstream>
using namespace std;
int getLine(char *& arr);
int main() {
char * arr = NULL;
cout << "Write something: ";
getLine(arr);
return 0;
}
int getLine(char *& arr) {
fstream file("temp.txt");
char input = '\0'; //initialize
int inputLength = 0; //initialize
if (file.is_open()) {
while (input != '\n') { //while the end of this line is not reached
input = cin.get(); //get each single character
file << input; //write it on a .txt file
inputLength++; //count the number of characters entered
}
arr = new char[inputLength]; //dynamically allocate memory for this array
file.getline(arr, inputLength); //HERE IS THE PROBLEM!!! ***
cout << "Count : " << inputLength << endl; //test counter
cout << "Array : " << arr << endl; //test line copy
file.close();
return 1;
}
return 0;
}
I see at least two problems with this code.
1) std::fstream constructor, by default, will open an existing file. It will not create a new one. If temp.txt does not exist, is_open() will fail. This code should pass the appropriate value for the second parameter to std::fstreams constructor that specifies that either a new file needs to be created, or the existing file is created.
Related to this: if the file already exists, running this code will not truncate it, so the contents of the file from this program's previous run will have obvious unexpected results.
2) The intent of this code appears to be to read back in the contents temp.txt that were previously written to it. To do that correctly, after writing and before reading it is necessary to seek back to the beginning of the file. This part appears to be missing.
There is no need in dynamic allocation because the std library functions get confused with mixed arguments such as cstring and pointer to cstring.I tested this code in Visual Studio 2015 compiler. It works good. Make sure to include all of the needed libraries:
#include <iostream>
#include <fstream>
#include<cstring>
#include<string>
using namespace std;
void getLine();
int main() {
cout << "Write something: ";
// no need to pass a pointer to a cstring
getLine();
system("pause");
return 0;
}
void getLine() {
char input[100]; // this is a cstring with
//a safe const number of elements
int inputLength; //to extract length of the actual input
//this function requires cstring as a first argument
// and constant length as a second
cin.get(input, 100, '\n'); //get each single character
//cast streamsize into int
inputLength = static_cast<int>(cin.gcount());
//testing input
cout << "Input: \n";
for (int i = 0; i < inputLength; i++)
{
cout << input[i];
}
cout << endl;
char arr[100];
strcpy_s(arr, input);
cout << "Count : " << inputLength << endl; //test counter
cout << "Array : " << endl; //test line copy
for (int i = 0; i < inputLength; i++)
{
cout << arr[i];
}
cout << endl;
// write cstring to a file
ofstream file;
file.open("temp.txt", ios::out);
if (file.is_open())
{
//write only what was entered in input
for (int i = 0; i < inputLength; i++)
file << arr[i];
file.close();
}
else cout << "Unable to open file";
}
My problem is that using ifstream read and fread on a file descriptor don't seem to produce the same results.
I open a file and read its input using ifstream open/read in ios::binary mode. Then I write this buffer out to a file. out1.
Next, I open the same file, read its input using FILE* file descriptors and fread. Then I write this buffer out to another file, out2.
When I compare out1 to out2 they do not match. out2, which uses FILE*, seems to stop reading, near the end.
More worrisome is that neither buffer matches the input file. The ifstream::read method seems to be modifying the end of line characters, even though I open the input file as ios::binary.
The fread method seems to be returning a value less than length (199) even though it's reading significantly more characters than that, as I can see the buffer that got read. This doesn't match the length determined by the seekg commands.
I'm quite confused and any help would be appreciated. Code is attached.
Thanks!
-Julian
ifstream read_file;
read_file.open("V:\\temp\\compressiontest\\out\\test_20224-5120_256x256.jpg", ios::binary);
read_file.seekg(0, ios::end);
unsigned long length = read_file.tellg();
cout << "Length: " << length << endl;
read_file.seekg(0, ios::beg);
unsigned char* buffer = new unsigned char[length];
unsigned char* buf = new unsigned char[length];
for(int i = 0; i < length; i++)
{
buffer[i] = 0;
buf[i] = 0;
}
if(read_file.is_open())
{
read_file.read((char*)buffer, length);
}
else
{
cout << "not open" << endl;
}
read_file.close();
FILE* read_file_1 = NULL;
read_file_1 = fopen("V:\\temp\\compressiontest\\out\\test_20224-5120_256x256.jpg", "r");
size_t read_len = fread(buf, 1, length, read_file_1);
fclose(read_file_1);
if(read_len != length)
cout << "read len != length" << " read_len: " << read_len << " length: " << length << endl;
int consistent = 0;
int inconsistent = 0;
for(int i = 0; i < length; i++)
{
if(buf[i] != buffer[i])
inconsistent++;
else
consistent++;
}
cout << "inconsistent:" << inconsistent << endl;
cout << "consistent:" << consistent << endl;
FILE* file1;
file1 = fopen("V:\\temp\\compressiontest\\out1.jpg", "w");
fwrite((void*) buffer, 1, length, file1);
fclose(file1);
FILE* file2;
file2 = fopen("V:\\temp\\compressiontest\\out2.jpg", "w");
fwrite((void*) buf, 1, length, file2);
fclose(file2);
return 0;
You're calling fopen() for read using mode r instead of mode rb and for write using mode w instead of mode wb, which on Windows (default) means that you're both reading and writing with text translation, not in binary mode.