Segfault after fread() call - c++

I have the following code:
char*
Sender::PrepareData(char* filename, unsigned long long int bytesToTransfer)
{
FILE* dataFile = fopen(filename, "rb");
if (dataFile==NULL) {fputs ("File error",stderr); exit (1);}
cout << "File Open: " << filename << endl;
char* theData;
size_t bytesRead = fread(&theData, 1, bytesToTransfer, dataFile);
if (bytesRead != bytesToTransfer) {fputs ("Reading error",stderr); exit (3);}
cout << "Data Read -- Num Bytes: " << bytesRead << endl;
cout << "Data to Send: " << *theData << endl;
return theData;
}
When this method gets hit, my output is:
File Open: t.bin
Data Read -- Num Bytes: 10
Segmentation fault (core dumped)
My t.bin file contains the following:
This is a test.
98172398172837129837
alsjdf89u32ijofiou2
TEST TEST...
!!## TESTING TEST!! ###(DLKAJ)
When I run through gdb, the segfault output is:
File Open: t.bin Data Read -- Num Bytes: 10
Program received signal SIGSEGV, Segmentation fault. 0x00000000004015e2 in Sender::PrepareData (this=0x603010, filename=0x7fffffffe363 "t.bin", bytesToTransfer=10)
at sender.cpp:98 98 cout << "Data to Send: " << *theData << endl;
Can someone tell me what I'm doing wrong?

You need a buffer, theData is just a pointer.
Something like
char theData[1000];
size_t bytesRead = fread(theData, 1, bytesToTransfer, dataFile);
might work depending on the maximum value of bytesToTransfer
If you are sure you are reading a string you might also need to terminate theData before writing to cout,
theData[bytesRead] = '\0';
You'll need to allocate your buffer on the heap if you want to return it.
It'd be a lot easier to do something like
std::vector<char>
PrepareData(char* filename, unsigned long long int bytesToTransfer)
{
std::ifstream file(filename, file.binary);
if (!file) {
std::cerr << "File error";
exit(1);
}
std::cout << "File Open: " << filename << '\n';
std::vector<char> data(bytesToTransfer + 1);
file.read(data.data(), bytesToTransfer);
data.back() = '\0';
if (!file) {
std::cerr << "Reading error";
exit(3);
}
std::cout << "Data Read -- Num Bytes: " << bytesToTransfer << '\n';
std::cout << "Data to Send: " << data.data() << '\n';
return data;
}
Then again if all you are doing is reading chars from a file you should probably consider using strings.

Related

c++ read binary data contained in an string variable

I have a function that uses zlib to unzip files from a zip file to memory, and then stores the resulting file content in a string object. The problem is that the string is now composed of the binary representation of the files, and I don't know how to read that string in blocks like with
ifstream::read():
file.read((char*)result, sizeof(int));
It's necessary that I read the data in block by block, but I haven't found any process to do that from a string variable.
This is the function that returns the file content as a string (I can change the function if there's a better approach):
string externalresourcemanager::getFileInsideZip(string zipFile, string fileInZip) {
int err = UNZ_OK; // error status
uInt size_buf = WRITEBUFFERSIZE; // byte size of buffer to store raw csv data
void* buf; // the buffer
string sout; // output strings
char filename_inzip[256]; // for unzGetCurrentFileInfo
unz_file_info file_info; // for unzGetCurrentFileInfo
unzFile uf = unzOpen(zipFile.c_str()); // open zipfile stream
if (uf==NULL) {
cerr << "Cannot open " << zipFile << endl;
return sout;
} // file is open
if ( unzLocateFile(uf,fileInZip.c_str(),1) ) { // try to locate file inside zip
// second argument of unzLocateFile: 1 = case sensitive, 0 = case-insensitive
cerr << "File " << fileInZip << " not found in " << zipFile << endl;
return sout;
} // file inside zip found
if (unzGetCurrentFileInfo(uf,&file_info,filename_inzip,sizeof(filename_inzip),NULL,0,NULL,0)) {
cerr << "Error " << err << " with zipfile " << zipFile << " in unzGetCurrentFileInfo." << endl;
return sout;
} // obtained the necessary details about file inside zip
buf = (void*)malloc(size_buf); // setup buffer
if (buf==NULL) {
cerr << "Error allocating memory for read buffer" << endl;
return sout;
} // buffer ready
err = unzOpenCurrentFilePassword(uf,NULL); // Open the file inside the zip (password = NULL)
if (err!=UNZ_OK) {
cerr << "Error " << err << " with zipfile " << zipFile << " in unzOpenCurrentFilePassword." << endl;
return sout;
} // file inside the zip is open
// Copy contents of the file inside the zip to the buffer
cout << "Extracting: " << filename_inzip << " from " << zipFile << endl;
do {
err = unzReadCurrentFile(uf,buf,size_buf);
if (err<0) {
cerr << "Error " << err << " with zipfile " << zipFile << " in unzReadCurrentFile" << endl;
sout = ""; // empty output string
break;
}
// copy the buffer to a string
if (err>0) for (int i = 0; i < (int) err; i++) sout.push_back( *(((char*)buf)+i) );
} while (err>0);
err = unzCloseCurrentFile (uf); // close the zipfile
if (err!=UNZ_OK) {
cerr << "Error " << err << " with zipfile " << zipFile << " in unzCloseCurrentFile" << endl;
sout = ""; // empty output string
}
free(buf); // free up buffer memory
return sout;
}
Thank you!
EDIT:
im trying your solution (john) but i get an empty result. Here is an example code to explain the problem:
int store = 1024;
string result = "";
ofstream file("test.txt", std::ios::binary);
//save a few data
file.write(reinterpret_cast<const char*>(&store), sizeof(int));
file.write(reinterpret_cast<const char*>(&store), sizeof(int));
file.close();
//remember that i receive the "string binary data" from the getFileInsideZip() function (here i simulating the problem and store the binary data in a string only for explanation purpose)
ifstream newFile("test.txt", std::ios::binary);
newFile.seekg(0, std::ios::end);
int length = newFile.tellg();
newFile.seekg(0, std::ios::beg);
char* begin = &*result.begin();
//here i have a binary file in a string variable
newFile.read(begin, length);
newFile.close();
cout << "result:" << result << "\n";
//now im trying to get the first int from this binary data
int stored = 0;
std::istringstream bin(result);
bin.read((char*)&stored, sizeof(int));
cout << "stored: " << stored << "\n";
//but the output is:
//result -> empty (initial value)
//stored: 0 -> (initial value)
any suggestion?

How to read from IStream to a char array and write it to another IStream?

I'm trying to send an image through a socket connection, but I have a problem with the following code:
//stream to char array
STATSTG myStreamStats;
ULONG bytesSaved;
myStream->Stat(&myStreamStats, 0);
char* streamData = new char[myStreamStats.cbSize.QuadPart];
if(myStream->Read(streamData, myStreamStats.cbSize.QuadPart, &bytesSaved) == S_OK)
cout<<"OK!"<<endl;
else
cout<<"Not OK!"<<endl;
//char array to stream
if(myStreamR->Write(streamData, myStreamStats.cbSize.QuadPart, &bytesSaved) == S_OK)
cout<<"OK!"<<endl;
else
cout<<"Not OK!"<<endl;
//saving the image to a file
myImage = Image::FromStream(myStreamR);
myImage->Save(lpszFilename, &imageCLSID, NULL);
The program compiles and runs, but I don't get my image. I do get it if I use the original "myStream" but not with "myStreamR" which is constructed from the char array read from the original stream.
The output is two "OK!"s which means that all the bytes are copied into the array and all of them are pasted into the new stream. However, I checked savedBytes and I discovered that after read() it's value is 0(not good), while after write() it's equal to the stream size I gave. Then why on Earth is read() giving me a "S_OK" flag if nothing is read?
You are not seeking MyStreamR back to the beginning after writing data to it. Image::FromStream() starts reading at the stream's current position, so if you don't seek back then there will be no data for it to read.
Try this:
STATSTG myStreamStats = {0};
if (FAILED(myStream->Stat(&myStreamStats, 0)))
cout << "Stat failed!" << endl;
else
{
char* streamData = new char[myStreamStats.cbSize.QuadPart];
ULONG bytesSaved = 0;
if (FAILED(myStream->Read(streamData, myStreamStats.cbSize.QuadPart, &bytesSaved)))
cout << "Read failed!" << endl;
else
{
//char array to stream
if (FAILED(myStreamR->Write(streamData, bytesSaved, &bytesSaved)))
cout << "Write failed!" << endl;
else
{
LARGE_INTEGER li;
li.QuadPart = 0;
if (FAILED(myStreamR->Seek(li, STREAM_SEEK_SET, NULL)))
cout << "Seek failed!" << endl;
else
{
//saving the image to a file
myImage = Image::FromStream(myStreamR);
if (myImage1->GetLastStatus() != Ok)
cout << "FromStream failed!" << endl;
else
{
if (myImage->Save(lpszFilename, &imageCLSID, NULL) != Ok)
cout << "Save failed!" << endl;
else
cout << "OK!" << endl;
}
}
}
}
}

copying large files in c/c++ under FreeBSD freezes system

this code seems to work under Windows (with unexpected results) and Ubuntu. But when I run it under FreeBSD 9.0 AMD 64 it causes the system to freeze. I get error messages like this:
ahcich0: Timeout on slot 28 port 0
Does anybody know what the problem could be?
Thanks.
#include <cmath>
#include <cstdlib>
#include <sys/time.h>
#include <iostream>
#include <fstream>
#include <string>
using namespace std;
int main(int argc, char *argv[])
{
const string FILENAME = "testfile";
const string COPYNAME = "copy";
const int FILES = 5;
const int SIZE_MULTIPLIER = 6;
const int BUFFER_SIZE = pow(2.0, 16);
time_t times[2][FILES];
srand (time(NULL));
// create test files
for (int i = 1; i < FILES + 1; i++){
ofstream os;
string filename(FILENAME);
filename += (char)i + 48;
os.open(filename.c_str(), ios::binary);
if (os.is_open()){
cout << "Writing file " << i << " of " << FILES;
long filesize =pow(2.0, i * SIZE_MULTIPLIER);
cout << " (" << filesize << " bytes)" << endl;
while(filesize--){
os << (char)(rand() % 256);
}
cout << os.tellp() << " bytes written.\n";
os.close();
}else{
cerr << "Could not create file " << filename;
cerr << endl;
}
}
// copy the files
timeval tv;
time_t start;
char buffer[BUFFER_SIZE];
char ci;
for (int i = 0; i < FILES; i++){
ci = (char)i + 49;
string filename(FILENAME);
filename += ci;
string copyname("c");
copyname += COPYNAME;
copyname += ci;
cout << "Copying file " << filename.c_str() << endl;
cout << "the c way: ";
cout.flush();
start = time(NULL);
FILE *pFile = fopen(filename.c_str(), "rb");
FILE *pCopy = fopen(copyname.c_str(), "wb");
if (!(pFile == NULL || pCopy == NULL)){
do{
int bytesRead = fread(
buffer, 1, BUFFER_SIZE, pFile);
fwrite(buffer, 1, bytesRead, pCopy);
}while(!feof(pFile));
fclose(pFile);
fclose(pCopy);
cout << " Done.\n";
}else{
cerr << "Could not open either " << filename;
cerr << " or " << copyname << endl;
}
times[0][i] = time(NULL) - start;
remove(copyname.c_str());
copyname = "cpp";
copyname += COPYNAME;
copyname += ci;
cout << "the c++ way: ";
cout.flush();
start = time(NULL);
ifstream in;
in.open(filename.c_str(), ios::binary);
in.rdbuf()->pubsetbuf(buffer, BUFFER_SIZE);
ofstream out;
out.open(copyname.c_str(), ios::binary);
char copyBuffer[BUFFER_SIZE];
out.rdbuf()->pubsetbuf(copyBuffer, BUFFER_SIZE);
if (in.is_open() && out.is_open()){
out << in.rdbuf();
in.close();
out.close();
cout << " Done.\n";
}else{
cerr << "Could not open either " << filename;
cerr << " or " << copyname << endl;
}
times[1][i] = time(NULL) - start ;
remove(copyname.c_str());
}
cout << "Summary:\n";
cout << "\tc\tc++\n";
for (int i = 0; i < FILES; i++){
ci = (char)i + 49;
cout << "copy" << ci << "\t" << times[0][i];
cout << "\t" << times[1][i] << endl;
}
return 0;
}
After changing FILES to 4 (because it takes very long otherwise), your program ran just fine here:
Writing file 1 of 4 (64 bytes)
64 bytes written.
Writing file 2 of 4 (4096 bytes)
4096 bytes written.
Writing file 3 of 4 (262144 bytes)
262144 bytes written.
Writing file 4 of 4 (16777216 bytes)
16777216 bytes written.
Copying file testfile1
the c way: Done.
the c++ way: Done.
Copying file testfile2
the c way: Done.
the c++ way: Done.
Copying file testfile3
the c way: Done.
the c++ way: Done.
Copying file testfile4
the c way: Done.
the c++ way: Done.
Summary:
c c++
copy1 0 0
copy2 0 0
copy3 0 0
copy4 0 0
(FreeBSD 9.0-RELEASE-p3 amd64, compiled with clang++)
There could've been a bug in the achi-driver in 9.0, that showed up under heavy load. Or, it could've been a buggy controller, that was failing under the same load -- and not failing under other OSes, because they weren't taxing it as much.
Is this still a problem with FreeBSD 9.2?
As for your program, you ought to check not just for feof(), but also for ferror() in your read/write loop. Further, in my opinion, such read/write loops are a thing from the past. These days, when size_t and offset_t are of the same width (64-bit platforms), you ought to simply mmap() your source file and fwrite it into destination in one go. Look, ma, no loop!

fseek not reading all the data?

I am making a c++ installer and I have appended both the file to extract and an 8 byte filesize of the file to extract within the program, to the executable. My program exits on a file read error, whats going wrong? To note I don't have any knowledge about c file managing, apart from what I've learned today. I am writing the file test_before.tar.gz, which is 161 bytes, the executable is 12335 bytes long and the filesize file is 8 bytes long, containing 0000161. What's wrong? Ask for more info if needed.
#include <iostream>
#include <cstdio>
#include <cstdlib>
#include <cstring>
using namespace std;
int main(int argc, char* argv[])
{
cout << "Opening the executable as read-only!" << endl;
FILE *exeFile; // The executable file pointer
FILE *outFile; // The file to write pointer
// Check whether a file name was supplied
if(argc < 2)
{
cout << "Please enter the file to write!" << endl;
return 1;
}
// Open the executable as read-only
if((exeFile = fopen(argv[0], "rb")) == 0)
{
cout << "Error opening the executable!" << endl;
return 1;
}
cout << "Getting the executables size!" << endl;
// Get the files size
fseek(exeFile, 0, SEEK_END);
int size = ftell(exeFile);
cout << "Reading ofset!" << endl;
// Read the ofset bytes contained in the last 7-bytes
char filesize_char[9];
fseek(exeFile, -8, SEEK_END);
fgets(filesize_char, 9, exeFile);
// Convert
int filesize = atoi(filesize_char);
int ofset = (size - filesize) - 8;
cout << "The ofset size is " << ofset << " bytes!" << endl;
cout << "The file size is " << filesize << " bytes!" << endl;
cout << "Reading the file to extract!" << endl;
// Create the variable to contain the file and goto the ofset
char* contents = new char[filesize + 1];
fseek(exeFile, ofset, SEEK_SET);
// Read the file to extract
if(fread(contents, sizeof(char), filesize + 1, exeFile) != sizeof(contents))
{
// Error has occured
if(feof(exeFile)) {
cout << "Premature end of file!" << endl;
// Delete variables so they dont "leak"
fclose(exeFile);
delete[] contents;
return 1;
} else {
cout << "File read error!" << endl;
// Delete variables so they dont "leak"
fclose(exeFile);
delete[] contents;
return 1;
}
}
cout << "Writing the file to " << argv[1] << "!" << endl;
// Write the file to extract
if((outFile = fopen(argv[1], "wb")) == 0)
{
cout << "Error opening the file to write!" << endl;
// Delete variables so they dont "leak"
fclose(exeFile);
fclose(outFile);
delete[] contents;
return 1;
}
fwrite(contents, 1, sizeof(contents), outFile);
//delete variables
fclose(exeFile);
fclose(outFile);
delete[] contents;
return 0;
}
You don't need the while loop at all. After all, you already allocated the memory that would contain all your data. Move file pointer to the start of data fseek(exeFile, ofset, SEEK_SET), then use fread to read it as whole, and then use fwrite to write it into outFile.
You should open your exeFile and outFile with "rb" and "wb" flags otherwise your code would work reliably only with text data.

Why does read of /proc/cpuinfo seem to not advance file position?

I have the following code which ends up forever reading '/proc/cpuinfo' as it keeps getting the same result every read. Why doesn't the file pointer get advanced and reach eof ever? Seems this special file has different semantics.
const int bufSize = 4096;
char buf[bufSize + 1];
const string cpuInfo = "/proc/cpuinfo";
int cpuFD = ::open(cpuInfo.c_str(), O_RDONLY);
if (cpuFD == -1) {
logOutputStream << "Failed attempt to open '" << cpuInfo << "': "
<< strerror(errno) << endl;
} else {
assert(bufSize <= SSIZE_MAX);
logOutputStream << "Contents of: '" << cpuInfo << "'.\n";
for (int nRead = ::read(cpuFD, buf, bufSize); nRead != 0;) {
if (nRead == -1) {
logOutputStream << "Failed attempt to read '" << cpuInfo << "': "
<< strerror(errno) << endl;
break;
} else {
buf[nRead] = '\0';
logOutputStream << buf;
}
}
if (::close(cpuFD) == -1) {
logOutputStream << "Failed attempt to close '" << cpuInfo << "': "
<< strerror(errno) << endl;
}
}
for (int nRead = ::read(cpuFD, buf, bufSize); nRead != 0;) {
is wrong. You're using read as an initializer, so read is only being called once, not once per loop. After that, you're just looping forever printing it out (because nothing is changing nRead).
What happens if you try dumping the content into an actual text file with something like
cat /proc/cpuinfo > cpuinfo.txt
and then reading that file ?