EXC_BAD_ACCESS using << on stringstream object - c++

I'm writing a code where I need to write a large string into memory.
I used a stringstream object to do so, but something odd to me happens: even if the size of the underlying string buffer has not exceeded the maximum size of a string object my program crashes with a BAD_ACCESS error.
I've created a test program like this:
#include <sstream> // std::stringstream
#include <iostream> // std::cout
int main(int argc, const char * argv[]) {
std::stringstream stream;
std::string string;
std::cout << "Max string size: " << string.max_size() << "\n";
for (int i = 0; true; i++) {
if (i >= 644245094) {
stream.seekg(0, std::ios::end);
std::stringstream::pos_type size = stream.tellg();
stream.seekg(0, std::ios::beg);
std::cout << "Size of stringstream: " << size << "\n";
}
stream << "hello";
}
return 0;
}
That if (i >= 644245094) inside the loop is only used to print the size of the stringstream buffer just before the program crashes. I've used my debugger to see which was the number of the last iteration and used it to print the size of the buffer just before the crash happens.
This is the output I get:
Max string size: 18446744073709551599
Size of stringstream: 3221225470
After this the program crashes.
I thought that the cause might be because the program fills up my computer's RAM but the memory used by this program is ~6.01GB so not enough to fill up my RAM. For the record I have a 16GB RAM Macbook Pro.
What could be the problem? Am I missing something about how << operator works?
Thank you in advance!

The behaviour of a std::stringstream when it gets full and fails may not be consistent across all platforms.
I modified your code and rain it on Yocto 3.19.0-32 64-bit, with gcc 5.4.1. I did not get an exception thrown, rather the stream set one of its failure mode bits.
The code I ran was:
#include <sstream> // std::stringstream
#include <iostream> // std::cout
std::stringstream::pos_type get_size(std::stringstream& stream)
{
stream.seekg(0, std::ios::end);
std::stringstream::pos_type size = stream.tellg();
stream.seekg(0, std::ios::beg);
return size;
}
int main(int argc, const char * argv[])
{
std::stringstream stream;
std::string string;
std::cout << "Max string size: " << string.max_size() << std::endl;
std::stringstream::pos_type size;
for (unsigned long i = 0; true; ++i)
{
size = get_size(stream);
stream.write("x", 1);
if (stream.fail())
{
std::cout << "Fail after " << i + 1 << " insertions" << std::endl;
std::cout << "Size of stringstream just before fail: " << size << std::endl;
break;
}
}
size = get_size(stream);
std::cout << "Size of stringstream just after fail: " << size << std::endl;
return 0;
}
And I got the following output, which shows that my stringstream filled and failed 56 bytes short of 8GB:
Max string size: 4611686018427387897
Fail after 8589934536 insertions
Size of stringstream just before fail: 8589934535
Size of stringstream just after fail: -1
Can you not use a different container and pre-allocate the memory, instead of using such a large stringstream?

Related

Why does C++ posix_memalign give the wrong array size?

I have the following C++ code, which tries to read a binary file, and print out the resulting 32 bit values as hex:
// hello.cpp file
#include <iostream>
#include <fstream>
#include <stdio.h>
#include <stdlib.h>
#include <sys/stat.h>
int main()
{
int file_size; // font file size in bytes
int i;
std::cout << "Hello World\n";
std::string binary_data_file("font.dat");
struct stat statbuff;
stat(binary_data_file.c_str(), &statbuff);
file_size = statbuff.st_size;
void *data_buffer;
posix_memalign(&data_buffer, 4096, file_size);
std::ifstream data_input_file(binary_data_file.c_str(), std::ios::in | std::ios::binary);
data_input_file.read((char *) data_buffer, file_size);
data_input_file.close();
int * debug_buffer = (int * ) data_buffer;
for (int j = 0; j< 148481; j++) {
std::cout << "Data size: " << std::dec << file_size << std::endl;
std::cout << "Element: " << j << " Value: " << std::hex << *(debug_buffer + j) << std::endl;
}
return 0;
}
This code causes a Segmentation Fault when j == 148480
Data size: 211200
Element: 148477 Value: 0
Data size: 211200
Element: 148478 Value: 0
Data size: 211200
Element: 148479 Value: 0
Data size: 211200
Segmentation fault (core dumped)
Why is this the case? Surely the buffer size should be equal to 211200, right, so j should be able to go up to 211200?
You allocated 211200 bytes, but you're trying to access 148481 * sizeof(int) bytes, which is far past the end of the buffer (and past the end of the file content).

std::fstream read block of data from file and write data back to file until EOF

I'm reading blocks of data from the file, but not all at once (ex. 3 bytes per read/write) and then write same 3 bytes back to file to the very same position inside a file, and then continue looping until there are no more blocks to read.
In other words I'm trying to rewrite the file by it's very contents.
However there is a problem that final output isn't the same as it was in the beginning.
Following sample code reads 3 bytes per iteration from a file "sample.txt", file contents are simple:
0123456789
after reading data and writing data back to file, the contents are:
012345345345
As you see data doesn't get rewritten correctly for some reason.
#include <fstream>
#include <iostream>
using namespace std;
#define BLOCK_SIZE 3
int main()
{
// open file
fstream file;
file.open("sample.txt", ios::binary | ios::out | ios::in);
// determine size and number of blocks to read
file.seekg(0, ios::end);
streampos size = file.tellg();
int blocks = size / BLOCK_SIZE;
cout << "size:\t" << size << endl;
if (size % BLOCK_SIZE != 0)
{
++blocks;
}
cout << "blocks:\t" << blocks << endl;
// return to beginning
file.seekg(ios::beg);
// we will read data here
unsigned char* data = new unsigned char[BLOCK_SIZE];
streampos pos;
// read blocks of data and write data back
for (int i = 0; i < blocks; ++i)
{
pos = file.tellg();
cout << "before read:\t" << pos << endl;
// read block
file.read(reinterpret_cast<char*>(data), BLOCK_SIZE);
cout << "after read:\t" << file.tellg() << endl;
// write same block back to same position
file.seekp(pos);
cout << "before write:\t" << file.tellg() << endl;
file.write(reinterpret_cast<char*>(data), BLOCK_SIZE);
cout << "after write:\t" << file.tellg() << endl;
// reset buffer
memset(data, 0, BLOCK_SIZE);
}
file.close();
delete[] data;
cin.get();
return 0;
}
Do you see what could be the reason for bad overwrite?
EDIT:
Sorry, I can't see how the linked duplicate answers my question, I'm simply unable to apply given answer to the code above.
Your code does not handle the EOF condition well, and leaves the stream in a bad state after trying to read past the end of the file. On my system, this results in all further calls to the stream having no effect. I bet that isn't the case on your system (which I suspect is a bug in its iostream implementation). I re-did your code to handle the EOF condition correctly, and also to be a lot cleaner in a few other ways:
#include <fstream>
#include <iostream>
using namespace std;
const int BLOCK_SIZE = 3;
int main()
{
// open file
fstream file;
file.open("sample.txt", ios::binary | ios::out | ios::in);
// we will read data here
bool found_eof = false;
// read blocks of data and write data back
while (!found_eof)
{
unsigned char data[BLOCK_SIZE] = {0};
char * const data_as_char = reinterpret_cast<char *>(data);
streampos const pos = file.tellp();
int count_to_write = BLOCK_SIZE;
cout << "before read:\t" << file.tellg() << ' ' << pos << '\n';
// read block
if (!file.read(data_as_char, BLOCK_SIZE)) {
found_eof = true;
count_to_write = file.gcount();
file.clear();
cout << "Only " << count_to_write << " characters extracted.\n";
}
cout << "after read:\t" << file.tellg() << ' ' << file.tellp() << '\n';
// write same block back to same position
file.seekp(pos);
cout << "before write:\t" << file.tellg() << ' ' << file.tellp() << '\n';
file.write(data_as_char, count_to_write);
cout << "after write:\t" << file.tellg() << ' ' << file.tellp() << '\n';
file.seekp(file.tellp());
}
file.close();
cin.get();
return 0;
}
But, this is not fundamentally different. Both versions work for me just the same. I'm on Linux with g++.
From the linked to possible dupe, I would also suggest adding this just before the closing } of your for loop:
file.seekp(file.tellp());
I've put that in my code in the appropriate place.

C++ How to create byte[] array from file (I don't mean reading file byte by byte)?

I have a problem I neither can solve on my own nor find answer anywhere. I have a file contains such a string:
01000000d08c9ddf0115d1118c7a00c04
I would like to read the file in the way, that I would do manually like that:
char fromFile[] =
"\x01\x00\x00\x00\xd0\x8c\x9d\xdf\x011\x5d\x11\x18\xc7\xa0\x0c\x04";
I would really appreciate any help.
I want to do it in C++ (the best would be vc++).
Thank You!
int t194(void)
{
// imagine you have n pair of char, for simplicity,
// here n is 3 (you should recognize them)
char pair1[] = "01"; // note:
char pair2[] = "8c"; // initialize with 3 char c-style strings
char pair3[] = "c7"; //
{
// let us put these into a ram based stream, with spaces
std::stringstream ss;
ss << pair1 << " " << pair2 << " " << pair3;
// each pair can now be extracted into
// pre-declared int vars
int i1 = 0;
int i2 = 0;
int i3 = 0;
// use formatted extractor to convert
ss >> i1 >> i2 >> i3;
// show what happened (for debug only)
std::cout << "Confirm1:" << std::endl;
std::cout << "i1: " << i1 << std::endl;
std::cout << "i2: " << i2 << std::endl;
std::cout << "i3: " << i3 << std::endl << std::endl;
// output is:
// Confirm1:
// i1: 1
// i2: 8
// i3: 0
// Shucks, not correct.
// We know the default radix is base 10
// I hope you can see that the input radix is wrong,
// because c is not a decimal digit,
// the i2 and i3 conversions stops before the 'c'
}
// pre-delcare
int i1 = 0;
int i2 = 0;
int i3 = 0;
{
// so we try again, with radix info added
std::stringstream ss;
ss << pair1 << " " << pair2 << " " << pair3;
// strings are already in hex, so we use them as is
ss >> std::hex // change radix to 16
>> i1 >> i2 >> i3;
// now show what happened
std::cout << "Confirm2:" << std::endl;
std::cout << "i1: " << i1 << std::endl;
std::cout << "i2: " << i2 << std::endl;
std::cout << "i3: " << i3 << std::endl << std::endl;
// output now:
// i1: 1
// i2: 140
// i3: 199
// not what you expected? Though correct,
// now we can see we have the wrong radix for output
// add output radix to cout stream
std::cout << std::hex // add radix info here!
<< "i1: " << i1 << std::endl
// Note: only need to do once for std::cout
<< "i2: " << i2 << std::endl
<< "i3: " << i3 << std::endl << std::endl
<< std::dec;
// output now looks correct, and easily comparable to input
// i1: 1
// i2: 8c
// i3: c7
// So: What next?
// read the entire string of hex input into a single string
// separate this into pairs of chars (perhaps using
// string::substr())
// put space separated pairs into stringstream ss
// extract hex values until ss.eof()
// probably should add error checks
// and, of course, figure out how to use a loop for these steps
//
// alternative to consider:
// read 1 char at a time, build a pairing, convert, repeat
}
//
// Eventually, you should get far enough to discover that the
// extracts I have done are integers, but you want to pack them
// into an array of binary bytes.
//
// You can go back, and recode to extract bytes (either
// unsigned char or uint8_t), which you might find interesting.
//
// Or ... because your input is hex, and the largest 2 char
// value will be 0xff, and this fits into a single byte, you
// can simply static_cast them (I use unsigned char)
unsigned char bin[] = {static_cast<unsigned char>(i1),
static_cast<unsigned char>(i2),
static_cast<unsigned char>(i3) };
// Now confirm by casting these back to ints to cout
std::cout << "Confirm4: "
<< std::hex << std::setw(2) << std::setfill('0')
<< static_cast<int>(bin[0]) << " "
<< static_cast<int>(bin[1]) << " "
<< static_cast<int>(bin[2]) << std::endl;
// you also might consider a vector (and i prefer uint8_t)
// because push_back operations does a lot of hidden work for you
std::vector<uint8_t> bytes;
bytes.push_back(static_cast<uint8_t>(i1));
bytes.push_back(static_cast<uint8_t>(i2));
bytes.push_back(static_cast<uint8_t>(i3));
// confirm
std::cout << "\nConfirm5: ";
for (size_t i=0; i<bytes.size(); ++i)
std::cout << std::hex << std::setw(2) << std::setfill(' ')
<< static_cast<int>(bytes[i]) << " ";
std::cout << std::endl;
Note: The cout (or ss) of bytes or char can be confusing, not always giving the result you might expect. My background is embedded software, and I have surprisingly small experience making stream i/o of bytes work. Just saying this tends to bias my work when dealing with stream i/o.
// other considerations:
//
// you might read 1 char at a time. this can simplify
// your loop, possibly easier to debug
// ... would you have to detect and remove eoln? i.e. '\n'
// ... how would you handle a bad input
// such as not hex char, odd char count in a line
//
// I would probably prefer to use getline(),
// it will read until eoln(), and discard the '\n'
// then in each string, loop char by char, creating char pairs, etc.
//
// Converting a vector<uint8_t> to char bytes[] can be an easier
// effort in some ways. A vector<> guarantees that all the values
// contained are 'packed' back-to-back, and contiguous in
// memory, just right for binary stream output
//
// vector.size() tells how many chars have been pushed
//
// NOTE: the formatted 'insert' operator ('<<') can not
// transfer binary data to a stream. You must use
// stream::write() for binary output.
//
std::stringstream ssOut;
// possible approach:
// 1 step reinterpret_cast
// - a binary block output requires "const char*"
const char* myBuff = reinterpret_cast<const char*>(&myBytes.front());
ssOut.write(myBuff, myBytes.size());
// block write puts binary info into stream
// confirm
std::cout << "\nConfirm6: ";
std::string s = ssOut.str(); // string with binary data
for (size_t i=0; i<s.size(); ++i)
{
// because binary data is _not_ signed data,
// we need to 'cancel' the sign bit
unsigned char ukar = static_cast<unsigned char>(s[i]);
// because formatted output would interpret some chars
// (like null, or \n), we cast to int
int intVal = static_cast<int>(ukar);
// cast does not generate code
// now the formatted 'insert' operator
// converts and displays what we want
std::cout << std::hex << std::setw(2) << std::setfill('0')
<< intVal << " ";
}
std::cout << std::endl;
//
//
return (0);
} // int t194(void)
The below snippet should be helpful!
std::ifstream input( "filePath", std::ios::binary );
std::vector<char> hex((
std::istreambuf_iterator<char>(input)),
(std::istreambuf_iterator<char>()));
std::vector<char> bytes;
for (unsigned int i = 0; i < hex.size(); i += 2) {
std::string byteString = hex.substr(i, 2);
char byte = (char) strtol(byteString.c_str(), NULL, 16);
bytes.push_back(byte);
}
char* byteArr = bytes.data()
The way I understand your question is that you want just the binary representation of the numbers, i.e. remove the ascii (or ebcdic) part. Your output array will be half the length of the input array.
Here is some crude pseudo code.
For each input char c:
if (isdigit(c)) c -= '0';
else if (isxdigit(c) c -= 'a' + 0xa; //Need to check for isupper or islower)
Then, depending on the index of c in your input array:
if (! index % 2) output[outputindex] = (c << 4) & 0xf0;
else output[outputindex++] = c & 0x0f;
Here is a function that takes a string as in your description, and outputs a string that has \x in front of each digit.
#include <iostream>
#include <algorithm>
#include <string>
std::string convertHex(const std::string& str)
{
std::string retVal;
std::string hexPrefix = "\\x";
if (!str.empty())
{
std::string::const_iterator it = str.begin();
do
{
if (std::distance(it, str.end()) == 1)
{
retVal += hexPrefix + "0";
retVal += *(it);
++it;
}
else
{
retVal += hexPrefix + std::string(it, it+2);
it += 2;
}
} while (it != str.end());
}
return retVal;
}
using namespace std;
int main()
{
cout << convertHex("01000000d08c9ddf0115d1118c7a00c04") << endl;
cout << convertHex("015d");
}
Output:
\x01\x00\x00\x00\xd0\x8c\x9d\xdf\x01\x15\xd1\x11\x8c\x7a\x00\xc0\x04
\x01\x5d
Basically it is nothing more than a do-while loop. A string is built from each pair of characters encountered. If the number of characters left is 1 (meaning that there is only one digit), a "0" is added to the front of the digit.
I think I'd use a proxy class for reading and writing the data. Unfortunately, the code for the manipulators involved is just a little on the verbose side (to put it mildly).
#include <vector>
#include <algorithm>
#include <iterator>
#include <iostream>
#include <iomanip>
#include <string>
#include <sstream>
struct byte {
unsigned char ch;
friend std::istream &operator>>(std::istream &is, byte &b) {
std::string temp;
if (is >> std::setw(2) >> std::setprecision(2) >> temp)
b.ch = std::stoi(temp, 0, 16);
return is;
}
friend std::ostream &operator<<(std::ostream &os, byte const &b) {
return os << "\\x" << std::setw(2) << std::setfill('0') << std::setprecision(2) << std::hex << (int)b.ch;
}
};
int main() {
std::istringstream input("01000000d08c9ddf115d1118c7a00c04");
std::ostringstream result;
std::istream_iterator<byte> in(input), end;
std::ostream_iterator<byte> out(result);
std::copy(in, end, out);
std::cout << result.str();
}
I do really dislike how verbose iomanipulators are, but other than that it seems pretty clean.
You can try a loop with fscanf
unsigned char b;
fscanf(pFile, "%2x", &b);
Edit:
#define MAX_LINE_SIZE 128
FILE* pFile = fopen(...);
char fromFile[MAX_LINE_SIZE] = {0};
char b = 0;
int currentIndex = 0;
while (fscanf (pFile, "%2x", &b) > 0 && i < MAX_LINE_SIZE)
fromFile[currentIndex++] = b;

C++ equivalent of C fgets

I am looking to find a C++ fstream equivalent function of C fgets. I tried with get function of fstream but did not get what I wanted. The get function does not extract the delim character whereas the fgets function used to extract it. So, I wrote a code to insert this delim character from my code itself. But it is giving strange behaviour. Please see my sample code below;
#include <stdio.h>
#include <fstream>
#include <iostream>
int main(int argc, char **argv)
{
char str[256];
int len = 10;
std::cout << "Using C fgets function" << std::endl;
FILE * file = fopen("C:\\cpp\\write.txt", "r");
if(file == NULL){
std::cout << " Error opening file" << std::endl;
}
int count = 0;
while(!feof(file)){
char *result = fgets(str, len, file);
std::cout << result << std::endl ;
count++;
}
std::cout << "\nCount = " << count << std::endl;
fclose(file);
std::fstream fp("C:\\cpp\\write.txt", std::ios_base::in);
int iter_count = 0;
while(!fp.eof() && iter_count < 10){
fp.get(str, len,'\n');
int count = fp.gcount();
std::cout << "\nCurrent Count = " << count << std::endl;
if(count == 0){
//only new line character encountered
//adding newline character
str[1] = '\0';
str[0] = '\n';
fp.ignore(1, '\n');
//std::cout << fp.get(); //ignore new line character from stream
}
else if(count != (len -1) ){
//adding newline character
str[count + 1] = '\0';
str[count ] = '\n';
//std::cout << fp.get(); //ignore new line character from stream
fp.ignore(1, '\n');
//std::cout << "Adding new line \n";
}
std::cout << str << std::endl;
std::cout << " Stream State : Good: " << fp.good() << " Fail: " << fp.fail() << std::endl;
iter_count++;
}
std::cout << "\nCount = " << iter_count << std::endl;
fp.close();
return 0;
}
The txt file that I am using is write.txt with following content:
This is a new lines.
Now writing second
line
DONE
If you observe my program, I am using fgets function first and then using the get function on same file. In case of get function, the stream state goes bad.
Can anyone please point me out what is going wrong here?
UPDATED: I am now posting a simplest code which does not work at my end. If I dont care about the delim character for now and just read the entire file 10 characters at a time using getline:
void read_file_getline_no_insert(){
char str[256];
int len =10;
std::cout << "\nREAD_GETLINE_NO_INSERT FUNCITON\n" << std::endl;
std::fstream fp("C:\\cpp\\write.txt", std::ios_base::in);
int iter_count = 0;
while(!fp.eof() && iter_count < 10){
fp.getline(str, len,'\n');
int count = fp.gcount();
std::cout << "\nCurrent Count = " << count << std::endl;
std::cout << str << std::endl;
std::cout << " Stream State : Good: " << fp.good() << " Fail: " << fp.fail() << std::endl;
iter_count++;
}
std::cout << "\nCount = " << iter_count << std::endl;
fp.close();
}
int main(int argc, char **argv)
{
read_file_getline_no_insert();
return 0;
}
If wee see the output of above code:
READ_GETLINE_NO_INSERT FUNCITON
Current Count = 9
This is a
Stream State : Good: 0 Fail: 1
Current Count = 0
Stream State : Good: 0 Fail: 1
You would see that the state of stream goes Bad and the fail bit is set. I am unable to understand this behavior.
Rgds
Sapan
std::getline() will read a string from a stream, until it encounters a delimiter (newline by default).
Unlike fgets(), std::getline() discards the delimiter. But, also unlike fgets(), it will read the whole line (available memory permitting) since it works with a std::string rather than a char *. That makes it somewhat easier to use in practice.
All types derived from std::istream (which is the base class for all input streams) also have a member function called getline() which works a little more like fgets() - accepting a char * and a buffer size. It still discards the delimiter though.
The C++-specific options are overloaded functions (i.e. available in more than one version) so you need to read documentation to decide which one is appropriate to your needs.

How many values can be put into an Array in C++?

I wanted to read an array of double values from a file to an array. I have like 128^3 values. My program worked just fine as long as I stayed at 128^2 values, but now I get an "segmentation fault" error, even though 128^3 ≈ 2,100,000 is by far below the maximum of int. So how many values can you actually put into an array of doubles?
#include <iostream>
#include <fstream>
int LENGTH = 128;
int main(int argc, const char * argv[]) {
// insert code here...
const int arrLength = LENGTH*LENGTH*LENGTH;
std::string filename = "density.dat";
std::cout << "opening file" << std::endl;
std::ifstream infile(filename.c_str());
std::cout << "creating array with length " << arrLength << std::endl;
double* densdata[arrLength];
std::cout << "Array created"<< std::endl;
for(int i=0; i < arrLength; ++i){
double a;
infile >> a;
densdata[i] = &a;
std::cout << "read value: " << a << " at line " << (i+1) << std::endl;
}
return 0;
}
You are allocating the array on the stack, and stack size is limited (by default, stack limit tends to be in single-digit megabytes).
You have several options:
increase the size of the stack (ulimit -s on Unix);
allocate the array on the heap using new;
move to using std::vector.