I have a function that allocated a buffer for the size of a file with
char *buffer = new char[size_of_file];
The i loop over the buffer and copy some of the pointers into a subbuffer to work with smaller units of it.
char *subbuffer = new char[size+1];
for (int i =0; i < size; i++) {
subbuffer[i] = (buffer + cursor)[i];
}
Next I call a function and pass it this subbuffer, and arbitrary cursor for a location in the subbuffer, and the size of text to be abstracted.
wchar_t* FileReader::getStringForSizeAndCursor(int32_t size, int cursor, char *buffer) {
int wlen = size/2;
#if MARKUP_SIZEOFWCHAR == 4 // sizeof(wchar_t) == 4
uint32_t *dest = new uint32_t[wlen+1];
#else
uint16_t *dest = new uint16_t[wlen+1];
#endif
char *bcpy = new char[size];
memcpy(bcpy, (buffer + cursor), size+2);
unsigned char *ptr = (unsigned char *)bcpy; //need to be careful not to read outside the buffer
for(int i=0; i<wlen; i++) {
dest[i] = (ptr[0] << 8) + ptr[1];
ptr += 2;
}
//cout << "size:: " << size << " wlen:: " << wlen << " c:: " << c << "\n";
dest[wlen] = ('\0' << 8) + '\0';
return (wchar_t *)dest;
}
I store this in a value as the property of a struct whilst looping through the file.
My issue seems to be when I free subbuffer, and start reading the title properties of my structs by looping over an array of struct pointers, my app segfaults. GDB tells me it finished normally though, but a bunch of records that I cout are missing.
I suspect this has to do with function scope of something. I thought the memcpy in getStringForSizeAndCursor would fix the segfault since it's copying bytes outside of subbuffer before I free. Right now I would expect those to then be cleaned up by my struct deconstructor, but either things are deconstructing before I expect or some memory is still pointing to the original subbuffer, if I let subbuffer leak I get back the data I expected, but this is not a solution.
The only definite error I can see in your question's code is the too small allocation of bcpy, where you allocate a buffer of size size and promptly copy size+2 bytes to the buffer. Since you're not using the extra 2 bytes in the code, just drop the +2 in the copy.
Besides that, I can only see one suspicious thing, you're doing;
char *subbuffer = new char[size+1];
and copying size bytes to the buffer. The allocation hints that you're allocating extra memory for a zero termination, but either it shouldn't be there at all (no +1) or you should allocate 2 bytes (since your function hints to a double byte character set. Either way, I can't see you zero terminating it, so use of it as a zero terminated string will probably break.
#Grizzly in the comments has a point too, allocating and handling memory for strings and wstrings is probably something you could "offload" to the STL with good results.
Related
I'm new to C++, trying to follow a tutorial series to learn the language, the challenge given for allocating memory is to allocate memory for 26 chars and then fill them with the alphabet, abcde... etc.
I thought I knew the solution but ran into this error:
Invalid address specified to RtlValidateHeap( 00490000, 0049D9EC ) the part that is throwing me off is the program executes fully, a-z but still throws this error
Here is my code:
char c = 'a';
char *pChar = new char[26];
for (int i = 0; i < 26; i++, pChar++, c++) {
*pChar = c;
cout << *pChar << flush;
}
delete[] pChar;
Sorry if the question is worded poorly, I am new to both c++ and stackoverflow.
When you say delete[] pChar; you are in fact attempting to delete what pChar is currently pointing at, which is not the same spot as where it was originally allocated.
In short, when you allocate something with new it puts some data about the allocation (size of the allocation for example so you do not need to say delete[26] pChar; like you had to when C++ was new) usually to the left of the newly allocated memory, and it is probably interpreting things you have written (the alphabet) as that information when trying to use it to free the memory, which is of course not going to work.
You should store a copy of the original pointer to the memory you have allocated instead and use that to delete, or perhaps a better option, use i and subscripts to index via pointer arithmetic instead like:
char c = 'a';
char *pChar = new char[26];
for (int i = 0; i < 26; i++, c++) {
pChar[i] = c;
cout << pChar[i] << flush;
}
delete[] pChar;
The trouble is that when control leaves the loop, pChar points to a location one past the end of the array. You then call delete[] on that pointer, which is like putting a wrecking ball through the wrong house.
As already mentioned in the other answers, you cannot call delete[] on a pointer you have changed since calling new []. It's undefined behavior.
delete [] needs to get passed the exactly same pointer value as you had achieved when calling new[].
And here's the most simple fix, without need to change pChar:
#include <iostream>
int main()
{
char c = 'a';
char *pChar = new char[26];
for (int i = 0; i < 26; ++i) {
pChar[i] = c; // access the character by index ...
std::cout << pChar[i] << std::flush; // access the character by index ...
// ++pChar; DON'T change the original pointer
++c;
}
delete[] pChar;
return 0;
}
Live Demo
in c++ id like to read input into a c-style string one character at a time. how do you do this without first creating a char array with a set size (you don't know how many chars the user will enter). And since you can't resize the array, how is this done? I was thinking something along these lines, but this does not work.
char words[1];
int count = 0;
char c;
while(cin.get(c))
{
words[count] = c;
char temp[count+1];
count++;
words = temp;
delete[] temp;
}
Since you cannot use std::vector, I am assuming you cannot use std::string either. If you can use std::string, you can the solution provide by the answer by #ilia.
Without that, your only option is to:
Use a pointer that points to dynamically allocated memory.
Keep track of the size of the allocated array. If the number of characters to be stored exceeds the current size, increase the array size, allocate new memory, copy the contents from the old memory to new memory, delete old memory, use the new memory.
Delete the allocated memory at the end of the function.
Here's what I suggest:
#include <iostream>
int main()
{
size_t currentSize = 10;
// Always make space for the terminating null character.
char* words = new char[currentSize+1];
size_t count = 0;
char c;
while(std::cin.get(c))
{
if ( count == currentSize )
{
// Allocate memory to hold more data.
size_t newSize = currentSize*2;
char* temp = new char[newSize+1];
// Copy the contents from the old location to the new location.
for ( size_t i = 0; i < currentSize; ++i )
{
temp[i] = words[i];
}
// Delete the old memory.
delete [] words;
// Use the new memory
words = temp;
currentSize = newSize;
}
words[count] = c;
count++;
}
// Terminate the string with a null character.
words[count] = '\0';
std::cout << words << std::endl;
// Deallocate the memory.
delete [] words;
}
You asked for C-style array. Stack or dynamic allocation will not serve you in this case. You need to change the count of the array number in each time you add new element which is not possible automatically. You have to work around and delete and reserve the array each time a new chae is read. So you have to options:
Use std::vector (which was created for this purpose)
Duplicate what is inside std::vector and write it yourself during your code( which seems terrible)
std::vector solution:
std::vector<char> words;
words.reserve(ESTIMATED_COUNT); // if you you do not the estimated count just delete this line
char c;
while(cin.get(c)){
words.push_back(c);
}
#include <iostream>
#include <string>
using namespace std;
int main()
{
string s1;
char c;
while (cin.get(c))
{
if (c == '\n')
continue;
s1 += c;
cout << "s1 is: " << s1.c_str() << endl; //s1.c_str() returns c style string
}
}
You have two ways, first is to use an zero size array, after each input you delete the array and allocate a new one that is +1 bigger, then store the input. This uses less memory but inefficient. (In C, you can use realloc to improve efficiency)
Second is to use a buffer, for example you store read input in a fixed size array and when it get full, you append the buffer at the end of main array (by deleting and re-allocating).
By the way, you can use std::vector which grows the size of itself automatically and efficiently.
If you're set on using c-style strings then:
char* words;
int count = 0;
char c;
while(cin.get(c))
{
// Create a new array and store the value at the end.
char* temp = new char[++count];
temp[count - 1] = c;
// Copy over the previous values. Notice the -1.
// You could also replace this FOR loop with a memcpy().
for(int i = 0; i < count - 1; i++)
temp[i] = words[i];
// Delete the data in the old memory location.
delete[] words;
// Point to the newly updated memory location.
words = temp;
}
I would do what Humam Helfawi suggested and use std::vector<char> since you are using C++, it will make your life easier. The implementation above is basically the less elegant version of vector. If you don't know the size before hand then you will have to resize memory.
You need to allocate a string buffer of arbitrary size. Then, if the maximum number of characters is reached upon appending, you need to enlarge the buffer with realloc.
In order to avoid calling realloc at each character, which is not optimal, a growth strategy is recommended, such as doubling the size at each allocation. There are even more fine-tuned growth strategies, which depend on the platform.
Then, at the end, you may use realloc to trim the buffer to the exact number of appended bytes, if necessary.
I have a vector of chars:
vector<char> bytesv;
I push 1024 chars to this vector in a loop (not important to the context) using push_back(char):
bytesv.push_back(c);
I know this vector has an exact value of 1024. It indeeds print 1024 when doing the following:
cout << bytesv.size() << "\n";
What I am trying to do: I need to transform this vector into a char array (char[]) of the same length and elements as the vector. I do the following:
char* bytes = &bytesv[0];
The problem: But when I print the size of this array, it prints 4, so the size is not what I expected:
cout << sizeof(bytes) << "\n";
Full code:
vector<char> bytesv;
for (char c : charr) { // Not important, there are 1024 chars in this array
bytesv.push_back(c);
}
cout << bytesv.size() << "\n";
char* bytes = &bytesv[0];
cout << sizeof(bytes) << "\n";
Prints:
1024
4
This obviously has to do with the fact that bytes is actually a char*, not really an array AFAIK.
The question: How can I safely transfer all the vector's contents into an array, then?
How can I safely transfer all the vector's contents into an array, then?
Allocate the required memory by using dynamic memory allocation.
size_t size = bytesv.size();
char* char_array = new char[size];
Copy the elements from the vector to the array.
for ( size_t i = 0; i < size; ++i )
char_array[i] = bytesv[i];
Make sure you deallocate the memory after you are done using it.
delete [] char_array;
Having said, that I realized that you mentioned in a comment,
My ultimate goal is to save these bytes to a file, using fstream, which requires an array of chars as far as I am concerned.
You don't need to copy the contents of the vector to an array to save them to an fstream. The contents of a std::vector are guaranteed to be in contiguous memory. You can just use:
outStream.write(bytesv.data(), bytesv.size());
sizeof(bytes) is the size of the pointer, not what it points to. Also,
char* bytes = &bytesv[0];
Doesn't transfer anything to an array, all you've done is saved a pointer to the beginning of the underlying array in std::vector.
To correctly move the data to an array you'll need to dynamically allocate an array. But the question is why would you do that? You already have a vector. It's like an array but about a billion times better.
How can I safely transfer all the vector's contents into an array, then?
There's no need to "transfer" (i.e. copy). You can access the vector's underlying storage as an array by using the data method.
char* arr = bytesv.data();
http://en.cppreference.com/w/cpp/container/vector/data
bytes is actually a char*, not really an array
The char* is not an array but a pointer to the first value in the array. You can get the number of elements in the array from bytesv.size()
sizeof(bytes) is always 4 because *bytes is a pointer and you are using a machine that uses 4-byte pointers.
You already know that you have 1024 bytes; just use that knowledge.
First of all: you need to copy the content of the vector to the array - otherwise you can't access elements of your array, when the vector is gone. So you need to allocate memory to your array (not just defining a pointer).
char* bytes = new char[bytesv.size()];
for (size_t i = 0; i < bytesv.size(); ++i) {
bytes [i] = bytesv.at(i);
}
//...
delete[] bytes;
Secondly sizeof() doesn't do what you expect: its not reporting the length of an array, but the size of a type/pointer. In case of stack allocated arrays, it can be used as: sizeof(array)/sizeof(array[0]); to determine the size, but as sizeof() is a compile time operator it can't know the size of your dynamic allocated arrays or vectors.
If you use an array, you need to use a seperate variable to store the length of this array (alternatively you could use std::array instead).
#include <iostream>
#include <string>
#include <vector>
int main(int argc, char* argv[]){
std::vector<uint8_t> bytesv = {0x01, 0x02, 0x03, 0x04, 0x05, 0x06 };
size_t bytesLength = bytesv.size();
char* bytes = new char[bytesLength];
for (size_t i = 0; i < bytesv.size(); ++i) {
bytes[i] = bytesv.at(i);
}
//...
std::cout << bytesLength << std::endl;
delete[] bytes;
return 0;
}
That's because the sizeof any pointer is 4 (on a 32 bit target). What
char* bytes = &bytesv[0];
is giving you is a pointer to the first element, not, necessarily, an array of chars.
now if you used:
char (*bytes)[1024] = (char (*)[1024])&v[0];
std::cout << sizeof(bytes) << " " << sizeof(*bytes);
that would give you a pointer to a 'char[1024]' array.
Im new to c++ and I dont know what this error means. It reads from a file and tries to store the values in a char * [].
The file contains:
5,Justin,19,123-45-6789,State Farm,9876,Jessica,Broken Hand,
This is my code.
void Hospital::readRecordsFile(){
std::ifstream fileStream;
fileStream.open(fileName); // Opens the file stream to read fileName
char * temp [8];
int i = 0;
while(!fileStream.eof()){
fileStream.get(temp[i],256,',');
i++;
}
i = 0;
for(char * t:temp){
std::cout << t << std::endl;
}
}
The error is at the line fileStream.get(temp[i],256,',');
You define an array of 8 pointers to char, but forget to allocate memory so that the pointers point to a valid chunk of memory:
char * temp [8]; // need then to allocate memory for the pointers
Because of this, in the line
fileStream.get(temp[i],256,',')
you end up using memory that's not yours.
Solution:
for(int i = 0; i<8; i++)
temp[i] = new char[256]; // we allocate 256 bytes for each pointer
Better though, use a std::vector<std::string> instead.
In the code you have right now it looks like you implicitly assume that the file has no more than 8 lines, which I find hard to believe. If your file has more than 8 lines, then you'll end up accessing the array of 8 pointers out of bounds, so you'll get another undefined behaviour (usually a segfault). That's why is much better to use standard STL containers like std::vector, to avoid all these headaches.
In case you MUST use pointers and want a variable number of lines, then you have to use a pointer to pointer,
char** temp;
then allocate memory for an enough pointers-to-char,
temp = new char* [1000]; // we believe we won't have more than 1000 lines
then, for each pointer-to-char, allocate memory
for(int i = 0; i < 1000; ++i)
temp[i] = new char[256];
At the end of the program, you must then delete[] in reverse order
for(int i = 0; i < 1000; ++i)
delete[] temp[i];
delete[] temp;
As you can see, it's getting messy.
You never allocated memory for each pointer in temp.
You probably want something like:
for (unsigned int i = 0u; i < 8; ++i)
{
temp[i] = new char[256];
}
The says that the temp variable points to 8 dynamically allocated byte buffers of 256 bytes each.
I want to create jagged character two dimensional array in c++.
int arrsize[3] = {10, 5, 2};
char** record;
record = (char**)malloc(3);
cout << endl << sizeof(record) << endl;
for (int i = 0; i < 3; i++)
{
record[i] = (char *)malloc(arrsize[i] * sizeof(char *));
cout << endl << sizeof(record[i]) << endl;
}
I want to set record[0] for name (should have 10 letter), record[1] for marks (should have 5 digit mark )and record[3] for Id (should have 2 digit number). How can i implement this? I directly write the record array to the binary file. I don't want to use struct and class.
in C++ it would like this:
std::vector<std::string> record;
Why would you not use a struct when it is the sensible solution to your problem?
struct record {
char name[10];
char mark[5];
char id[2];
};
Then writing to a binary file becomes trivial:
record r = get_a_record();
write( fd, &r, sizeof r );
Notes:
You might want to allocate a bit of extra space for NUL terminators, but this depends on the format that you want to use in the file.
If you are writing to a binary file, why do you want to write mark and id as strings? Why not store an int (4 bytes, greater range of values) and a unsigned char (1 byte)
If you insist on not using a user defined type (really, you should), then you can just create a single block of memory and use pointer arithmetic, but beware that the binary generated by the compiler will be the same, the only difference is that your code will be less maintainable:
char record[ 10+5+2 ];
// copy name to record
// copy mark to record+10
// copy id to record+15
write( fd, record, sizeof record);
Actually the right pattern “to malloc” is:
T * p = (T *) malloc(count * sizeof(T));
where T could be any type, including char *. So the right code for allocating memory in this case is like that:
int arrsize[3] = { 10, 5, 2 };
char** record;
record = (char**) malloc(3 * sizeof(char *));
cout << sizeof(record) << endl;
for (int i = 0; i < 3; ++i) {
record[i] = (char *) malloc(arrsize[i] * sizeof(char));
}
I deleted cout'ing sizeof(record[i]) because it will always yield size of (one) pointer to char (4 on my laptop). sizeof is something that plays in compiling time and has no idea how much memory pointed by record[i] (which is really a pointer - char * type) was allocated in the execution time.
malloc(3) allocates 3 bytes. Your jagged array would be an array containing pointers to character arrays. Each pointer usually takes 4 bytes (on a 32-bit machine), but more correctly sizeof(char*), so you should allocate using malloc(3 * sizeof(char*) ).
And then record[i] = (char*)malloc((arrsize[i]+1) * sizeof(char)), because a string is a char* and a character is a char, and because each C-style string is conventionally terminated with a '\0' character to indicate its length. You could do without it, but it would be harder to use for instance:
strcpy(record[0], name);
sprintf(record[1], "%0.2f", mark);
sprintf(record[2], "%d", id);
to fill in your record, because sprintf puts in a \0 at the end. I assumed mark was a floating-point number and id was an integer.
As regards writing all this to a file, if the file is binary why put everything in as strings in the first place?
Assuming you do, you could use something like:
ofstream f("myfile",ios_base::out|ios_base::binary);
for (int i=0; i<3; i++)
f.write(record[i], arrsize[i]);
f.close();
That being said, I second Anders' idea. If you use STL vectors and strings, you won't have to deal with ugly memory allocations, and your code will probably look cleaner as well.