I want to write code that causes a buffer overflow. Could you give me some advice on why my test code below does not cause a buffer overflow?
I think a buffer overflow should occur on this line:
#include <cstdio>
#include <string>
#include <cstring>
using namespace std;
#define buffer_size 10
int main(){
string m_string = "My name is String";
char* node_reference = new char[buffer_size];
unsigned int len = strlen(m_string.c_str());
unsigned int buffer_len = sizeof(node_reference)/sizeof(char);
std::printf("len: %d\n", len);
std::printf("buffer len: %d\n", buffer_len);
for (unsigned int i = 0; i < strlen(m_string.c_str()) + 1; i++) {
node_reference[i] = m_string[i];
std::printf("index: %d and value: %c\n", i, node_reference[i]);
}
return 0;
}
EDIT
I removed typo de_referece line. Thank you for your kind answers to the insufficient questions.
sizeof(node_reference) does not do what you think it does, since node_reference is allocated dynamically. You are obtaining the size of the pointer itself, not the size of the allocated buffer being pointed at. Your printf() of buffer_len should have indicated that to you. sizeof(node_reference) would work only if node_reference had been allocated statically instead.
That being said, your code does cause a buffer overflow. You are allocating 10 bytes for node_reference and then writing 18 bytes into it. The behavior is undefined, so anything can happen. A crash is not guaranteed, if that is what you are expecting. You are certainly corrupting memory, though it may not be memory you would normally see since it is outside of your allocation.
Related
What is proper size of an char array (buffer) when i want to use sprintf function?
I dont know why this part of code is working if buffer can hold only 1 char? I put a lot more chars inside than 1.
/* sprintf example */
#include <stdio.h>
int main ()
{
char buffer[1];
int n, a=5, b=3;
n = sprintf (buffer, "%d plus %d is %d", a, b, a+b);
printf ("[%s] is a string %d chars long\n", buffer, n);
return 0;
}
Results:
[5 plus 3 is 8] is a string 13 chars long
What is proper size of an char array (buffer) when i want to use sprintf function?
There isn't one.
If you can work out an upper bound from the format string and types of input, then you might use that. For example, a 32-bit int won't take up more than 11 characters to represent in decimal with an optional sign, so your particular example won't need more than 44 characters (unless I miscounted).
Otherwise, use something safer: std::stringstream in C++, or snprintf and care in C.
I don't know why this part of code is working if buffer can hold only 1 char?
It isn't. It's writing past the end of the buffer into some other memory.
Maybe that won't cause any visible errors; maybe it will corrupt some other variables; maybe it will cause a protection fault and end the program; maybe it will corrupt the stack frame and cause all kinds of havoc when the function tries to return; or maybe it will cause some other kind of undefined behaviour. But it's certainly not behaving correctly.
In your code a buffer overflow occurred, there were no apparent consequences, but that doesn't mean it worked correctly, try using a memory debugger like valgrind and you will see what I mean.
You can't ensure that sprintf() will not overflow the buffer, that's why there is a snprintf() function to which you pass the size of the buffer.
Sample usage
char buffer[100];
int result;
result = snprintf(buffer, sizeof(buffer), "%d plus %d is %d", a, b, a + b);
if (result >= sizeof(buffer))
{
fprintf(stderr, "The string does not fit `buffer'.\n");
}
Assuming code must use sprintf() and not some other function:
pre-determine the worse case output size and add margin.
Unless there are major memory concerns, suggest a 2x buffer. Various locales can do interesting things like add ',' to integer output as in "123,456,789".
#include <stdio.h>
#include <limits.h>
#define INT_DECIMAL_SIZE(i) (sizeof(i)*CHAR_BIT/3 + 3)
#define format1 "%d plus %d is %d"
char buffer[(sizeof format1 * 3 * INT_DECIMAL_SIZE(int)) * 2];
int n = sprintf(buffer, format1, a, b, a + b);
A challenging example is when code tries sprintf(buf,"%Lf", some_long_double) as the output could be 1000s of characters should x == LDBL_MAX. About 5000 characters with binary128 as long double.
// - 123.............456 . 000000 \0
#define LDBL_DECIMAL_SIZE(i) (1 + 1 + LDBL_MAX_10_EXP + 1 + 6 1)
My task is to read a yuv file and to each component(Y,Cb,Cr) of it, I'm appending some data and storing it into another file. I have tried the below code:
#include <stdio.h>
#include <stdlib.h>
void main()
{
FILE *fp=fopen("traffic_1920x1080.yuv","rb");
FILE *myYUV=fopen("traffic_1920x1088.yuv","ab");
int count=0;
unsigned char *y=(unsigned char*)malloc(sizeof(unsigned char)*1920*1080);
unsigned char *u=(unsigned char*)malloc(sizeof(unsigned char)*(1920/2)*(1080/2));
unsigned char *v=(unsigned char*)malloc(sizeof(unsigned char)*(1920/2)*(1080/2));
unsigned char ypad[1920*8];
unsigned char upad[(1920/2)*4];
unsigned char vpad[(1920/2)*4];
for(int i=0;i<(1920/2)*4;i++)
{
ypad[i]=255;
upad[i]=128;
vpad[i]=128;
}
for(int i=(1920/2)*4;i<1920*8;i++)
ypad[i]=255;
while (!feof(fp))
{
fread(y,sizeof(unsigned char),1920*1080,fp);
fread(u,sizeof(unsigned char),1920/2*1080/2,fp);
fread(v,sizeof(unsigned char),1920/2*1080/2,fp);
fwrite(y, sizeof(unsigned char),1920*1080,myYUV);
fwrite(ypad,sizeof(unsigned char),1920*8,myYUV);
fwrite(u,sizeof(unsigned char),1920/2*1080/2,myYUV);
fwrite(upad,sizeof(unsigned char),1920/2*4,myYUV);
fwrite(v,sizeof(unsigned char),1920/2*1080/2,myYUV);
fwrite(vpad,sizeof(unsigned char),1920/2*4,myYUV);
printf("Frame %d created\r",count);
y+=1920*1080;
u+=1920/2*1080/2;
v+=1920/2*1080/2;
count ++;
}
free(y);
free(u);
free(v);
fclose(fp);
fclose(myYUV);
}
Howevr the above code works fine for the first loop, but in the second loop i get an exception
Access violation writing location 0x0092f000.
at line fwrite(y, sizeof(unsigned char),1920*1080,myYUV);
Is this a problem in pointer increment? or it is something else? Please reply. Thanks in advance.
These increments:
y+=1920*1080;
u+=1920/2*1080/2;
v+=1920/2*1080/2;
will increment the pointers past the end of the allocated memory. For example, y points to the start of 1920*1080 bytes of allocated memory. Increasing it by that much makes it point past the end of that memory. This results in reading/writing to/from unallocated memory. That's why you get an access violation.
I don't actually see a reason for those pointers to be incremented at all.
Other than that, your code should check for error conditions (did fopen() succeed, etc.)
I read the code below in a book which said that this was vulnerable to stack overflow. Although fgets() has been used,I was not able to understand, why it is vulnerable?
My understanding is that using fgets() instead of gets() usually helps us get rid of buffer overflow by placing a null at the end. Am I missing something? What should be used instead of fgets() to correct the stack overflow?
void getinp(char *inp, int siz)
{
puts("Input value: ");
fgets(inp, siz, stdin);
printf("buffer3 getinp read %s\n", inp);
}
void display(char * val)
{
char tmp[16];
sprintf(tmp, "read val: %s\n", val);
puts(tmp);
}
int main(int argc, char *argv[])
{
char buf[16];
getinp(buf, sizeof(buf));
display(buf);
printf("buffer3 done\n");
}
In display tmp is declared as 16 chars long, but you are writing (with the sprintf) there not only val (which is guaranteed to be 16 characters or less), but also "read val: " and the final \n).
This means that if the user inserts more than 16-11=5 characters you have a buffer overflow in display.
One solution could be declaring buf in display to be large enough to store both val and the additional text, although in the real world you would just write to stdout using printf (without the intermediate buffer).
Also, usually when you have a sprintf and there's some potential risk of buffer overflow you use snprintf instead (actually, I use it always); snprintf, instead of overflowing the buffer, truncates the output if it would be too long, and returns the number of characters that would have been written if the output buffer was big enough.
In display, there is no way to be sure that val + 12 bytes is going fit into a 16 character buffer.
I am trying to constantly read data into a buffer of type unsigned char* from different files. However, I can't seem to set the buffer to NULL prior to reading in the next file.
Here is only the relevant code:
#include <stdio.h>
#include <fstream>
int
main (int argc, char** argv) {
FILE* dataFile = fopen("C:\\File1.txt", "rb");
unsigned char *buffer = NULL;
buffer = (unsigned char*)malloc(1000);
fread(buffer,1,1000,dataFile);
fclose(dataFile);
dataFile = fopen("C:\\File2.txt", "rb");
buffer = NULL;
fread(buffer,1,1000,dataFile);
fclose(dataFile);
system("pause");
return 0;
}
The error I run into is at the second occurrence of this line: fread(buffer,1,1000,dataFile);
The error I get is:
Debug Assertion Failed!
Expression: (buffer != NULL)
It points me to Line 147 of fread.c which is basically:
/* validation */
_VALIDATE_RETURN((buffer != NULL), EINVAL, 0);
if (stream == NULL || num > (SIZE_MAX / elementSize))
{
if (bufferSize != SIZE_MAX)
{
memset(buffer, _BUFFER_FILL_PATTERN, bufferSize);
}
_VALIDATE_RETURN((stream != NULL), EINVAL, 0);
_VALIDATE_RETURN(num <= (SIZE_MAX / elementSize), EINVAL, 0);
}
I did Google for ways to get the buffer pointer to NULL and tried the various suggestions, but none seem to work. Anyone can clarify what is the right way to set it to NULL?
Your buffer is a pointer.
When you do this:
buffer = (unsigned char*)malloc(1000);
you allocate some space in memory, and assign its starting position to buffer. Remember, buffer holds the address of the beginning of the space, that's all. When you do this:
buffer = NULL;
you have thrown away that address.
EDIT:
C++ style, without dynamic memory:
#include <fstream>
using std:: string;
using std:: ifstream;
void readFromFile(string fname)
{
char buffer[1000];
ifstream fin(fname.c_str());
fin.read(buffer, sizeof(buffer));
// maybe do things with the data
}
int main ()
{
readFromFile("File1.txt");
readFromFile("File2.txt");
return 0;
}
There's no need to erase the contents of the buffer. If the cost of allocating and deallocating the buffer with each call is too much, just add static:
static char buffer[1000];
It will be overwritten each time.
You can't say buffer = NULL because fread wil try to dereference it. Dereferencing NULL is one of the things that are certainly and completely illegal in C++. In effect you're losing what you got from malloc. Perhaps you're looking for memset and trying to zero the buffer:
memset(buffer, 0, 1000);
However, you don't need to do this before calling fread. There's simply no reason since fread will write the buffer anyway: it doesn't care if it's zeroed or not.
As a side note: you're writing very C-ish code in what I suspect is C++ (given your fstream header). There are better-suited I/O options for C++.
typedef struct {
unsigned char b1, b2;
} cont;
cont buf[1024];
int main(int argc, char *argv[]) {
FILE* fp;
fp = fopen(argv[1], "rb")
if(fp!=NULL)
fread(buf, sizeof (cont), sizeof (buf), fp);
//do something with buf
return 0;
}
Hello there, I am facing a segmentation fault error when I try to run this program. It used to work fine all of the sudden the segm. fault error appeared. The fread function call is generating the error. Please help me!
You're using fread() wrong - arg#1 is the size of elements to read and arg#2 is the number of elements to read (which should be 1024 in your case).
As a result, what you do reads sizeof (cont) * sizeof (buf) bytes, and that overflows your buffer.
See:
http://www.opengroup.org/onlinepubs/009695399/functions/fread.html
for the function documentation.
To clarify, you want to read 1024 elements but sizeof(buf) is 2048 (at least, maybe more if the struct is padded by the ABI of your platform).
Examples (coded so that they don't rely on a specific number of elements):
fread(buf, 1, sizeof(buf), fp); // fills the buffer (assuming it's buf[...])
fread(buf, sizeof(*buf), sizeof(buf)/sizeof(*buf), fp); // ditto
I.e. if you want to pass the total size of the destination buffer, via sizeof(), then the other argument must be one, while if you want to pass the size of the data structure, then the other argument is the number of these that fits into the buffer.
Always check return values. How else do you know if you actually managed to read anything?
I think this may be because of padding. The "cont" type is defined as 2 bytes big, but will probably be padded to 4. However this should not cause a problem because even if sizeof(cont) returns 2 or 4, "buf" must be using the padded size and so still be big enough.
sizeof(buf) gives you the grand total of buf, not just the number of elements in it. Nevertheless, you should never every read directly into structs. Bad things await you, if you do it that way.
Also structs may be padded anywhere between their members, so you don't even know the exact memory layout of that struct you define.
To keep your program portable and safe always read files element by element and construct the data from that.
int i;
for(i = 0; i < MAX_ELEMENTS && !feof(fil); ++i) {
int c1, c2;
c1 = fgetc(fil);
c2 = fgetc(fil);
if(c1 == EOF || c2 == EOF)
break;
buf[i].c1 = c1;
buf[i].c2 = c2;
}
Does this look tedious and verbose? Yes, but that is for good reason. Always assume the contents of a file to be possibly corrupted. Just reading a file into memory assuming is dangerous!