I don't use correctly the format specifiers in C. A few lines of code:
int main()
{
char dest[]="stack";
unsigned short val = 500;
char c = 'a';
char* final = (char*) malloc(strlen(dest) + 6);
snprintf(final, strlen(dest)+6, "%c%c%hd%c%c%s", c, c, val, c, c, dest);
printf("%s\n", final);
return 0;
}
What I want is to copy at
final [0] = a random char
final [1] = a random char
final [2] and final [3] = the short array
final [4] = another char ....
My problem is that i want to copy the two bytes of the short int to 2 bytes of the final array.
thanks.
I'm confused - the problem is that you are saying strlen(dest)+6 which limits the length of the final string to 10 chars (plus a null terminator). If you say strlen(dest)+8 then there will be enough space for the full string.
Update
Even though a short may only be 2 bytes in size, when it is printed as a string each character will take up a byte. So that means it can require up to 5 bytes of space to write a short to a string, if you are writing a number above 10000.
Now, if you write the short to a string as a hexadecimal number using the %x format specifier, it will take up no more than 2 bytes.
You need to allocate space for 13 characters - not 11. Don't forget the terminating NULL.
When formatted the number (500) takes up three spaces, not one. So your snsprintf should give the final length as strlen(dest)+5+3. Then also fix your malloc call to adjust. If you want to compute the strlen of the number, do that with a call like this strlen(itoa(val)). Also, cant forget the NULL at the end of dest, but I think strlen takes this into account, but I'm not for sure.
Simple answer is you only allocated enough space for the strlen(dest) + 6 characters when in all reality it looks like you're going to have 8 extra characters... since you have 2 chars + 3 chars in your number + 2 chars after + dest (5 chars) = 13 char when you allocated 11 chars.
Unsigned shorts can take up to 5 characters, right? (0 - 65535)
Seems like you'd need to allocate 5 characters for your unsigned short to cover all of the values.
Which would point to using this:
char* final = (char*) malloc(strlen(dest) + 10);
You lose one byte because you think the short variable takes 2 byte. But it takes three: one for each digit character ('5', '0', '0'). Also you need a '\0' terminator (+1 byte).
==> You need strlen(dest) + 8
Use 8 instead of 6 on:
char* final = (char*) malloc(strlen(dest) + 6);
and
snprintf(final, strlen(dest)+6, "%c%c%hd%c%c%s", c, c, val, c, c, dest);
Seems like the primary misunderstanding is that a "2-byte" short can't be represented on-screen as 2 1-byte characters.
First, leave enough room:
char* final = (char*) malloc(strlen(dest) + 9);
The entire range of possible values for a 1-byte character are not printable. If you want to display this on screen and be readable, you'll have to encode the 2-byte short as 4 hex bytes, such as:
## as hex, 4 characters
snprintf(final, sizeof(final), "%c%c%4x%c%c%s", c, c, val, c, c, dest);
If you are writing this to a file, that's OK, and you might try the following:
## print raw bytes, upper byte, then lower byte.
snprintf(final, sizeof(final), "%c%c%c%c%c%c%s", c, c, ((val<<8)&0xFF), ((val>>8)&0xFF), c, c, dest);
But that won't make sense to a human looking at it, and is sensitive to endianness. I'd strongly recommend against it.
Related
I have a BASE64 encode string:
static const unsigned char base64_test_enc[] =
"VGVzdCBzdHJpbmcgZm9yIGEgc3RhY2tvdmVyZmxvdy5jb20gcXVlc3Rpb24=";
It does not have CRLF-per-72 characters.
How to calculate a decoded message length?
Well, base64 represents 3 bytes in 4 characters... so to start with you just need to divide by 4 and multiply by 3.
You then need to account for padding:
If the text ends with "==" you need to subtract 2 bytes (as the last group of 4 characters only represents 1 byte)
If the text ends with just "=" you need to subtract 1 byte (as the last group of 4 characters represents 2 bytes)
If the text doesn't end with padding at all, you don't need to subtract anything (as the last group of 4 characters represents 3 bytes as normal)
Base 64 uses 4 characters per 3 bytes. If it uses padding it always has a multiple of 4 characters.
Furthermore, there are three padding possibilities:
two characters and two padding characters == for one encoded byte
3 characters and one padding character = for two encoded bytes
and of course no padding characters, making 3 bytes.
So you can simply divide the number of characters by 4, then multiply by 3 and finally subtract the number of padding characters.
Possible C code could be (if I wasn't extremely rusty in C, please adjust):
size_t encoded_base64_bytes(const char *input)
{
size_t len, padlen;
char *last, *first_pad;
len = strlen(input);
if (len == 0) return 0;
last = input + len - 4;
first_pad = strchr(last, '=');
padlen = first_pad == null ? 0 : last - first_pad;
return (len / 4) * 3 - padlen;
}
Note that this code assumes that the input is valid base 64.
A good observer will notice that there are spare bits, usually set to 0 in the final characters if padding is used.
I am adding values into the combo box as a string. Below is my code.
Platform Windows XP and I am using Microsoft Visual Studio 2003
language C++
error encountered -> "Run-Time Check Failure #2 - Stack around the variable 'buffer' was corrupted."
If I increase the size of the buffer to say 4 and above then I won't get this error.
My question is not related to how to fix that error, but I am wondering why I got this error if buffer size = 2.
According to my logic I have given buffer size = 2 as char[0] will store the valve of char[1] = null terminated character.
Now since char can store values from 0 to 255 , I thought this should be ok as my inserted values are from 1 to 63 and then from 183 to 200.
CComboBox m_select_combo;
const unsigned int max_num_of_values = 63;
m_select_combo.AddString( "ALL" );
for( unsigned int i = 1; i <= max_num_of_values ; ++i )
{
char buffer[2];
std::string prn_select_c = itoa( i, buffer, 10 );
m_select_combo.AddString( prn_select_c.c_str() );
}
const unsigned int max_num_of_high_sats = 202 ;
for( unsigned int i = 183; i <= max_num_of_high_sats ; ++i )
{
char buffer[2];
std::string prn_select_c = itoa( i, buffer, 10 );
m_select_combo.AddString( prn_select_c.c_str() );
}
Could you guys please give me an idea as to what I'm not understanding?
itoa() zero-terminates it's output, so when you call itoa(63, char[2], 10) it writes three characters 6, 3 and the terminating \0. But your buffer is only two characters long.
itoa() function is best avoided in favour of snprintf() or boost::lexical_cast<>().
You should read the documentation for itoa.
Consider the following loop:
for( unsigned int i = 183; i <= max_num_of_high_sats ; ++i )
{
char buffer[2];
std::string prn_select_c = itoa( i, buffer, 10 );
m_select_combo.AddString( prn_select_c.c_str() );
}
The first iteration converts the integer 183 to the 3 character string "183", plus a terminating null character. That's 4 bytes, which you are trying to cram into a two byte array. The docs tell you specifically to make sure your buffer is large enough to hold any value; in this case it should be at least the number of digits in max_num_of_high_sats long, plus one for the terminating null.
You might as well make it large enough to hold the maximum value you can store in an unsigned int, which would be 11 (eg. 10 digits for 4294967295 plus a terminating null).
the ito function is used to convert a int to a C sytle string based on the 3rd parameter base.
As a example, it just likes to print out the int 63 in printf. you need two ASII byte, one is used to storage CHAR 6, the other is used to storage CHAR 3. the 3rd should be NULL. So in your case the max int is three digital. you need 4 bytes in the string
You are converting an integer to ASCII, that is what itoa does. If you have a number like 183 that is four chars as a string, '1', '8', '3', '\0'.
Each character takes one byte, for example character '1' is the value 0x31 in ASCII.
I am having the following program, which is crashing. Does anybody know why it is crashing?
/* writes a, b, c into dst
** dst must have enough space for the result
** assumes all 3 numbers are positive */
void concat3(char *dst, int a, int b, int c) {
sprintf(dst, "%08x%08x%08x", a, b, c);
}
/* usage */
int main(void) {
printf("The size of int is %d \n", sizeof(int));
char n3[3 * sizeof(int) + 1];
concat3(n3, 0xDEADFACE, 0xF00BA4, 42);
printf("result is 0x%s\n", n3);
return 0;
}
You're confusing the size of the binary data (which is what sizeof) gives you, with the size of a textual representation in hexadecimal, which is what you're trying to store.
On most current systems, sizeof(int) evaluates to 4. Your buffer n3 will therefore be capable of storing 13 characters (3 * 4 + 1 == 13).
Then, you format three integers into 8-character hex format, which will require 3 * 8 + 1 == 25 characters to store. The resulting buffer overflow causes the crash.
It should be obvious that the size of the data type int doesn't matter, when you're formatting it as text (and specifying the field width yourself!).
Try 3*2*sizeof(int)+1, where 2*sizeof(int) is the number of bytes needed to print each byte worth of an int, in hex. Of course since you're using that %08X format and expecting fixed-width results, you really should be using uint32_t. By the way, your program is also incorrectly passing 0xDEADBEEF as int, which it probably doesn't fit in, and thus entering the realm of implementation-defined conversion-to-signed-type.
Here is a version with those corrections:
#include <inttypes.h>
#include <stdio.h>
/* writes a, b, c into dst
** dst must have enough space for the result
** assumes all 3 numbers are positive */
void concat3(char *dst, uint32_t a, uint32_t b, uint32_t c) {
sprintf(dst, "%08"PRIX32"%08"PRIX32"%08"PRIX32, a, b, c);
}
/* usage */
int main(void) {
printf("The size of int is %d \n", sizeof(int));
char n3[25];
concat3(n3, 0xDEADFACE, 0xF00BA4, 42);
printf("result is 0x%s\n", n3);
return 0;
}
I don't really understand what sizeof has anything to do in your code. In concat3, you're attempting to print a text representation of each provided integer as a 8 char hexadecimal string : the required buffer size should thus be equal to 8 * 3 + 1 = 25, and sizeof(int) has nothing to do with it.
You seem to be mixing the size occupied in memory by an int, and the length of it's textual representation (which in your case is easily determined as it's fixed by your sprintf format string).
On a side note : sprintf is a truly unsafe function that you should consider deprecated.
It is crashing because sizeof(int) is (most likely on your system) 4, meaning that n3 is 13 bytes long. You then try to write 8 + 8 + 8 = 24 characters to it.
Use snprintf instead of sprintf. Think of the kittens!
But seriously, you should not be creating interfaces with buffer pointers but no length information. concat should have a max length parameter. Then use snprintf inside. The length to give to concat is sizeof (n3).
It still won't work, but it won't crash either. The other answers explain how to get the functionality right.
(Oh, and don't use gets() either. Just because it is in the standard library doesn't mean it is good code.)
#define STR "test1"
Why does this take 6 bytes?
sizeof(STR) = 6
There is a trailing '\0' at the end.
a #define just does a text replacement before compiling.
#define STR "test1"
sizeof(STR);
is actually seen by the compiler as
sizeof("test1");
now why is that 6 and not 5? because there's a null terminator at the end of the string.
It has nothing to do with #define. A character array would be the same size:
const char str[] = { "test1" };
sizeof (str) == 6
The reason this string is 6 bytes long is that strings in C have a terminating NUL character to mark the end.
Strings in C are arrays of chars, with a null terminator i.e. they end with the \0. The common alternative is Pascal-style strings, where the string stores the array of chars without the null terminator, and stores the length of the string somewhere instead.
What the others said ... BUT
In C, preprocessing tokens take no space. It depends on how you use them
#define STR "test1"
char x[] = STR; /* 6 bytes */
char *y = STR; /* sizeof (char*) bytes (plus possibly 6 bytes) */
int ch = STR[3]; /* 1 byte (or sizeof (int), depending on how you look at it) */
if (ch == STR[1]) /* 1 byte (or sizeof (int) or no bytes or ...) */
printf("==>" STR "<==") /* 5 bytes ??? */
Why does this take 6 bytes?
Actually, it will take (6 bytes × the number of times you use it), because it's a preprocessor macro.
Try const char *STR = "test1" instead.
The latest C compiler has a feature to guess if the person writing the program is in a learning phase and give answers which make them search wider and deeper, and thus enrich their knowledge.
After programming for some time, depending of your learning, you might see the value go down to 5. ;-)
JK.. as someone else said, it symbolically nothing at the end which ironically takes a byte.
I have a funny problem using this function.
I use it as follow:
int nSeq = 1;
char cBuf[8];
int j = sprintf_s(cBuf, sizeof(cBuf), "%08d", nSeq);
And every time I get an exception. The exception is buffer to small.
When I changed the second field in the function to sizeof(cBuf) + 1.
Why do I need to add one if I only want to copy 8 bytes and I have an array that contains 8 bytes?
Your buffer contains 8 places. Your string contains 8 characters and a null character to close it.
Your string will require terminating '\0' and 8 bytes of data(00000001) due to %08d.
So you have to size as 9.
All sprintf functions add a null to terminate a string. So in effect your string is 9 characters long. 8 bytes of text and the ending zero