I have a funny problem using this function.
I use it as follow:
int nSeq = 1;
char cBuf[8];
int j = sprintf_s(cBuf, sizeof(cBuf), "%08d", nSeq);
And every time I get an exception. The exception is buffer to small.
When I changed the second field in the function to sizeof(cBuf) + 1.
Why do I need to add one if I only want to copy 8 bytes and I have an array that contains 8 bytes?
Your buffer contains 8 places. Your string contains 8 characters and a null character to close it.
Your string will require terminating '\0' and 8 bytes of data(00000001) due to %08d.
So you have to size as 9.
All sprintf functions add a null to terminate a string. So in effect your string is 9 characters long. 8 bytes of text and the ending zero
Related
I've seen that the below program is taking only 7 bits of memory to store the character, but in general everywhere I've studied says that char occupies 1 byte of memory ie is 8 bits.
Does a single character require 8 bits or 7 bits?
If it requires 8 bits, what will be stored in the other bit?
#include <iostream>
using namespace std;
int main()
{
char ch = 'a';
int val = ch;
while (val > 0)
{
(val % 2)? cout<<1<<" " : cout<<0<<" ";
val /= 2;
}
return 0;
}
Output:
1 0 0 0 0 1 1
The below code shows the memory gap between the character, i.e. is 7 bits:
9e9 <-> 9f0 <->......<-> a13
#include <iostream>
using namespace std;
int main()
{
char arr[] = {'k','r','i','s','h','n','a'};
for(int i=0;i<7;i++)
cout<<&arr+i<<endl;
return 0;
}
Output:
0x7fff999019e9
0x7fff999019f0
0x7fff999019f7
0x7fff999019fe
0x7fff99901a05
0x7fff99901a0c
0x7fff99901a13
Your first code sample doesn't print leading zero bits, as ASCII characters all have the upper bit set to zero you'll only get at most seven bits printed if using ASCII characters. Extended ASCII characters or utf-8 use the upper bit for characters outside the basic ASCII character set.
Your second example is actually printing that each character is seven bytes long which is obviously incorrect. If you change the size of the array you are using to not be seven characters long you'll see different results.
&arr + i is equivalent to (&arr) + i as &arr is a pointer to char[7] which has a size of 7, the +i adds 7 * i bytes to the pointer. (&arr) + 1 points to one byte past the end of the array, if you try printing the values these pointers point to you'll get junk or a crash: **(&arr + i).
Your code should be static_cast<void*>(&arr[i]), you'll then see the pointer going up by one for each iteration. The cast to void* is necessary to stop the standard library from trying to print the pointer as a null terminated string.
It has nothing to do with space assigned for char. You simply converting ASCII represent of char into binary.
ASCII is a 7 bit character set. In C normally represented by an 8 bit char. If highest bit in an 8 bit byte is set, it is not an ASCII character. The eighth bit was used for parity. To communicate information between computers using different encoding.
ASCII stands for American Standard Code for Information Interchange, with the emphasis on American. The character set could not represent like Arabic letters (things with umlauts for example) or latin.
To “extend” the ASCII set and use those extra 128 values that became available by using all 8 bits, which caused problems. Eventually, Unicode came along which can represent every Unicode character. But 8 bit become a standard for char.
I have a BASE64 encode string:
static const unsigned char base64_test_enc[] =
"VGVzdCBzdHJpbmcgZm9yIGEgc3RhY2tvdmVyZmxvdy5jb20gcXVlc3Rpb24=";
It does not have CRLF-per-72 characters.
How to calculate a decoded message length?
Well, base64 represents 3 bytes in 4 characters... so to start with you just need to divide by 4 and multiply by 3.
You then need to account for padding:
If the text ends with "==" you need to subtract 2 bytes (as the last group of 4 characters only represents 1 byte)
If the text ends with just "=" you need to subtract 1 byte (as the last group of 4 characters represents 2 bytes)
If the text doesn't end with padding at all, you don't need to subtract anything (as the last group of 4 characters represents 3 bytes as normal)
Base 64 uses 4 characters per 3 bytes. If it uses padding it always has a multiple of 4 characters.
Furthermore, there are three padding possibilities:
two characters and two padding characters == for one encoded byte
3 characters and one padding character = for two encoded bytes
and of course no padding characters, making 3 bytes.
So you can simply divide the number of characters by 4, then multiply by 3 and finally subtract the number of padding characters.
Possible C code could be (if I wasn't extremely rusty in C, please adjust):
size_t encoded_base64_bytes(const char *input)
{
size_t len, padlen;
char *last, *first_pad;
len = strlen(input);
if (len == 0) return 0;
last = input + len - 4;
first_pad = strchr(last, '=');
padlen = first_pad == null ? 0 : last - first_pad;
return (len / 4) * 3 - padlen;
}
Note that this code assumes that the input is valid base 64.
A good observer will notice that there are spare bits, usually set to 0 in the final characters if padding is used.
I am adding values into the combo box as a string. Below is my code.
Platform Windows XP and I am using Microsoft Visual Studio 2003
language C++
error encountered -> "Run-Time Check Failure #2 - Stack around the variable 'buffer' was corrupted."
If I increase the size of the buffer to say 4 and above then I won't get this error.
My question is not related to how to fix that error, but I am wondering why I got this error if buffer size = 2.
According to my logic I have given buffer size = 2 as char[0] will store the valve of char[1] = null terminated character.
Now since char can store values from 0 to 255 , I thought this should be ok as my inserted values are from 1 to 63 and then from 183 to 200.
CComboBox m_select_combo;
const unsigned int max_num_of_values = 63;
m_select_combo.AddString( "ALL" );
for( unsigned int i = 1; i <= max_num_of_values ; ++i )
{
char buffer[2];
std::string prn_select_c = itoa( i, buffer, 10 );
m_select_combo.AddString( prn_select_c.c_str() );
}
const unsigned int max_num_of_high_sats = 202 ;
for( unsigned int i = 183; i <= max_num_of_high_sats ; ++i )
{
char buffer[2];
std::string prn_select_c = itoa( i, buffer, 10 );
m_select_combo.AddString( prn_select_c.c_str() );
}
Could you guys please give me an idea as to what I'm not understanding?
itoa() zero-terminates it's output, so when you call itoa(63, char[2], 10) it writes three characters 6, 3 and the terminating \0. But your buffer is only two characters long.
itoa() function is best avoided in favour of snprintf() or boost::lexical_cast<>().
You should read the documentation for itoa.
Consider the following loop:
for( unsigned int i = 183; i <= max_num_of_high_sats ; ++i )
{
char buffer[2];
std::string prn_select_c = itoa( i, buffer, 10 );
m_select_combo.AddString( prn_select_c.c_str() );
}
The first iteration converts the integer 183 to the 3 character string "183", plus a terminating null character. That's 4 bytes, which you are trying to cram into a two byte array. The docs tell you specifically to make sure your buffer is large enough to hold any value; in this case it should be at least the number of digits in max_num_of_high_sats long, plus one for the terminating null.
You might as well make it large enough to hold the maximum value you can store in an unsigned int, which would be 11 (eg. 10 digits for 4294967295 plus a terminating null).
the ito function is used to convert a int to a C sytle string based on the 3rd parameter base.
As a example, it just likes to print out the int 63 in printf. you need two ASII byte, one is used to storage CHAR 6, the other is used to storage CHAR 3. the 3rd should be NULL. So in your case the max int is three digital. you need 4 bytes in the string
You are converting an integer to ASCII, that is what itoa does. If you have a number like 183 that is four chars as a string, '1', '8', '3', '\0'.
Each character takes one byte, for example character '1' is the value 0x31 in ASCII.
#define STR "test1"
Why does this take 6 bytes?
sizeof(STR) = 6
There is a trailing '\0' at the end.
a #define just does a text replacement before compiling.
#define STR "test1"
sizeof(STR);
is actually seen by the compiler as
sizeof("test1");
now why is that 6 and not 5? because there's a null terminator at the end of the string.
It has nothing to do with #define. A character array would be the same size:
const char str[] = { "test1" };
sizeof (str) == 6
The reason this string is 6 bytes long is that strings in C have a terminating NUL character to mark the end.
Strings in C are arrays of chars, with a null terminator i.e. they end with the \0. The common alternative is Pascal-style strings, where the string stores the array of chars without the null terminator, and stores the length of the string somewhere instead.
What the others said ... BUT
In C, preprocessing tokens take no space. It depends on how you use them
#define STR "test1"
char x[] = STR; /* 6 bytes */
char *y = STR; /* sizeof (char*) bytes (plus possibly 6 bytes) */
int ch = STR[3]; /* 1 byte (or sizeof (int), depending on how you look at it) */
if (ch == STR[1]) /* 1 byte (or sizeof (int) or no bytes or ...) */
printf("==>" STR "<==") /* 5 bytes ??? */
Why does this take 6 bytes?
Actually, it will take (6 bytes × the number of times you use it), because it's a preprocessor macro.
Try const char *STR = "test1" instead.
The latest C compiler has a feature to guess if the person writing the program is in a learning phase and give answers which make them search wider and deeper, and thus enrich their knowledge.
After programming for some time, depending of your learning, you might see the value go down to 5. ;-)
JK.. as someone else said, it symbolically nothing at the end which ironically takes a byte.
I don't use correctly the format specifiers in C. A few lines of code:
int main()
{
char dest[]="stack";
unsigned short val = 500;
char c = 'a';
char* final = (char*) malloc(strlen(dest) + 6);
snprintf(final, strlen(dest)+6, "%c%c%hd%c%c%s", c, c, val, c, c, dest);
printf("%s\n", final);
return 0;
}
What I want is to copy at
final [0] = a random char
final [1] = a random char
final [2] and final [3] = the short array
final [4] = another char ....
My problem is that i want to copy the two bytes of the short int to 2 bytes of the final array.
thanks.
I'm confused - the problem is that you are saying strlen(dest)+6 which limits the length of the final string to 10 chars (plus a null terminator). If you say strlen(dest)+8 then there will be enough space for the full string.
Update
Even though a short may only be 2 bytes in size, when it is printed as a string each character will take up a byte. So that means it can require up to 5 bytes of space to write a short to a string, if you are writing a number above 10000.
Now, if you write the short to a string as a hexadecimal number using the %x format specifier, it will take up no more than 2 bytes.
You need to allocate space for 13 characters - not 11. Don't forget the terminating NULL.
When formatted the number (500) takes up three spaces, not one. So your snsprintf should give the final length as strlen(dest)+5+3. Then also fix your malloc call to adjust. If you want to compute the strlen of the number, do that with a call like this strlen(itoa(val)). Also, cant forget the NULL at the end of dest, but I think strlen takes this into account, but I'm not for sure.
Simple answer is you only allocated enough space for the strlen(dest) + 6 characters when in all reality it looks like you're going to have 8 extra characters... since you have 2 chars + 3 chars in your number + 2 chars after + dest (5 chars) = 13 char when you allocated 11 chars.
Unsigned shorts can take up to 5 characters, right? (0 - 65535)
Seems like you'd need to allocate 5 characters for your unsigned short to cover all of the values.
Which would point to using this:
char* final = (char*) malloc(strlen(dest) + 10);
You lose one byte because you think the short variable takes 2 byte. But it takes three: one for each digit character ('5', '0', '0'). Also you need a '\0' terminator (+1 byte).
==> You need strlen(dest) + 8
Use 8 instead of 6 on:
char* final = (char*) malloc(strlen(dest) + 6);
and
snprintf(final, strlen(dest)+6, "%c%c%hd%c%c%s", c, c, val, c, c, dest);
Seems like the primary misunderstanding is that a "2-byte" short can't be represented on-screen as 2 1-byte characters.
First, leave enough room:
char* final = (char*) malloc(strlen(dest) + 9);
The entire range of possible values for a 1-byte character are not printable. If you want to display this on screen and be readable, you'll have to encode the 2-byte short as 4 hex bytes, such as:
## as hex, 4 characters
snprintf(final, sizeof(final), "%c%c%4x%c%c%s", c, c, val, c, c, dest);
If you are writing this to a file, that's OK, and you might try the following:
## print raw bytes, upper byte, then lower byte.
snprintf(final, sizeof(final), "%c%c%c%c%c%c%s", c, c, ((val<<8)&0xFF), ((val>>8)&0xFF), c, c, dest);
But that won't make sense to a human looking at it, and is sensitive to endianness. I'd strongly recommend against it.