I defined an struct based on bytes, with size of 3 bytes. (1 packetID and 2 packetSize) I checked the size with sizeof function, and it works well:
#pragma pack(1)
typedef struct ENVIRONMENT_STRUCT{
unsigned char packetID[1];
unsigned char packetSize[2];
}
I created a variable and reserved memory like this:
ENVIRONMENT_STRUCT * environment_struct = new ENVIRONMENT_STRUCT();
For now I want to initialize environment_struct.
The problem is about I am trying to initialize this struct by attribute, just like this:
*environment_struct->packetSize = 100;
But when I checked this value, using:
std::cout << "Packet Size: " << environment_struct->packetSize << endl;
Result: Packet Size: d
Expected result: Packet Size: 100
If i will work with numbers, Should I define the struct using csdint library? For example, u_int8 and this type of variable.
When you do
ENVIRONMENT_STRUCT * environment_struct = new ENVIRONMENT_STRUCT();
you initialize packetSize to be {0, 0}. Then
*environment_struct->packetSize = 100;
turns the array into {100, 0}. Since the array is a character array when you send it to cout with
std::cout << "Packet Size: " << environment_struct->packetSize << endl;
it treats it as a c-string and prints out the string contents. Since you see d that means your system is using ascii as the character 'd' has an integer representation of 100. To see the 100 you need to cast it to an int like
std::cout << "Packet Size: " << static_cast<int>(*environment_struct->packetSize) << endl;
Do note that since packetSize is an array of two chars you can't actually assign a single value that takes up that whole space. If you want this then you need to use fixed width types like
typedef struct ENVIRONMENT_STRUCT{
uint8_t packetID; // unsigned integer that is exactly 8 bits wide. Will be a compiler error if it does not exist
uint16_t packetSize; // unsigned integer that is exactly 16 bits wide. Will be a compiler error if it does not exist
};
int main()
{
ENVIRONMENT_STRUCT * environment_struct = new ENVIRONMENT_STRUCT();
environment_struct->packetSize = 100;
std::cout << "Packet Size: " << environment_struct->packetSize << std::endl;
}
Let's first consider what *environment_struct->packetSize = 100; does. It sets the first byte of ENVIRONMENT_STRUCT::packetSize to 100. A more conventional syntax to do this is: environment_struct->packetSize[0] = 100.
There's really no way to initialize the struct in a way for the expression std::cout << environment_struct->packetSize to result in the output of 100. Let us consider what that does. environment_struct->packetSize is an array, which in this case decays to a pointer to first element. Character pointers inserted into character streams are interpreted as null terminated character strings. Luckily, you had valueinitialized the second byte of environment_struct->packetSize, so your array is indeed null terminated. The value of the first byte is interpreted as an encoded character. On your system encoding, it happens that d character is encoded as value 100.
If you wish to print the numeric value of the first byte of environment_struct->packetSize, which you had set to 100, you can use:
std::cout << "Packet Size: " << (int)environment_struct->packetSize[0] << endl;
You get this result as you tries to print a character symbol not an integer.
To fix it just cast the value or declare it as integer depending on your needs.
Cast example:
std::cout << "Packet Size: " << static_cast<int>(*environment_struct->packetSize) << std::endl;
As packetSize is declared as char-type, it's being output as a character. (ASCII code of character 'd' is 100...)
Try casting it to an integer-type:
std::cout << "Packet Size: " << (int)environment_struct->packetSize << endl;
Alternatively, since you appear to want to store the number as a 2-byte type, you could avoid such casting and simply declare packetSize as unsigned short. This will be interpreted by cout as an integer-type.
Related
A char stores a numeric value from 0 to 255. But there seems to also be an implication that this type should be printed as a letter rather than a number by default.
This code produces 22:
int Bits = 0xE250;
signed int Test = ((Bits & 0x3F00) >> 8);
std::cout << "Test: " << Test <<std::endl; // 22
But I don't need Test to be 4 bytes long. One byte is enough. But if I do this:
int Bits = 0xE250;
signed char Test = ((Bits & 0x3F00) >> 8);
std::cout << "Test: " << Test <<std::endl; // "
I get " (a double quote symbol). Because char doesn't just make it an 8 bit variable, it also says, "this number represents a character".
Is there some way to specify a variable that is 8 bits long, like char, but also says, "this is meant as a number"?
I know I can cast or convert char, but I'd like to just use a number type to begin with. It there a better choice? Is it better to use short int even though it's twice the size needed?
cast your character variable to int before printing
signed char Test = ((Bits & 0x3F00) >> 8);
std::cout << "Test: " <<(int) Test <<std::endl;
I have the following declaration
char c[] = "Hello";
string c2 = "Hello";
I want to compare a) how many bytes of memory both need and b) the character lengths.
I know the character arrays add a null terminator at the end of the string where as string data types don't.
Using
cout << "The sizeof of c: " << sizeof(c);
cout << "The sizeof of c2: " << sizeof(c2);
returns 6 and 4 and I'm not sure why 4 and not 5?
also how does length function compare here ...
When I use the following code
cout << "The sizeof of c: " << sizeof(c);
cout <<"The sizeof of c2: " << c2.length();
I get 6 and 5 ... but is it comparing the lengths the same way?
Thanks.
a) how many bites of memory both need and
You correctly used the sizeof operator that to determine how many bytes the character array occupies.
It is
sizeof( c )
As for the object of type std::string then it occupies two extents of memory. The first one is used to allocate the object itself and the second one is used to allocate the string the object holds.
So
sizeof( c2 )
will give you the size of the memory occupied by the object.
c2.capacity()
will give you the size that the object allocated to store the string and maybe some additional characters that will be filled in future.
When I use the following code cout << "The sizeof of c: " <<
sizeof(c); cout <<"The sizeof of c2: " << c2.length();
I get 6 and 5
If you want to compare the strings itself without the terminating zero that the character array has then you should write
cout << "The length of c: " << std::strlen(c);
cout <<"The length of c2: " << c2.length();
and you will get result 5 and 5.
You could make the following experiment with objects of type std::string.
std::string s;
std::cout << sizeof( s ) << '\t' << s.capacity() << '\t' << s.length() << std::endl;
std::string s1( 1, 'A' );
std::cout << sizeof( s1 ) << '\t' << s1.capacity() << '\t' << s1.length() << std::endl;
std::string s3( 2, 'A' );
std::cout << sizeof( s2 ) << '\t' << s2.capacity() << '\t' << s2.length() << std::endl;
std::string s3( 16, 'A' );
std::cout << sizeof( s3 ) << '\t' << s3.capacity() << '\t' << s3.length() << std::endl;
sizeof(c) is the size of the array, which contains the five characters in the literal you initialise it with, plus a zero-valued terminator at the end, giving a total of six bytes.
sizeof(c2) is the size of the string class, which doesn't tell you anything particularly useful. The class manages a dynamically allocated memory containing the string's characters; that's not part of the string object itself.
c2.length() is the number of characters in the string managed by c2; five characters.
a) how many bytes of memory both need, and b) the character lengths
variable 'c' uses 6 bytes on stack (the 5 letters and the null terminator)
sizeof(c) = 6, strlen(c) = 5
Total bytes of memory needed: 6
if 'c' had 1000 chars,
sizeof(c) = 1001, strlen(c) = 1000)
Total bytes of memory needed: 1001
variable 'c2' uses 4 bytes on stack (I suspect a pointer, but have not confirmed),
and at least 5 bytes somewhere else (I think heap).
sizeof(c2) = 4, c2.size() = 5, strlen(c2.c_str()) = 5
Total bytes of memory needed: 9+ (4 + 5)+
if 'c2' had 1000 chars, i.e. c2.size() == 1000
4 bytes on the stack, and
at least 1000 bytes somewhere else (depending on implementation, probably a few more)
Total bytes of memory needed: 1004+
NOTE: std::string is a container. I think such values are not specified,
and should be considered implementation dependent.
I guess the size of the [] includes the terminating null character, thus 5+1=6 bytes.
The size of the string object returns 4 bytes which probably is the size of the pointer which points to the string object. 32 bits.
In the last case you're using the Length which is programmed to count the number of characters.
Well the size of c[] = "Hello" is 6 because the char array needs to allocate 1 more byte of
memory for the null \0 character.
the length() function returns the number of characters in the string literal.It does not include the null \0 character while counting.
std::string, like std::unique_ptr, std::shared_ptr, and std::vector, is a smart pointer plus some extra manipulation member functions.
When you pass one to sizeof, you're measuring the size of the smart pointer -- sizeof (std::string), not the content.
The author of this code states that (long)tab returns address of the tab. Is it true? If yes, why is it so?
char tab []= "PJC"
cout << " tab = " << tab << ", address: " << (long)tab << "\n" << endl;
Yes, its true. Raw arrays in C/C++ are considered so that their name is the pointer to the first element. So, you can write:
char tab[] = "PJC";
char c = *(tab + 1); // c == J
As pointer is no more than an integer value representing the address in memory, casting pointer to long will print you the address value.
You must be sure that integer would hold all values. Pointers always matches word size, so on 32-bit CPU a pointer is 4 byte, in 64-bit it is 8 byte and you'll need 64-bit integer not to have overflow - what exact type is it depends on the system (may be long long). You can use intptr_t (thanks #Avt) to store pointer values.
Typecasting a variable changes its interpretation, but the actual value remains the same. If you were to print the value with format specifier %x then you'll always get the same result, what typecast you use won't matter.
In this case, tab is a char*, which is nothing but an "address" of the location.
You should cast to void* to get the address. Run following to check
char tab []= "PJC"
cout << " tab = " << tab << ", address1: " << (void*)tab << ", address2: " << (long)tab << "\n" << endl;
But remember that result depends of architecture!
Why does the function sizeof not return the same size when its getting used on the struct itself?
I need to cast it because of a winsock program that im working on.
Thanks for any help, true.
#include <iostream>
#include <string>
using namespace std;
struct stringstruct
{
string s1;
string s2;
};
int main()
{
stringstruct ss = {"123","abc"};
char *NX = (char*)&ss;
cout << sizeof(NX) << endl << sizeof(*NX) << endl;
cout << sizeof(&ss) << endl << sizeof(ss) << endl;
getchar();
return 0;
}
the example above outputs
4
1
4
64
sizeof will tell you the size of the given expression's type. In both the sizeof(NX) and sizeof(&ss), the result is 4 because pointers on your machine take up 4 bytes. For sizeof(*NX), you are dereferencing a char*, which gives you a char, and a char takes up 1 byte (and always does), so you get the output 1. When you do sizeof(ss), ss is a stringstruct, so you get the size of a stringstruct, which appears to be 64 bytes.
stringstruct ss = {"123","abc"};
char *NX = (char*)&ss;
cout << sizeof(NX) << endl << sizeof(*NX) << endl;
cout << sizeof(&ss) << endl << sizeof(ss) << endl;
I'm pretty sure that any of these casts are pretty meaningless. NX will point at the beginning of your struct. Inside the struct are two objects of type string, which in turn have pointers pointing to the data they were initialized with "123" and "abc" respectively. sizeof(*NX) is just that - size of a char, and sizeof(NX) is indeed the size of a pointer. sizeof(ss) is the size of your two string members (and any padding added by the compiler) - and sizeof(&ss) is the size of a pointer to a stringstruct.
Now, I expect what you REALLY want is a way to send your data, "123" and "abc" as two separate strings over a network. None of the above will help you do that, since even if sizeof(ss) gives you the size of the data structure you want to send, the string values are not within that structure [1]. What you really need is something calls serialization - something that writes out your strings as separate elements as text/string.
Something like this would work:
struct stringstruct {
string s1;
string s2;
string to_string()
}
string stringstruct::to_string()
{
string res = s1 + " " + s2;
return res;
}
Then use to_string like this:
string temp = ss.to_string();
const char *to_send = temp.c_str();
int send_len = temp.length();
... send the string `to_send` with number of bytes `send_len`.
[1] There is an optimization where std::string is actually storing short strings within the actual class itself. But given a sufficiently long strong, it won't do that.
A pointer is of size 4(in your case seems to be 32 bit) no matter what it points. Size of the object itself on the other hand returns the real number of bytes that an object of that structure takes.
Please see the code snippet below :
#include <iostream>
using namespace std;
int main()
{
uint32_t len, x;
char abc[] = "12345678";
uint8_t *ptr = (uint8_t *)abc;
copy(ptr, ptr + 4, reinterpret_cast<uint32_t*>(&len));
cout << " len: " << len << endl;
}
The output is 49! I would want the output to be 1234. Am I missing something
Your target is a “container” of length 1 (namely, a single object, len).
You are copying four subsequent byte values into this container, which of course fails – in particular, it causes an overflow since the target only has space for a single element.
Other errors in your code (not an exhaustive list):
You are confusing character codes and their string representation
You are performing redundant casts
The first point in particular is relevant since what you actually want to do is parse the number encoded in the first four characters of the string as a decimal number. But what you actually do is copy its character codes.
To parse a number in C++, use as std::stringstream or, since C++11, std::stoi
std:copy doesn't work as you're expecting. It copies the source 'element-wise' to the destination. So it copies the first uint8 (= char '1' == 0x49 in hex) to 'len', and then proceeds to trample on three random uint32 values following on in memory.
This this instead to see what's actually happening.
#include <iostream>
using namespace std;
int main()
{
uint32_t len[4];
char abc[] = "12345678";
copy(abc, &abc[4], &len[0]);
cout << " len: " << len[0] << " " <<len[1] << " " << len[2] << " " << len[3] << endl;
}
First of all, std::copy does roughly this:
template <typename InputItr, typename OutputItr>
void copy(InputItr begin, InputItr end, OutputItr obegin)
{
while (begin != end)
*obegin++ = *begin++;
}
Your output iterator is uint32_t*, which would actually cause you to overwrite 4 32-bit words! (buffer overflow). You are seeing 49 because the first character that is copied ('1') has the ASCII value 49.