Storing hex in QBytearray, extracting it and converting it to decimal - c++

int dd = 0xA5;
QByteArray p;
p.push_back (0xA5);
qDebug () << "SOP: " << (int)p[0];
This results in -91 whereas 0xA5 stands for 165 in decimal.
How to store hex in QBytearray, extract it and convert it to decimal?

-91 is just a representation of a char value.
char has a range of --127..127. You are storing the value 165, which is larger than 127.
However, unsigned char has a range of 0..255. So in this case you may read your value as an unsigned char:
qDebug() << "SOP: " << (unsigned char)p[0];
In addition you may use QString to display the corresponding hex value:
QString hex = QString("%1").arg((unsigned char)p[0] , 0, 16);
qDebug() << "Hex: " << hex;

Related

Is there an alternative to char for storing one byte numeric values?

A char stores a numeric value from 0 to 255. But there seems to also be an implication that this type should be printed as a letter rather than a number by default.
This code produces 22:
int Bits = 0xE250;
signed int Test = ((Bits & 0x3F00) >> 8);
std::cout << "Test: " << Test <<std::endl; // 22
But I don't need Test to be 4 bytes long. One byte is enough. But if I do this:
int Bits = 0xE250;
signed char Test = ((Bits & 0x3F00) >> 8);
std::cout << "Test: " << Test <<std::endl; // "
I get " (a double quote symbol). Because char doesn't just make it an 8 bit variable, it also says, "this number represents a character".
Is there some way to specify a variable that is 8 bits long, like char, but also says, "this is meant as a number"?
I know I can cast or convert char, but I'd like to just use a number type to begin with. It there a better choice? Is it better to use short int even though it's twice the size needed?
cast your character variable to int before printing
signed char Test = ((Bits & 0x3F00) >> 8);
std::cout << "Test: " <<(int) Test <<std::endl;

How to avoid 0xFF prefix when converting char to short?

When I do:
cout << std::hex << (short)('\x3A') << std::endl;
cout << std::hex << (short)('\x8C') << std::endl;
I expect the following output:
3a
8c
but instead, I have:
3a
ff8c
I suppose that this is due to the way char—and more precisely a signed char—is stored in memory: everything below 0x80 would not be prefixed; the value 0x80 and above, on the other hand, would be prefixed with 0xFF.
When given a signed char, how do I get a hexadecimal representation of the actual character inside it? In other words, how do I get 0x3A for \x3A, and 0x8C for \x8C?
I don't think a conditional logic is well suited here. While I can subtract 0xFF00 from the resulting short when needed, it doesn't seem very clear.
Your output might make more sense if you looked at it in decimal instead of hexadecimal:
std::cout << std::dec << (short)('\x3A') << std::endl;
std::cout << std::dec << (short)('\x8C') << std::endl;
output:
58
-116
The values were cast to short, so we are (most commonly) dealing with 16 bit values. The 16-bit binary representation of -116 is 1111 1111 1000 1100, which becomes FF8C in hexadecimal. So the output is correct given what you requested (on systems where char is a signed type). So not so much the way the char is stored in memory, but more the way the bits are interpreted. As an unsigned value, the 8-bit pattern 1000 1100 represents -116, and the conversion to short is supposed to preserve this value, rather than preserving the bits.
Your desired output of a hexadecimal 8C corresponds (for a short) to the decimal value 140. To get this value out of 8 bits, the value has to be interpreted as an unsigned 8-bit value (since the largest signed 8-bit value is 127). So the data needs to be interpreted as an unsigned char before it gets expanded to some flavor of short. For a character literal like in the example code, this would look like the following.
std::cout << std::hex << (unsigned short)(unsigned char)('\x3A') << std::endl;
std::cout << std::hex << (unsigned short)(unsigned char)('\x8C') << std::endl;
Most likely, the real code would have variables instead of character literals. If that is the case, then rather than casting to an unsigned char, it might be more convenient to declare the variable to be of unsigned char type. Which is possibly the type you should be using anyway, based on the fact that you want to see its hexadecimal value. Not definitively, but this does suggest that the value is seen simply as a byte of data rather than as a number, and that suggests that an unsigned type is appropriate. Have you looked at std::byte?
One other nifty thought to throw out: the following also gives the desired output as a reasonable facsimile of using an unsigned char variable.
#include <iostream>
unsigned char operator "" _u (char c) { return c; } // Suffix for unsigned char literals
int main()
{
std::cout << std::hex << (unsigned short)('\x3A'_u) << std::endl;
std::cout << std::hex << (unsigned short)('\x8C'_u) << std::endl;
}
A more straightforward approach is to cast a signed char to an unsigned char. In other words, this:
cout << std::hex << (short)(unsigned char)('\x3A') << std::endl;
cout << std::hex << (short)(unsigned char)('\x8C') << std::endl;
produces the expected result:
3a
8c
Not sure this is particularly clear, though.

Printing out the hex vale of an unsigned char array in C++

I want to print out the hex value of an unsigned char array using the cout function.
The most obvious approach would be something like the following.
unsigned char str[] = "foo bar baz\n";
for(unsigned short int i = 0; i < sizeof(str); i++){
std::cout << std::hex << str[i] << std::dec << ' ';
}
std::cout << std::endl;
Suprisingly, this outputs the following string:
foo bar baz
For some reason this does not print out the proper hexadecimal value of str
How can I cout the proper hex value of str?
To cout the proper hex value of an unsigned char, it will need to be converted to an integer first.
unsigned char str[] = "foo bar baz\n";
for(unsigned short int i = 0; i < sizeof(str); i++){
std::cout << std::hex << (int) str[i] << std::dec << ' ';
}
std::cout << std::endl;
Gives the following output.
66 6f 6f 20 62 61 72 20 62 61 7a 00
Which corresponds with the hex value of each unsigned char in str.
An explaination for this can be found in the following std::hex documentation.
std::hex
Sets the basefield format flag for the str stream to hex.
When basefield is set to hex, integer values inserted into the stream are expressed in hexadecimal base (i.e., radix 16). For input streams, extracted values are also expected to be expressed in hexadecimal base when this flag is set.
http://www.cplusplus.com/reference/ios/hex/

Initializing an unsigned char array with hex values in C++

I would like to initialize an unsigned char array with 16 hex values. However, I don't seem to know how to properly initialize/access those values. When I try to access them as I might want to intuitively, I'm getting no value at all.
This is my output
The program was run with the following command: 4
Please be a value! -----> p
Here's some plaintext
when run with the code below -
int main(int argc, char** argv)
{
int n;
if (argc > 1) {
n = std::stof(argv[1]);
} else {
std::cerr << "Not enough arguments\n";
return 1;
}
char buff[100];
sprintf(buff,"The program was run with the following command: %d",n);
std::cout << buff << std::endl;
unsigned char plaintext[16] =
{0x0f, 0xb0, 0xc0, 0x0f,
0xa0, 0xa0, 0xa0, 0xa0,
0x00, 0x00, 0xa0, 0xa0,
0x00, 0x00, 0x00, 0x00};
unsigned char test = plaintext[1]^plaintext[2];
std::cout << "Please be a value! -----> " << test << std::endl;
std::cout << "Here's some plaintext " << plaintext[3] << std::endl;
return 0;
}
By way of context, this is part of a group project for school. We are ultimately trying to implement the Serpent cipher, but keep on getting tripped up by unsigned char arrays. Our project specification says that we must have two functions that take what would be Byte arrays in Java. I assume the closest relative in C++ is an unsigned char[]. Otherwise I would use vector. Elsewhere in the code I've implemented a setKey function which takes an unsigned char array, packs its values into 4 long long ints (the key needs to be 256 bits) and performs various bit-shifting and xor operations on those ints to generate the keys necessary for the cryptographic algorithm. Hope that's enough background on what I'm looking to do. I'm guessing I'm just overlooking some basic C++ functionality here. Thanks for any and all help!
A char is an 8-bit value capable of storing -128 <= n <= +127, frequently used to store character representations in different encodings and commonly - in Western, Roman-alphabet installations - char is used to indicate representation of ASCII or utf encoded values. 'Encoded' means the symbols/letter in the character set have been assigned numeric values. Think of the periodic table as an encoding of elements, so that 'H' (Hydrogen) is encoded as 1, Germanium as 32. In the ASCII (and UTF-8) tables, position 32 represents the character we call "space".
When you use operator << on a char value, the default behavior is to assume you are passing it a character encoding, e.g. an ASCII character code. If you do
char c = 'z';
char d = 122;
char e = 0x7A;
char f = '\x7a';
std::cout << c << d << e << f << "\n";
All four assignments are equivalent. 'z' is a shortcut/syntactic-sugar for char(122), 0x7A is hex for 122, and '\x7a' is an escape that forms the ascii character with a value of 0x7a or 122 - i.e. z.
Where many new programmers go wrong is that they do this:
char n = 8;
std::cout << n << endl;
this does not print "8", it prints ASCII character at position 8 in the ASCII table.
Think for a moment:
char n = 8; // stores the value 8
char n = a; // what does this store?
char n = '8'; // why is this different than the first line?
Lets rewind a moment: when you store 120 in a variable, it can represent the ASCII character 'x', but ultimately what is being stored is just the numeric value 120, plain and simple.
Specifically: When you pass 122 to a function that will ultimately use it to look up a font entry from a character set using the Latin1, ISO-8859-1, UTF-8 or similar encodings, then 120 means 'z'.
At the end of the day, char is just one of the standard integer value types, it can store values -128 <= n <= +127, it can trivially be promoted to a short, int, long or long long, etc, etc.
While it is generally used to denote characters, it also frequently gets used as a way of saying "I'm only storing very small values" (such as integer percentages).
int incoming = 5000;
int outgoing = 4000;
char percent = char(outgoing * 100 / incoming);
If you want to print the numeric value, you simply need to promote it to a different value type:
std::cout << (unsigned int)test << "\n";
std::cout << unsigned int(test) << "\n";
or the preferred C++ way
std::cout << static_cast<unsigned int>(test) << "\n";
I think (it's not completely clear what you are asking) that the answer is as simple as this
std::cout << "Please be a value! -----> " << static_cast<unsigned>(test) << std::endl;
If you want to output the numeric value of a char or unsigned char, you have to cast it to an int or unsigned first.
Not surprisingly, by default, chars are output as characters not integers.
BTW this funky code
char buff[100];
sprintf(buff,"The program was run with the following command: %d",n);
std::cout << buff << std::endl;
is more simply written as
std::cout << "The program was run with the following command: " << n << std::endl;
std::cout and std::cin always treats char variable as a char
If you want to input or output as int, you must manually do it like below.
std::cin >> int_var; c = int_var;
std::cout << (int)c;
If using scanf or printf, there is no such problem as the format parameter ("%d", "%c", "%s") tells howto covert input buffer (integer, char, string).

Signed Hexadecimal to decimal in C++

By using std::hex and std::dec, it is possible to parse hexadecimal from a string and convert it to a decimal number in C++. But what if the hexadecimal number is signed?
The following code for example will result 241 which is correct if the input "F1" is unsigned hex, but the result should be -15 if the input was a signed hex. Is there a C++ function that can process signed hex values?
int n;
stringstream("F1") >> std::hex >> n;
std::cout << std::dec << "Parsing \"F1\" as hex gives " << n << '\n';
When you say "signed hex" you mean if you were to represent the bitwise representation of a char in hexadecimal then F1 would be -15. However, -15 in signed hex is simply -F.
If you want to get -15 from this bitwise representation you'll have to do something like the following:
std::string szTest = "F1";
unsigned char chTest = std::stoi( szTest, nullptr, 16 );
char chTest2 = *reinterpret_cast<char*>(&chTest);
std::cout << szTest << ": " << static_cast<int>(chTest2) << std::endl;
return 0;