Why unsigned int right shift is always filled with '1' - c++

#include <iostream>
#include <string>
#include <bitset>
int main()
{
char c = 128;
unsigned int shift2 = (unsigned int)c;
std::string shift2bin = std::bitset<8>(shift2).to_string(); //to binary
std::cout << " shift2bin: " << shift2bin << std::endl;
unsigned int shift3 = shift2 >> 1;
std::string shift3bin = std::bitset<8>(shift3).to_string(); //to binary
std::cout << " shift3bin: " << shift3bin << std::endl;
}
Output:
shift2bin: 10000000
shift3bin: 11000000
I expect the result to be as follows:
shift2bin: 10000000
shift3bin: 01000000
Question> Why unsigned int right shift uses 1 as the filler?

As seen in this answer, unsigned right shifts always zero-fill. However, try this to print out all the bits in the unsigned int:
std::string shift2bin = std::bitset<sizeof(shift2)*8>(shift2).to_string(); //to binary
std::cout << " shift2bin: " << shift2bin << std::endl;
You will see something like (as you appear to have char signed by default):
shift2bin: 11111111111111111111111110000000
^^^^^^^^
If you do the same for shift3bin, you will see:
shift3bin: 01111111111111111111111111000000
^^^^^^^^
So, you can see how you appear to get a "1" fill.

Related

Printing hex characters

I would like to understand the results of printing out a char and an unsigned char.
#include <iostream>
#include <string>
using namespace std;
int main()
{
int8_t a = 0xA1;
uint8_t b = 0xA1;
printf("0x%x,", a);
printf("0x%x,", b);
std::cout << std::hex << a << ",";
std::cout << std::hex << b << std::endl;
}
Result
0xffffffa1,0xa1,�,�
I don't understand why the signed char turns into an uint or int, and why std::hex fails miserably.
int8_t a = 0xA1; causes signed overflow, thus the behavior is undefined. If you turned on correct compiler flags, you'd get something like:
error: overflow in conversion from 'int' to 'int8_t' {aka 'signed char'} changes value from '161' to '-95' [-Werror=overflow]
8 | int8_t a = 0xA1;
| ^~~~
Besides, %x expects an unsigned int. This also causes undefined behavior. You meant to do something like:
#include <iostream>
int main()
{
int8_t a = 42; // Doesnt overflow
uint8_t b = 42;
std::printf("%#x,", static_cast<unsigned int>(a));
std::printf("%#x,", static_cast<unsigned int>(b));
std::cout << std::hex << static_cast<unsigned int>(a) << ",";
std::cout << std::hex << static_cast<unsigned int>(b) << std::endl;
}
Output: 0x2a,0x2a,2a,2a

Converting unsigned char * to hexstring

Below code takes a hex string(every byte is represented as its corresponidng hex value)
converts it to unsigned char * buffer and then converts back to hex string.
This code is testing the conversion from unsigned char* buffer to hex string
which I need to send over the network to a receiver process.
I chose hex string as unsigned char can be in range of 0 to 255 and there is no printable character after 127.
The below code just tells the portion that bugs me. Its in the comment.
#include <iostream>
#include <sstream>
#include <iomanip>
using namespace std;
// converts a hexstring to corresponding integer. i.e "c0" - > 192
int convertHexStringToInt(const string & hexString)
{
stringstream geek;
int x=0;
geek << std::hex << hexString;
geek >> x;
return x;
}
// converts a complete hexstring to unsigned char * buffer
void convertHexStringToUnsignedCharBuffer(string hexString, unsigned char*
hexBuffer)
{
int i=0;
while(hexString.length())
{
string hexStringPart = hexString.substr(0,2);
hexString = hexString.substr(2);
int hexStringOneByte = convertHexStringToInt (hexStringPart);
hexBuffer[i] = static_cast<unsigned char>((hexStringOneByte & 0xFF)) ;
i++;
}
}
int main()
{
//below hex string is a hex representation of a unsigned char * buffer.
//this is generated by an excryption algorithm in unsigned char* format
//I am converting it to hex string to make it printable for verification pupose.
//and takes the hexstring as inpuit here to test the conversion logic.
string inputHexString = "552027e33844dd7b71676b963c0b8e20";
string outputHexString;
stringstream geek;
unsigned char * hexBuffer = new unsigned char[inputHexString.length()/2];
convertHexStringToUnsignedCharBuffer(inputHexString, hexBuffer);
for (int i=0;i<inputHexString.length()/2;i++)
{
geek <<std::hex << std::setw(2) << std::setfill('0')<<(0xFF&hexBuffer[i]); // this works
//geek <<std::hex << std::setw(2) << std::setfill('0')<<(hexBuffer[i]); -- > this does not work
// I am not able to figure out why I need to do the bit wise and operation with unsigned char "0xFF&hexBuffer[i]"
// without this the conversion does not work for individual bytes having ascii values more than 127.
}
geek >> outputHexString;
cout << "input hex string: " << inputHexString<<endl;
cout << "output hex string: " << outputHexString<<endl;
if(0 == inputHexString.compare(outputHexString))
cout<<"hex encoding successful"<<endl;
else
cout<<"hex encoding failed"<<endl;
if(NULL != hexBuffer)
delete[] hexBuffer;
return 0;
}
// output
// can some one explain ? I am sure its something silly that I am missing.
the C++20 way:
unsigned char* data = new unsigned char[]{ "Hello world\n\t\r\0" };
std::size_t data_size = sizeof("Hello world\n\t\r\0") - 1;
auto sp = std::span(data, data_size );
std::transform( sp.begin(), sp.end(),
std::ostream_iterator<std::string>(std::cout),
[](unsigned char c) -> std::string {
return std::format("{:02X}", int(c));
});
or if you want to store result into string:
std::string result{};
result.reserve(size * 2 + 1);
std::transform( sp.begin(), sp.end(),
std::back_inserter(result),
[](unsigned char c) -> std::string {
return std::format("{:02X}", int(c));
});
Output:
48656C6C6F20776F726C640A090D00
The output of an unsigned char is like the output of a char which obviously does not what the OP expects.
I tested the following on coliru:
#include <iomanip>
#include <iostream>
int main()
{
std::cout << "Output of (unsigned char)0xc0: "
<< std::hex << std::setw(2) << std::setfill('0') << (unsigned char)0xc0 << '\n';
return 0;
}
and got:
Output of (unsigned char)0xc0: 0�
This is caused by the std::ostream::operator<<() which is chosen out of the available operators. I looked on cppreference
operator<<(std::basic_ostream) and
std::basic_ostream::operator<<
and found
template< class Traits >
basic_ostream<char,Traits>& operator<<( basic_ostream<char,Traits>& os,
unsigned char ch );
in the former (with a little bit help from M.M).
The OP suggested a fix: bit-wise And with 0xff which seemed to work. Checking this in coliru.com:
#include <iomanip>
#include <iostream>
int main()
{
std::cout << "Output of (unsigned char)0xc0: "
<< std::hex << std::setw(2) << std::setfill('0') << (0xff & (unsigned char)0xc0) << '\n';
return 0;
}
Output:
Output of (unsigned char)0xc0: c0
Really, this seems to work. Why?
0xff is an int constant (stricly speaking: an integer literal) and has type int. Hence, the bit-wise And promotes (unsigned char)0xc0 to int as well, yields the result of type int, and hence, the std::ostream::operator<< for int is applied.
This is an option to solve this. I can provide another one – just converting the unsigned char to unsigned.
Where the promotion of unsigned char to int introduces a possible sign-bit extension (which is undesired in this case), this doesn't happen when unsigned char is converted to unsigned. The output stream operator for unsigned provides the intended output as well:
#include <iomanip>
#include <iostream>
int main()
{
std::cout << "Output of (unsigned char)0xc0: "
<< std::hex << std::setw(2) << std::setfill('0') << (unsigned)(unsigned char)0xc0 << '\n';
const unsigned char c = 0xc0;
std::cout << "Output of unsigned char c = 0xc0: "
<< std::hex << std::setw(2) << std::setfill('0') << (unsigned)c << '\n';
return 0;
}
Output:
Output of (unsigned char)0xc0: c0
Output of unsigned char c = 0xc0: c0
Live Demo on coliru

c++ reading argv into unsigned char fixed size: Segmentation fault

I am trying to read command line argument into a fixed size unsigned char array. I get segmentation fault.
My code:
#include <stdio.h>
#include <iostream>
#include <stdlib.h>
#include <memory.h>
unsigned char key[16]={};
int main(int argc, char** argv){
std::cout << "Hello!" << std::endl;
long a = atol(argv[1]);
std::cout << a << std::endl;
memcpy(key, (unsigned char*) a, sizeof key);
// std::cout << sizeof key << std::endl;
// for (int i = 0; i < 16; i++)
// std::cout << (int) (key[i]) << std::endl;
return 0;
}
What am I doing wrong?
To call the program:
compile: g++ main.cpp
Execute: ./a.out 128
You get SEGV because your address is wrong: you convert a value to an address. Plus the size is the one of the destination, should be the size of the source
The compiler issues a warning, that's never good, you should take it into account because that was exactly your error:
xxx.c:12:38: warning: cast to pointer from integer of different size [-Wint-to-pointer-cast]
memcpy(key, (unsigned char*) a, sizeof key);
^
fix that like this:
memcpy(key, &a, sizeof(a));
BTW you don't have to declare key with 16 bytes. It would be safer to allocate it like this:
unsigned char key[sizeof(long)];
and when you print the bytes, iterate until sizeof(long) too, or you'll just print trash bytes in the end.
Here's a fix proposal using uint64_t (unsigned 64-bit integer from stdint.h which gives exact control on the size), zero initialization for your key and parsing using strtoll:
#include <stdio.h>
#include <iostream>
#include <stdlib.h>
#include <memory.h>
#include <stdint.h>
unsigned char key[sizeof(uint64_t)]={0};
int main(int argc, char** argv){
std::cout << "Hello!" << std::endl;
uint64_t a = strtoll(argv[1],NULL,10);
memcpy(key, &a, sizeof a);
for (int i = 0; i < sizeof(key); i++)
std::cout << (int) (key[i]) << std::endl;
return 0;
}
(if you want to handle signed, just change to int64_t)
Test on a little endian architecture:
% a 10000000000000
Hello!
0
160
114
78
24
9
0
0
Looks like you are copying too much data.
I also added a &a for the memcpy.
#include <stdio.h>
#include <iostream>
#include <stdlib.h>
#include <memory.h>
unsigned char key[16]={};
int main(int argc, char** argv)
{
memset(key,0x0, sizeof(key));
std::cout << "Hello!" << std::endl;
long a = atol(argv[1]);
std::cout << a << std::endl;
// the size parameter needs to be the size of a
// or the lesser of the size of key and a
memcpy(key,(void *) &a, sizeof(a));
std::cout << "size of key " << sizeof(key) << "\n";
std::cout << "key " << key << "\n";
for (int i = 0; i < 16; i++)
std::cout << " " << i << " '" << ((int) key[i]) << "'\n";
return 0;
}

C++ convert Int value to 4 bytes [duplicate]

#include <iostream>
using namespace std;
int main()
{
char c1 = 0xab;
signed char c2 = 0xcd;
unsigned char c3 = 0xef;
cout << hex;
cout << c1 << endl;
cout << c2 << endl;
cout << c3 << endl;
}
I expected the output are as follows:
ab
cd
ef
Yet, I got nothing.
I guess this is because cout always treats 'char', 'signed char', and 'unsigned char' as characters rather than 8-bit integers. However, 'char', 'signed char', and 'unsigned char' are all integral types.
So my question is: How to output a character as an integer through cout?
PS: static_cast(...) is ugly and needs more work to trim extra bits.
char a = 0xab;
cout << +a; // promotes a to a type printable as a number, regardless of type.
This works as long as the type provides a unary + operator with ordinary semantics. If you are defining a class that represents a number, to provide a unary + operator with canonical semantics, create an operator+() that simply returns *this either by value or by reference-to-const.
source: Parashift.com - How can I print a char as a number? How can I print a char* so the output shows the pointer's numeric value?
Cast them to an integer type, (and bitmask appropriately!) i.e.:
#include <iostream>
using namespace std;
int main()
{
char c1 = 0xab;
signed char c2 = 0xcd;
unsigned char c3 = 0xef;
cout << hex;
cout << (static_cast<int>(c1) & 0xFF) << endl;
cout << (static_cast<int>(c2) & 0xFF) << endl;
cout << (static_cast<unsigned int>(c3) & 0xFF) << endl;
}
Maybe this:
char c = 0xab;
std::cout << (int)c;
Hope it helps.
Another way is to overload the << operator:
#include <iostream>
using namespace std;
typedef basic_ostream<char, char_traits<char>> basicOstream;
/*inline*/ basicOstream &operator<<(basicOstream &stream, char c) {
return stream.operator<<(+c);
}
/*inline*/ basicOstream &operator<<(basicOstream &stream, signed char c) {
return stream.operator<<(+c);
}
/*inline*/ basicOstream &operator<<(basicOstream &stream, unsigned char c) {
return stream.operator<<(+c);
}
int main() {
char var1 = 10;
signed char var2 = 11;
unsigned char var3 = 12;
cout << var1 << endl;
cout << var2 << endl;
cout << var3 << endl;
return 0;
}
which prints the following output:
10
11
12
Process finished with exit code 0
I think it's very neat and useful. hope it hepls!
And Also if you want it to print a hex value you can do like this:
basicOstream &operator<<(basicOstream &stream, char c) {
return stream.operator<<(hex).operator<<(c);
} // and so on...
What about:
char c1 = 0xab;
std::cout << int{ c1 } << std::endl;
It's concise and safe, and produces the same machine code as other methods.
Another way to do it is with std::hex apart from casting (int):
std::cout << std::hex << (int)myVar << std::endl;
I hope it helps.

Print hexcode in c++, but just "g" is printed

I have a question regarding following code. When I run it, it prints always just "g" instead of a hex code. Why? How can I output the hex code? Fiddle: http://ideone.com/FjYr2M
#include <iostream>
using namespace std;
void prepareAndSend() {
char Command[50];
sprintf(Command,"%04XT1000A", "076");
unsigned char checksum = 0x02;
char* p = Command;
while(*p) {
checksum ^= *p++;
}
checksum ^= 0x03;
std::cout << std::hex << checksum << std::endl;
}
int main() {
prepareAndSend();
return 0;
}
sprintf(Command,"%04XT1000A", "076");
Undefined behavior, turn your compiler warnings on.
sprintf(Command,"%04XT1000A", 0x76);
You also need to cast checksum to avoid using the unsigned char version of operator<<
std::cout << std::hex << static_cast<int>(checksum) << std::endl;
Cast checksum to int
std::cout << std::hex << static_cast<int>(checksum) << std::endl;
Since checksum is unsigned char, the operator<< tries to print it as a character