Formatting output of a signed hex digits - c++

I have been using stringstream to convert my data and it has been working great except for one case.
I am subtracting two integer values that can end up being negative or positive. I take that value and send it to my stringstream object using std::hex as well as it gets dumped to std::cout.
My problem is my field for this value can only be 3 digits long and when I get a negative value it pads it with too many leading F's. I can't seem to get any std functions to help (setw, setfill, ...).
Can anyone point me in the right direction?
Example:
Value - Value = -9, So what I want is FF9 but what I get is FFFFFFF9.
My code to send the value to my stringstream object ss
ss << hex << value - LocationCounter;

You are trying to output a value that is 12 bits max in size. There is no 12-bit data type, so the closest you can get is to use a 16-bit signed type with its high 4 bits set to 0. For instance, calculate your desired value into an 8-bit signed type first (which will reduce its effective range to -128 .. 127), then sign-extend it to a 16-bit signed type, zero the high 4 bits, and finally output the result as hex:
signed char diff = (signed char)(value - LocationCounter);
// the setw() and setfill() are used to pad
// values that are 8 bits or fewer in size...
ss << hex << setw(3) << setfill('0') << (((signed short)diff) & 0x0fff);
To read the value back, read the 12-bit hex into a signed short and then truncate its value to a signed char:
signed short tmp;
ss >> hex >> tmp;
signed char diff = (signed char)tmp;

Related

How do I convert an std::stringstream to a uintptr_t and maintain the hex value?

Currently, I am attempting to calculate offsets from a pointer address, and the number of offsets to be calculated can change, so the approach must be done dynamically.
I start by looping for the number of offsets I am trying to calculate, each offset is 4 bytes apart, thus I multiply the current iteration by 4, and then attempt to convert the resulting value to a hex address, and store it back in the uintptr_t as a hex value.
This value, in theory, should be the offset I am looking for.
In reality, that is not the case, the value seems to be getting converted back to an integer and stored in the uintptr_t variable.
Expected Output:
4
8
C
10
14
(etc)
Actual Output
4
8
12
16
20
(etc)
Code
for (int i = 1; i < totalEntities + 1; i++)
{
// Define a stringstream to store the hex value.
std::stringstream ss;
// Define a value that will be converted to hex.
uintptr_t valueToHex = i * 4;
// Convert valueToHex to Hex, and store the result in stringstream ss.
ss << std::hex << valueToHex;
// Convert stringstream ss to a uintptr_t stored in valueToHex
ss >> valueToHex;
// Output result
std::cout << valueToHex << std::endl;
}
uintptr_t represents a value of an integer. It does not represent textual representation of that value. The base of the number is not part of the value. 0xC, 12, 014 are indistinguishable values regardless of their different representation.
The base is part of the textual representation. All information besides the value (i.e. all representational details) is lost when an integer is extracted from a character stream.
You can either:
a) Extract a string instead, and insert the extracted string into the output stream. Strings retain most of the textual representation (an exception being that system specific new-line character sequences are converted to \n) or
b) Use std::hex to insert the integer into the output stream in the representation that you want.
seems like you just want to output an integer as a hex string. ie
for (int i = 1; i < numEntities + 1; i++)
{
std::cout << std::hex<< i*4 << std::endl;
}

How to read ASCII value from a character and convert it into hexadecimal formatted string

Need to read the value of character as a number and find corresponding hexadecimal value for that.
#include <iostream>
#include <iomanip>
using namespace std;
int main() {
char c = 197;
cout << hex << uppercase << setw(2) << setfill('0') << (short)c << endl;
}
Output:
FFC5
Expected output:
C5
The problem is that when you use char c = 197 you are overflowing the char type, producing a negative number (-59). Starting there it doesn't matter what conversion you make to larger types, it will remain a negative number.
To fully understand why you must know how two's complement works.
Basically, -59 and 192 have the same binary representation: 1100 0101, depending on the data type it is interpreted in one way or another. When you print it using hexadecimal format, the binary representation (the actual value stored in memory) is the one used, producing C5.
When the char is converted into an short/unsigned short, it is converting the -59 into its short/unsigned short representation, which is 1111 1111 1100 0101 (FFC5) for both cases.
The correct way to do it would be to store the initial value (197) into a variable which data type is able to represent it (unsigned char, short, unsigned short, ...) from the very beginning.

Converting a bitset to signed

I have a std::bitset<32> and I want to isolate the right 16 bits and output those bits as if they were a signed number. I also am going to want to output the entire 32 bit thing as a signed number down the road. However, Bitset does not support a signed int to_string().
for example
1010000000100001 1111111111111111:
I want one output to be:
-1608384513 for the whole sequence
-1 for the right 16 bits.
Any slick ways of converting them?
To get a 16-bit number you can use to_ulong(), drop the upper 16 bits, and reinterpret as int16_t.
Similarly, for a signed 32-bit number you can call to_ulong(), and reinterpret as a signed int32_t.
std::bitset<32> b("10100000001000011111111111111111");
int16_t x16 = (int16_t)(b.to_ulong() & 0xFFFF);
int32_t x32 = (int32_t)b.to_ulong();
cout << x16 << endl;
cout << x32 << endl;
Demo.

How to save hex value from string in uint8_t or char without converting the value?

I have a binary file which I am reading and saving as hex values. These hex values will be used to flash a device, but I need the hex values to be stored as either uint8_t or char (preferably uint8_t). Currently the values are stored in a string, and only one byte at a time. What I want to do is to store the value as uint8_t instead of string.
So if the value of std:string result is 0x3A, I want the value of uint8_t flash[1] to be 0x3A, and not 58 or some other value.
How can this be achieved?
Any help is appreciated :)
EDIT:
As I first expected, and has been confirmed, the values are the same whichever method I am using.
For those who wish to see the code generating the string, here it is:
unsigned char x;
std::ifstream input(file, std::ios::binary);
input >> std::noskipws;
std::stringstream stream, str;
stream << std::hex << std::setw(2) << std::setfill('0') << (int)x;
std::string result( stream.str() );
str << result;
int value;
str >> std::hex >> value;
uint8_t data = value;
In order to fully understand what you are saying we really need to see the code that puts data into your std:string result.
However this may be of help:
std::string result;
result += 58; // this is the same as
result += 0x3A; // this
The compiler assumes the first append is a decimal number because it does not begin with a 0 and converts it from decimal to binary before appending it to the string.
The compiler assumes the second append is a hexadecimal number because it begins with 0x and converts it from hexadecimal to binary before appending it to the string.
In both cases the same binary number is appended to the string.
Now to put that into your array we can do:
uint8_t flash[2]; // big enough to our 2 chars
flash[0] = result[0]; // copy first value
flash[1] = result[1]; // copy second value
Voila!, all done.
0x3A IS 58. They're represented the same in binary and mean exactly the same thing - the only difference is the formatting when output to the user (integers, by default, are output in base 10).
If you're trying to represent 58 as 0x3A in the output stream, just change the way you're outputting it.
A value is a value, 0x3A == 58. There is no difference, is if you put it in an uint8_t is is exactly the same. There is nothing like storing it as 58 or storing it as 0x3A. It both ends up as: 00111010.
You can use the strtol function to convert a hexstring to uint8_t and use 16 as your base number.

Hexadecimal in String to Hexadecimal in Integer

I want to know how to convert something like string x = "1f" to int y = 0x1f, every topic I found was solved by turning it to simply the integer value of it (31) or turning the string to a hexadecimal equivalent "Hello" > 48656C6C6F
std::stringstream Strm;
std::string Stng = "1f";
Strm << Stng;
int Hexa;
Strm >> std::hex >> Hexa;
cout << Hexa;
This closest I could get to it (but turned out it just converts it to integer)
EDIT: I guess my problem was I didn't know it must be stored as an integer and can be only shown as hexadecimal if i add std::hex after cout, that was stupid sorry
Integers don't carry labels saying 'I'm a decimal integer' or 'I'm a hexadecimal integer'. All integers are the same. So if you have found some code that converts a hexadecimal string to an integer then that is the code you should use.
Once you have your integer you can then choose to print it out in hexadecimal if you want. You do that with hex
int hexa = ...;
cout << hex << hexa; // prints an int in hexadecimal form
One quite fast solutions is using boost::lexical cast. You can find everything here http://www.boost.org/doc/libs/1_53_0/doc/html/boost_lexical_cast.html
You have two choices:
std::strtol
std::stoi
The last will throw an exception if the input string is not a proper hexadecimal number.
And remember, in the computer all integers are stored in binary, hexadecimal is just a presentation. For example, the ASCII character 'a' is the same as decimal number 97 and the same as octal number 141 and the same as hexadecimal number 61 and the same as binary number (which it is ultimately is stored as in memory) 01100001.