Convert a signed Int to a hex string with spaces - c++

I was wondering if I could get some help converting a integer to a hex string with a space between each byte like so-
int val = -2147483648;
char hexval[32];
sprintf(hexval, "%x", val)
Output = 80000000
how could I add spaced between each byte so I would have a string like -> 80 00 00 00
is there an easier way then malloc'ing memory and moving a pointer around?
Thanks!

A simple function:
/**
* hexstr(char *str, int val);
*
* `str` needs to point to a char array with at least 12 elements.
**/
int hexstr(char *str, int val) {
return snprintf(str, 12, "%02hhx %02hhx %02hhx %02hhx", val >> 24, val >> 16, val >> 8, val);
}
Example:
int main(void) {
int val = -2147483648;
char hexval[12];
hexstr(hexval, val);
printf("Integer value: %d\n", val);
printf("Result string: %s\n", hexval);
return 0;
}
Integer value: -2147483648
Result string: 80 00 00 00

As an alternative, you may consider using std::hex. Example:
#include <iostream>
int main() {
int n=255;
std::cout << std::hex << n << std::endl;
return 0;
}
UPDATE:
A more flexible implementation that do not rely on printing the content can be
void gethex(int n, std::ostream &o) {
o << std::hex << n;
}
then
std::ostringstream ss;
gethex(myNumber, ss);
std::cout << "Hex number: " << ss.str() << std::endl;

Related

How to plus hex( No output)? [duplicate]

This question already has answers here:
C++ cout hex values?
(10 answers)
Closed 2 years ago.
My code:
#include <iostream>
using namespace std;
int main() {
unsigned int a = 0x0009, b = 0x0002;
unsigned int c = a + b;
cout << c;
}
Now
c = 11
I want to this:
c = 000B
How can I do ?
When you do this
int main() {
unsigned int a = 0x0009, b = 0x0002;
unsigned int c = a + b;
}
Then c has the value of 11, it also has the value of 0x000B. It also has the value of 10 in a representation that uses 11 as base.
11 and 0x000B (and 10) are different representations of the same value.
When you use std::cout then the number is printed as decimal by default. What representation you choose to print the value on the screen has no influence whatsoever on the actual value of c.
What I understand is that you want to retrieve the result in an hexadecimal specific format XXXX.
Computing the addition is the same as any number base, you only need to use (here I display) the result in your format.
You can do this, for instance:
#include <iostream>
#include <iomanip>
std::string displayInPersonalizedHexa(unsigned int a)
{
std::stringstream ss;
ss << std::uppercase<< std::setfill('0') << std::setw(4) << std::hex<< a;
std::string x;
ss >>x;
//std::cout << x;
return x;
}
int main() {
unsigned int a = 0x0009, b = 0x0002;
unsigned int c = a + b;
// displays 000B
std::cout << displayInPersonalizedHexa(c) << std::endl;
// adds c=c+1
c=c+1;
// displays 000C
std::cout << displayInPersonalizedHexa(c) << std::endl;
//0xC+5 = 0x11
c=c+5;
// displays 0011
std::cout << displayInPersonalizedHexa(c) << std::endl;
}
This will output
000B
000C
0011

Using std::hex to convert hex to decimal

I'm working with VS 2010 on Windows. I have a function which takes a char pointer. Now, inside the function, I am calling std::hex to convert it to decimal, but for some reason it is not working. It is outputting a large value which makes me think that it is converting the address instead.
void convertHexToDec(char* hex, char * dec)
{
long long decimal;
std::stringstream ss;
ss << hex;
ss >> std::hex >> decimal;
sprintf (dec, "%llu", decimal);
}
So, if the pass in a char pointer containing "58", the output decimal value is something like 1D34E78xxxxxxxxx. Looks like it is converting the address of the hex.
I tried these ways too:
ss << *hex;
ss << (char*)hex[0];
ss << (int *)&hex[0];
None of the above worked.
Any idea how I can make this function work?
The reason for your error is, probably, the wrong printf specifier. Also, sprintf is not safe: it assumes the destination buffer (dec) is large enough.
A possible solution using your function signature - not recommended since you do not know the size of the destination:
void convertHexToDec( char* hex, char * dec )
{
std::sprintf( dec, "%lld", std::strtoll( hex, 0, 16 ) );
}
A safe solution:
std::string convertHexToDec( const char* h )
{
return std::to_string( std::strtoll( h, 0, 16 ) );
}
A safe solution using streams:
std::string convertHexToDec( const char* h )
{
long long lld;
std::istringstream( h ) >> std::hex >> lld;
std::ostringstream os;
os << lld;
return os.str();
}
Apart from you not using std::string and refrences, I tried the following code:
#include <iostream>
#include <sstream>
void convertHexToDec(char* hex, char* dec)
{
long long decimal;
std::stringstream ss;
ss << hex;
ss >> std::hex >> decimal;
std::cout << "Decimal: " << decimal << "\n";
sprintf (dec, "%llu", decimal);
}
int main()
{
char hex[] = "58";
char dec[4];
convertHexToDec(hex, dec);
std::cout << "Output string: " << dec << "\n";
}
Output:
Decimal: 88
Output string: 88
live example
So what's your problem?

Converting unsigned char * to hexstring

Below code takes a hex string(every byte is represented as its corresponidng hex value)
converts it to unsigned char * buffer and then converts back to hex string.
This code is testing the conversion from unsigned char* buffer to hex string
which I need to send over the network to a receiver process.
I chose hex string as unsigned char can be in range of 0 to 255 and there is no printable character after 127.
The below code just tells the portion that bugs me. Its in the comment.
#include <iostream>
#include <sstream>
#include <iomanip>
using namespace std;
// converts a hexstring to corresponding integer. i.e "c0" - > 192
int convertHexStringToInt(const string & hexString)
{
stringstream geek;
int x=0;
geek << std::hex << hexString;
geek >> x;
return x;
}
// converts a complete hexstring to unsigned char * buffer
void convertHexStringToUnsignedCharBuffer(string hexString, unsigned char*
hexBuffer)
{
int i=0;
while(hexString.length())
{
string hexStringPart = hexString.substr(0,2);
hexString = hexString.substr(2);
int hexStringOneByte = convertHexStringToInt (hexStringPart);
hexBuffer[i] = static_cast<unsigned char>((hexStringOneByte & 0xFF)) ;
i++;
}
}
int main()
{
//below hex string is a hex representation of a unsigned char * buffer.
//this is generated by an excryption algorithm in unsigned char* format
//I am converting it to hex string to make it printable for verification pupose.
//and takes the hexstring as inpuit here to test the conversion logic.
string inputHexString = "552027e33844dd7b71676b963c0b8e20";
string outputHexString;
stringstream geek;
unsigned char * hexBuffer = new unsigned char[inputHexString.length()/2];
convertHexStringToUnsignedCharBuffer(inputHexString, hexBuffer);
for (int i=0;i<inputHexString.length()/2;i++)
{
geek <<std::hex << std::setw(2) << std::setfill('0')<<(0xFF&hexBuffer[i]); // this works
//geek <<std::hex << std::setw(2) << std::setfill('0')<<(hexBuffer[i]); -- > this does not work
// I am not able to figure out why I need to do the bit wise and operation with unsigned char "0xFF&hexBuffer[i]"
// without this the conversion does not work for individual bytes having ascii values more than 127.
}
geek >> outputHexString;
cout << "input hex string: " << inputHexString<<endl;
cout << "output hex string: " << outputHexString<<endl;
if(0 == inputHexString.compare(outputHexString))
cout<<"hex encoding successful"<<endl;
else
cout<<"hex encoding failed"<<endl;
if(NULL != hexBuffer)
delete[] hexBuffer;
return 0;
}
// output
// can some one explain ? I am sure its something silly that I am missing.
the C++20 way:
unsigned char* data = new unsigned char[]{ "Hello world\n\t\r\0" };
std::size_t data_size = sizeof("Hello world\n\t\r\0") - 1;
auto sp = std::span(data, data_size );
std::transform( sp.begin(), sp.end(),
std::ostream_iterator<std::string>(std::cout),
[](unsigned char c) -> std::string {
return std::format("{:02X}", int(c));
});
or if you want to store result into string:
std::string result{};
result.reserve(size * 2 + 1);
std::transform( sp.begin(), sp.end(),
std::back_inserter(result),
[](unsigned char c) -> std::string {
return std::format("{:02X}", int(c));
});
Output:
48656C6C6F20776F726C640A090D00
The output of an unsigned char is like the output of a char which obviously does not what the OP expects.
I tested the following on coliru:
#include <iomanip>
#include <iostream>
int main()
{
std::cout << "Output of (unsigned char)0xc0: "
<< std::hex << std::setw(2) << std::setfill('0') << (unsigned char)0xc0 << '\n';
return 0;
}
and got:
Output of (unsigned char)0xc0: 0�
This is caused by the std::ostream::operator<<() which is chosen out of the available operators. I looked on cppreference
operator<<(std::basic_ostream) and
std::basic_ostream::operator<<
and found
template< class Traits >
basic_ostream<char,Traits>& operator<<( basic_ostream<char,Traits>& os,
unsigned char ch );
in the former (with a little bit help from M.M).
The OP suggested a fix: bit-wise And with 0xff which seemed to work. Checking this in coliru.com:
#include <iomanip>
#include <iostream>
int main()
{
std::cout << "Output of (unsigned char)0xc0: "
<< std::hex << std::setw(2) << std::setfill('0') << (0xff & (unsigned char)0xc0) << '\n';
return 0;
}
Output:
Output of (unsigned char)0xc0: c0
Really, this seems to work. Why?
0xff is an int constant (stricly speaking: an integer literal) and has type int. Hence, the bit-wise And promotes (unsigned char)0xc0 to int as well, yields the result of type int, and hence, the std::ostream::operator<< for int is applied.
This is an option to solve this. I can provide another one – just converting the unsigned char to unsigned.
Where the promotion of unsigned char to int introduces a possible sign-bit extension (which is undesired in this case), this doesn't happen when unsigned char is converted to unsigned. The output stream operator for unsigned provides the intended output as well:
#include <iomanip>
#include <iostream>
int main()
{
std::cout << "Output of (unsigned char)0xc0: "
<< std::hex << std::setw(2) << std::setfill('0') << (unsigned)(unsigned char)0xc0 << '\n';
const unsigned char c = 0xc0;
std::cout << "Output of unsigned char c = 0xc0: "
<< std::hex << std::setw(2) << std::setfill('0') << (unsigned)c << '\n';
return 0;
}
Output:
Output of (unsigned char)0xc0: c0
Output of unsigned char c = 0xc0: c0
Live Demo on coliru

ifstream >> reading same line over again

I'm trying to read in formatted hex data into unsigned ints using the >> operator. The code I'm using is
#include <iostream>
#include <fstream>
int main(int argc, char** argv)
{
ifstream in(argv[1]);
unsigned int addr;
unsigned int op;
unsigned int data;
do
{
in >> hex >> addr >> hex >> op >> hex >> data;
cout << addr << " " << op << " " << data << '\n';
if (in.eof()) break;
} while(1);
return 0;
}
This works on a 300 line file just fine, but when I try it on a different file, it reads the 5th line repeatedly then seg faults, and I cannot figure out why. The first five lines are
FD2C FF EB
4FE9 FF 32
276E FF 6E
5C09 FF A3
7739 FF 36
The offending line is
7739 FF 36
Any help is appreciated. Thanks!
Edit:
I modified my code so it looks like
#include <iostream>
#include <fstream>
int main(int argc, char** argv)
{
ifstream in(argv[1]);
unsigned int addr;
unsigned int op;
unsigned int data;
while (in >> hex >> addr >> hex >> op >> hex >> data)
{
cout << addr << " " << op << " " << data << '\n';
}
return 0;
}
This solves the problem of reading the 5th line over and over again but it still segfaults, albeit on another line. I'm going to look further and see if I can pin it down.
FF is new page symbol and then it is followed by a dollar sign $ which is 36, maybe it thinks that 36 is a pointer to something. try to change unsigned int to unsigned char.
EDIT: this reads everything in without problems and output is in hex values good luck.
#include <iostream>
#include <fstream>
int main(int argc, char** argv)
{
std::ifstream in; in.open("tst.tst",std::ios::in);
unsigned char addr;
unsigned char op;
unsigned char data;
while (in >>std::hex>> addr>>std::hex>> op>>std::hex>> data)
{
std::cout << (int)addr <<std::hex << " " << (int)op<<std::hex << " " << (int)data<<std::hex<< "\n";
}
return 0;
}

Extract integer from a string

I have string like "y.x-name', where y and x are number ranging from 0 to 100. From this string, what would be the best method to extract 'x' into an integer variable in C++.
You could split the string by . and convert it to integer type directly. The second number in while loop is the one you want, see sample code:
template<typename T>
T stringToDecimal(const string& s)
{
T t = T();
std::stringstream ss(s);
ss >> t;
return t;
}
int func()
{
string s("100.3-name");
std::vector<int> v;
std::stringstream ss(s);
string line;
while(std::getline(ss, line, '.'))
{
v.push_back(stringToDecimal<int>(line));
}
std::cout << v.back() << std::endl;
}
It will output: 3
It seem that this thread has a problem similar to you, it might help ;)
Simple string parsing with C++
You can achieve it with boost::lexical_cast, which utilizes streams like in billz' answer:
Pseudo code would look like this (indices might be wrong in that example):
std::string yxString = "56.74-name";
size_t xStart = yxString.find(".") + 1;
size_t xLength = yxString.find("-") - xStart;
int x = boost::lexical_cast<int>( yxString + xStart, xLength );
Parsing errors can be handled via exceptions that are thrown by lexical_cast.
For more flexible / powerful text matching I suggest boost::regex.
Use two calls to unsigned long strtoul( const char *str, char **str_end, int base ), e.g:
#include <cstdlib>
#include <iostream>
using namespace std;
int main(){
char const * s = "1.99-name";
char *endp;
unsigned long l1 = strtoul(s,&endp,10);
if (endp == s || *endp != '.') {
cerr << "Bad parse" << endl;
return EXIT_FAILURE;
}
s = endp + 1;
unsigned long l2 = strtoul(s,&endp,10);
if (endp == s || *endp != '-') {
cerr << "Bad parse" << endl;
return EXIT_FAILURE;
}
cout << "num 1 = " << l1 << "; num 2 = " << l2 << endl;
return EXIT_FAILURE;
}