I'm not used to C++, so bear with me...
Two bytes are read from a device and gets in a buffer.
It is then to be printed.
The code below is supposed to return the string "0x204D"
However, it returns "0x M" which in hex is 30 78 20 4d
So the hex is not decoded to ascii.
void vito_unit::decodeAsRaw(unsigned char *buffer, int bufferLen)
{
std::stringstream *decodedClearText;
decodedClearText = new std::stringstream;
*decodedClearText << "0x" << std::hex;
for (int i=0; i<bufferLen; i++) {
*decodedClearText << buffer[i];
}
setValue(decodedClearText->str());
}
How should it be done?
This has nothing to do with std::hex.
When you stream a [signed/unsigned] char, its ASCII representation is used, because that is usually what is expected of chars.
You can stream a number instead by converting it to int. Then the feature that renders numbers in hexadecimal notation (i.e. std::hex) will be triggered.
You should also fix that memory leak and unnecessary dynamic allocation:
void vito_unit::decodeAsRaw(unsigned char const* const buffer, int const bufferLen)
{
std::stringstream decodedClearText;
decodedClearText << "0x" << std::hex;
for (int i = 0; i < bufferLen; i++) {
decodedClearText << +buffer[i];
}
setValue(decodedClearText.str());
}
The unary "+" performs an integral promotion to int.
buffer[i] is of type unsigned char and is thus printed as a character instead of its hexadecimal representation. You can cast the value to an unsigned int to avoid that.
void vito_unit::decodeAsRaw(unsigned char *buffer, int bufferLen)
{
std::stringstream *decodedClearText;
decodedClearText = new std::stringstream;
*decodedClearText << "0x" << std::hex;
for (int i=0; i<bufferLen; i++) {
*decodedClearText << (unsigned int) buffer[i];
}
setValue(decodedClearText->str());
}
The hint from Bo Persson was what I needed.
for (int i=0; i<bufferLen; i++) {
*decodedClearText << (int)buffer[i];
}
did the trick.
Related
I'm working with VS 2010 on Windows. I have a function which takes a char pointer. Now, inside the function, I am calling std::hex to convert it to decimal, but for some reason it is not working. It is outputting a large value which makes me think that it is converting the address instead.
void convertHexToDec(char* hex, char * dec)
{
long long decimal;
std::stringstream ss;
ss << hex;
ss >> std::hex >> decimal;
sprintf (dec, "%llu", decimal);
}
So, if the pass in a char pointer containing "58", the output decimal value is something like 1D34E78xxxxxxxxx. Looks like it is converting the address of the hex.
I tried these ways too:
ss << *hex;
ss << (char*)hex[0];
ss << (int *)&hex[0];
None of the above worked.
Any idea how I can make this function work?
The reason for your error is, probably, the wrong printf specifier. Also, sprintf is not safe: it assumes the destination buffer (dec) is large enough.
A possible solution using your function signature - not recommended since you do not know the size of the destination:
void convertHexToDec( char* hex, char * dec )
{
std::sprintf( dec, "%lld", std::strtoll( hex, 0, 16 ) );
}
A safe solution:
std::string convertHexToDec( const char* h )
{
return std::to_string( std::strtoll( h, 0, 16 ) );
}
A safe solution using streams:
std::string convertHexToDec( const char* h )
{
long long lld;
std::istringstream( h ) >> std::hex >> lld;
std::ostringstream os;
os << lld;
return os.str();
}
Apart from you not using std::string and refrences, I tried the following code:
#include <iostream>
#include <sstream>
void convertHexToDec(char* hex, char* dec)
{
long long decimal;
std::stringstream ss;
ss << hex;
ss >> std::hex >> decimal;
std::cout << "Decimal: " << decimal << "\n";
sprintf (dec, "%llu", decimal);
}
int main()
{
char hex[] = "58";
char dec[4];
convertHexToDec(hex, dec);
std::cout << "Output string: " << dec << "\n";
}
Output:
Decimal: 88
Output string: 88
live example
So what's your problem?
Below code takes a hex string(every byte is represented as its corresponidng hex value)
converts it to unsigned char * buffer and then converts back to hex string.
This code is testing the conversion from unsigned char* buffer to hex string
which I need to send over the network to a receiver process.
I chose hex string as unsigned char can be in range of 0 to 255 and there is no printable character after 127.
The below code just tells the portion that bugs me. Its in the comment.
#include <iostream>
#include <sstream>
#include <iomanip>
using namespace std;
// converts a hexstring to corresponding integer. i.e "c0" - > 192
int convertHexStringToInt(const string & hexString)
{
stringstream geek;
int x=0;
geek << std::hex << hexString;
geek >> x;
return x;
}
// converts a complete hexstring to unsigned char * buffer
void convertHexStringToUnsignedCharBuffer(string hexString, unsigned char*
hexBuffer)
{
int i=0;
while(hexString.length())
{
string hexStringPart = hexString.substr(0,2);
hexString = hexString.substr(2);
int hexStringOneByte = convertHexStringToInt (hexStringPart);
hexBuffer[i] = static_cast<unsigned char>((hexStringOneByte & 0xFF)) ;
i++;
}
}
int main()
{
//below hex string is a hex representation of a unsigned char * buffer.
//this is generated by an excryption algorithm in unsigned char* format
//I am converting it to hex string to make it printable for verification pupose.
//and takes the hexstring as inpuit here to test the conversion logic.
string inputHexString = "552027e33844dd7b71676b963c0b8e20";
string outputHexString;
stringstream geek;
unsigned char * hexBuffer = new unsigned char[inputHexString.length()/2];
convertHexStringToUnsignedCharBuffer(inputHexString, hexBuffer);
for (int i=0;i<inputHexString.length()/2;i++)
{
geek <<std::hex << std::setw(2) << std::setfill('0')<<(0xFF&hexBuffer[i]); // this works
//geek <<std::hex << std::setw(2) << std::setfill('0')<<(hexBuffer[i]); -- > this does not work
// I am not able to figure out why I need to do the bit wise and operation with unsigned char "0xFF&hexBuffer[i]"
// without this the conversion does not work for individual bytes having ascii values more than 127.
}
geek >> outputHexString;
cout << "input hex string: " << inputHexString<<endl;
cout << "output hex string: " << outputHexString<<endl;
if(0 == inputHexString.compare(outputHexString))
cout<<"hex encoding successful"<<endl;
else
cout<<"hex encoding failed"<<endl;
if(NULL != hexBuffer)
delete[] hexBuffer;
return 0;
}
// output
// can some one explain ? I am sure its something silly that I am missing.
the C++20 way:
unsigned char* data = new unsigned char[]{ "Hello world\n\t\r\0" };
std::size_t data_size = sizeof("Hello world\n\t\r\0") - 1;
auto sp = std::span(data, data_size );
std::transform( sp.begin(), sp.end(),
std::ostream_iterator<std::string>(std::cout),
[](unsigned char c) -> std::string {
return std::format("{:02X}", int(c));
});
or if you want to store result into string:
std::string result{};
result.reserve(size * 2 + 1);
std::transform( sp.begin(), sp.end(),
std::back_inserter(result),
[](unsigned char c) -> std::string {
return std::format("{:02X}", int(c));
});
Output:
48656C6C6F20776F726C640A090D00
The output of an unsigned char is like the output of a char which obviously does not what the OP expects.
I tested the following on coliru:
#include <iomanip>
#include <iostream>
int main()
{
std::cout << "Output of (unsigned char)0xc0: "
<< std::hex << std::setw(2) << std::setfill('0') << (unsigned char)0xc0 << '\n';
return 0;
}
and got:
Output of (unsigned char)0xc0: 0�
This is caused by the std::ostream::operator<<() which is chosen out of the available operators. I looked on cppreference
operator<<(std::basic_ostream) and
std::basic_ostream::operator<<
and found
template< class Traits >
basic_ostream<char,Traits>& operator<<( basic_ostream<char,Traits>& os,
unsigned char ch );
in the former (with a little bit help from M.M).
The OP suggested a fix: bit-wise And with 0xff which seemed to work. Checking this in coliru.com:
#include <iomanip>
#include <iostream>
int main()
{
std::cout << "Output of (unsigned char)0xc0: "
<< std::hex << std::setw(2) << std::setfill('0') << (0xff & (unsigned char)0xc0) << '\n';
return 0;
}
Output:
Output of (unsigned char)0xc0: c0
Really, this seems to work. Why?
0xff is an int constant (stricly speaking: an integer literal) and has type int. Hence, the bit-wise And promotes (unsigned char)0xc0 to int as well, yields the result of type int, and hence, the std::ostream::operator<< for int is applied.
This is an option to solve this. I can provide another one – just converting the unsigned char to unsigned.
Where the promotion of unsigned char to int introduces a possible sign-bit extension (which is undesired in this case), this doesn't happen when unsigned char is converted to unsigned. The output stream operator for unsigned provides the intended output as well:
#include <iomanip>
#include <iostream>
int main()
{
std::cout << "Output of (unsigned char)0xc0: "
<< std::hex << std::setw(2) << std::setfill('0') << (unsigned)(unsigned char)0xc0 << '\n';
const unsigned char c = 0xc0;
std::cout << "Output of unsigned char c = 0xc0: "
<< std::hex << std::setw(2) << std::setfill('0') << (unsigned)c << '\n';
return 0;
}
Output:
Output of (unsigned char)0xc0: c0
Output of unsigned char c = 0xc0: c0
Live Demo on coliru
I'm using this code to convert the unsigned char* (points to an array of 256 values) to std::string:
int ClassA::Func(unsigned char *dataToSend, int sendLength)
{
std::stringstream convertStream;
std::string dataToSendStr = "";
for(int i=0; i<=sendLength; i++) convertStream << dataToSend[i];
while(!convertStream.eof()) convertStream >> dataToSendStr;
...
}
but then I have dataToSendStr in this format:
dataToSendStr ""
[0] 0x00
[1] 0x00
[2] 0x04
[3] 0xC0
if I now use this value I only get "" and not the important values [0-3]!
-> need something like: dataToSendStr "000004C0"
Thx for your help!
Use the IO manipulator std::hex if you want (which is what I think based on the need something like part of the question) a hexadecimal representation of the characters in dataToSend:
std::ostringstream convertStream;
convertStream << std::hex << std::setfill('0');
for (int i = 0; i < sendLength; i++)
{
convertStream << std::setw(2) << static_cast<short>(dataToSend[i]);
}
I have a the following class (only partial, much more fields in a class)
class Network{
public:
string src_ip_;
string alternative_src_ip_;
array<unsigned char,6> mac_;
string toString(){
stringstream ss;
ss << src_ip_ << SEPERATOR << alternative_src_ip_ << SEPERATOR ;
return ss.str();
}
}
I want to add a formatted mac (with :) to the toString method?
Is there a simple way to adopt my printMac method (by generelize or by write new one) that will do this with combined in the << operator
void printMac(array<unsigned char, 6> mac) {
printf("%02x:%02x:%02x:%02x:%02x:%02x\n",
(unsigned char) mac[0], (unsigned char) mac[1],
(unsigned char) mac[2], (unsigned char) mac[3],
(unsigned char) mac[4], (unsigned char) mac[5]);
}
Use the IO manipulators:
std::ostringstream s;
unsigned char arr[6] = { 0, 14, 10, 11, 89, 10 };
s << std::hex << std::setfill('0');
for (int i = 0; i < sizeof(arr); i++)
{
if (i > 0) s << ':';
// Need to:
// - set width each time as it only
// applies to the next output field.
// - cast to an int as std::hex is for
// integer I/O
s << std::setw(2) << static_cast<int>(arr[i]);
}
You can replace your use of printf with sprintf and then use it to implement operator<< for ostreams
void printMac(array<unsigned char, 6> mac, char (&out)[18]) {
sprintf(out, "%02x:%02x:%02x:%02x:%02x:%02x",
(unsigned char) mac[0], (unsigned char) mac[1],
(unsigned char) mac[2], (unsigned char) mac[3],
(unsigned char) mac[4], (unsigned char) mac[5]);
}
std::ostream &operator<<(std::ostream &os, std::array<unsigned char, 6> mac) {
char buf[18];
printMac(mac, buf);
return os << buf << '\n';
}
If you want to maintain printf-like code, you could try Boost.Format.
ss << boost::format("%02x:%02x:%02x:%02x:%02x:%02x\n") % mac[0] % mac[1] % mac[2] % mac[3] % mac[4] % mac[5];
I have tried to find this topic on the web but I couldn't find the one I need.
I have a string of character:
char * tempBuf = "qj";
The result I want is 0x716A, and that value is going to be converted into decimal value.
Is there any function in vc++ that can be used for that?
You can use a stringstream to convert each character to a hexadecimal representation.
#include <iostream>
#include <sstream>
#include <cstring>
int main()
{
const char* tempBuf = "qj";
std::stringstream ss;
const char* it = tempBuf;
const char* end = tempBuf + std::strlen(tempBuf);
for (; it != end; ++it)
ss << std::hex << unsigned(*it);
unsigned result;
ss >> result;
std::cout << "Hex value: " << std::hex << result << std::endl;
std::cout << "Decimal value: " << std::dec << result << std::endl;
}
So if I understood correctly the idea...
#include <stdint.h>
uint32_t charToUInt32(const char* src) {
uint32_t ret = 0;
char* dst = (char*)&ret;
for(int i = 0; (i < 4) && (*src); ++i, ++src)
dst[i] = *src;
return ret;
}
If I understand what you want correctly: just loop over the characters, start to finish; at each character, multiply the sum so far by 256, and add the value of the next character; that gives the decimal value in one shot.
What you are looking for is called "hex encoding". There are a lot of libraries out there that can do that (unless what you were looking for was how to implement one yourself).
One example is crypto++.