I have tried to find this topic on the web but I couldn't find the one I need.
I have a string of character:
char * tempBuf = "qj";
The result I want is 0x716A, and that value is going to be converted into decimal value.
Is there any function in vc++ that can be used for that?
You can use a stringstream to convert each character to a hexadecimal representation.
#include <iostream>
#include <sstream>
#include <cstring>
int main()
{
const char* tempBuf = "qj";
std::stringstream ss;
const char* it = tempBuf;
const char* end = tempBuf + std::strlen(tempBuf);
for (; it != end; ++it)
ss << std::hex << unsigned(*it);
unsigned result;
ss >> result;
std::cout << "Hex value: " << std::hex << result << std::endl;
std::cout << "Decimal value: " << std::dec << result << std::endl;
}
So if I understood correctly the idea...
#include <stdint.h>
uint32_t charToUInt32(const char* src) {
uint32_t ret = 0;
char* dst = (char*)&ret;
for(int i = 0; (i < 4) && (*src); ++i, ++src)
dst[i] = *src;
return ret;
}
If I understand what you want correctly: just loop over the characters, start to finish; at each character, multiply the sum so far by 256, and add the value of the next character; that gives the decimal value in one shot.
What you are looking for is called "hex encoding". There are a lot of libraries out there that can do that (unless what you were looking for was how to implement one yourself).
One example is crypto++.
Related
Below code takes a hex string(every byte is represented as its corresponidng hex value)
converts it to unsigned char * buffer and then converts back to hex string.
This code is testing the conversion from unsigned char* buffer to hex string
which I need to send over the network to a receiver process.
I chose hex string as unsigned char can be in range of 0 to 255 and there is no printable character after 127.
The below code just tells the portion that bugs me. Its in the comment.
#include <iostream>
#include <sstream>
#include <iomanip>
using namespace std;
// converts a hexstring to corresponding integer. i.e "c0" - > 192
int convertHexStringToInt(const string & hexString)
{
stringstream geek;
int x=0;
geek << std::hex << hexString;
geek >> x;
return x;
}
// converts a complete hexstring to unsigned char * buffer
void convertHexStringToUnsignedCharBuffer(string hexString, unsigned char*
hexBuffer)
{
int i=0;
while(hexString.length())
{
string hexStringPart = hexString.substr(0,2);
hexString = hexString.substr(2);
int hexStringOneByte = convertHexStringToInt (hexStringPart);
hexBuffer[i] = static_cast<unsigned char>((hexStringOneByte & 0xFF)) ;
i++;
}
}
int main()
{
//below hex string is a hex representation of a unsigned char * buffer.
//this is generated by an excryption algorithm in unsigned char* format
//I am converting it to hex string to make it printable for verification pupose.
//and takes the hexstring as inpuit here to test the conversion logic.
string inputHexString = "552027e33844dd7b71676b963c0b8e20";
string outputHexString;
stringstream geek;
unsigned char * hexBuffer = new unsigned char[inputHexString.length()/2];
convertHexStringToUnsignedCharBuffer(inputHexString, hexBuffer);
for (int i=0;i<inputHexString.length()/2;i++)
{
geek <<std::hex << std::setw(2) << std::setfill('0')<<(0xFF&hexBuffer[i]); // this works
//geek <<std::hex << std::setw(2) << std::setfill('0')<<(hexBuffer[i]); -- > this does not work
// I am not able to figure out why I need to do the bit wise and operation with unsigned char "0xFF&hexBuffer[i]"
// without this the conversion does not work for individual bytes having ascii values more than 127.
}
geek >> outputHexString;
cout << "input hex string: " << inputHexString<<endl;
cout << "output hex string: " << outputHexString<<endl;
if(0 == inputHexString.compare(outputHexString))
cout<<"hex encoding successful"<<endl;
else
cout<<"hex encoding failed"<<endl;
if(NULL != hexBuffer)
delete[] hexBuffer;
return 0;
}
// output
// can some one explain ? I am sure its something silly that I am missing.
the C++20 way:
unsigned char* data = new unsigned char[]{ "Hello world\n\t\r\0" };
std::size_t data_size = sizeof("Hello world\n\t\r\0") - 1;
auto sp = std::span(data, data_size );
std::transform( sp.begin(), sp.end(),
std::ostream_iterator<std::string>(std::cout),
[](unsigned char c) -> std::string {
return std::format("{:02X}", int(c));
});
or if you want to store result into string:
std::string result{};
result.reserve(size * 2 + 1);
std::transform( sp.begin(), sp.end(),
std::back_inserter(result),
[](unsigned char c) -> std::string {
return std::format("{:02X}", int(c));
});
Output:
48656C6C6F20776F726C640A090D00
The output of an unsigned char is like the output of a char which obviously does not what the OP expects.
I tested the following on coliru:
#include <iomanip>
#include <iostream>
int main()
{
std::cout << "Output of (unsigned char)0xc0: "
<< std::hex << std::setw(2) << std::setfill('0') << (unsigned char)0xc0 << '\n';
return 0;
}
and got:
Output of (unsigned char)0xc0: 0�
This is caused by the std::ostream::operator<<() which is chosen out of the available operators. I looked on cppreference
operator<<(std::basic_ostream) and
std::basic_ostream::operator<<
and found
template< class Traits >
basic_ostream<char,Traits>& operator<<( basic_ostream<char,Traits>& os,
unsigned char ch );
in the former (with a little bit help from M.M).
The OP suggested a fix: bit-wise And with 0xff which seemed to work. Checking this in coliru.com:
#include <iomanip>
#include <iostream>
int main()
{
std::cout << "Output of (unsigned char)0xc0: "
<< std::hex << std::setw(2) << std::setfill('0') << (0xff & (unsigned char)0xc0) << '\n';
return 0;
}
Output:
Output of (unsigned char)0xc0: c0
Really, this seems to work. Why?
0xff is an int constant (stricly speaking: an integer literal) and has type int. Hence, the bit-wise And promotes (unsigned char)0xc0 to int as well, yields the result of type int, and hence, the std::ostream::operator<< for int is applied.
This is an option to solve this. I can provide another one – just converting the unsigned char to unsigned.
Where the promotion of unsigned char to int introduces a possible sign-bit extension (which is undesired in this case), this doesn't happen when unsigned char is converted to unsigned. The output stream operator for unsigned provides the intended output as well:
#include <iomanip>
#include <iostream>
int main()
{
std::cout << "Output of (unsigned char)0xc0: "
<< std::hex << std::setw(2) << std::setfill('0') << (unsigned)(unsigned char)0xc0 << '\n';
const unsigned char c = 0xc0;
std::cout << "Output of unsigned char c = 0xc0: "
<< std::hex << std::setw(2) << std::setfill('0') << (unsigned)c << '\n';
return 0;
}
Output:
Output of (unsigned char)0xc0: c0
Output of unsigned char c = 0xc0: c0
Live Demo on coliru
What I'm trying to do is converting a string's bytes into hexadecimal format.
Based on this answer (and many others consistent) I've tried the code:
#include <sstream>
#include <iomanip>
#include <iostream>
int main ()
{
std::string inputText = u8"A7°";
std::stringstream ss;
// print every char of the string as hex on 2 values
for (unsigned int i = 0; i < inputText.size (); ++i)
{
ss << std::hex << std::setfill ('0') << std::setw (2) << (int) inputText[i];
}
std::cout << ss.str() << std::endl;
}
but with some characters coded in UTF 8 it does't work.
For Instance, in strings containing the degrees symbol ( ° ) coded in UTF8, the result is: ffffffc2ffffffb0 instead of c2b0.
Now I would expect the algorithm to work on individual bytes regardless of their contents and furthermore the result seems to ignore the setw(2) parameter.
Why does I get such a result?
(run test program here)
As Pete Becker already hinted in a comment, converting a negative value to a larger integer fills the higher bits with '1'. The solution is to first cast the char to unsigned char before casting it to int:
#include <string>
#include <iostream>
#include <iomanip>
int main()
{
std::string inputText = "-12°C";
// print every char of the string as hex on 2 values
for (unsigned int i = 0; i < inputText.size(); ++i)
{
std::cout << std::hex << std::setfill('0')
<< std::setw(2) << (int)(unsigned char)inputText[i];
}
}
setw sets the minimal width, it does not truncate longer values.
Here is the code I have right now:
#include <iostream>
#include <string>
#include <sstream>
std::string string_to_hex(const std::string& input)
{
static const char* const lut = "0123456789ABCDEF";
size_t len = input.length();
std::string output;
output.reserve(2 * len);
for (size_t i = 0; i < len; ++i)
{
const unsigned char c = input[i];
output.push_back(lut[c >> 4]);
output.push_back(lut[c & 15]);
}
return output;
}
std::string encrypt(std::string msg, std::string key)
{
// Make sure the key is at least as long as the message
std::string tmp(key);
while (key.size() < msg.size())
key += tmp;
// And now for the encryption part
for (std::string::size_type i = 0; i < msg.size(); ++i)
msg[i] ^= key[i];
return msg;
}
std::string decrypt(std::string msg, std::string key)
{
return encrypt(msg, key); // lol
}
int main()
{
std::cout << string_to_hex(encrypt("Hello World!", "monkey")) << std::endl;
std::cout << decrypt("\x25\x0A\x02\x07\x0A\x59\x3A\x00\x1C\x07\x01\x58", "monkey") << std::endl;
std::cout << string_to_hex(encrypt("Hello. This is a test of encrypting strings in C++.", "monkey")) << std::endl;
std::cout << decrypt("\x25\x0A\x02\x07\x0A\x57\x4D\x3B\x06\x02\x16\x59\x04\x1C\x4E\x0A\x45\x0D\x08\x1C\x1A\x4B\x0A\x1F\x4D\x0A\x00\x08\x17\x00\x1D\x1B\x07\x05\x02\x59\x1E\x1B\x1C\x02\x0B\x1E\x1E\x4F\x07\x05\x45\x3A\x46\x44\x40", "monkey") << std::endl;
}
The output is the following:
250A02070A593A001C070158
Hello W
250A02070A574D3B06021659041C4E0A450D081C1A4B0A1F4D0A000817001D1B070502591E1B1C020B1E1E4F0705453A464440
Hello. This is a test of e
The decryption seems to stop when reaching a \x00. Does anyone have any ideas on how to fix or get around that?
Thanks!
The std::string constructor that takes in a char* assumes that the input is a null-terminated string, so even though your string literal has lots of data in it past the null byte, when you pass it into your function the std::string constructor will stop reading as soon as it hits that null byte.
You have a couple of options to fix this. As one option, the std::string type has a two-argument constructor where you can give a pointer to the first element in the string and the past-the-end byte of the string. The std::string will then initialize itself to the text in that range, ignoring intermediary null terminators.
char s1[] = "\x25\x0A\x02\x07\x0A\x59\x3A\x00\x1C\x07\x01\x58";
char s2[] = "\x25\x0A\x02\x07\x0A\x57\x4D\x3B\x06\x02\x16\x59\x04\x1C\x4E\x0A\x45\x0D\x08\x1C\x1A\x4B\x0A\x1F\x4D\x0A\x00\x08\x17\x00\x1D\x1B\x07\x05\x02\x59\x1E\x1B\x1C\x02\x0B\x1E\x1E\x4F\x07\x05\x45\x3A\x46\x44\x40";
std::cout << string_to_hex(encrypt("Hello World!", "monkey")) << std::endl;
std::cout << decrypt(std::string(std::begin(s1), std::end(s1)-1), "monkey") << std::endl;
std::cout << string_to_hex(encrypt("Hello. This is a test of encrypting strings in C++.", "monkey")) << std::endl;
std::cout << decrypt(std::string(std::begin(s2), std::end(s2)-1), "monkey") << std::endl;
Demo.
I'm trying to convert Unicode code points to percent encoded UTF-8 code units.
The Unicode -> UTF-8 conversion seems to be working correctly as shown by some testing with Hindi and Chinese characters which show up correctly in Notepad++ with UTF-8 encoding, and can be translated back properly.
I thought the percent encoding would be as simple as adding '%' in front of each UTF-8 code unit, but that doesn't quite work. Rather than the expected %E5%84%A3, I'm seeing %xE5%x84%xA3 (for the unicode U+5123).
What am I doing wrong?
Added code (note that utf8.h belongs to the UTF8-CPP library).
#include <fstream>
#include <iostream>
#include <vector>
#include "utf8.h"
std::string unicode_to_utf8_units(int32_t unicode)
{
unsigned char u[5] = {0,0,0,0,0};
unsigned char *iter = u, *limit = utf8::append(unicode, u);
std::string s;
for (; iter != limit; ++iter) {
s.push_back(*iter);
}
return s;
}
int main()
{
std::ofstream ofs("test.txt", std::ios_base::out);
if (!ofs.good()) {
std::cout << "ofstream encountered a problem." << std::endl;
return 1;
}
utf8::uint32_t unicode = 0x5123;
auto s = unicode_to_utf8_units(unicode);
for (auto &c : s) {
ofs << "%" << c;
}
ofs.close();
return 0;
}
You actually need to convert byte values to the corresponding ASCII strings, for example:
"é" in UTF-8 is the value { 0xc3, 0xa9 }. Please not that these are bytes, char values in C++.
Each byte needs to be converted to: "%C3" and "%C9" respectively.
The best way to do so is to use sstream:
std::ostringstream out;
std::string utf8str = "\xE5\x84\xA3";
for (int i = 0; i < utf8str.length(); ++i) {
out << '%' << std::hex << std::uppercase << (int)(unsigned char)utf8str[i];
}
Or in C++11:
for (auto c: utf8str) {
out << '%' << std::hex << std::uppercase << (int)(unsigned char)c;
}
Please note that the bytes need to be cast to int, because else the << operator will use the litteral binary value.
First casting to unsigned char is needed because otherwise, the sign bit will propagate to the int value, causing output of negative values like FFFFFFE5.
I have the following string: s="80". I need to put this in an
unsigned char k[]. The unsigned char should look like this: unsigned char k[]={0x38,0x34}, where 0x38=8 and 0x34=0 These are the hexadecimal values for 8 and 0. How to do this? Need some help!
Please give some code. Thx
I am working on ubuntu c++ code. THX!
I use this for an encryption! I need 0x38 in an unsigned char.PLEASE HELP! Need some code:)
EDIT:
HOW TO OBTAIN THE DEC/CHAR VALUE AND PUT IT IN AN unsigned char k[]?
I've realised that it's ok if in the unsigned char [] i have the dec values {56,52} of the 8 and 0 that i have in the string!
Assuming you want this string converted as ASCII (or UTF-8) it is already in the correct format.
std::string s="80";
std::cout << "0x" << std::hex << static_cast<int>(s[0]) << "\n";
std::cout << "0x" << std::hex << static_cast<int>(s[1]) << "\n";
If you want it in an int array, then just copy it:
int data[2];
std::copy(s.begin(), s.end(), data);
I think that no matter you store '8' or 0x39, they will be present as the same binary numbers by the computer.
I think you do not really understand what you are asking.
The following are synonyms:
std::string s = "\x38\x30";
std::string s = "80";
As the following are synonyms:
char c = '8', s = '0' ;
char c = s[0], s = s[1];
char c = 0x38, s = 0x30;
It is exactly the same (except if your base encoding is not ASCII). This is not an encryption.
std::string s = "80";
unsigned char * pArray = new unsigned char[ s.size() ];
const char * p = s.c_str();
unsigned char * p2 = pArray;
while( *p )
*p2++ = *p++;
delete []pArray;
You can try it. I did not write these codes. I found I like you
#include <algorithm>
#include <sstream>
#include <iostream>
#include <iterator>
#include <iomanip>
namespace {
const std::string test="mahmutefe";
}
int main() {
std::ostringstream result;
result << std::setw(2) << std::setfill('0') << std::hex << std::uppercase;
std::copy(test.begin(), test.end(), std::ostream_iterator<unsigned int>(result, " "));
std::cout << test << ":" << result.str() << std::endl;
system("PAUSE");
}
convert that string to a char array, then subrtact '0' from each char.