I have created a function to count the byte length of an incoming hex string, then convert that length into hexidecimal. It first assigns the Byte Length of the incoming string to an int, then I convert the int to a string. After assigning the byte length of my incoming string to an int, I check to see if it is more than 255, if it is, I insert a zero so that I have 2 bytes returned, instead of 3-bits.
I do the follwing:
1) Takes in the Hex string and divides the number by 2.
static int ByteLen(std::string sHexStr)
{
return (sHexStr.length() / 2);
}
2) Takes in Hex string, then converts to a Hex format string with itoa()
static std::string ByteLenStr(std::string sHexStr)
{
//Assign the length to an int
int iLen = ByteLen(sHexStr);
std::string sTemp = "";
std::string sZero = "0";
std::string sLen = "";
char buffer [1000];
if (iLen > 255)
{
//returns the number passed converted to hex base-16
//if it is over 255 then it will insert a 0 infront
//so to have 2 bytes instead of 3-bits
sTemp = itoa (iLen,buffer,16);
sLen = sTemp.insert(0,sZero);
return sLen;
}
else{
return itoa (iLen,buffer,16);
}
}
I convert the length to hexidecimal. This seems to work fine, however I am looking for maybe a more simpler way to format the text like I would in C# with the ToString("X2") method. Is this it for C++ or does my method work well enough?
Here is how I would do it in C#:
public static int ByteLen(string sHexStr)
{
return (sHexStr.Length / 2);
}
public static string ByteLenStr(string sHexStr)
{
int iLen = ByteLen(sHexStr);
if (iLen > 255)
return iLen.ToString("X4");
else
return iLen.ToString("X2");
}
My logic may be off a bit in C++, but the C# method is good enough for me in what I want to do.
Thank you for your time.
static std::string ByteLenStr(std::string& sHexStr)
{
int iLen = ByteLen(sHexStr);
char buffer[16];
snprintf(buffer, sizeof(buffer), (iLen > 255) ? "%04x" : "%02x", iLen);
return buffer;
}
snprintf formats text in a buffer using a format string and a variable list of arguments. We are using the %x format code to convert a int argument into a hex string. In this instance, we have two format strings to choose from:
When iLen > 255, we want the number to be four digits long. %04x means format as a hex string, with zero-padding at the beginning up to four places.
Otherwise, we want the number to be two digits long. %02x means format as a hex string, with zero-padding up to two places.
We use the ternary operator to select which format string we use. Finally, iLen is passed as the single argument which will be used to provide the value that is formatted by the function.
For a purely C++ solutuon that does not use any C functions, try using a std::stringstream to help you with formatting:
static std::string ByteLenStr(std::string sHexStr)
{
//Assign the length to an int
int iLen = ByteLen(sHexStr);
//return the number converted to hex base-16
//if it is over 255 then insert a 0 in front
//so to have 2 bytes instead of 3-bits
std::stringstream ss;
ss.fill('0');
ss.width((iLen > 255) ? 4 : 2);
ss << std::right << std::hex << iLen;
return ss.str();
}
Related
I am looking for a solution how to get a filled spot in the string when I send the 0x00 value.
as an example, by sending the value hexToASCII("6765656b73") I get the string "geeks".
But when I send the value hexToASCII("0065006b73") I get only the string "eks", the 00 do not even count.
Is there something I can add to fill the spots in the string, which has the same value as ASCII NUL?
Thanks a lot in advance..
// C++ program to convert hexadecimal
// string to ASCII format string
#include <bits/stdc++.h>
using namespace std;
string hexToASCII(string hex)
{
// initialize the ASCII code string as empty.
string ascii = "";
for (size_t i = 0; i < hex.length(); i += 2)
{
// extract two characters from hex string
string part = hex.substr(i, 2);
// change it into base 16 and
// typecast as the character
char ch = stoul(part, nullptr, 16);
// add this char to final ASCII string
ascii += ch;
}
return ascii;
}
// Driver Code
int main()
{
// print the ASCII string.
cout << hexToASCII("6765656b73") << endl;
return 0;
}
QByteArray ba;
QDataStream ds(&ba,QIODevice::WriteOnly);
ds<<quint8(1)<<quint16(2)<<quint32(3); //1+2+4
qDebug()<<"size:"<<ba.size(); // 7
I use QDataStream to write 3 number, ba.size() is 7, but I'm confused about this:
QByteArray ba;
QDataStream ds(&ba,QIODevice::WriteOnly);
QString s="a";
ds<<quint8(1)<<quint16(2)<<quint32(3)<<s; //1+2+4+a
qDebug()<<"size:"<<ba.size(); // 13
If a QString's size is 1, ba's size plus 6, why is that? sizeof(QString) is 4.
Let's analyze the difference between both impressions:
"\x01\x00\x02\x00\x00\x00\x03"
"\x01\x00\x02\x00\x00\x00\x03\x00\x00\x00\x02\x00""a"
-----------------------------------------------------
x00\x00\x00\x02\x00""a
And for that, let's review the source code:
QDataStream &operator<<(QDataStream &out, const QString &str)
{
if (out.version() == 1) {
out << str.toLatin1();
} else {
if (!str.isNull() || out.version() < 3) {
if ((out.byteOrder() == QDataStream::BigEndian) == (QSysInfo::ByteOrder == QSysInfo::BigEndian)) {
out.writeBytes(reinterpret_cast<const char *>(str.unicode()), sizeof(QChar) * str.length());
} else {
QVarLengthArray<ushort> buffer(str.length());
const ushort *data = reinterpret_cast<const ushort *>(str.constData());
for (int i = 0; i < str.length(); i++) {
buffer[i] = qbswap(*data);
++data;
}
out.writeBytes(reinterpret_cast<const char *>(buffer.data()), sizeof(ushort) * buffer.size());
}
} else {
// write null marker
out << (quint32)0xffffffff;
}
}
return out;
}
That method uses the writeBytes() method,
and according to the docs:
QDataStream &QDataStream::writeBytes(const char *s, uint len)
Writes the length specifier len and the buffer s to the stream and
returns a reference to the stream.
The len is serialized as a quint32, followed by len bytes from s. Note
that the data is not encoded.
That is, apart from writing the data, write the length of the text in quint32 format (4 bytes) and the length of the buffer is equal to sizeOf(QChar) x length of the QString.
Taking into account in it we can understand the result better:
x00\x00\x00\x02 \x00""a
--------------- -------
numbers of bytes of buffer buffer
In general you can use the following formula to calculate the size of the stored data:
length stored data = 4 + 2 x length of string
By checking Qt Documentation for QDatastream, how strings are stored and retreived:
a char * string is written as a 32-bit integer equal to the length of
the string including the '\0' byte, followed by all the characters of
the string including the '\0' byte. When reading a char * string, 4
bytes are read to create the 32-bit length value, then that many
characters for the char * string including the '\0' terminator are
read.
So in your case 32 bit for length of the string + 1 Byte for "a" + 1 byte for \0, which sums to 6 bytes.
I want to print MD5 for some string. For this I have done the the function
std::string generateHashMD5(std::string text)
{
unsigned char * resultHash;
resultHash = MD5((const unsigned char*)text.c_str(), text.size(), NULL);
std::string result;
result += (char *) resultHash;
return result;
}
Mow I want to print the result of this function. I try to version of such function.
void printHash(std::string hash)
{
for (unsigned i = 0; i < str.size(); i++)
{
int val = (short) hash[i];
std::cout<<std::hex<<val<<':';
}
std::cout<<std::endl;
}
std::string printHash(std::string hash)
{
char arrayResult[200];
for(int i = 0; i < 16; i++)
sprintf(&arrayResult[i*2], "%02x", (unsigned short int)hash[i]);
std::string result;
result += arrayResult;
return result;
}
The problem is that unfortunately none of it does not show correct result. What should be changed in this function or where is the mistakes?
You improperly use std::string as a buffer:
result += (char *) resultHash;
treats resultHash as a c-string, so if there is \0 byte in middle it would not get enough data. If there is no \0 byte you would copy too much and get UB. You should use constructor with size:
std::string result( static_cast<const char *>( resultHash ), blocksize );
where block size probably is 16. But I would recommend to use std::array<uint8_t,blocksize> or std::vector<uint8_t> instead os std::string, as std::string for buffer is very confusing.
in case if MD5 returns byte array
result += (char *) resultHash;
return result;
conversion to string will lose numbers after 0 because string constructor interprets input as null-terminated string
so vector can be used or string construction with explicit number of characters.
Still, there are not enough information to say exactly
What is the easiest way to convert string holding octal number into string holding decimal representation of the same number?
I could convert it into int value using strtol, then convert into string again using stringstream:
string oct_number("203");
// converts into integer
int value = strtol(oct_number.c_str(), NULL, 8);
// converts int to decimal string
stringstream ss;
ss << value;
string dec_number = ss.str();
But: is there any quicker way to do it?
I have a rather poor understanding of the stringstream class, am I missing something?
std::string result;
Try
{
result = std::to_string( std::stoi( oct_number, 0, 8 ) );
}
catch ( ... )
{
//...
}
In c++11 you can use string dec_number = to_string (value);.
Also you can use sprintf but you will need some buffer:
char buf[64];
sprintf(buf,"%d",value);
string dec_number = buf;
In C++, I have an AES-encrypted char* which I transform to its HEX representation before sending it as an URL parameter, just as it is done in this question. Now, I want to do the opposite, that is converting such hex back to char* again. However I am puzzled here - using the sprintf with either %x or %s would result in a totally different value. How could I convert it back again? Thanks...
You can use sscanf() like this:
#define LEN 16 /* 128/8 */
void aes_to_char(char *aes, char *res)
{
int i;
for (i = 0; i < LEN; i++) {
sscanf(aes, "%2hhx", &res[i]);
aes += 2;
}
}
"%2hhx" means "a 2-chars hexadecimal value, to be stored in a char *.