I am new to c++ and still trying to feel my way in. I have attempted to adapt a function I have found on SO to convert my string into bytes as I need:
void hexconvert(const char *text, unsigned char bytes[])
{
int i;
int temp;
for (i = 0; i < 4; ++i) {
sscanf(text + 2 * i, "%2x", &temp);
bytes[i] = temp;
}
cout << bytes;
}
hexconvert("SKY 000.001\n", );
Issues I have are:
1) I am not sure how to amend the for loop to handle my string.
2) I am not sure what I should use as the input for the second parameter in the function.
Can anyone assist?
Thanks
This is my suggested solution. I used it to encode a GUID as a byte array. It should achieve higher performance than having to do printf on all the characters.
typedef unsigned char byte;
std::map<char, byte> char2hex =
{
{'0', 0x0},
{'1', 0x1},
{'2', 0x2},
{'3', 0x3},
{'4', 0x4},
{'5', 0x5},
{'6', 0x6},
{'7', 0x7},
{'8', 0x8},
{'9', 0x9},
{'a', 0xa},
{'b', 0xb},
{'c', 0xc},
{'d', 0xd},
{'e', 0xe},
{'f', 0xf}
};
void convertToBytes(const string &chars, byte bytes[])
{
for (size_t i = 0; i < chars.length() / 2; i++) {
byte b1 = (byte)(char2hex[chars[2*i]] << 4);
byte b2 = char2hex[chars[2*i+1]];
byte f = b1 | b2;
*(bytes + i) = f;
}
}
Remember that two ascii characters make up one byte, so for every pair of characters, I have to convert the first character to byte, then shift it up by 4 bits, then or it with the next character to get one byte.
To print a string as bytes:
const size_t length = data.length();
for (size_t i = 0; i < length; ++i)
{
unsigned int value = data[i];
std::cout << std::dec << std::fill(' ') << value
<< " (0x" << std::setw(2) << std::setfill('0') << std::hex << value << ')'
<< "\n";
}
Some important rules to remember:
1. Copy the character into an integer type variable, so that cout doesn't print as character.
2. Bytes are unsigned.
3. When filling the width with 0 for hex, remember to reset it to space before printing decimal.
4. Use std::hex for printing in hexadecimal, and remember to reset it with std::dec afterwards (if you are printing in decimal afterwards).
See <iomanip>.
Edit 1: C-Style
To use the C language style:
static const char data[] = "Hello World!";
const size_t length = strlen(data);
for (size_t i = 0; i < length; ++i)
{
printf("%3d (0x%02X)\n", data[i], data[i]);
}
The above assumes that data is a character array, nul terminated.
Related
I have a string value:
string str = "2018";
Now I have to store in unsigned char array as hex representation but not really convert to hex:
unsigned char data [2]; //[0x20,0x18]
If I do it this way
data[0] = 0x20;
data[1] = 0x18;
It works, but my input is string, how I can resolve it?
Edit
If my input is unsigned char instead of string like
unsigned char y1 = 20;
unsigned char y2 = 18;
Is there any better way?.
A brief research made me find this function QString::toInt(bool&, int) which can be useful for your intent.
Basically you could:
if(str.size() % 2 == 1){
str = '0' + str;
}
for(int i = 0; i < str.size() / 2; i++){
data[i] = (str[2*i] + str[2*i+1]).toInt(res, 16);
}
I did not try this code, there surely a better way to extract the substring, and probably a more efficient way than to iterate over it.
Perhaps you could try something like this:
#include <iostream>
int main()
{
std::string s = "2018";
unsigned i;
std::sscanf(s.c_str(), "%04x", &i);
unsigned char data[2];
data[0] = i >> 8;
data[1] = i;
std::cout << std::hex << (int)data[0] << " " << (int)data[1] << std::endl;
return 0;
}
https://ideone.com/SyYKUl
Prints:
20 18
If you can assume the string to have 4 digits, you can convert it to BCD format simply and efficiently this way:
void convert_to_bcd4(unsigned char *data, const char *str) {
data[0] = (str[0] - '0') * 16 + (str[1] - '0');
data[1] = (str[2] - '0') * 16 + (str[3] - '0');
}
You can complete the conversion of "2018" to 0x20 0x18 using a hex string to binary converter. I think, for example, sscanf("%x",....) will do this. This typically gives an int. You can extract the byte values from the int in the normal way. (This method does not check for errors.)
How to convert char array to its equivalent ascii decimal array ? I tried in QT using QString and QbyteArray. But it is not working.
i/p: "1,9,10,2"
o/p: "4944574449484450" (1's ascii decimal value is 49,
, ascii decimal value is 44,
9's ascii decimal value is 57, and so on..).
How to convert char array to its equivalent ascii decimal array ?
You do not, it is already stored that way. If you want to print your array as decimal numbers for ASCII just output it accordingly (as integers):
char str[] = "1,9,10,2";
for( const char *s = str; *s; ++s ) std::cout << static_cast<int>( *s ) << ",";
// just output number instead of character
char in C++ is a numerical type similar to int, difference is std::ostream outputs char as a symbol and int and other integer types as integer.
You can easily do this way. It's C++ but it will work in C with proper includes and printf in place of std:cout.
int main() {
char input[] = "1,9,10,2";
char output[3*sizeof(input) + 1];
output[0] = '\0';
for (size_t i = 0; i < strlen(input); i++) {
sprintf(output, "%s%d", output, input[i]);
}
std::cout << input << std::endl;
std::cout << output << std::endl;
return 0;
}
Followed some hint from #Nathan, it should be better now...
This should work
char arr[] = {'1','9','2'};
std::vector<int> vec(size); //size of the char array - in this case 3
for(int i = 0; i < size; i++)
{
vec[i] = arr[i];
}
I have a string of 1s and 0s that i padded with enough 0s to make its length exactly divisible by 8. My goal is to convert this string to a number of bytes and order it in such a way that the first character i read is the least siginificant bit, then the next on is the next least siginificant, etc until i have read 8 bits, save that as a byte and the continue reading the string saving the next bit as as the least siginificant bit of the second byte.
As an example the string "0101101101010010" is length 16 so it will be converted into two bytes. The first byte should be "11011010" and the second byte should be "01001010".
I am unsure how to do this because it is not as simple as reversing the string (i need to maintain the order of these bytes).
Any help is appreciated, thanks!
You could iterate backwards through the string, but reversing it like you suggest might be easier. From there, you can just build the bytes one at a time. A nested for loop would work nicely:
unsigned char bytes[8]; // Make sure this is zeroed
for (int i=0, j=0; i<str.length(); j++) {
for (int k=0; k<8; k++, i++) {
bytes[j] >>= 1;
if (str[i] == '1') bytes[j] |= 0x80;
}
}
i is the current string index, j is the current byte array index, and k counts how many bits we've set in the current byte. We set the bit if the current character is 1, otherwise we leave it unset. It's important that the byte array is unsigned since we're using a right-shift.
You can get the number of bytes using the string::size / 8.
Then, it is just a matter of reversing the sub-strings.
You can do something like that:
for(int i=0; i<number_of_bytes; i++)
{
std::string temp_substr = original.substr(i*8,8);
std::reversed = string(temp_substr.rbegin(),temp_substr.rend()) // using reverse iterators
//now you can save that "byte" represented in the "reversed" string, for example using memcpy
}
Depends whether you want to expose it as a general purpose function or encapsulate it in a class which will ensure you have all the right constraints applied, such as all the characters being either 0 or 1.
#include <cstdint>
#include <string>
#include <algorithm>
#include <iostream>
static const size_t BitsPerByte = 8;
// Suitable for a member function where you know all the constraints are met.
uint64_t crudeBinaryDecode(const std::string& src)
{
uint64_t value = 0;
const size_t numBits = src.size();
for (size_t bitNo = 0; bitNo < numBits; ++bitNo)
value |= uint64_t(src[bitNo] - '0') << bitNo;
return value;
}
uint64_t clearerBinaryDecode(const std::string& src)
{
static const size_t BitsPerByte = 8;
if ((src.size() & (BitsPerByte - 1)) != 0)
throw std::invalid_argument("binary value must be padded to a byte size");
uint64_t value = 0;
const size_t numBits = std::min(src.size(), sizeof(value) * BitsPerByte);
for (size_t bitNo = 0; bitNo < numBits; ++bitNo) {
uint64_t bitValue = (src[bitNo] == '0') ? 0ULL : 1ULL;
value |= bitValue << bitNo;
}
return value;
}
int main()
{
std::string dead("1011" "0101" "0111" "1011");
std::string beef("1111" "0111" "0111" "1101");
std::string bse ("1111" "0111" "0111" "1101" "1011" "0101" "0111" "1011" "1111" "0111" "0111" "1101" "1011" "0111" "0111" "1111");
std::cout << std::hex;
std::cout << "'dead' is: " << crudeBinaryDecode(dead) << std::endl;
std::cout << "'beef' is: " << clearerBinaryDecode(beef) << std::endl;
std::cout << "'bse' is: " << crudeBinaryDecode(bse) << std::endl;
return 0;
}
I am trying to convert a "double" value (say 1.12345) to 8 byte hex string. I am using the following function to convert double value to hex string.
std::string double_to_hex_string(double d)
{
unsigned char *buffer = (unsigned char*)&d;
const int bufferSize = sizeof(double);
char converted[bufferSize * 2 + 1];
//char converted[bufferSize];
int j = 0;
for(int i = 0 ; i < bufferSize ; ++i)
{
sprintf(&converted[j*2], "%02X", buffer[i]);
++j;
}
string hex_string(converted);
return hex_string;
}
This function returns the 16 byte hex string. I then compress this string to fit into 8 bytes through this code
string hexStr = double_to_hex_string(TempD);
unsigned char sample[8];
for ( int i = 0; i < hexStr.length() / 2 ; i++)
{
sscanf( (hexStr.substr(i*2,2)).c_str(), "%02X", &sample[i]);
}
Now, how can I get the hex digits representing these 8 bytes in "sample" array. There should be only one hex digit per byte. I need to append this 8 byte hex string to a global string.
If there is any other solution which can convert a double value to 8 hex digits and vice versa, that would be highly appreciated.
Regards.
A hexidecimal digit represents half a byte, so if you are limited to 8 hex digits you are also limited to storing 4 bytes.
This solution will encode the number from a float, which is commonly 4 bytes.
std::string double_to_hex_string(double d)
{
// Create a stream that writes 2 digit hex values
std::stringstream stream;
stream << std::hex << std::setfill('0');
float f = d;
const unsigned char *buffer = reinterpret_cast<unsigned char*>( &f );
const unsigned char *buffer_end = buffer + sizeof(f);
// Write each byte as 2 character hex.
while ( buffer != buffer_end )
{
stream << std::setw(2) << static_cast<int>( *buffer );
++buffer;
}
return stream.str();
}
I want to convert the integer (whose maximum value can reach to 99999999) in to BCD and store in to array of 4 characters.
Like for example:
Input is : 12345 (Integer)
Output should be = "00012345" in BCD which is stored in to array of 4 characters.
Here 0x00 0x01 0x23 0x45 stored in BCD format.
I tried in the below manner but didnt work
int decNum = 12345;
long aux;
aux = (long)decNum;
cout<<" aux = "<<aux<<endl;
char* str = (char*)& aux;
char output[4];
int len = 0;
int i = 3;
while (len < 8)
{
cout <<"str: " << len << " " << (int)str[len] << endl;
unsigned char temp = str[len]%10;
len++;
cout <<"str: " << len << " " << (int)str[len] << endl;
output[i] = ((str[len]) << 4) | temp;
i--;
len++;
}
Any help will be appreciated
str points actually to a long (probably 4 bytes), but the iteration accesses 8 bytes.
The operation str[len]%10 looks as if you are expecting digits, but there is only binary data. In addition I suspect that i gets negative.
First, don't use C-style casts (like (long)a or (char*)). They are a bad smell. Instead, learn and use C++ style casts (like static_cast<long>(a)), because they point out where you are doing things that are dangeruos, instead of just silently working and causing undefined behavior.
char* str = (char*)& aux; gives you a pointer to the bytes of aux -- it is actually char* str = reinterpret_cast<char*>(&aux);. It does not give you a traditional string with digits in it. sizeof(char) is 1, sizeof(long) is almost certainly 4, so there are only 4 valid bytes in your aux variable. You proceed to try to read 8 of them.
I doubt this is doing what you want it to do. If you want to print out a number into a string, you will have to run actual code, not just reinterpret bits in memory.
std::string s; std::stringstream ss; ss << aux; ss >> s; will create a std::string with the base-10 digits of aux in it.
Then you can look at the characters in s to build your BCD.
This is far from the fastest method, but it at least is close to your original approach.
First of all sorry about the C code, I was deceived since this started as a C questions, porting to C++ should not really be such a big deal.
If you really want it to be in a char array I'll do something like following code, I find useful to still leave the result in a little endian format so I can just cast it to an int for printing out, however that is not strictly necessary:
#include <stdio.h>
typedef struct
{
char value[4];
} BCD_Number;
BCD_Number bin2bcd(int bin_number);
int main(int args, char **argv)
{
BCD_Number bcd_result;
bcd_result = bin2bcd(12345678);
/* Assuming an int is 4 bytes */
printf("result=0x%08x\n", *((int *)bcd_result.value));
}
BCD_Number bin2bcd(int bin_number)
{
BCD_Number bcd_number;
for(int i = 0; i < sizeof(bcd_number.value); i++)
{
bcd_number.value[i] = bin_number % 10;
bin_number /= 10;
bcd_number.value[i] |= bin_number % 10 << 4;
bin_number /= 10;
}
return bcd_number;
}