QString samp_buff[100];
QByteArray data;
uint8_t speed;
samp_buff[3] = data.toHex(); //I converted the QByteArray into a string
qDebug() << "read_every_data_"<< samp_buff[3];
speed = samp_buff[3].toUInt(); //Trying to convert the string to uint8_t
qDebug() << "Converted to UINT8" << speed;
Hi! I successfully got the Qbytearray value (data) stored in as a QString in the samp_buff array of strings, and also during the conversion of QString to uint8_t in the form of hex.
Data: "\x07" //QByteArray
read_every_data_ "07" //QString
Converted to UINT8 7 //Uint8_t
Its working fine for this but the problem arises when this happens.
Data: "\x0B" //QByteArray
read_every_data_ "0b" //QString
Converted to UINT8 0 //Uint8_t
Whenever the hex string has alphabets in it, the result of the conversion becomes zero.
As the documentation of QString::toUint suggests, the signature of the function looks like this.
uint QString::toUInt(bool *ok = nullptr, int base = 10) const
The second argument base is for specifying the base. To convert from hex strings, supply 16 to it.
speed = samp_buff[3].toUInt(nullptr, 16);
Related
I have a QByteArray, inside the QByteArray are multiple values of diferent datatypes which I want to extract. The difficulty by that is, that the values are in an defined length of x-bits, and they have a defined start position (also in bits).
eg: an int8 is stored inside the QByteArray from bit nr. 4 (of the first byte) to bit nr. 12 (inside the second byte).
On the Qt-wiki side i found a method to disassemble a byteArray into a bit array: https://wiki.qt.io/Working_with_Raw_Data
So i'm cutting my bits out of the byte array like that:
QByteArray MyCutter::cutMessage(qint32 bitStart, qint32 bitLength)
{
qDebug() << mBuffer;
QBitArray bits(mBufferLen * 8);
for(quint32 i=0; i<mBufferLen; ++i)
{
for(quint32 b=0; b<8; b++)
{
bits.setBit(i*8 +b, mBuffer.at(i) & (1<<(7-b)));
}
}
qDebug() << bits;
QByteArray bytes;
//round up to the next n*8 length of a byte: (length + y) = x*8
qint32 bitLengthWithZeros = (bitLength + (8 - 1)) & ~(8 - 1);
bytes.resize(bitLengthWithZeros/8);
bytes.fill(0);
for(qint32 b=bitStart ,c=0; b<(bitStart + bitLength); b++, c++)
{
bytes[c/8] = (bytes.at(c/8) | ((bits.testBit(b)?1:0)<<(7-b%8)));
}
qDebug() << bytes;
return bytes.data();
}
This is working fine so far - I can cut my ByteArray into any other.
The Problem is to convert the values into int/float/doubles, and to be more specific into signed values.
To convert i've tried two things:
QByteArray::toHex().toLong(nullptr, 16) ... toLong/toLongLong etc. This is working, but only returns me the UNSIGNED value of the QByteArray. If I'm cutting the mBuffer with the function MyCutter::cutMessage, like the excample above, from the 4. bit to the 12. (which is also 0xFF) im getting 255 as signed int! And that's wrong?
On the other side I've tried to convert it with QStremData:
QDataStream stream(mBuffer);
stream.setByteOrder(QDataStream::LittleEndian);
qint64 result;
stream >> result;
qDebug() << QString::number(result,16);
qDebug() << QString::number(result);
mBuffer are the raw data. If im putting "\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF" insde mBuffer, the printed value is -1 which is correct.
QByteArray h = cutMessage(0,8);
qDebug() << h.toHex().toLongLong(nullptr, 16);
QDataStream stream2(h);
stream2.setByteOrder(QDataStream::LittleEndian);
qint64 result2;
stream2 >> result2;
qDebug() << QString::number(result2,16);
qDebug() << QString::number(result2);
Converting the cut-Message with the code block above is always returning me "0".
So without cutting, the interpretation of the whole QByteArray is correct, but if I'am cutting something off it's returning me always the unsigned value, or "0".
Somehow I'am loosing some information while the transformation into QBitArray and vice versa.
Hopefully my explanations are understandable ;)
This have been troubled me for days and I really have no clue so to trouble Y'all.
I have an input in Qt to get any number from the user input.
After the user input the number, I get the number and convert it into hexadecimal (base 16) after I convert it, if the hexadecimal is more than 1 bytes, I will split it into 1 bytes size. I have done all the conversion and splitting.
Now my problem is , after I convert and split, the hexadecimal stays in the data type of QString, but in order to sent inside the QBtyeArray, I need convert back to int.
Can you guys tell me is there any convenient way to convert QString back to the int? I have tried a lot of ways but all of the conversion give me the value of base 10, I want the value to be in base 16 but in int.
Example : The user input 10800 (base 10) in the lineEdit , I retrieve the 10800 from lineEdit and after that I convert it to a base 16, so the hexadecimal of 10800 is 2A30 , after I do the conversion , the value 2A30 is in string , may I know how to convert the type into int but the value still stays as 2A30 but not convert back to 10800 (base 10).
The closest answer I get is through this method
unsigned int value = QString("0x2A").toUInt(&ok, 16);
int abcde = sprintf(abc, "%x", value);
qDebug()<<QString::number(value);
QByteArray test_a = abc;
but either it returns me QByteArray or char, I want it in int, because I need to specify each bytes I send in writeDatagram() functions like this.
QByteArray datagram(4, '\x000');
datagram[0] = 0x02;
datagram[1] = 0x10;
datagram[2] = 0x00;
datagram[3] = 0x00;
Please tell me if my question is not clear enough. I'm using Qt 5.2.1
Thanks !!!
Why do you do the conversion etc.? I presume that you want to send an arbitrarily long integers in, effectively, base 256, i.e. one byte and at a time, instead of, say one decimal digit at a time. I also assume you want to represent them big endian, i.e. the most significant base-256 digit comes first. Say if you had to send 12384928, it'd be sent as bytes 188, 250, 160 (0xbc, 0xfa, 0xa0).
That's pretty easy to do:
QByteArray numberToBytes(const QString &number) {
QByteArray result;
bool ok = false;
auto value = number.toLongLong(&ok);
if (ok) {
int n = sizeof(value);
while (value && n--) {
result.append(quint8(value & 0xFF));
value = value >> 8;
}
std::reverse(result.begin(), result.end());
}
return result;
}
QString bytesToNumber(const QByteArray &bytes) {
qlonglong value = 0;
for (auto b : bytes)
value = (value << 8) | quint8(b);
return QString::number(value);
}
void test() {
Q_ASSERT(sizeof(qlonglong) == 8);
Q_ASSERT(numberToBytes("256") == QByteArray::fromRawData("\x01\x00", 2));
Q_ASSERT(numberToBytes("2134789") == QByteArray::fromRawData("\x20\x93 \x05", 3));
Q_ASSERT(numberToBytes("-58931") == QByteArray::fromRawData("\xFF\xFF\xFF\xFF\xFF\xFF\x19\xCD");
}
You might also consider numbers too long to fit in 8 bytes. Those require a slightly more involved radix change operation - after all, you don't really want to be doing repetitive long divisions. See this page for details.
But it really looks as if you want to simply send strings in datagrams. If you wish to append a checksum (here: CCITT CRC-16) to the data, that's not hard either, because Qt does it for us:
QByteArray serialize(const QString &str, bool withCRC = false) {
QByteArray result;
QDataStream ds(&result, QIODevice::WriteOnly);
ds << str;
if (withCRC) ds << qChecksum(result.constData(), result.size());
return result;
}
QString deserialize(const QByteArray &packet, bool withCRC = false) {
QString result;
QDataStream ds(packet);
ds >> result;
if (withCRC) {
quint16 crc;
ds >> crc;
crc ^= qChecksum(packet.data(), packet.size() - 2);
if (crc) return {};
}
return result;
}
The format of the datagram is as follows: length of the string (4 bytes), followed by each character in the string (2 bytes each - it's a QChar). The optional CRC is another 2 bytes. That's all there's to that.
If the string has only ASCII characters, then sending the UTF-8 representation will take half the space if the string is long:
QByteArray serialize(const QString &str, bool withCRC = false) {
QByteArray result;
QDataStream ds(&result, QIODevice::WriteOnly);
ds << str.toUtf8();
if (withCRC) ds << qChecksum(result.constData(), result.size());
return result;
}
QString deserialize(const QByteArray &packet, bool withCRC = false) {
QByteArray result;
QDataStream ds(packet);
ds >> result;
if (withCRC) {
quint16 crc;
ds >> crc;
crc ^= qChecksum(result.constData(), result.size());
if (crc) return {};
}
return QString::fromUtf8(result);
}
Perhaps you tried to make the string smaller by knowing ahead of time that it is a number, and thus representing it optimally. How long do you expect those strings to be, and what is your limit on the datagram size?
I am wondering what the most efficient way would be to convert a binary that is saved as a QString into the corresponding Hex and save it in the same QString
QString value = "10111100"
into
value = "bc"
It's simple. First convert your binary string to an integer:
QString value = "10111100";
bool fOK;
int iValue = value.toInt(&fOk, 2); //2 is the base
Then convert the integer to hex string:
value = QString::number(iValue, 16); //The new base is 16
I have a method that returns an unsigned char * array and I am trying to encode this as base64 and decode it later. So what I am doing is as follows:
unsigned char * val = myMethod();
char * encodedMsg = reinterpret_cast<char *>(val);
std::cout << "Returned message: " << val << std::endl;
QByteArray raw = QByteArray(encodedMsg).toBase64(QByteArray::Base64Encoding | QByteArray::OmitTrailingEquals);
The output from the method is \u0001z\ri!i, and the encoded value is XHUwMDAxelxyaSFpLA.
Now, I decode it as follows:
QByteArray decoded = QByteArray::fromBase64(raw,
QByteArray::Base64Encoding | QByteArray::OmitTrailingEquals);
qDebug() << decoded;
Now this returns \\u0001z\\ri!i,. Notice that the slashes have been escaped.
I know I can replace this as a post processing step but is there a way to avoid this. Perhaps I am using the encoding/decoding incorrectly?
Try like below,
Base 64 Encode :
unsigned char* Ciphertext = Encypt();
QByteArray qByteArray((char*)(Ciphertext),(int)strlen((char *)Ciphertext));
qDebug()<<"Ciphertext : "<< qByteArray.toBase64();
Output :
Ciphertext : "WLdC+ri94z940BiAven6qXQH6rPbE64nQXt5aFByLGPFh1n//tOGLG02zBSoZ79qMA"
Base 64 Decode :
QString data = QString::fromLatin1(QByteArray::fromBase64(Base64QByteArray.toBase64()).data());
in my case i used this base64 conversion for AES encyption, make it shoultable for yours.
I have a code as below
QByteArray bla("abcde");
QDataStream ds(bla.right(bla.size()-1));
QChar c;
ds>>c;
qDebug()<<c; // It prints '?' instead of 'b'
It prints out b if I change the code as
qint8 c;
ds>>c;
qDebug()<<QChar(c); // It now prints 'b'.
It's ok for a single character suppose, I have a lot of characters then I need to make a loop and cast every single of them . Please suggest a good approach.
ds>>c; equals ds>>c.unicode();, which has type ushort &. While QByteArray contains chars.
The correct way to converting QBytaArray to a sequence of QChar would be:
QByteArray bla("abcde");
QTextCodec *codec = QTextCodec::codecForLocale();
const QString string = codec->toUnicode(bla);
foreach (const QChar &c, string) {
qDebug() << c;
}