I have a textedit where I key in the hex number, and then this text to be converted into QByteArray.
This is my code:
QByteArray parsedValue = QByteArray::fromHex(expectedPacketStr.toUtf8());
qDebug() << parsedValue;
when I set it to 001102,
then console log reports "\x00\x11\x02" which is what I expected.
But if i set it to 001122,
the console logs reports "\x00\x11\"" which is missing the x22 byte.
I really cannot understand what's going on. Anybody have any clue why is this so???
0x22 is the character " in ascii, so it's just qDebug() that is interpreting it, and nothing is missing inside QByteArray.
To convince you, you can always display the array one by one:
for (auto b : parsedValue)
qDebug() << (int)b;
Related
I've replicated this in two places in my code, one written by me and the one I'm posting an image of, that was written by someone else. I can't get the base64 to output to qDebug at all. I thought base64 was supposed to be readable. It has a size. But it won't print the entire qDebug line.
Thanks in advance for any help.
Here's the code. I'm on Qt Kit 5.12.1 64 bit mingw in release.
QFile* file = new QFile("C:\\Qr-Pic\\Poll_Directory\\IMG_00000001 - Copy (53).jpg");
file->open(QIODevice::ReadOnly);
QByteArray image = file->readAll();
int originalSize = image.length();
QString encoded = QString(image.toBase64());
int encodedSize = encoded.size();
qDebug() << "encodedSize=" << encodedSize;
qDebug() << "encode=" << encoded;
Output:
encodedSize= 34036
I think I’m a kind of at a loss here. I trying such a simple thing, I can’t believe that there is nothing build-in Qt (using Qt 5.6.2). I try to convert the data inside a QByteArray from big endian to little endian. Always starts with the same test QByteArray like this.
QByteArray value;
value.append(0x01);
value.append(0x02);
value.append(0x03);
qDebug() << "Original value is: " << value.toHex(); // “010203” like expected
What I need is the little endian, which means the output should be “030201”. Is there any build in thing so far in the Qt Framework? I don’t find one. What I tried so far
// Try with build in QtEndian stuff
QByteArray test = qToLittleEndian(value);
qDebug() << "Test value is: " << test.toHex(); // 010203
// Try via QDataStream
QByteArray data;
QDataStream out(&data, QIODevice::ReadWrite);
out.setByteOrder(QDataStream::LittleEndian);
out << value;
qDebug() << "Changed value is: " << data.toHex(); // "03000000010203"
Any good idea? Or do I really need to shift hands by hand? Found nothing helpfull on SO or on google or maybe ask the wrong question...
It sounds like you want to reverse the array, rather than manipulate the endianness of any of the multi-byte types inside the array. The standard library has a solution for this:
QByteArray arr;
std::reverse(arr.begin(), arr.end());
// arr is now in reversed order
I've just noticed something when using QNetworkReply that I was unable to find the slightest hint in the Qt documentation for QIODevice::readAll() (which the QNetworkReply inherits this method from).
Here is what the documentation states:
Reads all remaining data from the device, and returns it as a byte
array.
This function has no way of reporting errors; returning an empty
QByteArray can mean either that no data was currently available for
reading, or that an error occurred.
Let's say I have the following connection:
connect(this->reply, &QIODevice::readyRead, this, &MyApp::readyReadRequest);
Ths readyReadRequest() slot looks like this:
void MyApp::readyReadRequest()
{
LOG(INFO) << "Received data from \"" << this->url.toString() << "\"";
LOG(INFO) << "Data contents:\n" << QString(this->reply->readAll());
this->bufferReply = this->reply->readAll();
}
The surprise came after I called this->bufferReply (which a QByteArray class member of MyApp). I passed it to a QXmlStreamReader and did:
while (!reader.atEnd())
{
LOG(DEBUG) << "Reading next XML element";
reader.readNext();
LOG(DEBUG) << reader.tokenString();
}
if (reader.hasError())
{
LOG(ERROR) << "Encountered error while parsing XML data:" << reader.errorString();
}
Imagine my surprise when I got the following output:
2017-10-17 16:12:18,591 DEBUG [default] [void MyApp::processReply()][...] Reading next XML element
2017-10-17 16:12:18,591 DEBUG [default] [void MyApp::processReply()] [...] Invalid
2017-10-17 16:12:18,591 ERROR [default] Encountered error while parsing XML data: Premature end of document
Through debugging I got that my bufferReply at this point is empty. I looked in the docs again but couldn't find anything that hints removing the data from the device (in my case the network reply) after reading it all.
Removing the line where I print the byte array or simply moving it after this->bufferReply = this->reply->readAll(); and then printing the contents of the class member fixed the issue:
void MyApp::readyReadRequest()
{
LOG(INFO) << "Received data from \"" << this->url.toString() << "\"";
this->bufferReply = this->reply->readAll();
LOG(INFO) << "Data contents:\n" << QString(this->bufferReply);
}
However I would like to know if I'm doing something wrong or is the documentation indeed incomplete.
Since readAll() doesn't report errors or that data is not available at the given point in time returning an empty byte array is the only thing that hints towards the fact that something didn't work as intended.
Yes. When you call QIODevice::readAll() 2 times, it is normal that the 2nd time you get nothing. Everything has been read, there is nothing more to be read.
This behavior is standard in IO read functions: each call to a read() function returns the next piece of data. Since readAll() reads to the end, further calls return nothing.
However, this does not necessarily means that the data has been flushed. For instance when you read a file, it just moves a "cursor" around and you can go back to the start of the file with QIODevice::seek(0). For QNetworkReply, I'd guess that the data is just discarded.
I know this title might sound confusing. I have a simple question which I haven't been able to solve yet.
Lets imagine I have a file, opening it with an Hex Editor shows it has two characters inside, say 0x1B and 0x00 (obviously unprintable). I'd like to take that as 1B00 in HEX, which is 6912 in DEC, as opposed to directly converting the characters which would be wrong and is what all other questions I saw asked. Well, thats the task I want to do here. Seems simple, but everything I've tried just does it wrong! Even though I am obviously opening the file in binary mode.
I have only managed to read the characters individually, and mess around a bit, but never do what I actually want, which is as simple as taking those 2 hex characters, interpreting them as an Hex Number, and then convert it to Decimal.
Sorry for any unclear idea, Im not a native speaker. Any help will be appreciated, Im sure you'll think this was quite a noobish question :P
EDIT: Sorry, apparently I didn't explain myself properly. I know this might seem abstract, but it is a really concrete little thing which I have struggled to get solved, yet I haven't been able. Maybe I can ask it another way:
How can I scan a character in binary mode, lets say 0x1B, and convert that to actual 1B characters. Just that.
Sounds like you want to read the file as raw data, and then display it on the screen in decimal? Super easy!
int main() {
std::ifstream myfile("filename.data", std::ofstream::binary);
uint16_t number;
char* buffer = (char*)(&number);
while(myfile.read(buffer, sizeof(number))) {
std::cout << number << ' ';
}
}
The reason it's so easy is that there's no hexidecimal involved. The file is saved as a series of bytes, each byte holds one of 256 values. They aren't hex, they're just a series of values. If you read two bytes into the uint16_t, that is the easiest way to interpret two bytes as a single unsigned 2 byte value. And streaming out a uint16_t will, by default, display that value in decimal. There's no hexidecimal involved. The hexidecimal you saw in the hex editor was because a hex editor interprets the bytes as hex values.
If all you want to do is print a number in hexadecimal form, use std::hex
int i = 0x1B;
std::cout << std::hex << i << std::endl;
std::ifstream infile("test.bin", std::ofstream::binary);
while (true)
{
char c1 = ifs.get();
if (!infile.good())
{
break;
}
char c2 = ifs.get();
if (!infile.good())
{
break;
}
int num = (int)c1 |((int)c2 << 8);
// if you need the oppisite order then
// int num = (int)c2 &((int)c1 << 8);
cout << num;
}
I have a QByteArray to store data received from a GPS, which is part binary and part ASCII. I want to know for debug proposals know what's being received, so I'm writing a qDebug like this:
//QByteArray buffer;
//...
qDebug() << "GNSS msg (" << buffer.size() << "): " << buffer;
And I get messages like this at console:
GNSS msg ( 1774 ): "ygnnsdgk...(many data)..PR085hlHJGOLH
(more data into a new line, which is OK because it is a new GNSS sentence and
probably has a \n at the end of each one) blablabla...
But suddenly I get a new print iteration. Data has not been erased yet, it has been appended. So new message size its for example 3204, bigger than the previous print obviously. But it prints exactly the same (but with the new size 3204 between brackets). No new data is printed, just the same as the previous message had:
GNSS msg ( 3204 ): "ygnnsdgk...(many data)..PR085hlHJGOLH
(more data into a new line, which is OK because it is a new GNSS sentence and
probably has a \n at the end of each one) blablabla...
I guess qDebug stops printing because it has a limit, or because it reaches a terminating character or something like that, but I'm only guessing.
Any help or explanation for this behaviour?
Solution / workaround:
Indeed, the qDebug() output of QByteArray gets truncated at a '\0' character. This doesn't have something to do with the QByteArray; you even can't ever output a '\0' character using qDebug(). For an explanation see below.
QByteArray buffer;
buffer.append("hello");
buffer.append('\0');
buffer.append("world");
qDebug() << "GNSS msg (" << buffer.size() << "): " << buffer;
Output:
GNSS msg ( 11 ): "hello
Even any following arguments are ignored:
qDebug() << "hello" << '\0' << "world";
Output:
hello
You can work around this "problem" by replacing the special characters in your byte array before debugging them:
QByteArray dbg = buffer; // create a copy to not alter the buffer itself
dbg.replace('\\', "\\\\"); // escape the backslash itself
dbg.replace('\0', "\\0"); // get rid of 0 characters
dbg.replace('"', "\\\""); // more special characters as you like
qDebug() << "GNSS msg (" << buffer.size() << "): " << dbg; // not dbg.size()!
Output:
GNSS msg ( 11 ): "hello\0world"
So why is this happening? Why can't I output a '\0' using qDebug()?
Let's dive into the Qt internal code to find out what qDebug() does.
The following code snippets are from the Qt 4.8.0 source code.
This method is called when you do qDebug() << buffer:
inline QDebug &operator<<(const QByteArray & t) {
stream->ts << '\"' << t << '\"'; return maybeSpace();
}
The stream->ts above is of type QTextStream, which converts
the QByteArray into a QString:
QTextStream &QTextStream::operator<<(const QByteArray &array)
{
Q_D(QTextStream);
CHECK_VALID_STREAM(*this);
// Here, Qt constructs a QString from the binary data. Until now,
// the '\0' and following data is still captured.
d->putString(QString::fromAscii(array.constData(), array.length()));
return *this;
}
As you can see, d->putString(QString) is called (the type of d is the internal private class of the text stream), which calls write(QString) after doing some padding for constant-width fields. I skip the code of putString(QString) and directly jump into d->write(QString), which is defined like this:
inline void QTextStreamPrivate::write(const QString &data)
{
if (string) {
string->append(data);
} else {
writeBuffer += data;
if (writeBuffer.size() > QTEXTSTREAM_BUFFERSIZE)
flushWriteBuffer();
}
}
As you can see, the QTextStreamPrivate has a buffer. This buffer is of type QString. So what happens when the buffer is finally printed on the terminal? For this, we have to find out what happens when your qDebug() statement finishes and the buffer is passed to the message handler, which, per default, prints the buffer on the terminal. This is happening in the destructor of the QDebug class, which is defined as follows:
inline ~QDebug() {
if (!--stream->ref) {
if(stream->message_output) {
QT_TRY {
qt_message_output(stream->type, stream->buffer.toLocal8Bit().data());
} QT_CATCH(std::bad_alloc&) { /* We're out of memory - give up. */ }
}
delete stream;
}
}
So here is the non-binary-safe part. Qt takes the textual buffer, converts it to "local 8bit" binary representation (until now, AFAIK we should still have the binary data we want to debug).
But then passes it to the message handler without the additional specification of the length of the binary data. As you should know, it is impossible to find out the length of a C-string which should also be able to hold '\0' characters. (That's why QString::fromAscii() in the code above needs the additional length parameter for binary-safety.)
So if you want to handle the '\0' characters, even writing your own message handler will not solve the problem, as you can't know the length. Sad, but true.