I'm writing a simple TCP based network application in Qt and wanted to use QDataStream and QByteArray to send data over the network. The problem is that when I'm putting data into QByteArray they are "zeroed". For example (a slot in MainWindow that is connected to timer timeout signal):
void MainWindow::SendPlayer1Data(){
QByteArray block;
QDataStream s(&block, QIODevice::OpenModeFlag::ReadWrite);
QString h="hello";
s<<h;
qDebug() << "h: " << data;
qDebug() << "block: " << QString(block); // equivalent to s >> h on receiving end
qDebug() << "block[0]: " << int(block[0]);
}
h: "hello"
block: ""
block[0]: 0
I receive "hello" once at the beginning but after that I only get "". The same goes for qint32. Both client and server shows that QByteArray size is 14 bytes, so QDataStream writes data into that array, but it makes them 0 (it shows "" when I use s >> h and then use qDebug() << h)
The issue here is writing a QString directly to a stream that is expecting a QByteArray, consider the following
QByteArray block;
QDataStream s(&block, QIODevice::OpenModeFlag::ReadWrite);
QString h = "hello";
s << h;
qDebug() << block;
Which outputs
"\x00\x00\x00\n\x00h\x00""e\x00l\x00l\x00o"
So the data is there, it just isn't there how one might expect it. The easiest way to solve this is to create a QByteArray from a string encoded with UTF8 (or other encoding of your choice). This can trivially be done on the fly,
QByteArray block;
QDataStream s(&block, QIODevice::OpenModeFlag::ReadWrite);
QString h = "hello";
QByteArray data(h.toUtf8(), 5);
s << data;
qDebug() << block;
Which outputs
"\x00\x00\x00\x05hello"
Because when this QByteArray is sent through the QDataStream the length of the array and 3 NULL characters are prepended - the NULL characters are there in case the buffer is larger than a relatively small 5 (you can test that for yourself by passing a larger value - a small factor of 256 is most demonstrative - as the second parameter in the QByteArray constructor as this is the buffer length). But if you try to explicitly construct a QString (as s >> h does) from the NULL-commenced QByteArray it will create an empty string. To correct for this you can use QByteArray::remove() to remove the first 4 bytes like this
QByteArray block;
QDataStream s(&block, QIODevice::OpenModeFlag::ReadWrite);
QString h = "hello";
QByteArray data(h.toUtf8());
s << data;
qDebug() << QString::fromUtf8(block.remove(0, 4));
Which outputs
"hello"
Complete example
#include <qbytearray.h>
#include <qdatastream.h>
#include <qdebug.h>
int main() {
QByteArray block;
QDataStream s(&block, QIODevice::OpenModeFlag::ReadWrite);
QString h = "hello";
QByteArray data(h.toUtf8());
s << data;
qDebug() << QString::fromUtf8(block.remove(0, 4));
}
Ok, I have figured it out.
The problem was not the QByteArray or socket because, as #William Miller mentioned, the data was there.
The problem was with QDataStream on client side - I decided to create a new QDataStream object every time the slot responsible for receiving data was called. This way I was able to pack data easily into QByteArray, send it and receive every time.
The client function for receiving:
void ClientTcpHelper::ReceivePacket(){
if(socket.waitForReadyRead(20)){
//qDebug()<<"Packet: "<<socket.peek(30)<<endl;
qDebug()<<"Receiving packet!"<<endl;
Data=socket.readAll();
emit DataReceived();
}
else{
//qDebug()<<"Failed to receive packet!"<<endl;
}}
and unpacking data to variables:
void ClientTcpHelper::UnpackData(){
stream=new QDataStream (&this->Data,QIODevice::OpenModeFlag::ReadWrite);
*stream>>h>>a>>b;
Data.clear();
delete stream;}
h,a and b are members of a class.
Unfortunately I can not explain why QDataStream need to be destroyed every time here in order to handle data as I wanted it from the beginning.
Related
if you look in the documentation of QT. You can use readBytes or readRawBytes to read the binary data from a file. I am comfortable in any case, either reading the data from file or stream.
In case of readBytes - Reads the buffer s from the stream and returns a reference to the stream.
In case of readRawBytes - Reads L (length) bytes from the stream into s(buffer char*) and returns the number of bytes reads.
void readBinary(QString path)
{
QFile file(path);
if (!file.open(QIODevice::ReadOnly)) {
qDebug() << "Could not open bin file for reading";
return;
}
QDataStream in(&file);
char *data = new char[30];
qint32 bytes = in.readRawData(data, 30);
//in >> data;
qInfo() << "bytes read: " << bytes;
qInfo() << data;
file.close();
}
It shows no of bytes reads but not showing binary data on a screen.
What I am missing here?
Do we need to do serialization/de-serialization of binary data? In other words marshelling/un-marshelling of data. because I have read in official documentation encoding/decoding you need to take care of by own and need to set version of QT while reading/writing the data in the file.
How do we write back to the file/stream.
If we have another method to read/write the data directly from a file.
Do we need to write the whole binary data into buffer and then reads it again? This way we can maintain format of data.
Want some answers from you guys!
for your reference - consider the snippet of binary data in the file as mentioned below.
00000000: 0000 0520 a52a 0108 8520 0108 9320 0108 ... .*... ... ..
00000010: 9920 0108 9f20 0108 a520 0108 0000 0000 . ... ... ......
Links which I had followed on StackOverflow to resolve this issue are- Link1 , Link2, Link3, Link4
What I am missing here?
You are using raw pointers instead of QByteArray. And then you are trying to print binary data as a text.
...
QDataStream in(&file);
// You can use QByteArray which was created especially ti avoid raw char pointers
QByteArray ba(30, 0);
qint32 bytes = in.readRawData(ba.data(), ba.size());
qInfo() << "bytes read: " << bytes;
qInfo() << ba.toHex('\s'); // You read binary data, not a text. Do not try to print it as a text
Do we need to do serialization/de-serialization of binary data? In other words marshelling/un-marshelling of data. because I have read in official documentation encoding/decoding you need to take care of by own and need to set version of QT while reading/writing the data in the file.
Generally speaking, yes. Anyway you will have some serialization/de-serialization mechanism in your code. Qt suggest the mechanism out-of-the-box. You can forget about raw pointers, sizes, byte orders.
From the docs:
You don't have to set a version if you are using the current version
of Qt, but for your own custom binary formats we recommend that you
do; see Versioning in the Detailed Description.
QByteArray ba;
// fill it by some data
// Write to file
QDataStream out(&file); // output file, socket, etc.
out << ba;
// Read from file
QDataStream in(&file);
in >> ba;
How do we write back to the file/stream.
// Write to file
QDataStream out(&file); // output file, socket, etc.
out << ba;
If we have another method to read/write the data directly from a file.
Do we need to write the whole binary data into buffer and then reads it again? This way we can maintain format of data. Want some answers from you guys!
With QDataStream you can read/write directly the data structures you need, for example:
struct MyStruct
{
double d;
int a;
QString str;
QPixmap pix;
QVector<int> vec;
//overload the operators
friend QDataStream &operator << (QDataStream &out, const MyStruct &d)
{
out << d.d << d.a << d.str << d.pix << d.vec;
return out;
}
friend QDataStream &operator >> (QDataStream &in, MyStruct &d)
{
in >> d.d >> d.a >> d.str >> d.pix >> d.vec;
return in;
}
}
// Now you can:
MyStruct data;
out << data;
I have created an encrypt/decrypt program, when encrypting I store the encrypted QByteArray in a text file.
When trying to decrypt I retrieved it and then put it into the decryption method, the problem is that I need a way to convert it to QByteArray without changing the format, otherwise it will not decrypt properly. What I mean is if the file gave me an encrypted value of 1234 and I converted that to QByteArray by going 1234.toLatin1() it changes the value and the decryption does not work. Any suggestions?
My Code:
QFile file(filename);
QString encrypted;
QString content;
if (file.open(QIODevice::ReadOnly)) {
QTextStream stream( &file );
content = stream.readAll();
}
encrypted = content.replace("\n", "");
qDebug() << encrypted; // Returns correct encrypted value
QByteArray a;
a += encrypted;
qDebug() << "2 " + a; // Returns different value than previous qDebug()
QByteArray decrypted = crypto.Decrypt(a, key);
return decrypted;
I guess you should use:
QString::fromUtf8(const QByteArray &str)
Or:
QString::QString(const QByteArray &ba)
to convert QByteArray to QString, then write it into file by QTextStream.
After that, read file by QTextStream, use:
QString::toUtf8()
to convert QString to QByteArray.
QString::QString(const QByteArray &ba)
Constructs a string initialized with the byte array ba. The given byte array is converted to Unicode using fromUtf8().
P.S:
Maybe use QFile::write and QFile::read is a better way.
try using toUtf8() .. it works fine with me
If I understand correctly, the text from the file is store in the QString content. I think you could create a new QByteArray. Because the constructor of a QByteArray does not allow a QString as input, I will probably have to append the QString to the empty QByteArray.
//After if:
QByteArray tempContent();
tempContent.append(content);
QByteArray decrypted = crypto.Decrypt(tempContent, key);
I do not have much experience in the Qt library, but I hope this helps.
Or simply go with b64 = data.toUtf8().toBase64();
First convert it to QByteArray with the toUtf8() and then immediately convert it to toBase64()
There's a simple way :
QByteArray ba;
QString qs = "String";
ba += qs;
More hard way:
QByteArray ba;
QDataStream in(&ba, QIODevice::WriteOnly);
in << QString("String");
Extreme way, for people who want to use QBuffer:
#include <QDebug>
#include <QBuffer>
#include <QDataStream>
#include <QIODevice>
#include <QByteArray>
#include <QString>
#include <qcoreapplication.h>
int main(int argc, char *argv[])
{
QCoreApplication a(argc, argv);
QByteArray byteArray;
QBuffer buffer(&byteArray);
buffer.open(QIODevice::ReadWrite);
QDataStream in( &buffer );
in << QString("String");
buffer.close();
for (int i = 0; i < byteArray.length(); ++i) {
printf("%c - %x\n", byteArray.at(i), byteArray.at(i));
}
printf("\n");
return a.exec();
}
Who run last code can ask me where does the null byte come from?
QDataStream serializes the QString as a little-endian. 4 bytes are read to create the 32-bit length value, followed by the string itself in UTF-16. This string length in UTF-16 is 12 bytes, so the first three bytes in QByteArray will be zero.
There is often a problem with reading the QByteArray in qDebug () The recording is fine.
Don't forget to remove QT_NO_CAST_FROM_BYTEARRAY in your .pro file, if it has
I have to populate a QByteArray with different data. So I'm using the QDataStream.
QByteArray buffer;
QDataStream stream(&buffer, QIODevice::WriteOnly);
qint8 dataHex= 0x04;
qint8 dataChar = 'V';
stream << dataHex<< dataChar;
qDebug() << buffer.toHex(); // "0456" This is what I want
However, I would also like to append a QByteArray to the buffer.
QByteArray buffer;
QDataStream stream(&buffer, QIODevice::WriteOnly);
qint8 dataHex= 0x04;
qint8 dataChar = 'V';
QByteArray moreData = QByteArray::fromHex("ff");
stream << dataHex<< dataChar << moreData.data(); // char * QByteArray::data ()
qDebug() << buffer.toHex(); // "045600000002ff00" I would like "0456ff"
What am I missing?
when a char* is appended it assumes \0 termination and serializes with writeBytes which also writes out the size first (as uint32)
writeBytes' doc:
Writes the length specifier len and the buffer s to the stream and
returns a reference to the stream.
The len is serialized as a quint32, followed by len bytes from s. Note
that the data is not encoded.
you can use writeRawData to circumvent it:
stream << dataHex<< dataChar;
stream.writeRawData(moreData.data(), moreDate.size());
The 00000002 is the size of the char array, which is written to the stream.
What you are missing is, QDataStream is not raw data. It has its own simple serialization format. It is most suitable for use cases where data is both written (serialized) and read back (deserialized) with QDataStream, and using a reliable QIODevice (QBuffer or QFile for example).
If you want to add raw data to a QBuffer, you could use a suitable overload of write method. But then you might as well just append to the QByteArray directly.
I have a problem reading more than 2048 bytes from a QLocalSocket.
This is my server-side code:
clientConnection->flush(); // <-- clientConnection is a QLocalSocket
QByteArray block;
QDataStream out(&block, QIODevice::WriteOnly);
out.setVersion(QDataStream::Qt_5_0);
out << (quint16)message.size() << message; // <--- message is a QString
qint64 c = clientConnection->write(block);
clientConnection->waitForBytesWritten();
if(c == -1)
qDebug() << "ERROR:" << clientConnection->errorString();
clientConnection->flush();
And this is how I read the data in my client:
QDataStream in(sock); // <--- sock is a QLocalSocket
in.setVersion(QDataStream::Qt_5_0);
while(sock->bytesAvailable() < (int)sizeof(quint16)){
sock->waitForReadyRead();
}
in >> bytes_to_read; // <--- quint16
while(sock->bytesAvailable() < (int)bytes_to_read){
sock->waitForReadyRead();
}
in >> received_message;
The client code is connected to the readyRead signal and it's disconnected after the first call to the slot.
Why I'm able to read only 2048 bytes?
==EDIT==
After peppe's reply I updated my code. Here is how it looks now:
server side code:
clientConnection->flush();
QByteArray block;
QDataStream out(&block, QIODevice::WriteOnly);
out.setVersion(QDataStream::Qt_5_0);
out << (quint16)0;
out << message;
out.device()->seek(0);
out << (quint16)(block.size() - sizeof(quint16));
qDebug() << "Bytes client should read" << (quint16)(block.size() - sizeof(quint16));
qint64 c = clientConnection->write(block);
clientConnection->waitForBytesWritten();
client side code:
QDataStream in(sock);
in.setVersion(QDataStream::Qt_5_0);
while(sock->bytesAvailable() < sizeof(quint16)){
sock->waitForReadyRead();
}
quint16 btr;
in >> btr;
qDebug() << "Need to read" << btr << "and we have" << sock->bytesAvailable() << "in sock";
while(sock->bytesAvailable() < btr){
sock->waitForReadyRead();
}
in >> received_message;
qDebug() << received_message;
I'm still not able to read more data.
out.setVersion(QDataStream::Qt_5_0);
out << (quint16)message.size() << message; // <--- message is a QString
This is wrong. The serialized length of "message" will be message.size() * 2 + 4 bytes, as QString prepends its own length as a quint32, and each QString character is actually a UTF-16 code unit, so it requires 2 bytes. Expecting only message.size() bytes to read in the reader will cause QDataStream to short read, which is undefined behaviour.
Please do check the size of "block" after those lines -- it'll be 2 + 4 + 2 * message.size() bytes. So you need to fix the math. You can safely assume it won't change, as the format of serialization of Qt datatypes is known and documented.
I do recognize the "design pattern" of prepending the length, though. It probably comes from the fortune network example shipped with Qt. The notable difference there is that the example doesn't prepend the length of the string in UTF-16 code units (which is pointless, as it's not how it's going to be serialized) -- it prepends the length of the serialized QString. Look at what it does:
out << (quint16)0;
out << fortunes.at(qrand() % fortunes.size());
out.device()->seek(0);
out << (quint16)(block.size() - sizeof(quint16));
First it reserves some space in the output, by writing a 0. Then it serializes a QString. Then it backtracks and overwrites the 0 with the length of the serialized QString -- which at this point, is exactly block.size() minus the prepended integer stating the lenght (and we know that the serialized length of a quint16 is sizeof(quint16))
To repeat myself, there actually two reasons about why that example was coded that way, and they're somehow related:
QDataStream has no means to recover from short reads: all the data it needs to successfully decode an object must be available when you use the operator>> to deserialize the object. Therefore, you cannot use it before being sure that all data was received. Which brings us to:
TCP has no built in mechanism for separating data in "records". You can't just send some bytes followed by a "record marker" which tells the receiver that he has received all the data pertinent to a record. What TCP provides is a raw stream of bytes. Eventually, you can (half-)close the connection to signal the other peer that the transmission is over.
1+2 imply that you must use some other mechanism to know (on the receiver side) if you already have all the data you need or you must wait for some more. For instance, you can introduce in-band markers like \r\n (like IRC or - up to a certain degree - HTTP do).
The solution in the fortune example is prepending to the "actual" data (the serialized QString with the fortune message) the length, in bytes, of that data; then it sends the length (as a 16 bit integer) followed by the data itself.
The receiver first reads the length; then it reads up that many bytes, then it knows it can decode the fortune. If there's not enough data available (both for the length - i.e. you received less than 2 bytes - and the payload itself) the client simply does nothing and waits for more.
Note that:
the design ain't new: it's what all most protocols do. In the "standard" TCP/IP stack, TCP, IP, Ethernet and so on all have a field in their "headers" which specify the lenght of the payload (or of the whole "record");
the transmission of the "length" uses a 16bit unsigned integer sent in a specific byte order: it's not memcpy()d into the buffer, but QDataStream is used on it to both store it and read it back. Although it may seem trivial, this actually completes the definition of the protocol you're using.
if QDataStream had been able to recover from short reads (f.i. by throwing an exception and leaving the data in the device), you would not have needed to send the length of the payload, since QDataStream already sends the length of the string (as a 32 bit unsigned bigendian integer) followed by the UTF-16 chars.
I am a beginner with C++ and Qt. The data sent is a string of ASCII characters ex:"jdlsfjffjf: XX" where I would like to extract the number XX. I know I should possibly use indexof to point to it but not sure how. Any direction? Here's the server side code that receives, displays and writes. I get the correct numbers in the application but gibberish characters in the file I'm writing to.
void Receiver::processPendingDatagrams()
{
while (udpSocket->hasPendingDatagrams()) {
QByteArray datagram; //array of bytes
datagram.resize(udpSocket->pendingDatagramSize()); //size it depending on sent data
udpSocket->readDatagram(datagram.data(), datagram.size()); //read all
statusLabel->setText(tr("%1 C").arg(datagram.data()));
//writing stream to file
bool ok;
QFile file("file.dat");
file.open(QIODevice::WriteOnly);
QDataStream out(&file);
out << datagram.toInt(&ok, 10 );
}
int num = datagram.right(datagram.size() - datagram.indexOf(':') - 1).toInt();