Qt: QSslSocket::bytesWritten() signal is emitted too often - c++

I use this code to transfer a large file through a socket without spikes in memory usage:
connect(socket, SIGNAL(bytesWritten(qint64)), this, SLOT(refillSocketBuffer(qint64)));
refillSocketBuffer(128*1024);
}
void FtpRetrCommand::refillSocketBuffer(qint64 bytes)
{
if (!file->atEnd()) {
socket->write(file->read(bytes));
} else {
socket->disconnectFromHost();
}
}
This works fine with QTcpSocket, but with an encrypted QSslSocket, the bytesWritten() signal is emitted constantly, which causes my function to write to the socket all the time, way quicker than it can send the data though the socket, so eventually its memory usage goes to 400 MB and the OS kills it.

I just found the answer after some more digging, it was in the documentation actually. It seems that I should use encryptedBytesWritten() for SSL sockets instead:
Note: Be aware of the difference between the bytesWritten() signal and the encryptedBytesWritten() signal. For a QTcpSocket, bytesWritten() will get emitted as soon as data has been written to the TCP socket. For a QSslSocket, bytesWritten() will get emitted when the data is being encrypted and encryptedBytesWritten() will get emitted as soon as data has been written to the TCP socket.
So I needed to change this code:
connect(socket, SIGNAL(bytesWritten(qint64)), this, SLOT(refillSocketBuffer(qint64)));
to this:
if (socket->isEncrypted()) {
connect(socket, SIGNAL(encryptedBytesWritten(qint64)), this, SLOT(refillSocketBuffer(qint64)));
} else {
connect(socket, SIGNAL(bytesWritten(qint64)), this, SLOT(refillSocketBuffer(qint64)));
}

Related

Is Signal/Slot mechanism is faster than Queue mechanism?

I am working on a Qt program that communicate with specific digital board with LAN and Serial Port, These packets should be read from LAN and send to other device connected to pc immediately. we have a problem with those packets come from serial port. we get these packets with emit on readyRead Signal and using readAll function for reading QByteArray inside the connected Slot. the problem is: delay on reading packets. I want to change the signal/slot mechanism to producer/costumer queue mechanism (using mutex for prevent critical section). Is queue mechanism faster than signal/slot or not?
Some code like below: this is not an executable function just present in pseudocode
func1Thread() //customer thread
{
while(!stopThread)
{
if(queueSize>0)
{
//parse packet
}
else
msleep(100);
}
}
func2Thread() //producer thread
{
while(!stopThread)
{
if(!socket.waitForReadyRead())
{
qByteArray ba= socket.readAll();
//enqueue here
}
}
}
thanks

The slot of readMessageFromTCPServer could not be triggered sometimes after receiving message

I use Qt creator 5.5.1 in windows 7.
The compiler is VC 2010 32 bits.
I have written a socket client. It could connect well but sometimes its readyRead() signal could not be triggered after receiving message from server. So the readMessageFromTCPServer() slot could not be triggered and the thread could not run.
void MainWindow::on_pushBtn_LoadCfg_clicked()
{
if (tcpClient == NULL)
{
tcpClient = new QTcpSocket;
tcpClient->connectToHost(ui->txtIPServer->text(),ui->txtPortServer-
>text().toInt());
QObject::connect(tcpClient,SIGNAL(readyRead()),this,
SLOT(readMessageFromTCPServer()));
}
}
void MainWindow::readMessageFromTCPServer()
{
std::string r="start";
QByteArray qba;
qba= tcpClient->readAll();
if (qba.contains(r.c_str()))
{
cout<<"thread run";
}
}
When I tried to debug this program. I put a break point at this line: Sleep(800), but sometimes this slot could not be triggered after receiving message from socket server. And I have checked that the socket is still connected, why the slot could not be triggered?
There are some errors in your code. You will only have the slot triggered once.
Get rid of those Sleep commands. There is no good reason to use it if you are doing it in the main thread.
QByteArray qba= NULL; makes no sense. QByteArray qba; is ok.
while(1) means forever. You should break the loop at some point. Actually, you do not need this loop at all. Put the code inside this loop out of it and remove the loop. When the readyRead() signal is emitted, it means that there is some data to be read. You can readAll() that data chunk, do your stuff and return.
There is no guarantee that you will get your entire message in one round. So, in some circumstances, you may get "St" in one signal and "art" in another. So, you should implement your own buffering mechanism for such a situation. It may happen on big chunks of data. If you are sure that you are getting very short packages, then it's ok to rely on the internal buffer of QTCPSocket.

Serial Communication Timeout in QT with Arduino

I want to implement a timeout mechanism such that if the arduino doesn't read the command within one second, it results in a timeout and the new command is discarded and the program runs fine.
But right now, the program hangs if any new command is sent during the execution of the old one.
This is the timeout section of my code:
QByteArray requestData = myRequest.toLocal8Bit();
serial.write(requestData);
if (serial.waitForBytesWritten(waitTime)) {
if (serial.waitForReadyRead(myWaitTimeout)) {
QByteArray responseData = serial.readAll();
while (serial.waitForReadyRead(10))
responseData += serial.readAll();
QString response(responseData);
emit this->response(response);
} else {
emit timeout(tr("Wait Read Request Timed Out %1")
.arg(QTime::currentTime().toString()));
}
} else {
emit timeout(tr("Wait Write Request Timed Out %1")
.arg(QTime::currentTime().toString()));
}
The timeout signal is connected to a slot that just prints the timeout message and does nothing.
How can I fix this so that I can achieve what I target?
You are using blocking approach to transmit data via serial port. Unless you are using threads I don't see possibility to send any additional data during execution of previous loop.
BTW: Your program, for example, will block indefinitely if Arduino manages to keep sending something within 10ms periods.
Add couple of QDebug() << "I'm here"; lines to check where your code gets stuck; it is possible that you are blocking somewhere outside code you pasted here. Alternative is to use debugger.
What if previous 'command' you tried to send is still in the buffer ? You'll end up filling output buffer. Check with QDebug how many bytes are in output buffer before writing more data to it. Buffer should be empty. (qint64 QIODevice::bytesToWrite() const).

Qt QNetworkAccessManager long delay to emit finished signal

I use a QNAM to handle uploads using a ftp protocol.
The whole process works but I have a strange behavior:
this is my method :
void ftp::uploadFile(const QString &origin, const QString &destination)
{
QUrl url("ftp://"+host+""+destination);
url.setUserName(user);
url.setPassword(pwd);
url.setPort(21);
localFile = new QFile(origin, this);
if (localFile->open(QIODevice::ReadOnly))
{
reply = nam->put(QNetworkRequest(url), localFile);
QObject::connect(reply, SIGNAL(uploadProgress(qint64, qint64)), SLOT(transferProgress(qint64, qint64)));
QObject::connect(reply, SIGNAL(finished()), this, SLOT(transferFinished()));
}
else qDebug() << localFile->errorString();
}
When I upload a file, the uploadProgress is emitted :
qDebug() << sent << "/" << total;
outputs the 0/x till the x/x .
Then it takes a long time, maybe up to 20 seconds before the finished signal is emitted.
Why this delay?
I tried ignoring the finished signal and emit the signal myself when the progress is at sent==total but the file is corrupted at the other end. (It's not really corrupted, as I only send jpg, The resulting file is an upper-half only jpg. a big part is just grey.)
I'd like to provide my users with a progress bar where 100% really means the process is finished.
Uploading for 5 seconds, then staying for 20 seconds at 100% isn't really nice.
file upload does some buffering in background (qt socket buffers, system socket buffers, network buffers) so 'progress' signal just means you send the data to somewhere nor that server has received it.
While 'finished' signal is emitted when all data transferred to remote side and buffers are flushed. If you need to know exact size transferred you may look for disabling request or socket(s) or qnam buffring/caching.

Broadcast large data with Qt sockets

I'm using QT. I need to broadcast data, so I try to use QUdpSocket. But data can be too big(after writeDatagram QUdpSocket::error returns DatagramTooLargeError). So I split data and call writeDatagram several times to the parts. But Received socket receive data only once, only first packet. Receive code is
connect(&m_socketReceiver, &QUdpSocket::readyRead, this, &LocalNetSharing::onDataRead);
void LocalNetSharing::onDataRead()
{
while (m_socketReceiver.hasPendingDatagrams())
{
QByteArray datagram;
datagram.resize(m_socketReceiver.pendingDatagramSize());
m_socketReceiver.readDatagram(datagram.data(), datagram.size());
//process data
}
}
From the Qt documentation about QUdpSocket Class :
Note: An incoming datagram should be read when you receive the
readyRead() signal, otherwise this signal will not be emitted for the
next datagram.
So it seems that you are not reading the entire datagram in each call of onDataRead.
You don't specify host and port in readDatagram. I am not sure if it is the reason but the correct form is :
while (m_socketReceiver.hasPendingDatagrams())
{
QByteArray datagram;
datagram.resize(m_socketReceiver.pendingDatagramSize());
m_socketReceiver.readDatagram(datagram.data(), datagram.size(), host, &port);
//process data
}