I use a QNAM to handle uploads using a ftp protocol.
The whole process works but I have a strange behavior:
this is my method :
void ftp::uploadFile(const QString &origin, const QString &destination)
{
QUrl url("ftp://"+host+""+destination);
url.setUserName(user);
url.setPassword(pwd);
url.setPort(21);
localFile = new QFile(origin, this);
if (localFile->open(QIODevice::ReadOnly))
{
reply = nam->put(QNetworkRequest(url), localFile);
QObject::connect(reply, SIGNAL(uploadProgress(qint64, qint64)), SLOT(transferProgress(qint64, qint64)));
QObject::connect(reply, SIGNAL(finished()), this, SLOT(transferFinished()));
}
else qDebug() << localFile->errorString();
}
When I upload a file, the uploadProgress is emitted :
qDebug() << sent << "/" << total;
outputs the 0/x till the x/x .
Then it takes a long time, maybe up to 20 seconds before the finished signal is emitted.
Why this delay?
I tried ignoring the finished signal and emit the signal myself when the progress is at sent==total but the file is corrupted at the other end. (It's not really corrupted, as I only send jpg, The resulting file is an upper-half only jpg. a big part is just grey.)
I'd like to provide my users with a progress bar where 100% really means the process is finished.
Uploading for 5 seconds, then staying for 20 seconds at 100% isn't really nice.
file upload does some buffering in background (qt socket buffers, system socket buffers, network buffers) so 'progress' signal just means you send the data to somewhere nor that server has received it.
While 'finished' signal is emitted when all data transferred to remote side and buffers are flushed. If you need to know exact size transferred you may look for disabling request or socket(s) or qnam buffring/caching.
Related
I use Qt creator 5.5.1 in windows 7.
The compiler is VC 2010 32 bits.
I have written a socket client. It could connect well but sometimes its readyRead() signal could not be triggered after receiving message from server. So the readMessageFromTCPServer() slot could not be triggered and the thread could not run.
void MainWindow::on_pushBtn_LoadCfg_clicked()
{
if (tcpClient == NULL)
{
tcpClient = new QTcpSocket;
tcpClient->connectToHost(ui->txtIPServer->text(),ui->txtPortServer-
>text().toInt());
QObject::connect(tcpClient,SIGNAL(readyRead()),this,
SLOT(readMessageFromTCPServer()));
}
}
void MainWindow::readMessageFromTCPServer()
{
std::string r="start";
QByteArray qba;
qba= tcpClient->readAll();
if (qba.contains(r.c_str()))
{
cout<<"thread run";
}
}
When I tried to debug this program. I put a break point at this line: Sleep(800), but sometimes this slot could not be triggered after receiving message from socket server. And I have checked that the socket is still connected, why the slot could not be triggered?
There are some errors in your code. You will only have the slot triggered once.
Get rid of those Sleep commands. There is no good reason to use it if you are doing it in the main thread.
QByteArray qba= NULL; makes no sense. QByteArray qba; is ok.
while(1) means forever. You should break the loop at some point. Actually, you do not need this loop at all. Put the code inside this loop out of it and remove the loop. When the readyRead() signal is emitted, it means that there is some data to be read. You can readAll() that data chunk, do your stuff and return.
There is no guarantee that you will get your entire message in one round. So, in some circumstances, you may get "St" in one signal and "art" in another. So, you should implement your own buffering mechanism for such a situation. It may happen on big chunks of data. If you are sure that you are getting very short packages, then it's ok to rely on the internal buffer of QTCPSocket.
I'm triying to send data from a PSoc via UART to my PC where a want to store data with Qt. The PSoc sends 3 bytes of data. Theses 3 bytes are repeatet with a frequency of 2.5Hz. When I check the signals with my oscilloscope everything is fine. When I receive the data with the software HTerm also everything is as expected. When I use my code written in c++ with Qt I get the problem that not all data are received in Qt, only one third is in the memory. I expected that the signal readyRead is emitted with every new byte? But it seems that the signal is only emitted at the begin of the package of the 3 bytes. Also my qDebug output doesn't react on changes from the PSoc. So when I change values at PSoc the output in qDebug doesn't change.
I already tried reading 3 Bytes (serial->read(3)) and then I first received some single bytes and after a few readings I get the 3 bytes I sended but this is not so reproducible.
connect(serial, SIGNAL(readyRead()), this, SLOT(readData()));
serial->setPortName(gui->ui->comboBox->currentData().toString());
serial->setBaudRate(QSerialPort::Baud115200);
serial->setDataBits(QSerialPort::Data8);
serial->setParity(QSerialPort::NoParity);
serial->setStopBits(QSerialPort::OneStop);
serial->setFlowControl(QSerialPort::NoFlowControl);
void uart::readData()
{
QByteArray data = serial->read(1);
qDebug() << data;
}
I expect an output like "0x01" "0x02" "0x03" 2.5 times a second, but I get only "0x01"
You are only reading a fixed size with read.
Could it be that you get readyRead signals with varying bytes available but you only read fixed size of them
In your readyRead slot try to read all available bytes.
qint64 available = serial->bytesAvailable();
if (available > 0)
{
QByteArray data = serial->read(available);
qDebug() << data;
}
You can also use readAll() function.
I just found the solution!
You have to set the read-buffer size to the right value.
So for reading a package of three bytes I must set:
serial->setReadBufferSize(3);
I want to implement a timeout mechanism such that if the arduino doesn't read the command within one second, it results in a timeout and the new command is discarded and the program runs fine.
But right now, the program hangs if any new command is sent during the execution of the old one.
This is the timeout section of my code:
QByteArray requestData = myRequest.toLocal8Bit();
serial.write(requestData);
if (serial.waitForBytesWritten(waitTime)) {
if (serial.waitForReadyRead(myWaitTimeout)) {
QByteArray responseData = serial.readAll();
while (serial.waitForReadyRead(10))
responseData += serial.readAll();
QString response(responseData);
emit this->response(response);
} else {
emit timeout(tr("Wait Read Request Timed Out %1")
.arg(QTime::currentTime().toString()));
}
} else {
emit timeout(tr("Wait Write Request Timed Out %1")
.arg(QTime::currentTime().toString()));
}
The timeout signal is connected to a slot that just prints the timeout message and does nothing.
How can I fix this so that I can achieve what I target?
You are using blocking approach to transmit data via serial port. Unless you are using threads I don't see possibility to send any additional data during execution of previous loop.
BTW: Your program, for example, will block indefinitely if Arduino manages to keep sending something within 10ms periods.
Add couple of QDebug() << "I'm here"; lines to check where your code gets stuck; it is possible that you are blocking somewhere outside code you pasted here. Alternative is to use debugger.
What if previous 'command' you tried to send is still in the buffer ? You'll end up filling output buffer. Check with QDebug how many bytes are in output buffer before writing more data to it. Buffer should be empty. (qint64 QIODevice::bytesToWrite() const).
I'm trying to understand why putting the a.exec() call in the following Qt 4.8 code does not need to happen before my QProcess waitForFinished() and waitForStarted() calls can work. I understand that a.exec() starts the event loop, and in my mind the waitFor* slots need to receive a signal ( i.e 'started()' or 'finished()' ) before moving on with execution. How can this happen if the event loop has not been started?
Documentation for waitForStarted():
Blocks until the process has started and the started() signal has been emitted, or until msecs milliseconds have passed.
Code:
int main(int argc, char *argv[])
{
QCoreApplication a(argc, argv);
// Exec the i2c command to get the power button status
QProcess i2cprocess;
i2cprocess.start("/usr/bin/i2cget -f -y 1 0x4b 0x45");
// Wait for it to start
if(!i2cprocess.waitForStarted())
{
qDebug() << "Could not start QProcess to check power button status.";
exit(-1);
}
// Wait for it to finish
bool returnValue = i2cprocess.waitForFinished();
if ( returnValue )
{
QByteArray status = i2cprocess.readAllStandardOutput().trimmed();
bool ok;
quint16 hexValue = status.toUInt(&ok, 16);
qDebug() << "Power button status: " << status << hexValue << (hexValue & 0x01);
// We want LSB
exit(hexValue & 0x01);
}
else
{
qDebug() << "Error, process never completed to check power button status.";
exit(-1);
}
return a.exec();
}
Direct signal-slot connections are simply indirect function calls, they have nothing whatsoever to do with event loops.
What the waitForXxxx methods do, though, is to spin a local event loop until the signal fires. The signal is fired from some code that gets notified by the OS that the process state has changed. That code is executed, functionally, "by" the event loop.
Remember that in Qt you can create temporary event loops on a whim - it's a bad practice, and you should never write code that uses waitFor methods. It places requirements on your code that are not normally present - and thus introduces bugs!
The question is then: what is the use of an event loop when waiting for processes to change state? Internally, the process notifications require waiting for native events or callbacks, and those all get handled from an event loop. Even if no events are used, the event loop performs an interruptible sleep that's needed for the OS to deliver callbacks to the application.
The Qt documentation for QProcess states: -
QProcess provides a set of functions which allow it to be used without an event loop, by suspending the calling thread until certain signals are emitted:
waitForStarted() blocks until the process has started.
waitForReadyRead() blocks until new data is available for reading on the current read channel.
waitForBytesWritten() blocks until one payload of data has been written to the process.
waitForFinished() blocks until the process has finished.
Calling these functions from the main thread (the thread that calls QApplication::exec()) may cause your user interface to freeze.
I use this code to transfer a large file through a socket without spikes in memory usage:
connect(socket, SIGNAL(bytesWritten(qint64)), this, SLOT(refillSocketBuffer(qint64)));
refillSocketBuffer(128*1024);
}
void FtpRetrCommand::refillSocketBuffer(qint64 bytes)
{
if (!file->atEnd()) {
socket->write(file->read(bytes));
} else {
socket->disconnectFromHost();
}
}
This works fine with QTcpSocket, but with an encrypted QSslSocket, the bytesWritten() signal is emitted constantly, which causes my function to write to the socket all the time, way quicker than it can send the data though the socket, so eventually its memory usage goes to 400 MB and the OS kills it.
I just found the answer after some more digging, it was in the documentation actually. It seems that I should use encryptedBytesWritten() for SSL sockets instead:
Note: Be aware of the difference between the bytesWritten() signal and the encryptedBytesWritten() signal. For a QTcpSocket, bytesWritten() will get emitted as soon as data has been written to the TCP socket. For a QSslSocket, bytesWritten() will get emitted when the data is being encrypted and encryptedBytesWritten() will get emitted as soon as data has been written to the TCP socket.
So I needed to change this code:
connect(socket, SIGNAL(bytesWritten(qint64)), this, SLOT(refillSocketBuffer(qint64)));
to this:
if (socket->isEncrypted()) {
connect(socket, SIGNAL(encryptedBytesWritten(qint64)), this, SLOT(refillSocketBuffer(qint64)));
} else {
connect(socket, SIGNAL(bytesWritten(qint64)), this, SLOT(refillSocketBuffer(qint64)));
}