I am working on an Qt application which has the possability to use a script to perform several actions. One command within the script requires an external event to happen until the next command in the list can be computed (which is
not the case for the rest of the commands).
Usually, I open the file, read a line of the script and process it. This is repeated until the EOF is reached.
Emitting a signal when the external event occured is possible, but the function which runs through the script hast so to be stopped during this timespan.
How can i archive this whithout locking the GUI response?
Thank you!
I would do it this way:
public:
void execute_script() {
//open file
continue_execution();
}
public slots:
void continue_execution() {
while(!file.atEnd()) {
//read and process command
if(async_command) {
//make sure the signal indicating command completion
//is connected to continue_execution() slot
return;
}
}
emit script_finished();
}
Related
I am working on a Qt-C++ based front-end app for a Raspberry Pi powered robot. I am using Qt version 5.9 along with libraries QSerialPort and Pigpio. In my app, when I give the run command for a command sequence to the robot, my Raspberry Pi starts a serial communication with a microcontroller in which it sends some message and then waits to receive a response. This sending and waiting causes the Mainwindow thread to freeze up. I am trying to build in a emergency stop functionality, which would stop the command execution in the middle of the run process.
Towards that effort, I tried to push my serial communication part to a separate thread(QThread). It didn't work out. Now I am trying to build the emergency stop part into a QDialog box that opens up when I give the run command, which contains a emergency stop QPushbutton. The Dialog box is being run in non-modal form. But in my current code, when I give the run command, a dialog box does open up, but the dialog box is completely blank and then closes up when the run command ends(which is intentional). I'll share some screenshots of the appearance.
Can you suggest where I might be going wrong? Or is there a better approach to this issue? Any criticism and suggestions are welcome!
Thanks!
One shouldn't block the main thread in the Qt. Everytime you call the blocking function, your GUI freezes, as well as Dialog boxes.
One solution is to use signal/slots. They blend really well into Qt. But doing a complicated request/response logic would require a huge state machine usually prone to errors.
Sometimes it is better to leave this code blocking, create a plain chain of request/response code, and put it in another non-GUI thread. Then use the signal to notify the main thread about the job result.
In order to stop the execution it is possible to use an atomic and check it between blocking steps. The biggest time delay before exiting the working function is the biggest delay of the single blocking function. You should carefully tune the timeouts. Or you can write your own function, which emulates timeout and a stop condition. It should check if incoming data is available in an infinite loop and check fro stop condition on each iteration, which must be a timeout AND a stop condition variable.
// pseudocode here
while (true) {
if (stopCondition) return; // check for emergency condition
it (currentTime - startTime > timeout) return;
if (serial->dataReady()) break;
}
auto data = serial->getData();
If a step can block forever, then this method can't be used.
There is an example with QtConcurrent framework, which demonstrates the use of QFuture and the work of a function in a separate thread without blocking the main thread. You can put all your communication logic inside it.
The code is example only!
#ifndef WORKERCLASS_H
#define WORKERCLASS_H
#include <QObject>
#include <QtConcurrent/QtConcurrent>
#include <QFuture>
class WorkerClass : public QObject
{
Q_OBJECT
public:
explicit WorkerClass(QObject *parent = nullptr) : QObject(parent) {
connect(&futureWatcher, &QFutureWatcher<void>::finished, [this] () {
emit workFinsihed();
});
}
void startWork(int value) {
atomic = 0;
future = QtConcurrent::run(this, &WorkerClass::workFunction, value);
futureWatcher.setFuture(future);
}
void stopWork() {
atomic = 1;
}
private:
QFuture<void> future;
QFutureWatcher<void> futureWatcher;
void workFunction(int value) {
for (int i = 0; i < value; ++i) {
if (atomic) return;
}
return;
};
QAtomicInt atomic{0};
signals:
void workFinsihed();
};
#endif // WORKERCLASS_H
I am trying to run a script in parallel to my Qt program and am having trouble starting it as a separate process. Check out my attempts below and let me know what you see wrong.
The first attempt was just a system call:
system("python3 startprocess.py");
This works but it also stops the program while running it.
I then followed this guy https://forum.qt.io/topic/92205/writing-commands-to-linux-bash with no success. No errors, just no start of my script.
I am trying this after I saw the documentation and have the below code.
QProcess process;
process.start("python3 startprocess.py");
process.waitForStarted();
I am just wanting to start this script and have it run at the same time as my C++ code. Perhaps I am using the QProcess wrong?
UPDATE:
It was a lot easier to use QThreading and the original system call.
I think the issue is that the QProcess doesn't have the file path and fails to find and start it! I suggest first to use the full file path! Also check the QProcess::setWorkingDirectory and QProcess::setProcessEnvironment that are useful to handle this case!
Update
In order to prevent the QProcess to be killed while running and without freezing the GUI, you need to define it as a pointer, then connect the QProcess::finished event; in the slot, you can check the exit code and delete the sender using QObject::deleteLater method. Check both the Qt example and the QProcess::finished.
Update 2
Try this code:
auto process = new QProcess(this);
connect(process, QOverload<int,QProcess::ExitStatus>::of(&QProcess::finished),
[this](int exitCode, QProcess::ExitStatus exitStatus)
{
if (exitStatus == QProcess::ExitStatus::CrashExit
|| exitCode != 0) {
// Process error!
} else {
// Process OK!
}
});
process->setWorkingDirectory("startprocess.py folder location");
process->start("python3 startprocess.py");
if (!process->waitForStarted(-1)) {
// Failed to start process
delete process;
}
I'm tracking a log file that is changed by another application. In linux I receive the fileChanged signal correctly as soon as the other application changes the file. In windows QFileSystemWatcher doesn't emit any fileChanged signal until the other application is closed.
I have tried to open the log with notepad to make sure is actually been changed and as soon as the notepad open the log ,QFileSystemWatcher sends the fileChanged signal.
My code:
void LogLoader::createFileWatcher()
{
if(fileWatcher != NULL) delete fileWatcher;
fileWatcher = new QFileSystemWatcher(this);
connect(fileWatcher, SIGNAL(fileChanged(QString)),
this, SLOT(prepareLogWorker(QString)));
if(fileWatcher->addPath(logPath))
{
qDebug() << "LogLoader: "<< "FileWatcher linked.";
}
}
void LogLoader::prepareLogWorker(QString path)
{
//Added this just in case because I read it as solution
//in other question. But in my case the file is not removed.
if (!fileWatcher->files().contains(path))
{
fileWatcher->addPath(path);
}
QTimer::singleShot(1000, this, SLOT(sendLogWorker()));
}
Am I doing something wrong? Is there any other solution than checking
the file manually from time to time?
void OBJ_Loader::on_actionOpen_triggered()
{
QString filename = QFileDialog::getOpenFileName(this, tr("Open a File"));
if (!filename.isEmpty()) {
filepath=filename.toUtf8().constData();
command.append(filepath);
int TempNumOne=command.size();
for (int a=0;a<=TempNumOne;a++) { //get letters to a char list so it can be used by system();
cmd[a]=command[a];
}
openfile=true;
if (openfile) {
openfile=false;
system(cmd);
}
}
}
When the system(cmd); is called the QFileDialog window does not close till the system command finishes. I would like to know if I can close the search window after clicking open.
The system function blocks the event loop: the user interaction requires the event loop to run, and it runs when your code isn't running. Since the system invocation is in your code, you can't simply have it block your process. You need to use QProcess as it has an asynchronous interface. This answer provides a complete example of one process calling itself -- all done from a single executable.
I am running a QProcess in a timer slot at 1 Hz. The process is designed to evoke a Linux command and parse it's output.
The problem is this: after the program runs for about 20 minutes, I get this error:
QProcessPrivate::createPipe: Cannot create pipe 0x104c0a8: Too many open files
QSocketNotifier: Invalid socket specified
Ideally, this program would run for the entire uptime of the system, which may be days or weeks.
I think I've been careful with process control by reading the examples, but maybe I missed something. I've used examples from the Qt website, and they use the same code that I've written, but those were designed for a single use, not thousands. Here is a minimum example:
class UsageStatistics : public QObject {
Q_OBJECT
public:
UsageStatistics() : process(new QProcess) {
timer = new QTimer(this);
connect(timer, SIGNAL(timeout()), this, SLOT(getMemoryUsage()));
timer->start(1000); // one second
}
virtual ~UsageStatistics() {}
public slots:
void getMemoryUsage() {
process->start("/usr/bin/free");
if (!process->waitForFinished()) {
// error processing
}
QByteArray result = process->realAll();
// parse result
// edit, I added these
process->closeReadChannel(QProcess::StandardOutput);
process->closeReadChannel(QProcess::StandardError);
process->closeWriteChannel();
process->close();
}
}
I've also tried manually deleting the process pointer at the end of the function, and then new at the beginning. It was worth a try, I suppose.
Free beer for whoever answers this :)
QProcess is derived from QIODevice, so I would say calling close() should close the file handle and solve you problem.
I cannot see the issue, however one thing that concerns me is a possible invocation overlap in getMemoryUsage() where it's invoked before the previous run has finished.
How about restructuring this so that a new QProcess object is used within getMemoryUsage() (on the stack, not new'd), rather than being an instance variable of the top-level class? This would ensure clean-up (with the QProcess object going out-of-scope) and would avoid any possible invocation overlap.
Alternatively, rather than invoking /usr/bin/free as a process and parsing its output, why not read /proc/meminfo directly yourself? This will be much more efficient.
First I had the same situation with you. I got the same results.
I think that QProcess can not handle opened pipes correctly.
Then, instead of QProcess, I have decided to use popen() + QFile().
class UsageStatistics : public QObject {
Q_OBJECT
public:
UsageStatistics(){
timer = new QTimer(this);
connect(timer, SIGNAL(timeout()), this, SLOT(getMemoryUsage()));
timer->start(1000); // one second
}
virtual ~UsageStatistics() {}
private:
QFile freePipe;
FILE *in;
public slots:
void getMemoryUsage() {
if(!(in = popen("/usr/bin/free", "r"))){
qDebug() << "UsageStatistics::getMemoryUsage() <<" << "Can not execute free command.";
return;
}
freePipe.open(in, QIODevice::ReadOnly);
connect(&freePipe, SIGNAL(readyRead()), this, SLOT(parseResult()) );
// OR waitForReadyRead() and parse here.
}
void parseResult(){
// Parse your stuff
freePipe.close();
pclose(in); // You can also use exit code by diving by 256.
}
}
tl;dr:
This occurs because your application wants to use more resources than allowed by the system-wide resource limitations. You might be able to solve it by using the command specified in [2] if you have a huge application, but it is probably caused by a programming error.
Long:
I just solved a similar problem myself. I use a QThread for the logging of exit codes of QProcesses. The QThread uses curl to connect to a FTP server uploads the log. Since I am testing the software I didn't connect the FTP server and curl_easy_perform apparently waits for a connection. As such, my resource limit was reached and I got this error. After a while my program starts complaining, which was the main indicator to figure out what was going wrong.
[..]
QProcessPrivate::createPipe: Cannot create pipe 0x7fbda8002f28: Too many open files
QProcessPrivate::createPipe: Cannot create pipe 0x7fbdb0003128: Too many open files
QProcessPrivate::createPipe: Cannot create pipe 0x7fbdb4003128: Too many open files
QProcessPrivate::createPipe: Cannot create pipe 0x7fbdb4003128: Too many open files
[...]
curl_easy_perform() failed for curl_easy_perform() failed for disk.log
[...]
I've tested this by connecting the machine to the FTP server after this error transpired. That solved my problem.
Read:
[1] https://linux.die.net/man/3/ulimit
[2] https://ss64.com/bash/ulimit.html
[3] https://bbs.archlinux.org/viewtopic.php?id=234915