Events pool in qt - c++

I'm working on a Qt app, and at some point I have a class (I name it here "engine") that is governing the program: it has a signal with a timeout which makes the class to draw, and evolve the app logic. Morevoer, it receives events that are caught from a QGraphicsScene.
Each engine "tick", the update() is called on the Scene, updating the drawing according to the app evolution.
Naturally, I want the drawing to be synchronized with the reactions of the events, otherwise, a drawing of some object could be made while the reaction of a event was destroying that same object, causing a SegFault.
I've tried using a queue on the engine such that I would only make the engine to react to those events on a specific place of a update, thus not interfering with the drawing part.
Two problems rised:
I cannot make a copy of a QGraphicsEvent. Apparently the copy operator is private (which I assume is for a good reason anyway).
When the class is processing the events, before the drawing, it can also happen that a new event appears, which can be "bad" because of non-synchronization of it
Taking into account this situation, is there any approach that can solve this situation? Is there any standard procedure in Qt for this? I mean, how do I ensure the drawing is not potentially desynchronized with the events' reactions of the application?

Related

What is the correct way to display widgets without calling QApplication::exec()?

For test purposes I'd like to create and display a widget. For now I only need the widget to render correctly but in the future I may want to extend this so I simulate various events to see how the widget behaves.
From various sources it would appear that the following should work:
QApplication app;
QPushButton button("Hello");
button.show();
// Might also be necessary:
QApplication::processEvents();
But for me the widget does not render correctly. A window is created to display the widget, however it is entirely black.
I can get the widget to render correctly by adding the following lines:
std::this_thread::sleep_for(std::chrono::milliseconds(10));
QApplication::processEvents();
With 10 milliseconds being about the smallest time necessary to get the widget to render correctly.
Does anyone know how to get this to work without the time delay, or know why the delay is necessary?
To test Qt GUI application you need at least QApplication instance and event loop being processed. The fastest way is just use QTEST_MAIN macro, this answer explains in a nice way what it does exactly. However, to have more flexibility (e.g. to use GTest, GMock) you can also simply create QAplication instance in your tests main (no need to call exec).
Then, to have the events processed, you should invoke QTest::qWait. This will process your events for the specified amount of time. It is a good practice to use qWaitFor which accepts a predicate - this way you avoid race conditions in your tests.
In the particular scenario, when you expect some signal to be emitted, you can also use similar functionality of QSignalSpy::wait.
Small example when we want to wait until some parameters are passed from one item to another:
QSignalSpy spy(&widget1, &Widget1::settingsChanged);
widget2->applySettings();
ASSERT_TRUE(spy.wait(5000));
// do some further tests based on the content of passed settings
Why don't you want to have the application run exec ? The process of displaying a widget is not "static". You don't "draw" the widget on the screen, but rather you have an application that listen for various events and receives draw events from the windowing manager. The application can only draw the widget when the windowing manager asks it to.
The reason your second code works is that you wait sufficiently long for the windowing manager to have sent the "draw" request in your conditions. This does not guarantee it will always work.
If you want to guarantee the display of the widget, you need to start a loop and wait until you have received at least one draw event, but even that isn't foolproof.
As expertly described by Vincent Fourmond, widgets are not a one-off deal. The GUI is non-blocking and for this, it needs to run in an event loop.
The exec() method starts this event loop which you mimicked by polling.
While it is possible to combine Qt's event loop with other event loops, I would recommend you a simpler solution:
Proceed your program within the event loop by calling a method when it starts. Find an excellent answer here on how to do this: https://stackoverflow.com/a/8877968/21974
As you mentioned unit testing, there is also a signal you can use for doing checks at the end of the lifecycle (before widgets are destroyed): QApplication::aboutToQuit. This will happen when the last window is closed (programmatically or by the user).

How does View get updated behind the scenes?

So when I use a setText() on a QLabel for example, Qt automatically updates the view/gui for me and the new text is shown, but what happens behind the scenes? Is there an update function that gets called automatically when using functions like setText()?
Thanks!!
You should check the basic documentation in this link.
The internal system is a little bit more complex but in general, it follows the observer pattern. This mechanism allows the detection of a user action or changing state, and respond to this action.
Low-level interactions, like refreshing the screen are implemented via the Event System
In Qt, events are objects, derived from the abstract QEvent class, that represent things that have happened either within an application or as a result of outside activity that the application needs to know about. Events can be received and handled by any instance of a QObject subclass, but they are especially relevant to widgets. This document describes how events are delivered and handled in a typical application.
So, regarding the display process, there is a dedicated event. A QWidget object handles/subscribe to a PaintEvent, see QWidget::paintEvent.
This event handler can be reimplemented in a subclass to receive paint events passed in event. A paint event is a request to repaint all or part of a widget.
When you call, QLineEdit::setText(), the widget will be repainted the next time a display event is triggered, based in the OS configuration, refresh rate, etc.
For high-level interactions, Qt uses a similar pattern based in the signal/slot mechanism:
Observer pattern is used everywhere in GUI applications and often leads to some boilerplate code. Qt was created with the idea of removing this boilerplate code and providing a nice and clean syntax, and the signal and slots mechanism is the answer.

How to show paginated text from a QTextDocument in QML?

I currently have a C++ class inheriting from QQuickPaintedItem. I use it to paint layouted, paginated richtext from a QTextDocument via QTextDocument::drawContents (or by directly calling its QTextDocumenLayout's draw method).
However, as stated in QQuickPaintedItems documentation, there are threading issues to be aware of:
Warning: Extreme caution must be used when creating QObjects, emitting signals, starting timers and similar inside this function as these will have affinity to the rendering thread.
Specifically, in this case, QTextDocumentLayoutPrivate has timers which get started/stopped when QTextDocumenLayout::draw is called. Unfortunately, the QTextDocument and thus the timers lives in the qml main thread, while paint is called in the render thread, leading to messages like
QBasicTimer::start: Timers cannot be started from another thread
While this doesn't affect the functionality of my application (so far), this is probably not a good thing™.
Therefore, my question is whether there is a better way to show the paginated text in QML (not necessarily involving QQuickPaintedItem).
For now I'm still using the QQuickPaintedItem and when paint is called I do the following:
First check whether the QTextDocument has its affinity set to its current thread. If yes, I'll proceed as normal.
Otherwise QMetaObject::invokeMethod is used to call a method which moves the document to the rendering thread, and calls update to trigger a repaint, which now works as the thread affinity is correct. At the end of paint, the QTextDocument's thread affinity is set back to the original thread.
This works as far as I can tell (as in, no more warnings), but feels conceptually rather wrong.

QT Creating many complex widgets

I have a QT application that allows a user to define a list of models. Each model defined has a fairly complex widget class that is created. Currently this is all being done in the main (GUI) thread. The program works fine for normal use, but I tried to stress test it by creating 200 to 1000 models and the creation of the GUI took a VERY long time (20 seconds to many minutes).
I tried a couple attempts at threading out the work using QtConcurrent::run, but the program always complained because "Widgets must be created in the GUI thread."
I found this answer on SO: How do I create a Window in different QT threads?. But this doesn't seem to be doing much in the new thread except telling the main thread to create a new widget. This does not distribute the workload, as far as I can tell.
With all of that, I have a couple direct questions:
1) Is there a better design (faster performance) than looping through my list of models and serially creating each widget?
2) If this process is possible to be multithreaded, can someone point me in the direction of a simple example?
There really is no way to put widget stuff to other threads. OpenGL rendering or some data processing for a widget to show when ready, stuff like that can easily use other thread, but not the actual widgets.
Better design would be to create the widget when you need it. Or just have one widget and update the data it displays (unless widgets for different items are actually different).
If you are worried about performance when scrolling, then one solution to that is to switch widget only after current one has been visible for some minimum time (say, 500ms if user keeps scrolling, or 200ms after last scroll event, some logic like that) note that this is not delay for reacting to user selection, this is delay after selection and visible widget change, so if user just changes to next item, GUI is updated immediately. But in scrolling the widget is not updated for every item briefly selected, and things don't slow down. Use QTimer or two to send signal when widget should be updated.

Navigation/View flow with MVC in the context of a C++ desktop application

Let's imagine you have a fullscreen C++ desktop application that consists of several screens with each of them having a distinct function and a ViewController with appropriate models as well. For example a set of the following very simplified screens:
Quiz: The user is navigated through a set of multiple-choice questions.
Quiz Results with Statistics.
Information: The user is presented with information about a specific subject.
Menu (Quiz, Information, Exit)
Judging by the GRASP principle Information Expert, each ViewController will know best when it is finished and time to move to a new screen. Yet by the same principle, it is not the right place to decide what the next screen should actually be. In this rather simple example, one could argue it would be okay but in a more complex application, it will undoubtedly lead to duplicated code and logic as well as higher coupling and lower cohesion. There is also the problem that you would have to create the new widget and controller within the current screen's ViewController which brings all sorts of new problems and at least by the Creator principle, it is not the right choice. You would have to introduce a Factory to alleviate some of the problems, amongst other things.
So, the next logical step is to introduce an ApplicationController with the sole responsibility of managing Views and their controllers including the navigation flow from one view to the next.
This still leaves one problem wide open in my opinion: How to signal the ApplicationController that it is time to move to a different screen and hand over the control to that object properly?
One could use the Observer pattern, for example. Yet what if you have an expensive View active at the moment and want that one destroyed once the new screen is active? If the current ViewController signals the ApplicationController that the next screen should go up, it can manage everything up to the point where it would destroy the currently active screen which it cannot do because the current call comes from exactly that object. Apart from several other problems with that approach.
So my question is (and sorry for all the verbose introduction :P): How do you properly implement a navigation flow from one fullscreen widget to a different one with MVC which solves the above problems, splits the responsibility between the View- and ApplicationController and is nicely object oriented in terms of coupling and cohesion?
Sometimes you miss one detail in your thought process and you open up a whole can of problems without even realizing the mistake you made.
In this case the detail was that you can naturally post asynchronous as well as synchronous events. If you have to make sure that you are no longer in the context of the event posting method, post an asynchronous event. Once you receive that event in your handler, you can be sure that the context was left. And for example you can safely delete the object, if you so desire. Naturally the event handler should not be in the context of the same object that you are trying to delete.
For completeness: In Qt, you can specify for each signal/slot-connection you make with connect(), that it should be of type Qt::QueuedConnection. If you raise a signal, it won't be delivered until the control is back to the thread's event loop. Normally, Qt::AutoConnection is used which delivers a signal at the time it is raised (Qt::DirectConnection) if the receiver is in the same thread or falls back to queuing that signal (Qt::QueuedConnection) if the receiver is in a different thread.
In wxWidgets you can queue events with wxEvtHandler::QueueEvent(wxEvent* event) which is available through the Application singleton for example.