How does View get updated behind the scenes? - c++

So when I use a setText() on a QLabel for example, Qt automatically updates the view/gui for me and the new text is shown, but what happens behind the scenes? Is there an update function that gets called automatically when using functions like setText()?
Thanks!!

You should check the basic documentation in this link.
The internal system is a little bit more complex but in general, it follows the observer pattern. This mechanism allows the detection of a user action or changing state, and respond to this action.
Low-level interactions, like refreshing the screen are implemented via the Event System
In Qt, events are objects, derived from the abstract QEvent class, that represent things that have happened either within an application or as a result of outside activity that the application needs to know about. Events can be received and handled by any instance of a QObject subclass, but they are especially relevant to widgets. This document describes how events are delivered and handled in a typical application.
So, regarding the display process, there is a dedicated event. A QWidget object handles/subscribe to a PaintEvent, see QWidget::paintEvent.
This event handler can be reimplemented in a subclass to receive paint events passed in event. A paint event is a request to repaint all or part of a widget.
When you call, QLineEdit::setText(), the widget will be repainted the next time a display event is triggered, based in the OS configuration, refresh rate, etc.
For high-level interactions, Qt uses a similar pattern based in the signal/slot mechanism:
Observer pattern is used everywhere in GUI applications and often leads to some boilerplate code. Qt was created with the idea of removing this boilerplate code and providing a nice and clean syntax, and the signal and slots mechanism is the answer.

Related

What is the correct way to display widgets without calling QApplication::exec()?

For test purposes I'd like to create and display a widget. For now I only need the widget to render correctly but in the future I may want to extend this so I simulate various events to see how the widget behaves.
From various sources it would appear that the following should work:
QApplication app;
QPushButton button("Hello");
button.show();
// Might also be necessary:
QApplication::processEvents();
But for me the widget does not render correctly. A window is created to display the widget, however it is entirely black.
I can get the widget to render correctly by adding the following lines:
std::this_thread::sleep_for(std::chrono::milliseconds(10));
QApplication::processEvents();
With 10 milliseconds being about the smallest time necessary to get the widget to render correctly.
Does anyone know how to get this to work without the time delay, or know why the delay is necessary?
To test Qt GUI application you need at least QApplication instance and event loop being processed. The fastest way is just use QTEST_MAIN macro, this answer explains in a nice way what it does exactly. However, to have more flexibility (e.g. to use GTest, GMock) you can also simply create QAplication instance in your tests main (no need to call exec).
Then, to have the events processed, you should invoke QTest::qWait. This will process your events for the specified amount of time. It is a good practice to use qWaitFor which accepts a predicate - this way you avoid race conditions in your tests.
In the particular scenario, when you expect some signal to be emitted, you can also use similar functionality of QSignalSpy::wait.
Small example when we want to wait until some parameters are passed from one item to another:
QSignalSpy spy(&widget1, &Widget1::settingsChanged);
widget2->applySettings();
ASSERT_TRUE(spy.wait(5000));
// do some further tests based on the content of passed settings
Why don't you want to have the application run exec ? The process of displaying a widget is not "static". You don't "draw" the widget on the screen, but rather you have an application that listen for various events and receives draw events from the windowing manager. The application can only draw the widget when the windowing manager asks it to.
The reason your second code works is that you wait sufficiently long for the windowing manager to have sent the "draw" request in your conditions. This does not guarantee it will always work.
If you want to guarantee the display of the widget, you need to start a loop and wait until you have received at least one draw event, but even that isn't foolproof.
As expertly described by Vincent Fourmond, widgets are not a one-off deal. The GUI is non-blocking and for this, it needs to run in an event loop.
The exec() method starts this event loop which you mimicked by polling.
While it is possible to combine Qt's event loop with other event loops, I would recommend you a simpler solution:
Proceed your program within the event loop by calling a method when it starts. Find an excellent answer here on how to do this: https://stackoverflow.com/a/8877968/21974
As you mentioned unit testing, there is also a signal you can use for doing checks at the end of the lifecycle (before widgets are destroyed): QApplication::aboutToQuit. This will happen when the last window is closed (programmatically or by the user).

Is there an equivalent to MFCs OnUpdate in Qt?

I have an existing application that uses MFC for the UI, and I'm trying to migrate to Qt. For the most part the migration is straight forward, but I'm not sure how to manage the enabled state of actions (menu and toolbar items).
In MFC you implement a callback with enable/disable logic, and this is called when the item is displayed. In Qt, you only have access to the setEnabled() method.
Is there a built-in or standardized way of connecting an update callback to an action? or do I need to create my solution using timers and registering actions with it? In a large application such as the one I'm working with, the 'should enable' logic can jump all over the place - i.e. certain files on disk must exist, the main display must have a selection, the application's ProcessManager::isProcessing() must be false, etc. It doesn't seem practical to rely on setEnabled() being called on specific actions when there are so many conditions behind the enable/disable logic.
The most "standard" Qt way would be the use of signals/slots.
In my MDI apps, which are based on the Qt MainWindow/MDI examples, I just connect a single "updateMenus()" function to the signal emitted whenever an MDI subwindow is shown or hidden.
Now that may not be enough granularity for your application. So what you could do is - still have a single "updateMenus()" method - but connect it to each menu's "aboutToShow()/aboutToHide()" signals.
That way you keep the logic from sprawling all over the place, and only update menus right when they are needed (like in MFC's OnCmdUI()).
Here's my mainwindow constructor:
mp_mdiArea = new QMdiArea();
setCentralWidget(mp_mdiArea);
connect(mp_mdiArea, SIGNAL(subWindowActivated(QMdiSubWindow*)), this, SLOT(updateMenus()));
And here's my updateMenus():
void MainWindow::updateMenus()
{
bool hasMdiChild = (activeMdiChild() != nullptr);
mp_actionSave->setEnabled(hasMdiChild);
mp_actionSaveAs->setEnabled(hasMdiChild);
mp_actionClose->setEnabled(hasMdiChild);
}
See Qt 4.8 doc for menu->aboutToShow()/Hide() here

Is it possible to get information on the control on which an event was issued in Qt?

I am trying to create one event handler for button clicks and connect that to multiple buttons (creating a simple calculator where pressing each number adds its text to the lineEdit).
In C# we would use the sender object which was passed as a parameter and then cast it back to Button and get its Text or other needed property and go on.
I am new to Qt, Do we have such a thing or a similar approach in Qt? Since I couldn't get it out of the signal/slot method of Qt.
On the QObject / QWidget that receives the signal, call this->sender() (QObject::sender()), and cast it with dynamic_cast<MyWidgetType*>(...)
You can find some good examples here for linking back to the issuer of an event.
http://doc.qt.digia.com/qq/qq10-signalmapper.html
They give you different examples for
The sender() Approach (like Jamin Grey's approach below)
The Subclass Approach
The Signal Mapper Approach

Events pool in qt

I'm working on a Qt app, and at some point I have a class (I name it here "engine") that is governing the program: it has a signal with a timeout which makes the class to draw, and evolve the app logic. Morevoer, it receives events that are caught from a QGraphicsScene.
Each engine "tick", the update() is called on the Scene, updating the drawing according to the app evolution.
Naturally, I want the drawing to be synchronized with the reactions of the events, otherwise, a drawing of some object could be made while the reaction of a event was destroying that same object, causing a SegFault.
I've tried using a queue on the engine such that I would only make the engine to react to those events on a specific place of a update, thus not interfering with the drawing part.
Two problems rised:
I cannot make a copy of a QGraphicsEvent. Apparently the copy operator is private (which I assume is for a good reason anyway).
When the class is processing the events, before the drawing, it can also happen that a new event appears, which can be "bad" because of non-synchronization of it
Taking into account this situation, is there any approach that can solve this situation? Is there any standard procedure in Qt for this? I mean, how do I ensure the drawing is not potentially desynchronized with the events' reactions of the application?

Qt events and signal/slots

In the Qt world, what is the difference of events and signal/slots?
Does one replace the other? Are events an abstraction of signal/slots?
In Qt, signals and events are both implementations of the Observer pattern. They are used in different situations because they have different strengths and weaknesses.
First of all let's define what we mean by 'Qt event' exactly: a virtual function in a Qt class, which you're expected to reimplement in a base class of yours if you want to handle the event. It's related to the Template Method pattern.
Note how I used the word "handle". Indeed, here's a basic difference between the intent of signals and events:
You "handle" events
You "get notified of" signal emissions
The difference is that when you "handle" the event, you take on the responsibility to "respond" with a behavior that is useful outside the class. For example, consider an app that has a button with a number on it. The app needs to let the user focus the button and change the number by pressing the "up" and "down" keyboard keys. Otherwise the button should function like a normal QPushButton (it can be clicked, etc). In Qt this is done by creating your own little reusable "component" (subclass of QPushButton), which reimplements QWidget::keyPressEvent. Pseudocode:
class NumericButton extends QPushButton
private void addToNumber(int value):
// ...
reimplement base.keyPressEvent(QKeyEvent event):
if(event.key == up)
this.addToNumber(1)
else if(event.key == down)
this.addToNumber(-1)
else
base.keyPressEvent(event)
See? This code presents a new abstraction: a widget that acts like a button, but with some extra functionality. We added this functionality very conveniently:
Since we reimplemented a virtual, our implementation automatically became encapsulated in our class. If Qt's designers had made keyPressEvent a signal, we would need to decide whether to inherit QPushButton or just externally connect to the signal. But that would be stupid, since in Qt you're always expected to inherit when writing a widget with a custom behavior (for good reason - reusability/modularity). So by making keyPressEvent an event, they convey their intent that keyPressEvent is just a basic building block of functionality. If it were a signal, it'd look like a user-facing thing, when it's not intended to be.
Since the base-class-implementation of the function is available, we easily implement the Chain-of-responsibility pattern by handling our special cases (up&down keys) and leaving the rest to the base class. You can see this would be nearly impossible if keyPressEvent were a signal.
The design of Qt is well thought out - they made us fall into the pit of success by making it easy to do the right thing and hard to do the wrong thing (by making keyPressEvent an event).
On the other hand, consider the simplest usage of QPushButton - just instantiating it and getting notified when it's clicked:
button = new QPushButton(this)
connect(button, SIGNAL(clicked()), SLOT(sayHello())
This is clearly meant to be done by the user of the class:
if we had to subclass QPushButton every time we want some button to notify us of a click, that would require a lot of subclasses for no good reason! A widget that always shows a "Hello world" messagebox when clicked is useful only in a single case - so it's totally not reusable. Again, we have no choice but to do the right thing - by connecting to it externally.
we may want to connect several slots to clicked() - or connect several signals to sayHello(). With signals there is no fuss. With subclassing you would have to sit down and ponder some class diagrams until you decide on an appropriate design.
Note that one of the places QPushButton emits clicked() is in its mousePressEvent() implementation. That doesn't mean clicked() and mousePressEvent() are interchangable - just that they're related.
So signals and events have different purposes (but are related in that both let you "subscribe" to a notification of something happening).
I don’t like the answers so far. – Let me concentrate on this part of the question:
Are events an abstraction of signal/slots?
Short answer: no. The long answer raises a “better” question: How are signals and events related?
An idle main loop (Qt’s for example) is usually “stuck” in a select() call of the operating system. That call makes the application “sleep”, while it passes a bunch of sockets or files or whatever to the kernel asking for: if something changes on these, let the select() call return. – And the kernel, as the master of the world, knows when that happens.
The result of that select() call could be: new data on the socket connect to X11, a packet to a UDP port we listen on came in, etc. – That stuff is neither a Qt signal, nor a Qt event, and the Qt main loop decides itself if it turns the fresh data into the one, the other or ignores it.
Qt could call a method (or several) like keyPressEvent(), effectively turning it into a Qt event. Or Qt emits a signal, which in effect looks up all functions registered for that signal, and calls them one after the other.
One difference of those two concepts is visible here: a slot has no vote on whether other slots registered to that signal will get called or not. – Events are more like a chain, and the event handler decides if it interrupts that chain or not. Signals look like a star or tree in this respect.
An event can trigger or be entirely turned into a signal (just emit one, and don’t call “super()”). A signal can be turned into an event (call an event handler).
What abstracts what depends on the case: the clicked()-signal abstracts mouse events (a button goes down and up again without too much moving around). Keyboard events are abstractions from lower levels (things like 果 or é are several key strokes on my system).
Maybe the focusInEvent() is an example of the opposite: it could use (and thus abstract) the clicked() signal, but I don’t know if it actually does.
The Qt documentation probably explains it best:
In Qt, events are objects, derived
from the abstract QEvent class, that
represent things that have happened
either within an application or as a
result of outside activity that the
application needs to know about.
Events can be received and handled by
any instance of a QObject subclass,
but they are especially relevant to
widgets. This document describes how
events are delivered and handled in a
typical application.
So events and signal/slots are two parallel mechanisms accomplishing the same things. In general, an event will be generated by an outside entity (for example, keyboard or mouse wheel) and will be delivered through the event loop in QApplication. In general, unless you set up the code, you will not be generating events. You might filter them through QObject::installEventFilter() or handle events in subclassed object by overriding the appropriate functions.
Signals and Slots are much easier to generate and receive and you can connect any two QObject subclasses. They are handled through the Metaclass (have a look at your moc_classname.cpp file for more), but most of the interclass communication that you will produce will probably use signals and slots. Signals can get delivered immediately or deferred via a queue (if you are using threads).
A signal can be generated.
Events are dispatched by the event loop. Each GUI program needs an event loop, whatever you write it Windows or Linux, using Qt, Win32 or any other GUI library. As well each thread has its own event loop. In Qt "GUI Event Loop" (which is the main loop of all Qt applications) is hidden, but you start it calling:
QApplication a(argc, argv);
return a.exec();
Messages OS and other applications send to your program are dispatched as events.
Signals and slots are Qt mechanisms. In the process of compilations using moc (meta-object compiler), they are changed to callback functions.
Event should have one receiver, which should dispatch it. No one else should get that event.
All slots connected to the emitted signal will be executed.
You shouldn't think of Signals as events, because as you can read in the Qt documentation:
When a signal is emitted, the slots connected to it are usually executed immediately, just like a normal function call. When this happens, the signals and
slots mechanism is totally independent
of any GUI event loop.
When you send an event, it must wait for some time until the event loop dispatches all events that came earlier. Because of this, execution of the code after sending event or signal is different. Code following sending an event will be run immediately. With the signals and slots mechanisms it depends on the connection type. Normally it will be executed after all slots. Using Qt::QueuedConnection, it will be executed immediately, just like events. Check all connection types in the Qt documentation.
There is an article that discusses event processing in some detail: http://www.packtpub.com/article/events-and-signals
It discussions the difference between events and signals here:
Events and signals are two parallel mechanisms used to accomplish the
same thing. As a general difference, signals are useful when using a
widget, whereas events are useful when implementing the widget. For
example, when we are using a widget like QPushButton, we are more
interested in its clicked() signal than in the low-level mouse press
or key press events that caused the signal to be emitted. But if we
are implementing the QPushButton class, we are more interested in the
implementation of code for mouse and key events. Also, we usually
handle events but get notified by signal emissions.
This seems to be a common way of talking about it, as the accepted answer uses some of the same phrases.
Note, please see helpful comments below on this answer from Kuba Ober, that make me wonder if it might be a bit simplistic.
TL;DR: Signals and slots are indirect method calls. Events are data structures. So they are quite different animals.
The only time when they come together is when slot calls are made across thread boundaries. The slot call arguments are packed up in a data structure and get sent as an event to the receiving thread's event queue. In the receiving thread, the QObject::event method unpacks the arguments, executes the call, and possibly returns the result if it was a blocking connection.
If we're willing to generalize to oblivion, one could think of events as as a way of invoking the target object's event method. This is an indirect method call, after a fashion - but I don't think it's a helpful way of thinking about it, even if it's a true statement.
'Event processing' by Leow Wee Kheng says:
Jasmine Blanchette says:
The main reason why you would use events rather than standard function calls, or signals and slots, is that events can be used both synchronously and asynchronously (depending on whether you call sendEvent() or postEvents()), whereas calling a function or invoking a slot is always synchronous. Another advantage of events is that they can be filtered.
Events (in a general sense of user/network interaction) are typically handled in Qt with signals/slots, but signals/slots can do plenty of other things.
QEvent and its subclasses are basically just little standardized data packages for the framework to communicate with your code. If you want to pay attention to the mouse in some way, you only have to look at the QMouseEvent API, and the library designers don't have to reinvent the wheel every time you need to figure out what the mouse did in some corner of the Qt API.
It is true that if you're waiting for events (again in the general case) of some sort, your slot will almost certainly accept a QEvent subclass as an argument.
With that said, signals and slots can certainly be used without QEvents, although you'll find that the original impetus for activating a signal will often be some kind of user interaction or other asynchronous activity. Sometimes, however, your code will just reach a point where firing off a certain signal will be the right thing to do. For example, firing off a signal connected to a progress bar during a long process doesn't involve a QEvent up to that point.
Another minor pragmatic consideration: emitting or receiving signals requires inheriting QObject whereas an object of any inheritance can post or send an event (since you invoke QCoreApplication.sendEvent() or postEvent()) This is usually not an issue but: to use signals PyQt strangely requires QObject to be the first super class, and you might not want to rearrange your inheritance order just to be able to send signals.)
In my opinion events are completely redundant and could be thrown out. There is no reason why signals could not be replaced by events or events by signals, except that Qt is already set up as it is. Queued signals are wrapped by events and events could conceivably be wrapped by signals, for example:
connect(this, &MyItem::mouseMove, [this](QMouseEvent*){});
Would replace the convenience mouseMoveEvent() function found in QWidget (but not in QQuickItem anymore) and would handle mouseMove signals that a scene manager would emit for the item. The fact that the signal is emitted on behalf of the item by some outside entity is unimportant and happens quite often in the world of Qt components, even though it is supposedly not allowed (Qt components often circumvent this rule). But Qt is a conglomerate of many different design decisions and pretty much cast in stone for fear of breaking old code (which happens often enough anyway).