Enable antialiasing for QPushButton - c++

I have a QPushButton that I am styling with a border-image through a stylesheet. However, the image quality is poor because the image isn't being drawn using antialiasing.
Is there any way to enable antialiasing simply without subclassing QPushButton and giving it a new painter? I don't really want to do that because I'm using the style sheet extensively and would have to create a bunch of QProperties to emulate the existing pseudo-state functionality.

You can normally set the Anti-aliasing flag, on a QPainter object using the setRenderHints method. This is normally done by Subclassing the widget and overriding the paintEvent.
According to the Qt Docs for QPainter:
... When the paintdevice is a widget, QPainter can only be used inside a paintEvent() function or in a function called by paintEvent() ...
Without subclassing, you will be limited to intercepting the paintEvent using an Event Filter and setting the flag yourself.
You will need to create a class that overrides the event handler for your object. This class will be installed using QObject::installEventFilter. This class will need to filter the events to handle the specific ones that you care about (QPaintEvent). Then it will need to create a QPainter object that takes in the originating object(using the second constructor) as its device as shown in this qtforum post. This works because QWidget inherits from QPaintDevice.
...
myView::handlePaintEvent(QObject *obj, QEvent *eve)
{
QPainter painter(static_cast<QWidget *>(obj));
}
...
From here you should be able to set the proper Render Hints to what you need them to be.
This same event filter class can be installed on numerous objects so the same functionality can be added very quickly and without subclassing any other widget.

Related

Qt how to redirect widget painting to parent widget?

I'm creating some custom qt designer widget plugin for drawing purpose. With these widgets, user can use qt designer for drawing just like Microsoft Visio (hopefully).
As shown in the screenshot below, there are one SvPage object page_0 as the container, it contains one SvArc widget and one SvCircle widget.
Every thing is good, except that when one widget (A) cover other widget (B), user cannot select widget B easily.
To solve this problem, I'm trying to do:
Set the size of each drawing widget (eg. SvArc,SvCircle) to very small (40px * 40 px);
Paint the content of drawing widget direct to its parent widget (SvPage). In SvPage::PaintEvent(QPaintEvent event), it iterates all children drawing widgets and call the doPaint(QPainter painter) method of each children.
3. To refresh the drawing widget automatically (eg, when SvArc widget is moved, its drawing on SvPage should be updated automatically), in the drawing widget's SvArc::PaintEvent(QPaintEvent *event), it will trigger the SvPage to update its painting.
But in step 3, there is a problem that it will lead to recursive repaint issue:
because SvArc::PaintEvent() trigger SvPage::PaintEvent(), and SvPage::PaintEvent() will then trigger SvArc::PaintEvent() again since SvArc widget is a child widget of SvPage widget.
So, the question is that is it a good idea to redirect widget painting to parent widget? If yes, how to solve the recursive repaint issue? If no, what's the good one?
Code (simplified):
void SvPage::paintEvent(QPaintEvent *event)
{
initPainter();
QList<SvWidget*> widgets = this->findChildren<SvWidget*>();
for (int i = 0; i < widgets.count(); i++)
{
SvWidget* w = widgets.at(i);
w->doPaint(this->painter);
}
destoryPainter();
}
void SvWidget::paintEvent(QPaintEvent *event)
{
Q_UNUSED(event);
emit signalDoPaint();
}
void SvArc::doPaint(QPainter* painter)
{
painter->drawArc(x, y, w, h, a alen);
}
You are messing things up here.
Every widget should be responsible for its own drawing. That's how Qt is designed to work.
You could use a single widget as a manager for some objects, and to draw them, but then those objects don't need to be widgets, they can be simple data representations. In that case, the objects will not concern themselves with any painting, it will be the manager widget that does it.
However that approach will be less efficient. Because when you have multiple independent widgets, the paint engine can easily detect changes and efficiently repaint only the parts that need updating.
In your case you will either have to do a whole lot of redundant repainting, or implement more sophisticated item management, which will be a complex task that will definitely not be worth the effort, if you are even up to it to begin with, which you are probably not.
Your current approach is very bad. I'd suggest to just stick to regular widgets, in their actual sizes, doing their actual painting. It will be much easier for you to implement and manage it, and it will be much easier for the computer to paint it.
As for selecting between overlapping widgets, QWidget wasn't really designed to facilitate that. Widgets are supposed to be put in layouts, not to overlap. Which is why its childAt() function can only return a single widget at a given coordinate.
What you should really do is use QGraphicsScene, QGraphicsView and QGraphicsItem. Similarly to widgets, graphics items will handle their own drawing efficiently, the difference is the API was designed for graphics, and when you have overlapping items, QGraphicsScene::items() will give you a list of all items at that position, so you can chose an item other than the topmost.
I'm creating some custom Qt Designer widget plugin for drawing purpose. With these widgets, user can use Qt Designer for drawing just like Microsoft Visio (hopefully).
The functionality you're reusing in Qt Designer is minimal, and could be easily factored out into a separate project. The only thing of any value for you is the property inspector pane.
For everything else, using widgets is about the most complicated way of implementing it. Use QGraphicsScene and QGraphicsView and start with 90% of your functionality already implemented and ready to go.
Implementing a rudimentary vector illustration system in QGraphicsScene is an afternoon job. You can have something with the functionality of early Corel Draw from Windows 2.x times done in a few days. It's the testament to the power of the scene framework and modern development frameworks in general.

Qt, multiple inheritance, wrappers or event filters

I have a few different QGlWidget based display widgets which I need to embed in either an MDI or QDockwidget based app. But I need to handle some of the MDI/Dock specific events (minimize/dock etc) in my display widget
Options are:
Multiply inherit the display widgets from QGlWidget and QMdiSubWindow/QDockWidget. Any issues with multiply inheriting and signals/slots?
Encapsulate the display inside a QMdiSubWindow/QDock derived widget but then I have to wrap all the display's external functions in the Mdi/Dock wrapper widget.
When I make a new window, create a temporary Mdi/Dock widget, connect all the special signals to slots in the display before attaching the display to it and showing it. But this doesn't work for events.
Some QSignalMapper magic where I can receive QMdiSubWindow/QDockWidget specific signals in a QGlWidget
MDI/dock widgets are containers for other widgets, so mixing their features with display widgets is not a very nice solution as you end up with a hideous hybrid widget that looks like a container - but cannot contain anything. Not that Qt would allow it as noted by Jeremy.
If your QGLWidget needs events from it's parent container (e.g. minimize, dock, etc.) why can't you create partner methods in the QGLWidget for them, and call them whenever the action is performed by the parent?

How to Layer independent widgets in Qt?

I'm creating an application using Qt which consists of a widget that is used as the background of the application, and a user control interface that is floating above.
A similar example is google maps, where the map is on the background and the controls are on top of the background.
But the thing is that the background widget can be changed to a different widget (there's a widget that displays a map, another widget that displays video feed, ...)
And the same thing happens for the buttons in the user control interface, they are not directly related to the current background and can be change dinamically.
I've tried using a QStackedLayout, using two layers, the background widget and the user control interface. But you cannot interact with the background layer because all the clicks are blocked by the widget in the front.
Any suggestions?
You could place a filter on the event stream to your interface widgets using the QObject::installEventFilter() function, and intercept all the incoming mouse-click events. Once you have captured these events, use the filter function to delegate them to either the background widget, or deliver them to the front interface buttons. You would most likely have to use the (x,y) coordinates of the mouse-click to determine if an event should go to the background widget, or one of the foreground button widgets.
Another option is to create a derived class from QAbstractButton (or whatever QWidget you're using for your buttons), and re-implement the event functions for mouse-clicks on that widget (i.e., QAbstractButton::mousePressEvent(), etc.). When a mouse-click arrives, check to see if the mouse was over the button, and if it wasn't, send the event to the background widget via a signal or QCoreApplication::sendEvent().
Your question is too generic to give you a especific answer, but the most obvious solution is to implement classes that inherits from QWidget for each possible component of you system. In your example I can visualize 2 distinct components: Background and Controls. Background would store all the image data, like maps and videos, while the Controls would have the buttons to interact with the system. You can even break the Background into different classes to manage image or video. I recommend using a central GUIController class that inherits from QObject to manage all the interface interactions, like connecting the signals/slots or implementing any animations, this way you can add/manage multiple widgets without going trough different .cpp's.
EDIT: With your comment, seems that your main problem is that your mouse events are not propagating to your widgets as you expected. Probably the reason for this is that you are not setting the parent/children relationships between the components. Make sure that you are calling the default QWidget constructor in your custom widgets classes like above:
CustoWidget(QWidget *parent = 0, Qt::WFlags flags = 0) : QWidget(parent, flags)
{
//your code here
}
When creating the Controller class, sets the right relationships between the components. In the context of your system, seens to me that all components will be added as Background children, so it would looks like below:
class Controller : public QObject
{
public:
Controller(QObject *parent = 0, Qt::WFlags flags = 0) : QObject(parent, flags)
{
wdg_back_= new BackWidget(this);
wdg_control_ = new Controls(wdg_back);
wdg_1_ = new GenericWidget(wdg_back);
//connect your signals/slots, etc
}
private:
BackWidget *wdg_back_;
Controls *wdg_control_;
GenericWidget *wdg_1_;
}
Ok I've finally found a solution for my issue.
My approach of using QStackedWidget was wrong, widget on the background are not meant to be clickable, and even though it might be done, it's not what I was looking for.
In the end, this is what I've done:
QWidget *centralWidget = new QWidget(this);
setCentralWidget(centralWidget);
MapView *backgroundWidget = new MapView(centralWidget);
backgroundWidget->setMinimumSize(1024,600);
QGridLayout *controlsLayout = new QGridLayout(centralWidget);
MyControlWidget *control1 = new MyControlWidget(centralWidget);
control1->setMinimumSize(140,140);
control1->show();
controlsLayout->addWidget(control1,2,0);
So I create a QWidget, centralWidget which will be the parent of the background and the foreground. Set the background to full screen, and organize the controls in a QGridLayout, which doesn't affect the backgroundWidget.
If I click on a control, the event is processed by this control, but clicking on an empty space will trigger a mouse event on the backgroundWidget, which is what I needed.
I'll test this for some time and if it works fine I'll close the question.

How to paint with QPainter only after a specific event?

I have a main window with some widgets on it, each needs its own graphic. I would like to use QPainter to draw shapes, lines, etc. on them, but only after a specific event, like the press of a button.
The problem is, if I just create a QPainter in any function, it won't work:
QPainter::setPen: Painter not active
The QPainter methods can only be called inside a paintEvent(QPaintEvent *) function! This raises the following problems:
I have to derive my custom classes for all the widgets I would like to paint on, so I can't use the Designer to place my widgets. This can get frustrating with a large number of widgets.
The widgets redraw themselves after each paint event of the window, like moving it around, or moving other windows in front of it. I do a lot of drawing in those widgets, so they will visibly blink in these cases.
Is there a better and simpler way to solve this? I started to think about just displaying images, and re-manufacturing those images only when the specific buttons are pressed. I doubt that it's the most elegant solution...
You can use custom widgets in the designer: Creating Custom Widgets for Qt Designer.
Qt Designer's plugin-based architecture allows user-defined and third party custom widgets to be edited just like you do with standard Qt widgets.
For your second question, one of the approaches is to create a QPixmap for each of your widgets. When your widget's appearance needs to be changed, you draw in that pixmap (using QPainter's constructor that takes a QPaintDevice - QPixmap is a QPaintDevice).
In your widget's paintEvent function, you simply fill your widget with that "cache" pixmap. This way, you only do the (potentially expensive) painting when it's actually necessary.

How does a Qt custom widget notify ScrollArea parent about change of view

I'm writing an image viewer as a custom Qt widget (see: https://github.com/dov/Qviv) and I now got stuck on the question of how to make my widget notify a parent QScrollArea of changes in the view port, and thus to tell it to move the scrollbars. E.g. if the image viewer changes the zoom factor as the result of a keypress then the scrollbars need to change their page size.
One way of doing it would be to have the widget explicitly check if the parent is a QScrollArea and then make an explicit call to its methods to notify it on any changes.
Of course I also need to connect the changes of the ScrollArea to the internal view of the image, but that is a different question. And I need to cut the infinite recursion where the widget reports changes to the scrollbar that report changes to the widget etc.
Edit 20:15 Wednesday (GMT/UTC) trying to clarify to Vjo and myself what I need.
What I am trying to achieve is the equivalent of a Gtk widget that has been assigned a pair of GtkAdjustment's that are connected to a horizontal and vertical scrollbar. In my widget GtkImageViewer, that QvivImageViewer is based on, whenever I change the view due to some internal event (e.g. a keypress) I update the GtkAdjustment's. The scrollbars are connected to such changes and are update accordingly. GtkImageViewer also listens to the GtkAdjustment changes, and thus if the user scrolls the scrollbars, the GtkImageViewer is updated with this information and can change its view. My question is whether there is anything similar to GtkAdjustment in Qt that you can connect to for changes, and update in which case the update will be propagated to all the listeners?
Thus I don't expect the ScrollArea to be part of QvivImageViewer, but if the user has placed QvivImageViewer within a ScrollArea, I want bidirectional communication with it so that the scrollbars reflect the internal state of the widget.
The simplest is to send the QResizeEvent event from your widget object to the QScrollArea object.
I finally downloaded the Qt sources and investigated how QTextEdit does it. What I found is that QTextEdit inherits the QAbstractScrollArea on its own, and thus the scroll area and the scrollbars are part of the widget. This is different from Gtk, which uses a higher level of abstraction, through its GtkAdjustment's that are used to signal changes between the scrollbars and the widget. The Qt model is simpler and this is the way that I will implement it in my widget.
It's been a while, but I ran across this same issue.
You can inherit QAbstractScrollArea if you'd like, but QScrollArea will work as well.
Your custom inner widget (i.e. the one that you are scrolling), should do the following when its size changes:
void MyCustomControl::resize_me() {
// recompute internal data such that sizeHint() returns the new size
...
updateGeometry();
adjustSize();
}
QSize MyCustomControl::sizeHint() {
return ... ; // Return my internally computed size.
}
I was missing the adjustSize() call, and without it the QScrollArea will ignore size changes of the internal widget.