I'm trying to write a QML plugin that reads frames from a video (using a custom widget to do that task, NOT QtMultimedia/Phonon), and each frame is converted to a QImage RGB888, and then displayed on a QGLWidget (for performance reasons). Right now nothing is draw to the screen and the screen stays white all the time.
It's important to state that I already have all of this working without QGLWidget, so I know the issue is setting up and drawing on QGLWidget.
The plugin is being registered with:
qmlRegisterType<Video>(uri,1,0,"Video");
so Video is the main class of the plugin. On it's constructor we have:
Video::Video(QDeclarativeItem* parent)
: QDeclarativeItem(parent), d_ptr(new VideoPrivate(this))
{
setFlag(QGraphicsItem::ItemHasNoContents, false);
Q_D(Video);
QDeclarativeView* view = new QDeclarativeView;
view->setViewport(&d->canvas()); // canvas() returns a reference to my custom OpenGL Widget
}
Before I jump to the canvas object, let me say that I overloaded Video::paint() so it calls canvas.paint() while passing QImage as parameter, I don't know if this is the right way to do it so I would like some advice on this:
void Video::paint(QPainter* painter, const QStyleOptionGraphicsItem* option, QWidget* widget)
{
Q_UNUSED(painter);
Q_UNUSED(widget);
Q_UNUSED(option);
Q_D(Video);
// I know for sure at this point "d->image()" is valid, but I'm hiding the code for clarity
d->canvas().paint(painter, option, d->image());
}
The canvas object is declared as GLWidget canvas; and the header of this class is defined as:
class GLWidget : public QGLWidget
{
Q_OBJECT
public:
explicit GLWidget(QWidget* parent = NULL);
~GLWidget();
void paint(QPainter* painter, const QStyleOptionGraphicsItem* option, QImage* image);
};
Seems pretty simple. Now, the implementation of QGLWidget is the following:
GLWidget::GLWidget(QWidget* parent)
: QGLWidget(QGLFormat(QGL::SampleBuffers), parent)
{
// Should I do something here?
// Maybe setAutoFillBackground(false); ???
}
GLWidget::~GLWidget()
{
}
And finally:
void GLWidget::paint(QPainter* painter, const QStyleOptionGraphicsItem* option, QImage* image)
{
// I ignore painter because it comes from Video, so I create a new one:
QPainter gl_painter(this);
// Perform drawing as Qt::KeepAspectRatio
gl_painter.fillRect(QRectF(QPoint(0, 0), QSize(this->width(), this->height())), Qt::black);
QImage scaled_img = image->scaled(QSize(this->width(), this->height()), _ar, Qt::FastTransformation);
gl_painter.drawImage(qRound(this->width()/2) - qRound(scaled_img.size().width()/2),
qRound(this->height()/2) - qRound(scaled_img.size().height()/2),
scaled_img);
}
What am I missing?
I originally asked this question on Qt Forum but got no replies.
Solved. The problem was that I was trying to create a new GL context within my plugin when I should be retrieving the GL context from the application that loaded it.
This code was very helpful to understand how to accomplish that.
By the way, I discovered that the stuff was being draw inside view. It's just that I needed to execute view->show(), but that created another window which was not what I was looking for. The link I shared above has the answer.
I think that you have to draw on your glwidget using the opengl functions.
One possible way is to override your paintGL() method in GLWidget
and then draw your image with glDrawPixels().
glClear(GL_COLOR_BUFFER_BIT);
glDrawPixels(buffer.width(), buffer.height(), GL_RGBA, GL_UNSIGNED_BYTE, buffer.bits());
Where buffer is a QImage object that needs to be converted using QGLWidget::converrtToGLFormat() static method.
Look at this source for reference:
https://github.com/nayyden/ZVector/blob/master/src/GLDebugBufferWidget.cpp
Related
I have some code that uses QPainter to render to a QWidget, QImage or QPrinter, since each is derived from QPaintDevice, like this:
void RenderWithQPainter (QPaintDevice* device)
{
QPainter painter (device);
painter->doThis();
painter->doThat();
}
void RenderToImage (QImage* image)
{
RenderWithQPainter (image);
}
void RenderToWidget (QWidget* widget)
{
RenderWithQPainter (widget);
}
void RenderToPrinter (QPrinter* printer)
{
RenderWithQPainter (printer);
}
Now though, I want to use OpenGL. I know how to render into a QOpelGLWidget but I'd like my code to remain device-agnostic. So what can I render into that I can then copy to the various devices? Is there something I could derive from each of QWidget, QImage and QPrinter, not necessarily a Qt class, which I could render to using OpenGL commands instead of QPainter?
I have a QML Canvas, on which I'm drawing in C++ by overriding the paint(QPainter *painter) method and using a bunch of statements that use that painter object.
Stuff like...
void myGraphDisplay::paint (QPainter* painter) {
QPainterPath path;
path.MoveTo(0, 0);
path.LineTo(100, 100);
painter->strokePath(path, painter->pen());
etc.
Now some time later, another function wants to draw on the canvas, but the painter object is no longer available. I have tried saving it as a private member of the myGraphDisplay class but my application crashes if I try to access it again in the later function.
void myGraphDisplay::updateGraph () {
QPainterPath path;
path.MoveTo(100, 0);
path.LineTo(0, 100);
painter->strokePath(path, painter->pen()); // where do I get "painter" here?
I also tried
QPainter painter(this);
as shown on the QT reference pages but that gives me an error...
no matching function for call to QPainter::QPainter(myGraphDisplay*)
How do I get the current QPainter object? If it's any help, updateGraph() is a Q_INVOKABLE called from the same QML.
The rules state that the process of painting a QQuickItem occurs in the paint method, not in another method. The generic solution is:
Save the painting information in some attribute of the class.
Invoke the paint method
Implement the logic in the paint method.
*.h
private:
QPainterPath m_path;
*.cpp
myGraphDisplay::myGraphDisplay(QQuickItem *parent): QQuickItem(parent){
// default path
m_path.MoveTo(0, 0);
m_path.LineTo(100, 100);
}
void myGraphDisplay::updateGraph(){
QPainterPath path;
path.MoveTo(100, 0);
path.LineTo(0, 100);
m_path = path;
update();
}
void myGraphDisplay::paint (QPainter* painter) {
painter->strokePath(m_path, painter->pen());
}
This is my first time using Qt and I have to make a MSPaint equivalent with Qt. I am however having trouble with painting my lines. I can currently draw a line by clicking somewhere on the screen and releasing somewhere else, however when I draw a second line the previous line is erased. How could I keep the previously painted items when painting another item?
void Canvas::paintEvent(QPaintEvent *pe){
QWidget::paintEvent(pe);
QPainter p(this);
p.drawPicture(0,0,pic);
}
void Canvas::mousePressEvent(QMouseEvent *mp){
start = mp->pos();
}
void Canvas::mouseReleaseEvent(QMouseEvent *mr){
end = mr->pos();
addline();
}
void Canvas::addline()Q_DECL_OVERRIDE{
QPainter p(&pic);
p.drawLine(start,end);
p.end();
this->update();
}
Canvas is a class that derives QWidget, it has 2 QPoint attributes start and end.
Class body:
class Canvas : public QWidget{
Q_OBJECT
private:
QPoint start;
QPoint end;
QPicture pic;
public:
Canvas(){paint = false;setAttribute(Qt::WA_StaticContents);}
void addline();
protected:
void paintEvent(QPaintEvent *);
void mousePressEvent( QMouseEvent * );
//void mouseMoveEvent( QMouseEvent * );
void mouseReleaseEvent( QMouseEvent * );
};
QPicture records QPainter commands. Also from its documentation you can read this:
Note that the list of painter commands is reset on each call to the
QPainter::begin() function.
And the QPainter constructor with a paint device does call begin(). So each time the old recorded commands are deleted.
It may sound tempting to use it, since it does say a few good things, for example, that it is resolution independent, but this is not how drawing applications work in reality. Switch to a QPixmap and your drawings will persist.
Also, don't forget to initialize the pixmap, because by default it will be empty and you will not be able to draw on it.
Canvas() : pic(width,height) {...}
Furthermore, if you would like the introduce the concept of brushes as in artistic brushes and not QBrush, you might want to look at this approach to draw the line.
EDIT: Note that you should be able to prevent QPicture from losing its content by not calling begin() on it more than once. If you create a painter, dedicated to only drawing on it at class scope, and call begin in the constructor, different recorded drawing operations should persist. But as their number increases it will take more and more time to draw the QPicture to your widget. You could come around that by using both a QPicture and a QPixmap, and draw to both, use the picture to record the actions and the pixmap to avoid continuously redrawing the picture, even though you will do double the work it will still be more efficient, while you still retain the possibility to use the picture to re-rasterize in a different resolution or save the drawing history. But I doubt QPicture will do well as your drawing application begins to take shape of an actual drawing application, for example when you start using pixmap brushe stencils and such.
I am creating a simple gauge in Qt 4.7.4, and everything is working wonderfully. Except for the fact that, for the life of me, I cannot get the dial shape to paint over the text labels when it passes over them. It always paints it behind the label. I am just using a simple drawpolygon() method.
I'm thinking this has something to do about paint events? I am drawing everything inside a QFrame inside a MainWindow. I am using QFrame's paintEvent.
Edit:
The QLabels are created on start up with new QLabel(this). They are only created once, and never touched again ( Similar to manually adding them on the Ui with Designer). The drawpolygon() is in the QFrame's Paint event.
"myclass.h"
class gauge : public QFrame
{
Q_OBJECT
public:
explicit gauge(QWidget *parent = 0);
~gauge();
void setValues(int req, int Limit, bool extra=false);
private:
void drawDial();
protected:
void paintEvent(QPaintEvent *e);
};
"myclass.cpp"
void gauge::paintEvent(QPaintEvent *e)
{
Q_UNUSED(e);
drawDial();
return;
}
void gauge::drawDial()
{
QPainter Needle(this);
Needle.save();
Needle.setRenderHint(Needle.Antialiasing, true); // Needle was Staggered looking, This will make it smooth
Needle.translate(centrePt); // Center of Widget
Needle.drawEllipse(QPoint(0,0),10,10);
Needle.restore();
Needle.end();
}
If the gauge widget and the QLabels are siblings, then you can move the gauge widget to the front by calling its raise() method.
If the QLabels are children of the gauge widget, on the other hand, then they will always display in front of it. In that case you can either reorganize your widget hierarchy so that they are siblings instead, or you can get rid of the QLabels and simply call drawText() from your paintEvent() method instead (after drawDial() returns)
So it appears that Qt4 doesn't let you draw on windows outside of a paint event. I have a lot of code that expects to be able to in order to draw rubber band lines (generic drawing code for a particular, proprietary interface that I then implement in the given UI). I've read about the pixmap method, it would be a lot of work and I don't think it's really what I want.
Is there a workaround that allows me to do what I want anyway? I just need to draw XOR bands on the screen.
Tried the WA_PaintOutsidePaintEvent flag. Then I saw the bit that says it doesn't work on Windows.
In modern compositing desktops window painting needs to be synchronized by the window manager so that the alpha blending and other effects can be applied, in order, to the correct back buffers - the result of which is then flipped onto the screen to allow tear-free window animations.
Invoking painting operations out-of-band of this process - while supported for legacy reasons on the underlying platforms - would subvert this process and cause a number of very non optimal code paths to be executed.
Basically, when you have painting to do on a window: Call the invalidate function to schedule the painting soon, and paint during the paint event.
Just paint to a QPixmap, and copy it to the real widget in the paintEvent. This is the only standard way. You shouldn't try to workaround it.
Seems like if you could get access to the Hwnd of the window in question, you could paint on that surface. Otherwise, I'm not sure. If by pixmap method you mean something like this, I don't think it's a bad solution:
m_composed_image = QImage(size, QImage::Format_ARGB32);
m_composed_image.setDotsPerMeterX(dpm);
m_composed_image.setDotsPerMeterY(dpm);
m_composed_image.fill(Qt::transparent);
//paint all image data onto new image
QPainter painter(&m_composed_image);
painter.drawImage(QPoint(0, 0), m_alignment_image);
As it's mentioned in one of the answers, The best way to do it will be to make a pixmap buffer. The painting works will be done in the buffer and when it's done, repaint() will be scheduled. And the paintEvent() function just paints the widget by copying the pixel buffer
I was trying to draw a circle on a widget area after user inputs values and pushes a button. This was my solution. connecting the drawCircle() slot to the clicked() signal.
class PaintHelper : public QWidget
{
Q_OBJECT
private:
QPixmap *buffer;
public:
explicit PaintHelper(QWidget *parent = 0) : QWidget(parent)
{
buffer=new QPixmap(350,250);// this is the fixe width of this widget so
buffer->fill(Qt::cyan);
}
signals:
public slots:
void drawCircle(int cx, int cy, int r){
QPainter painter(buffer);
painter.setBrush(QBrush(QColor(0,0,255)));
// A part of mid-point algorithm to draw 1/8 pacrt of circle
int x1=0,y1=r;
int p=1-r;
for(int i=0;y1>=x1;i++){
painter.drawPoint(x1+cx,y1+cy);
x1++;
if(p>0){
p+=3+x1;
}
else{
y1--;
p+=2*x1-2*y1;
p++;
}
}
this->repaint();
}
// QWidget interface
protected:
void paintEvent(QPaintEvent *event)
{
QPainter painter(this);
painter.drawPixmap(0,0,*buffer);
}
};