Qt, openGL, widgets and offscreen rendering - c++

I am developing on RedHat Linux, cat /etc/redhat-release:
Red Hat Enterprise Linux Workstation release 7.2 (Maipo)
I am using Qt Creator 4.3.1:
Based on Qt 5.9.1 (GCC 5.3.1 20160406 (Red Hat 5.3.1-6), 64 bit)
The project I'm developing is using Qt 5.6.2 GCC 64bit, the project has been developed with graphical objects derived from QWidget, this includes a live video stream.
Unfortunately we have experienced tearing in the video whilst it is playing back and this is also evident in other widgets displayed around the video, I believe this is because the video is not using vsync.
I believe using openGL will rectify this situation, the aim is to rewrite the widgets including the video playback using openGL. I've spent several days trying to find complete and working solutions but so far failed to find a complete and working solution.
I've been looking at using QOpenGLWidget, in a widget I am using to test:
class clsElevStrip : public QOpenGLWidget, protected QOpenGLFunctions {
Q_OBJECT
In the constructor, I set-up the format for offscreen rendering:
//Create surface format for rendering offscreen
mobjFormat.setDepthBufferSize(24);
mobjFormat.setSamples(4);
mobjFormat.setVersion(3, 0);
mobjFormat.setSwapBehavior(QSurfaceFormat::DoubleBuffer);
setFormat(mobjFormat);
In the paintGL method:
QOpenGLContext* pobjContext = context();
QSurface* pobjSurface = pobjContext->surface();
assert(pobjSurface != NULL);
int intSB1 = pobjSurface->format().swapBehavior();
qDebug() << (QString("paintGL:format: ")
+ QString::number(intSB1));
pobjContext->makeCurrent(pobjSurface);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glBegin(GL_TRIANGLES);
glColor3f(1.0, 0.0, 0.0);
glVertex3f(-0.5, -0.5, 0);
glColor3f(0.0, 1.0, 0.0);
glVertex3f( 0.5, -0.5, 0);
glColor3f(0.0, 0.0, 1.0);
glVertex3f( 0.0, 0.5, 0);
glEnd();
pobjContext->swapBuffers(pobjSurface);
Nothing is visible on the main display, the debug statement shows the format as 2 (DoubleBuffering).
If I comment out the line in the constructor:
setFormat(mobjFormat);
The debug statement shows the format as 0 (DefaultSwapBehavior). And the graphics are visible, what have I missed?

The solution for your problem is simple:
Just do not all that QOpenGLGLContext jugging. The whole point of paintGL is, that this particular function is called inside a wrapper that already does all that context juggling for you. **There is no need to call makeCurrent or swapBuffers. Qt already does that for you!
From the Qt documentation
void QOpenGLWidget::paintGL()
This virtual function is called whenever the widget needs to be
painted. Reimplement it in a subclass.
There is no need to call makeCurrent() because this has
already been done when this function is called.
Before invoking this function, the context and the framebuffer are
bound, and the viewport is set up by a call to glViewport().
No other state is set and no clearing or drawing is performed
by the framework.
If you have just this as your paintGL it will show something, iff you have either a compatibility profile >=OpenGL-3.x context OR if you're using a <=OpenGL-2.x context. You're using the legacy fixed function pipeline there, which will not work with OpenGL-3.x core profile contexts!
void glwidget::paintGL(){
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glBegin(GL_TRIANGLES);
glColor3f(1.0, 0.0, 0.0);
glVertex3f(-0.5, -0.5, 0);
glColor3f(0.0, 1.0, 0.0);
glVertex3f( 0.5, -0.5, 0);
glColor3f(0.0, 0.0, 1.0);
glVertex3f( 0.0, 0.5, 0);
glEnd();
}

Related

Is it possible to stick a image onto one face of 3d object in OpenGL

I am currently working on a project with OpenGL. I wanted to stick an image onto the sign below to make it more realistic. Is there a way to do this with OpenGL? Maybe loading the image in and paint it onto the shape?
Here is the code to create the block of sign:
//Set the size and position
glPushMatrix();
glTranslatef(-550.0, 500.0, fltSignOffset);
glScalef(700.0, 150.0, 20.0);
//Create the shape
glPushMatrix();
glColor3f(0.0, 0.55, 0.27); //Use SpringGreen4 for sign
glutSolidCube(1);
//Create the wireframe of the shape
glColor3f(0.0, 0.0, 0.0);
glutWireCube(1);
glPopMatrix();
glPopMatrix();

OpenGL: Quads seemingly not culled properly

I have built a simple scene like the following:
The problem is, the blue shape is lower than the red one but somehow bleeds through. It looks proper when I rotate it like the following:
From what I searched this could be related to the order of vertices being sent, and here is my definition for those:
Shape* Obj1 = new Quad(Vec3(-5.0, 5.0, 0.0), Vec3(5.0, 5.0, 0.0), Vec3(5.0, 5.0, -10.0), Vec3(-5.0, 5.0, -10.0));
Shape* Obj2 = new Quad(Vec3(-5.0, 3.0, 0.0), Vec3(5.0, 3.0, 0.0), Vec3(5.0, 3.0, -10.0), Vec3(-5.0, 3.0, -10.0));
The Vec3 class just holds 3 doubles for x,y,z coordinates. I add these Vec3 classes to a vector, and iterate through them when I want to draw, as such:
glBegin(GL_QUADS);
for (auto it = vertex_list.begin(); it != vertex_list.end(); ++it)
glVertex3d(it->get_x(), it->get_y(), it->get_z());
glEnd();
Finally, my settings:
glEnable(GL_ALPHA_TEST | GL_DEPTH_TEST | GL_CULL_FACE);
glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST);
glAlphaFunc(GL_GREATER, 0.0f);
glViewport(0, 0, WINDOW_X, WINDOW_Y);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glFrustum(-1.0, 1.0, -1.0, 1.0, 1.0f, 300.0);
// camera origin xyz, point to look at xyz, camera rot xyz
gluLookAt(10, 10, -20, 2.5, 2.5, -10, 0, 1, 0);
You should enable depth test, face culling and alpha testing separately.
glEnable(GL_DEPTH_TEST);
glEnable(GL_CULL_FACE);
They are not flags. You cannot use them in that way.
See glEnable:
glEnable — enable or disable server-side GL capabilities
void glEnable(GLenum cap);
cap Specifies a symbolic constant indicating a GL capability.
This means the paramter of glEnable is a constant and not a set of bits and GL_ALPHA_TEST, GL_DEPTH_TEST, GL_CULL_FACE are symbolic constats and not bits of a bit set.
Change your code like this:
glEnable(GL_ALPHA_TEST);
glEnable(GL_DEPTH_TEST);
glEnable(GL_CULL_FACE);
See OpenGL Specifiction - 17.3.4 Depth Buffer Test, p. 500:
17.3.4 Depth Buffer Test
The depth buffer test discards the incoming fragment if a depth comparison fails. The comparison is enabled or disabled with the generic Enable and Disable commands using target DEPTH_TEST.
See OpenGL Specifiction - 14.6.1 Basic Polygon Rasterization, p. 473:
Culling is enabled or disabled by calling Enable or Disable with target CULL_FACE.

Can't get OpenGL view to work in QT application

I'm trying to make a QT application with open GL, but no matter what I do I can't get it to do anything, it just renders blackness.
void GraphView::initializeGL()
{
qglClearColor(Qt::black);
glClear(GL_COLOR_BUFFER_BIT );
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0.0 , 256.0, 256.0, 0.0, -1.0, 1.0);
glColor3f(1.0, 1.0, 1.0);
glBegin(GL_QUADS);
glVertex3f(0 , 0 , 0.0);
glVertex3f(0 , 128.0 , 0.0);
glVertex3f(128.0, 128.0 , 0.0);
glVertex3f(128.0, 0 , 0.0);
glEnd();
glFlush();
}
the only thing I've been able to get it to do is color the screen red by changing the clear color to red. But even then if I add the following code:
void GraphView::mousePressEvent(QMouseEvent *event)
{
qglClearColor(Qt::red);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glFlush();
}
It seems to do absolutely nothing; but the color will change if the window resizes--the resize method is just an empty { } affair. But if I open a second window in the same application with another GLwidget, it displays part of the firefox window, typically the bookmarks. I'm not sure if there's any particular reason for that; but it doesn't seem to display any other application.

QOpenGLWidget show black screen

I tried the QOpenGLWidget example described here:
https://stackoverflow.com/a/31524956/4564882
but I get only a black widget. The code is exactly the same. this the code associated to the QopenGLWidget:
OGLWidget::OGLWidget(QWidget *parent)
: QOpenGLWidget(parent)
{
}
OGLWidget::~OGLWidget()
{
}
void OGLWidget::initializeGL()
{
glClearColor(0,0,0,1);
glEnable(GL_DEPTH_TEST);
glEnable(GL_LIGHT0);
glEnable(GL_LIGHTING);
glColorMaterial(GL_FRONT_AND_BACK, GL_AMBIENT_AND_DIFFUSE);
glEnable(GL_COLOR_MATERIAL);
}
void OGLWidget::paintGL()
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glBegin(GL_TRIANGLES);
glColor3f(1.0, 0.0, 0.0);
glVertex3f(-0.5, -0.5, 0);
glColor3f(0.0, 1.0, 0.0);
glVertex3f( 0.5, -0.5, 0);
glColor3f(0.0, 0.0, 1.0);
glVertex3f( 0.0, 0.5, 0);
glEnd();
}
void OGLWidget::resizeGL(int w, int h)
{
glViewport(0,0,w,h);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluPerspective(45, (float)w/h, 0.01, 100.0);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
gluLookAt(0,0,5,0,0,0,0,1,0);
}
I tried the example here: https://doc.qt.io/archives/qt-5.3/qtopengl-2dpainting-example.html. It works fine (trying the both base class: QGLWidget and QOpenGLWidget. this is the code associated to the Widget:
GLWidget::GLWidget(Helper *helper, QWidget *parent)
: QGLWidget(QGLFormat(QGL::SampleBuffers), parent), helper(helper)
{
elapsed = 0;
setFixedSize(200, 200);
setAutoFillBackground(false);
}
void GLWidget::animate()
{
elapsed = (elapsed + qobject_cast<QTimer*>(sender())->interval()) % 1000;
repaint();
}
void GLWidget::paintEvent(QPaintEvent *event)
{
QPainter painter;
painter.begin(this);
painter.setRenderHint(QPainter::Antialiasing);
helper->paint(&painter, event, elapsed);
painter.end();
}
I use Qt 5.5.1 binairies built on my machine. I let the Build Configuration by default, so it is based on Qt ANGLE not Desktop OpenGL.
What is the problem of such a behaviour?
In my case, my laptop uses NVIDIA external graphics card. So I went to NVIDIA Control Panel -> Manage 3D Settings -> Program Settings, and then selected "high-performance" for the .EXE file. This worked.
The problem was because I use Qt5 binaries built with the default configuration. The default in Qt 5.5 is "dynamic" GL -- both ANGLE (ES2)
ANGLE ((Almost Native Graphics Layer Engine) is an open source project by
Google. Its aim is to map OpenGL ES 2.0 API calls to DirectX 9 API.)
and Desktop backends (Desktop OpenGL)are built, the decision on which one to use is taken at runtime.
The problem is that ANGLE only supports OpenGL>3.x, so the first code that I test is deprecated and not supported by ANGLE. The second is supported, that's why it worked.
So, I rebuild Qt to target Desktop OpenGL only to support my deprecated code, using:
configure -debug-and-release -opensource -opengl desktop -platform win32-msvc2015
and then run nmake, link my application to the new binaries, and my code works well!
I had a black screen on desktop. I solved the problem by adding this line of code:
QCoreApplication::setAttribute(Qt::AA_UseDesktopOpenGL);
For example, put it here:
#include "widget.h"
#include <QApplication>
int main(int argc, char *argv[])
{
QCoreApplication::setAttribute(Qt::AA_UseDesktopOpenGL);
QApplication a(argc, argv);
Widget w;
w.show();
return a.exec();
}

OpenGL object not rotating

So, I've been trying to rotate a single object in an OpenGL/GLUT environment.
After going through several questions on SO and the like, I've written what appears to be correct code, but no dice. Does anyone know how to make this work?
PS: I've tried changing the GLMatrixmode to Projection, but that just shows a black screen. Same thing happens if I use glLoadIdentity().
Here's my rendering code
void display()
{
preProcessEvents();
glClear(GL_COLOR_BUFFER_BIT|GL_DEPTH_BUFFER_BIT);
glLoadIdentity();
gluLookAt(Camera::position.x, Camera::position.y, Camera::position.z,
Camera::position.x+Math::sind(Camera::rotationAngles.x)*Math::cosd(Camera::rotationAngles.y),
Camera::position.y+Math::cosd(Camera::rotationAngles.x),
Camera::position.z+Math::sind(Camera::rotationAngles.x)*Math::sind(Camera::rotationAngles.y),
0.0, 1.0, 0.0);
glBegin(GL_TRIANGLES);
glColor3f(1, 0, 0);
glVertex3f(-1, 0,-3);
glColor3f(0, 1, 0);
glVertex3f(0.0f, 2.0f,-3);
glColor3f(0, 0, 1);
glVertex3f(1.0f, 0.0f,-3);
glEnd();
glBindTexture(GL_TEXTURE_2D, tex->textureID);
glBegin(GL_QUADS);
glColor3f(1, 1, 1);
glTexCoord2f(100, 100);
glVertex3f(100,0,100);
glTexCoord2f(-100, 100);
glVertex3f(-100,0,100);
glTexCoord2f(-100,-100);
glVertex3f(-100,0,-100);
glTexCoord2f(100,-100);
glVertex3f(100,0,-100);
glEnd();
glBindTexture(GL_TEXTURE_2D, 0);
object1.draw();
glTranslatef(-10.0, 10.0, 0.0);
glBindTexture(GL_TEXTURE_2D, tex2->textureID);
gluQuadricTexture(quad,1);
gluSphere(quad,10,20,20);
glBindTexture(GL_TEXTURE_2D, 0);
//RELEVANT CODE STARTS HERE
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glPushMatrix();
glRotatef(190, 0.0, 0.0, 1.0);
glPopMatrix();
glutSwapBuffers();
}
Are you aware what glPushMatrix and glPopMatrix do? They save and restore the "current" matrix.
By enclosing your rotation in that and then doing no actual drawing operation before restoring the matrix the entire sequence of code beginning with //RELEVANT CODE STARTS HERE is completely pointless.
Even if you did not push/pop, your rotation would only be applied the next time you draw something. Logically you might think that would mean the next time you call display (...), but one of the first things you do in display (...) is replace the current matrix with an identity matrix (line 3).
In all honesty, you should consider abandoning whatever resource you are currently using to learn OpenGL. You are using deprecated functionality and missing a few fundamentals. A good OpenGL 3.0 tutorial will usually touch on the basics of transformation matrices.
As for why changing the matrix mode to projection produces a black screen, that is because the next time you call display (...), gluLookAt operates on the projection matrix. In effect, you wind up applying the camera transformation twice. You should really add glMatrixMode (GL_MODELVIEW) to the beginning of display (...) to avoid this.
You do the rotation (and reset it with glPopMatrix) after you draw, do the rotation code before the glBegin/glEnd calls.
Or just move to the shader based pipeline and manage you own transformation matrices.