I tried the QOpenGLWidget example described here:
https://stackoverflow.com/a/31524956/4564882
but I get only a black widget. The code is exactly the same. this the code associated to the QopenGLWidget:
OGLWidget::OGLWidget(QWidget *parent)
: QOpenGLWidget(parent)
{
}
OGLWidget::~OGLWidget()
{
}
void OGLWidget::initializeGL()
{
glClearColor(0,0,0,1);
glEnable(GL_DEPTH_TEST);
glEnable(GL_LIGHT0);
glEnable(GL_LIGHTING);
glColorMaterial(GL_FRONT_AND_BACK, GL_AMBIENT_AND_DIFFUSE);
glEnable(GL_COLOR_MATERIAL);
}
void OGLWidget::paintGL()
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glBegin(GL_TRIANGLES);
glColor3f(1.0, 0.0, 0.0);
glVertex3f(-0.5, -0.5, 0);
glColor3f(0.0, 1.0, 0.0);
glVertex3f( 0.5, -0.5, 0);
glColor3f(0.0, 0.0, 1.0);
glVertex3f( 0.0, 0.5, 0);
glEnd();
}
void OGLWidget::resizeGL(int w, int h)
{
glViewport(0,0,w,h);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluPerspective(45, (float)w/h, 0.01, 100.0);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
gluLookAt(0,0,5,0,0,0,0,1,0);
}
I tried the example here: https://doc.qt.io/archives/qt-5.3/qtopengl-2dpainting-example.html. It works fine (trying the both base class: QGLWidget and QOpenGLWidget. this is the code associated to the Widget:
GLWidget::GLWidget(Helper *helper, QWidget *parent)
: QGLWidget(QGLFormat(QGL::SampleBuffers), parent), helper(helper)
{
elapsed = 0;
setFixedSize(200, 200);
setAutoFillBackground(false);
}
void GLWidget::animate()
{
elapsed = (elapsed + qobject_cast<QTimer*>(sender())->interval()) % 1000;
repaint();
}
void GLWidget::paintEvent(QPaintEvent *event)
{
QPainter painter;
painter.begin(this);
painter.setRenderHint(QPainter::Antialiasing);
helper->paint(&painter, event, elapsed);
painter.end();
}
I use Qt 5.5.1 binairies built on my machine. I let the Build Configuration by default, so it is based on Qt ANGLE not Desktop OpenGL.
What is the problem of such a behaviour?
In my case, my laptop uses NVIDIA external graphics card. So I went to NVIDIA Control Panel -> Manage 3D Settings -> Program Settings, and then selected "high-performance" for the .EXE file. This worked.
The problem was because I use Qt5 binaries built with the default configuration. The default in Qt 5.5 is "dynamic" GL -- both ANGLE (ES2)
ANGLE ((Almost Native Graphics Layer Engine) is an open source project by
Google. Its aim is to map OpenGL ES 2.0 API calls to DirectX 9 API.)
and Desktop backends (Desktop OpenGL)are built, the decision on which one to use is taken at runtime.
The problem is that ANGLE only supports OpenGL>3.x, so the first code that I test is deprecated and not supported by ANGLE. The second is supported, that's why it worked.
So, I rebuild Qt to target Desktop OpenGL only to support my deprecated code, using:
configure -debug-and-release -opensource -opengl desktop -platform win32-msvc2015
and then run nmake, link my application to the new binaries, and my code works well!
I had a black screen on desktop. I solved the problem by adding this line of code:
QCoreApplication::setAttribute(Qt::AA_UseDesktopOpenGL);
For example, put it here:
#include "widget.h"
#include <QApplication>
int main(int argc, char *argv[])
{
QCoreApplication::setAttribute(Qt::AA_UseDesktopOpenGL);
QApplication a(argc, argv);
Widget w;
w.show();
return a.exec();
}
Related
I am developing on RedHat Linux, cat /etc/redhat-release:
Red Hat Enterprise Linux Workstation release 7.2 (Maipo)
I am using Qt Creator 4.3.1:
Based on Qt 5.9.1 (GCC 5.3.1 20160406 (Red Hat 5.3.1-6), 64 bit)
The project I'm developing is using Qt 5.6.2 GCC 64bit, the project has been developed with graphical objects derived from QWidget, this includes a live video stream.
Unfortunately we have experienced tearing in the video whilst it is playing back and this is also evident in other widgets displayed around the video, I believe this is because the video is not using vsync.
I believe using openGL will rectify this situation, the aim is to rewrite the widgets including the video playback using openGL. I've spent several days trying to find complete and working solutions but so far failed to find a complete and working solution.
I've been looking at using QOpenGLWidget, in a widget I am using to test:
class clsElevStrip : public QOpenGLWidget, protected QOpenGLFunctions {
Q_OBJECT
In the constructor, I set-up the format for offscreen rendering:
//Create surface format for rendering offscreen
mobjFormat.setDepthBufferSize(24);
mobjFormat.setSamples(4);
mobjFormat.setVersion(3, 0);
mobjFormat.setSwapBehavior(QSurfaceFormat::DoubleBuffer);
setFormat(mobjFormat);
In the paintGL method:
QOpenGLContext* pobjContext = context();
QSurface* pobjSurface = pobjContext->surface();
assert(pobjSurface != NULL);
int intSB1 = pobjSurface->format().swapBehavior();
qDebug() << (QString("paintGL:format: ")
+ QString::number(intSB1));
pobjContext->makeCurrent(pobjSurface);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glBegin(GL_TRIANGLES);
glColor3f(1.0, 0.0, 0.0);
glVertex3f(-0.5, -0.5, 0);
glColor3f(0.0, 1.0, 0.0);
glVertex3f( 0.5, -0.5, 0);
glColor3f(0.0, 0.0, 1.0);
glVertex3f( 0.0, 0.5, 0);
glEnd();
pobjContext->swapBuffers(pobjSurface);
Nothing is visible on the main display, the debug statement shows the format as 2 (DoubleBuffering).
If I comment out the line in the constructor:
setFormat(mobjFormat);
The debug statement shows the format as 0 (DefaultSwapBehavior). And the graphics are visible, what have I missed?
The solution for your problem is simple:
Just do not all that QOpenGLGLContext jugging. The whole point of paintGL is, that this particular function is called inside a wrapper that already does all that context juggling for you. **There is no need to call makeCurrent or swapBuffers. Qt already does that for you!
From the Qt documentation
void QOpenGLWidget::paintGL()
This virtual function is called whenever the widget needs to be
painted. Reimplement it in a subclass.
There is no need to call makeCurrent() because this has
already been done when this function is called.
Before invoking this function, the context and the framebuffer are
bound, and the viewport is set up by a call to glViewport().
No other state is set and no clearing or drawing is performed
by the framework.
If you have just this as your paintGL it will show something, iff you have either a compatibility profile >=OpenGL-3.x context OR if you're using a <=OpenGL-2.x context. You're using the legacy fixed function pipeline there, which will not work with OpenGL-3.x core profile contexts!
void glwidget::paintGL(){
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glBegin(GL_TRIANGLES);
glColor3f(1.0, 0.0, 0.0);
glVertex3f(-0.5, -0.5, 0);
glColor3f(0.0, 1.0, 0.0);
glVertex3f( 0.5, -0.5, 0);
glColor3f(0.0, 0.0, 1.0);
glVertex3f( 0.0, 0.5, 0);
glEnd();
}
The exact same code snippet is working on another machine but its not working properly for me. The GLUT is working absolutely fine as it open the created window but the line segment is not shown on the window which means there is a problem with opengl. It is not even changing the background color of the window.
I even test the opengl on my windows with a testing application and its working fine.
#ifdef WIN32
#include <windows.h>
#endif
#include <GL/glut.h>
#include <GL/gl.h>
#include <GL/glu.h>
void init(void){
glClearColor(1.0, 1.0, 1.0, 0.0);
glMatrixMode(GL_PROJECTION);
gluOrtho2D(0.0, 400.0, 0.0, 400.0);
}
void linesegment(void){
glClear(GL_COLOR_BUFFER_BIT);
glColor3f(0.0, 0.0, 0.0);
glBegin(GL_LINES);
glVertex2i(180, 15);
glVertex2i(10, 145);
glEnd();
glFlush();
}
int main(int argc, char* argv[])
{
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_SINGLE | GLUT_RGB);
glutInitWindowPosition(50, 50);
glutInitWindowSize(400, 400);
glutCreateWindow("Testing Open GL");
init();
glutDisplayFunc(linesegment);
glutMainLoop();
return 0;
}
Most likely you're running into the rather new class of problems introduced by compositing graphics systems (Aero in Windows, Quartz Extreme on MacOS X and a multitude of various programs on Linux/X11). The gist of the problem is, that compositing is inherently double buffered: There's always an offscreen (back) buffer for the window to be drawn. And when a program indicates that it's finished with drawing the compositor will integrate it into the on-screen image.
This however brings a few caveats. Most importantly, single buffered drawing somehow needs to indicate that it's finished. While OpenGL's glFinish call should suffice from a implementors point of view, most compositing systems are not sensitive to it. You'll have to create a double buffered window pixel format and do a buffer swap to make the compositor present your image.
So for your program:
replace GLUT_SINGLE with GLUT_DOUBLE in glutInitDisplayMode
replace glFlush with glutSwapBuffers
I am starting with openGL and c++, and I was wondering why I don't see anything on the window. Here is my code:
#include <GLUT/GLUT.h>
#include <stdlib.h>
void init() {
glClearColor(0, 0, 1, 0);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluPerspective(60.0, 1.0, 1.0, 100.0);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glTranslatef(0, 0, -10);
}
void display() {
glClear(GL_COLOR_BUFFER_BIT);
glBegin(GL_TRIANGLES);
glVertex3i(-0.5, -0.5, 0);
glVertex3i(0, 0.5, 0);
glVertex3i(0.5, -0.5, 0);
glEnd();
}
int main(int argc, char **argv) {
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_SINGLE| GLUT_RGBA);
glutInitWindowPosition(200, 200);
glutInitWindowSize(400, 400);
glutCreateWindow("Window");
init();
glutDisplayFunc(display);
glutMainLoop();
}
I have a few questions:
If I run the program like this all I see is a white window... Didn't I set the color to blue?
When I do glutSwapBuffers() at the end of display function and run the program, I see the blue window without the triangle. So, I thought glutSwapBuffers() function only worked with double buffering.
And the most important, where the hell is my triangle? O.o Didn't I translate the camera with glTranslatf() function to -10 in the z-axes? If you are wondering why I used gluPerspective, I have to say that I am trying out new things, but neither works if I use gluOrtho2D().
I don't know if I am missing something or what. Maybe I need to search more information about this, but I think most of the code is correct.
1 & 2) Well you don't have to call glutSwapBuffers() when using single buffer. But you have to call glFlush(), so the draw commands are executed on the GPU.
3) I noticed that you are creating vertices with double coordinates, but you are calling integer version of glVertex** function (decimal part will be truncated) - it means that you will be drawing triangle with zero size.
Use glVertex3d() or glVertex3f() instead of glVertex3i().
Small note: intermediate mode is deprecated in the latest OpenGL.
It isn't a problem with the code because it compiles when I tell the compiler to compile it as C but it doesn't compile when I set the settings to default (which is to compile it as C++). When I compile it as C++ I get numerous errors along the lines of "undefined reference to glClear"
I'm using Microsoft's Visual Studio C++ compiler. I have everything properly linked.
The code is:
#include <GL/glut.h>
#include <GL/freeglut.h>
#include <GL/gl.h>
void display(void)
{
/* Clear all pixels */
glClear(GL_COLOR_BUFFER_BIT);
/* draw white polygon (rectangle) with
* corners at (0.25, 0.25, 0.0) and (0.75, 0.75, 0.0)
*/
glColor3f(1.0, 1.0, 1.0);
glBegin(GL_POLYGON);
glVertex3f(0.25, 0.25, 0.0);
glVertex3f(0.75, 0.25, 0.0);
glVertex3f(0.75, 0.75, 0.0);
glVertex3f(0.25, 0.75, 0.0);
glEnd();
/* don't wait!
* start processing buffered OpenGL routines
*/
glFlush();
}
void init(void)
{
/* Select clearing background color */
glClearColor(0.0, 0.0, 0.0, 0.0);
/* Initialize viewing values */
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0.0, 1.0, 0.0, 1.0, -1.0, 1.0);
}
/*
* Declare initial window size, position, and display mode
* (single buffer and RGBA). Open window with “hello”
* in its title bar. Call initialization routines.
* Register callback function to display graphics.
* Enter main loop and process events.
*/
int main(int argc, char** argv)
{
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_SINGLE | GLUT_RGB);
glutInitWindowSize(250, 250);
glutInitWindowPosition(100, 100);
glutCreateWindow("hello");
init();
glutDisplayFunc(display);
glutMainLoop();
return 0; /* ISO C requires main to return int. */
}
Also, if anyone has a proper resource for learning opengl with C++ could you please recommend it?
It's likely because glClear is not declared in any of the header files currently included. In C, an undeclared function is often assumed to have a certain type based on its arguments, and returning an int. So when compiling with C, you might get a warning about it being undeclared (I hopefully you have warnings enabled, and read them when compiling?), but it will do its best to compile and link it.
C++ is more strict about undeclared functions.
As Alexadre Jasmin and Bart have pointed out, verify that you are linking OpenGL libraries correctly. I use -lGLU -lGL -lglut with freeglut on ubuntu.
If that doesn't solve the problem, try adding #define GLUT_DISABLE_ATEXIT_HACK at the top of your cpp file.
I am learning OpenGL these days, and I tried to compile the example code on the book(OpenGL SuperBible)
The code likes this: first use glEnable(GL_LINE_STIPPLE) to open the GL_LINE_STIPPLE, and then glLineStipple(2, (GLushort)0x00ff), last I draw some lines, but when executed, it just displayed the normal lines. (in Ubuntu)
However, I compiled the same code in windows, it worked!!
Why? Are there any different details between Windows and Linux?
#include <QtGui>
#include "GLWidget.h"
GLWidget::GLWidget(QWidget *parent)
: QGLWidget(parent)
{
setFormat(QGLFormat(QGL::DoubleBuffer));
}
GLWidget::~GLWidget()
{
}
void GLWidget::initializeGL()
{
glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
glEnable(GL_LINE_STIPPLE);
}
void GLWidget::resizeGL(int w, int h)
{
if(h == 0)
h = 1;
glViewport(0, 0, w, h);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(-100, 100, -100, 100, -1, +1);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
}
void GLWidget::paintGL()
{
glClear(GL_COLOR_BUFFER_BIT);
drawLine();
}
void GLWidget::drawLine()
{
GLint factor = 1;
GLushort pattern = 0x00ff;
glColor3f(1.0f, 1.0f, 1.0f);
for(GLfloat i = -90.0f; i < 90.0f; i += 20.0f)
{
glLineStipple(factor, pattern);
glLineWidth(5.0);
glBegin(GL_LINES);
glVertex2f(-80.0f, i);
glVertex2f(+80.0f, i);
glEnd();
factor++;
}
}
I've seen from your comment answer to #Lefteris that you're using the Mesa3D/DRI based "radeon" driver. I suggest you download AMD's propritary driver (fglrx) fron their website, install that and try again.
You would have to tell us what OpenGL context you are using to see why it does not work. If in Linux it is an openGL 4 context for example glEnable(GL_LINE_STIPPLE) is not a part of openGL 4.
On the other hand in Windows if you don't bother to create a specific openGL context you get the default which is a version that definitely supports glEnable(GL_LINE_STIPPLE).
So please tell us what OpenGL context you are using in Linux.
Stippled primitives are TODO for anything above a R200.
glLineStipple is a deprecated API so perhaps it was removed from the Linux driver.