glewinit() apparently successful, sets error flag anyway - opengl

I have recently migrated from Windows to Linux (Debian, 64-bit) and am trying to get a GPGPU development environment up and running, so I am testing a program which worked under Windows.
Compiling and linking goes fine, but when I run the program I get some odd errors. I am using glew and freeglut.
First snippet: OpenGL only
i = 1;
info = PROGRAM_NAME;
glutInitContextVersion(4,2);
glutInit(&i, &info);
glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGBA);
glutInitWindowSize(W_SIZEX, W_SIZEY);
glutInitWindowPosition(W_POSX, W_POSY);
glutCreateWindow(info);
glClearColor(1.0,1.0,1.0,0);
/**/
printf("Before glewInit: %i\n", glGetError());
/**/
printf("glewInit returns: %i\n", glewInit());
/**/
printf("After glewInit: %i\n", glGetError());
/**/
From which I get the following output:
Before glewInit: 0
glewInit returns: 0
After glewInit: 1280
This is an invalid enum error. I don't know what's causing it, but I suspect it might be related to the next error I get, later in the program's execution.
Second snippet: OpenCL-OpenGL interop
/* BUFFERS */
(*BFR).C[0] = clCreateBuffer(*CTX, CL_MEM_READ_WRITE, SD, 0, 0);
(*BFR).C[1] = clCreateBuffer(*CTX, CL_MEM_READ_WRITE, SD, 0, &i);
dcl(i);
glGenBuffers(2, (*BFR).G);
glBindBuffer(GL_ARRAY_BUFFER, (*BFR).G[0]);
glBufferData(GL_ARRAY_BUFFER, SI, 0, GL_DYNAMIC_DRAW);
(*BFR).D[0] = clCreateFromGLBuffer(*CTX, CL_MEM_WRITE_ONLY, (*BFR).G[0], &i);
dcl(i);
glBindBuffer(GL_ARRAY_BUFFER, 0);
Here, the dcl(int) method just decodes the CL error code. When I run this, I get a CL_INVALID_GL_OBJECT error from clCreateFromGLBuffer(). However, OpenGL has no issues generating, binding or unbinding the buffers in question. The OpenCL context is apparently valid, generating no errors on creation or query. Everything works in VS2010 on Windows 7 64-bit.
Compilation Details
Here are the relevant includes:
/* OPENGL */
#include "GL/glew.h"
#include "GL/freeglut.h"
/* OPENCL */
#include "CL/cl.h"
#include "CL/cl_gl.h"
I am using GCC and linking like so:
gcc -w -I./include CLGL.c -o ~/Templates/GOL-CLGL/run/a.out -lGLEW -lGLU -lglut -lGL -lOpenCL;
Compilation and linking results in no errors (plenty of warnings about pointer abuse but I doubt that's the culprit).
I'm currently out of ideas on how to debug this. Can anyone suggest further steps?

I had this issue recently too so here is the answer:
OpenGL: glGetError() returns invalid enum after call to glewInit()
So you can discard that error .

Related

GL_INVALID_OPERATION in glGetIntegerv() with GLAD

I use GLAD (config) to load OpenGL functions and GLFW 3.3.8 to create context. Each time I start my program it pops a ERROR 1282 in glGetIntegerv from GLAD debug post-callback function (as far as I know it is invoked after each gl- function and prints an error if any occurred). I figured that this happens after returning from main().
Here's the code (it loads OpenGL 3.3 and shows red window until it is closed, pretty simple I think):
#include <iostream>
#include <glad/glad.h>
#include <GLFW/glfw3.h>
int main()
{
if(glfwInit() != GLFW_TRUE)
throw std::runtime_error{"Unable to initialize GLFW."};
glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3);
glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 3);
glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);
GLFWwindow * w{glfwCreateWindow(100, 100, "title", nullptr, nullptr)};
if(w == nullptr)
throw std::runtime_error{"Unable to create window."};
glfwMakeContextCurrent(w);
if(not gladLoadGLLoader(GLADloadproc(glfwGetProcAddress)))
throw std::runtime_error{"Unable to load OpenGL functions."};
glViewport(0, 0, 100, 100);
while(not glfwWindowShouldClose(w))
{
glfwPollEvents();
glClearColor(1.f, 0.f, 0.f, 1.f);
glClear(GL_COLOR_BUFFER_BIT);
glfwSwapBuffers(w);
}
glfwMakeContextCurrent(nullptr);
glfwDestroyWindow(w);
glfwTerminate();
std::cout << "Hey!" << std::endl;
return 0;
}
The output is:
Hey!
ERROR 1282 in glGetIntegerv
From this callstack:
#0 0x00416f91 in _post_call_callback_default_gl (name=0x446d40 <_glfwDataFormat+10036> "glGetIntegerv", funcptr=0x41c1ec <glad_debug_impl_glGetIntegerv#8>, len_args=2) at <glad.c>:45
#1 0x0041c265 in glad_debug_impl_glGetIntegerv#8 (arg0=33309, arg1=0x4526cc <num_exts_i>) at <glad.c>:1385
#2 0x00417168 in get_exts () at <glad.c>:220
#3 0x0042691f in find_extensionsGL () at <glad.c>:3742
#4 0x00426d12 in gladLoadGLLoader (load=0x402a2e <glfwGetProcAddress>) at <glad.c>:3821
#5 0x004016f8 in main () at <main.cpp>:33
Error 1282 is GL_INVALID_OPERATION, but it pops up after the program ended (or at least after the main() ended). Even if I separate the whole code in another function (i. e. create and destroy everything in separate function), and then invoke it in main(), the error still appears after the return 0; from main().
This did not happen when I used GLEW to load OpenGL functions, but maybe it was silenced. I didn't find anything similar to my problem on the internet. What am I doing wrong? Do I have to unload OpenGL or something like that?
UPD: Error message actually pops in gladLoadGLLoader(), not after the end of main().

gdb Cannot find bounds of current function

I am developing a OpenGL program using Mingw32 on Windows 10(64 bit)
The program runs without problem
But when I debug my program using gdb, it shows:
(gdb) n
0x6a7706f8 in ?? ()
from C:\Windows\System32\DriverStore\FileRepository\c0310483.inf_amd64_ab6d2afa5c543409\atioglxx.dll
(gdb) n
Cannot find bounds of current function
(gdb)
Here is the code I want to debug
int main() {
GLFWwindow * window = initGLContext();
initImGui(window);
int points[8] = { 0 };
GLuint VAO, VBO;
glGenVertexArrays(1, &VAO); // I set breakpoint here
glGenBuffers(1, &VBO);
GLShader curveShader("", "", "");
while (!glfwWindowShouldClose(window)) {
glfwPollEvents();
useGUI(points);
render();
glfwSwapBuffers(window);
}
ImGui_ImplGlfwGL3_Shutdown();
ImGui::DestroyContext();
glfwTerminate();
return 0;
}
please let me know if more info is needed
Thanks in advance
Edited:
It turns out that my program lack the debug information for glGenVertexArrays(),which is offered by atioglxx.dll, so I decide to use printf() instead
But when I debug my program using gdb, it shows:
(gdb) n
0x6a7706f8 in ?? () from C:\Windows\System32\DriverStore...
This is happening because you are stopped inside atioglxx.dll, which has no debugging info (or even symbol table).
When debugging, you need to be aware of your current context (e.g. which function am I stopped in).
When you are in your own code, and assuming you compiled it with debug info, you can do next, step, info locals, etc. But when you are in somebody else's code (e.g. in system-provided DLL), these commands will not work (are not expected to work).

QGLShaderProgram will not compile any shader since upgrading to Qt5

I am finding that QGLShaderProgram is consistently failing to compile any shader and providing no error log. Here are the symptoms:
QGLShaderProgram reports that it failed to compile but produces an empty error log. If I try to bind the shader an exception is thrown.
I can compile a shader using glCompileShader without problem. However, the first time I try to compile this way after QGLShaderProgram has failed, fails with this error log:
ERROR: error(#270) Internal error: Wrong symbol table level
ERROR: 0:2: error(#232) Function declarations cannot occur inside of functions:
main
ERROR: error(#273) 2 compilation errors. No code generated
Following that one failure, the next time I try to compile using glCompileShader works fine.
The problem has arisen only since upgrading from Qt 4.8 to 5.2. Nothing else has changed on this machine.
I have tested on two PCs, one with an ATI Radeon HD 5700, the other with an AMD FirePro V7900. The problem only appears on the Radeon PC.
Here is my test code demonstrating the problem:
main.cpp
#include <QApplication>
#include "Test.h"
int main(int argc, char* argv[])
{
QApplication* app = new QApplication(argc, argv);
Drawer* drawer = new Drawer;
return app->exec();
}
Test.h
#pragma once
#include <qobject>
#include <QTimer>
#include <QWindow>
#include <QOpenGLContext>
#include <QOpenGLFunctions>
class Drawer : public QWindow, protected QOpenGLFunctions
{
Q_OBJECT;
public:
Drawer();
QTimer* mTimer;
QOpenGLContext* mContext;
int frame;
public Q_SLOTS:
void draw();
};
Test.cpp
#include "Test.h"
#include <QGLShaderProgram>
#include <iostream>
#include <ostream>
using namespace std;
Drawer::Drawer()
: mTimer(new QTimer)
, mContext(new QOpenGLContext)
, frame(0)
{
mContext->create();
setSurfaceType(OpenGLSurface);
mTimer->setInterval(40);
connect(mTimer, SIGNAL(timeout()), this, SLOT(draw()));
mTimer->start();
show();
}
const char* vertex = "#version 110 \n void main() { gl_Position = gl_Vertex; }";
const char* fragment = "#version 110 \n void main() { gl_FragColor = vec4(0.0,0.0,0.0,0.0); }";
void Drawer::draw()
{
mContext->makeCurrent(this);
if (frame==0) {
initializeOpenGLFunctions();
}
// Compile using QGLShaderProgram. This always fails
if (frame < 5)
{
QGLShaderProgram* prog = new QGLShaderProgram;
bool f = prog->addShaderFromSourceCode(QGLShader::Fragment, fragment);
cout << "fragment "<<f<<endl;
bool v = prog->addShaderFromSourceCode(QGLShader::Vertex, vertex);
cout << "vertex "<<v<<endl;
bool link = prog->link();
cout << "link "<<link<<endl;
}
// Manual compile using OpenGL direct. This works except for the first time it
// follows the above block
{
GLuint prog = glCreateShader(GL_FRAGMENT_SHADER);
glShaderSource(prog, 1, &fragment, 0);
glCompileShader(prog);
GLint success = 0;
glGetShaderiv(prog, GL_COMPILE_STATUS, &success);
GLint logSize = 0;
glGetShaderiv(prog, GL_INFO_LOG_LENGTH, &logSize);
GLchar* log = new char[8192];
glGetShaderInfoLog(prog, 8192, 0, log);
cout << "manual compile " << success << endl << log << endl;
delete[] log;
}
glClearColor(1,1,0,1);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
mContext->swapBuffers(this);
frame++;
}
Elsewhere, I have tested using QGLWidget, and on a project that uses GLEW instead of QOpenGLFunctions with exactly the same results.
The version of Qt I'm linking against was built with the following configuration:
configure -developer-build -opensource -nomake examples -nomake tests -mp -opengl desktop -icu -confirm-license
Any suggestions? Or shall I just send this in as a bug report?
Update
In response to peppe's comments:
1) What does QOpenGLDebugLogger says?
The only thing I can get from QOpenGLDebugLogger is
QWindowsGLContext::getProcAddress: Unable to resolve 'glGetPointerv'
This is printed when I initialize it (and not as a debug event firing, but just to console). It happens even though mContext->hasExtension(QByteArrayLiteral("GL_KHR_debug")) returns true and I'm initializing it within the first frame's draw() function.
2) Can you print the compile log of the QOGLShaders even if they compile successfully?
I cannot successfully compile QOpenGLShader or QGLShader at any point so I'm not able to test this. However, when compiling successfully using plain GL functions, the log returns blank.
3) Which GL version did you get from the context? (Check with QSurfaceFormat).
I've tried with versions 3.0, 3.2, 4.2, all with the same result.
4) Please set the same QSurfaceFormat on both the context and the window before creating them
5) Remember to create() the window
I've implemented both of these now and the result is the same.
I've just tested on a third PC and that has no issues. So it is this specific computer which, incidentally, happens to be a Mac Pro running Windows in bootcamp. It has had absolutely no trouble in any other context running the latest ATI drivers but I can only really conclude that there is a bug somewhere between the ATI drivers, this computer's graphics chip and QOpenGLShaderProgram.
I think I'm unlikely to find a solution, so giving up. Thank you for all your input!

C++ OpenGL GLEW Static Library on Windows 7 MSVC2012

I am attempting to Compile a simple OpenGL program on windows with statically linked glew32mxsd.lib... I am also working with glfw and if I compile without glew everything works.
I Downloaded the glew src and build the static mx debug library from source. I then copied the resulting glew32mxsd.lib file to my project directory. I am using cmake so my cmake code appears as follows.
SET (GLEW_VERSION 1.9.0)
SET (GLEW_DIRECTORY ${EXTERNAL_LIBS}/glew/${GLEW_VERSION})
SET (GLEW_INCLUDES ${GLEW_DIRECTORY}/include)
SET (GLEW_LIBS ${GLEW_DIRECTORY}/win/${SYSTEM_ARC}/lib)
INCLUDE_DIRECTORIES (${GLEW_INCLUDES})
ADD_EXECUTABLE (myproject ${HEADER_FILES} ${SOURCE_FILES})
TARGET_LINK_LIBRARIES(engine OpenGL32.lib)
TARGET_LINK_LIBRARIES(engine ${GLEW_LIBS}/glew32mxsd.lib)
Also in my source I am using the following in my header
#define GLEW_STATIC
#define GLEW_MX
#include <GL/glew.h>
#include <GLFW/glfw3.h>
#ifdef WIN32
#include <GL/wglew.h>
#endif
//-----------------------------------------------------------------
static GLEWContext *glewGetContext()
{
return nullptr;
}
Everything compiles and links without any errors but...
When I run the program I get a memory access error. The call stack is
engine.exe!glewContextInit(GLEWContextStruct * ctx) Line 8912 C
engine.exe!cext::graphics::internal::WindowManager::initGLExtentions() Line 204 C++
engine.exe!cext::graphics::WindowManager::initGLExtentions() Line 273 C++
engine.exe!main(int argc, char * * argv) Line 363 C++
engine.exe!__tmainCRTStartup() Line 536 C
engine.exe!mainCRTStartup() Line 377 C
And looking at line 8912 in glew.c the following line is revealed
CONST_CAST(GLEW_VERSION_4_3) = ( major > 4 ) || ( major == 4 && minor >= 3 ) ? GL_TRUE : GL_FALSE;
My glewInit looks like the following
void initGLExtentions()
{
glewExperimental = true;
GLenum err = glewInit();
if (GLEW_OK != err)
{
printf("Error: %s\n",glewGetErrorString(err));
}
printf("Status: Using GLEW %s\n",glewGetString(GLEW_VERSION));
if (!GLEW_ARB_vertex_buffer_object)
{
printf("VBO not supported\n");
exit(EXIT_FAILURE);
}
}
...
glfwWindow *window = makeWindow(options,hints);
if (!window)
{
fprintf( stderr, "Failed to open GLFW window\n" );
glfwTerminate();
exit( EXIT_FAILURE );
}
glfwMakeContextCurrent(window);
initGLExtentions();
Using the same code on Mac works without a problem which leads me to believe that it is something to do with the static lib. However even after following all the instructions on the glew website I must be missing something still.
Edit: Additional Information
I ran dependency walker on my application after reading about it in another thread. Running dependency walker on my exe file produces the following missing files
API-MS-WIN-CORE-COM-L1-1-0.DLL
API-MS-WIN-CORE-WINRT-ERROR-L1-1-0.DLL
API-MS-WIN-CORE-WINRT-L1-1-0.DLL
API-MS-WIN-CORE-WINRT-ROBUFFER-L1-1-0.DLL
API-MS-WIN-CORE-WINRT-STRING-L1-1-0.DLL
API-MS-WIN-SHCORE-SCALING-L1-1-0.DLL
DCOMP.DLL
GPSVC.DLL
IESHIMS.DLL
These are called from the USER32.DLL. Are these related to the glew.lib or wglew.lib in anyway?

FFMPEG Undefined Reference to 'avcodoec_open2' in C++

I have an error when compiling one of my C++ programs after updating the FFMPEG library from 0.8 to 'ffmpeg version git-2012-04-12-277f20c'
The error I get when I make my program is as follows:
-------- begin --------
Linking: Analysing_Server
./source/Encoding_Thread.o: In function `CEncoding_Thread::do_work()':
/home/Analyser/source/Encoding_Thread.cpp:155: undefined reference to `avcodec_open2'
collect2: ld returned 1 exit status
make: *** [Analysing_Server] Error 1
The relevant lines of my Make file is similar to running g++ as below:
g++ test2.cpp -lavformat -lavcodec -lavutil -D__STDC_CONSTANT_MACROS
A stripped down version of my relevant CPP code that throws the error is:
#include <stdio.h>
#include <stdint.h>
#define LOG_OUT_STREAM_BUFF_SIZE 200000
extern "C" {
/* The ffmpeg library is completely written in C, so we need to tell the C++ compiler that so it links correctly. */
#include "stdint.h"
#include "libavcodec/avcodec.h"
#include "libavutil/mathematics.h"
#include "libswscale/swscale.h"
#include "libavfilter/avfilter.h"
int avcodec_open2(AVCodecContext *avctx, AVCodec *codec, AVDictionary **options);
int avcodec_encode_video2(AVCodecContext *avctx, AVPacket *avpkt, const AVFrame *frame, int *got_packet_ptr);
}
uint8_t m_outbuf[2][LOG_OUT_STREAM_BUFF_SIZE];
unsigned int m_out_size[2];
unsigned int m_OutBuffer_ID[2];
unsigned int m_Buffer_ID; /* This is just a uniqueish stamp we give to each buffer so we can tell when they change.. */
AVCodecContext * m_CodecContex;
AVCodec * m_codec;
struct SwsContext *m_img_convert_ctx;
unsigned char* m_DataBuff;
int Output_Width, Output_Height;
int Output_Bitrate;
int main(void) {
//New version of FFMPEG calls this in avcodec_register_all
//avcodec_init();
/* register all the codecs */
avcodec_register_all();
/* Initalise the encoder */
m_codec = avcodec_find_encoder(CODEC_ID_MP2);
if (!m_codec) {
printf("Encoding codec not found\n");
}
/* init the pointers.. */
m_CodecContex = NULL;
/* Default values.. */
Output_Width = 1600;
Output_Height = 1200;
Output_Bitrate = 600000;
/* Create/setup the Codec details.. */
//Changed to work with new FFMPEG
m_CodecContex = avcodec_alloc_context3(m_codec);
avcodec_get_context_defaults3(m_CodecContex, m_codec);
/* put sample parameters */
m_CodecContex->bit_rate = Output_Bitrate;
/* resolution must be a multiple of two */
m_CodecContex->width = Output_Width;
m_CodecContex->height = Output_Height;
/* frames per second */
m_CodecContex->time_base= (AVRational){1,25};
m_CodecContex->gop_size = 10; /* emit one intra frame every ten frames */
m_CodecContex->max_b_frames=1;
m_CodecContex->pix_fmt = PIX_FMT_YUV420P; /* must be YUV for encoding.. */
AVDictionary * RetunedAVDic;
/* open it */
//Changed to work with new FFMPEG
if (avcodec_open2(m_CodecContex, m_codec, &RetunedAVDic) < 0) {
printf("could not open codec");
}
}
Unfortunately the example under 'doc/examples/decoding_encoding.c' that comes with FFMPEG no longer works because all the functions that it uses are now depreciated. My code is based on the example code and worked fine with FFMPEG 0.8 but does not compile with the newest version of FFMPEG. I have changed some of the depreciated functions to their newer versions but it still doesn't compile.
Does anyone know why I am getting this error? or does anyone have a link to an example like 'doc/examples/decoding_encoding.c' using the newest version of FFMPEG?
The relevant lines of my Make file is similar to running g++ as below:
g++ test2.cpp -lavformat -lavcodec -lavutil -D__STDC_CONSTANT_MACROS
In programming, details matter. Your link command is not sufficiently similar to the above command, or it would have worked.
You probably are putting libraries in the wrong place on the link line. The order of sources and libraries matters.
Update:
If you put the code supplied above in a CPP file, then run g++ with the supplied options, it does not work. You will get the error "undefined reference to `avcodec_open2'".
No, I don't. I get a different error (since I don't have avcodec installed at all).
If the example command already fails for you, then you should provide the error it produced, not the error from some other command, so we wouldn't have to guess what that other command might have looked like.
The order of the libraries worked for FFMPEG version 0.8, why does it not work with the latest version?
Probably because you've installed the latest libavcodec54, but didn't install the latest libavcodec-dev.