Bad Opengl output, SDL2 x86 - c++

I have been working on a small hobby project for over a year now. Its written in C++, using SDL/Image/Mixer and Minizip. It uses OpenGL for rendering.
Till July of this year I had been maintaining and testing both x64 and x86 versions of the code. Both compiled without any changes to the original code and ran exactly the same.
However, around August i moved the code upto SDL 2.0 from 1.2.15 and started only maintaining and testing the x64 version. Now when i try to build a x86 version I am getting the below problem.
Correct Output
Incorrect Output
-
Things I have tried:
gDebugger: both version of the code create the same type of context.
However accumulate buffer is 64 bits in both. Cannot find a way to
disable it.
ran it through drmemory: no alarming memory or heap corruption
check contexts on creation: both version create the same value
context in SDL, both generate the same "No OpenGL context has been
made current" even after calling SDL_GL_MakeCurrent, but the x64
version works, the x86 debug version gives a black screen, and the
x86 release version gives the above output.
Both the x64 and x86 version are the same exact code, which used to compile and work properly before SDL 2.0. I am not sure if its a bug in SDL, or something i did wrong. Let me if you need more information on this.
Update:
I am using pure GL 1.1 code only, so no shaders or vbo's. Using only glVertexPointer, and associated glColorPointer and glTexCoordPointer functions. all arrays are defined as GL_types, with the gl functions given the pointer to the client memory. All textures are rendered as quads.
GLfloat vertex_array_3f[12];
//Initialize array
glVertexPointer(3, GL_FLOAT, 0, vertex_array_3f);
//set color and tex pointers
glDrawArrays(GL_QUADS, 0, 4);
The context type I am requesting is 2.1, but instead i get a backward compatible context. Doesnt cause any issues in the x64 version.
I also changed over the from VS2010 express to VS2012 express during the same period. But i do remember it compiling succesfully for x86 for VS2012 express.
edit: Has any one experience anything like this before? I am doing some testing in the meanwhile, if i find anything will post the findings below.

Related

OpenGL - ARB extension

I am using MacBook Pro (13-inch, Mid 2010) and I work with OpenGL. I noticed, some of functions miss in library. I found specifications on the internet about my hardware and it says: "support OpenGL 3.3". It was strange so I printed my OpenGL version and IT IS 2.1, NO 3.3!. (Then I found, newest MacBooks (2014) have the same OpenGL version 2.1, WTF)
Then I almost jumped from window. (JK)
I googled something about 2.1 with some extension ARB, but there is no documentation, no usage, nobody uses it. Can anybody explain me please, what is that? How to use it? What is the difference?
I read (If I understand well), instead of new OpenGL 3.X, there is ARB extension which is similar or something. I hope, if they write to the specification it supports version 3.3, ARB should be the same (the same functions at least).
I would be glad, if somebody explains me what is going on.
Question:
I have problem with multisample texture for FBO drawing. It can be created by function glTexImage2DMultisample with parameter GL_TEXTURE_2D_MULTISAMPLE. It is from version 3.2 or grater.
So what should I use, or is it possible to do it with ARB?
I found GL_ARB_multisample in library. What is that? Any usage? All functions I found on the internet are missing. There are some definitions like GL_MULTISAMPLE_ARB in header. I tried to enable it by glEnable (GL_MULTISAMPLE is defined too), it doesn't work.
Please help me. :(
Edit:
If you know different way to solve this, I would be happy.
Original question: OpenGL - FBO and alpha blending
You must switch OpenGL context from Legacy to Core profile. Core profile requires some changes in your code. You must migrate your code and shaders, because it's new version of OpenGL and GLSL. Check official video, how to migrate and rewrite functions to validate code for new version. Apple Developer Site - OpenGL (The video on the right side).
The important thing, you must do, is add #import <OpenGL/gl3.h> and all functions will be visible for use.
To get it to work, and debug shaders it's necessary set up NSOpenGLPixelFormat. Add NSOpenGLPFAOpenGLProfile key with NSOpenGLProfileVersion3_2Core value to NSOpenGLPixelFormatAttribute array:
NSOpenGLPixelFormatAttribute attribs[] = {
// ...
NSOpenGLPFAOpenGLProfile, NSOpenGLProfileVersion3_2Core,
// ...
};
This helps you to debug your code.
Thanks a lot for help and I hope, this helps you.

glGenVertexArrays not giving unique vaos

My friend and I are working on a project using C++ and OpenGL. We've created a C++ class for a "ModelObject", and each ModelObject has a GLuint vao as a member variable. Then while initializing a ModelObject, we call
glGenVertexArrays( 1, &vao );
glBindVertexArray( vao );
and at the end of the initializing function we call
glBindVertexArray(0);
to get out of the context. Now we're trying to render 2 objects, a train car and a cube. On his machine (Linux Mint) they both render fine, with their own textures, and querying the objects for their vaos returns 1 and 2 respectively.
On my machine however (a MacBook Pro), both objects render as a cube (though one has the texture of a train and the other the texture of the cube). Querying their vaos returns 0 and 0.
As an experiment we told glGenVertexArrays to create 5 vaos for each ModelObject. This resulted in the list 0, 1, 918273, 8, 7 (or something similar to that), and it was the same list for both ModelObjects.
So as far as I can tell the problem is that glGenVertexArrays is both a) using 0 as a valid address, and b) generating identical addresses on each call, even though we're never calling glDeleteVertexArray. Why would it be doing this on my machine and not his, and how do I stop it?
Does your GPU support OpenGL 3.0? What does glview say about the entry point glGenVertexArrays? It is possible that your GPU/Driver doesn't support VAOs.
I had the same issue on an iMac. It turned out that apparently on MacOS, you need to use glGenVertexArrayAPPLE and glBindVertexArrayAPPLE. Replacing the call give consistent unique VAO.
VAOs were introduced in OpenGL 3.0, so they will only work in contexts that support 3.0 or later.
Mac OS only supports OpenGL 3.x and 4.x in Core Profile contexts. By default, you will get a context that supports OpenGL 2.1 with all the legacy features that are deprecated, and have been removed in the Core Profile. Mac OS does not support the Compatibility Profile, where features from 3.0 and later can be used in combination with legacy features.
How you create a Core Profile context depends on the window system interface you use. For two of the common ones:
With GLUT (which is marked as deprecated itself, but still works at the moment): Add GLUT_3_2_CORE_PROFILE to the flags passed to glutInitDisplayMode() (see Glut deprecation in Mac OSX 10.9, IDE: QT Creator for details).
With Cocoa, add this attribute/value pair to the pixel format attributes:
NSOpenGLPFAOpenGLProfile, NSOpenGLProfileVersion3_2Core,
Once you have a Core Profile context, glGenVertexArrays() and glBindVertexArray() will work fine. This obviously requires a machine that can support at least OpenGL 3.x. The table on this page lists the version support for each machine: http://support.apple.com/kb/HT5942.

Can't figure out why this OpenGL program is not rendering

I compiled Joe Groff's "An intro to modern OpenGL: Hello World: The Slideshow.
I have compiled it using Mingw-w64 with freeglut, Glut 3.7 and a version that makes my own context.
However, when I run the program, the image doesn't fade back and forth like its supposed to and I can't figure out why (spent a whole day on it).
Also, I have examined most of inputs and outputs except for the shaders and cant find anything wrong, anyone have any ideas?
Most likely, your OpenGL version doesn't support shaders. Are you by any chance running in a virtual machine or via remote desktop? These tend to only support OpenGL 1.1 even if the graphics card/drivers are much more recent, and OpenGL 1.1 does not support shaders. It's also possible that if you're using an older laptop with an integrated Intel GPU that shaders are not (properly) supported.

Where to get OpenGL 2.0 for windows 7 64bit

I've been looking for OpenGL version 2.0 or higher, but I haven't found anything I could use so far. There is no download section on the official website and google finds mostly stuff like OpenGL Viewer or OpenGL Screen Saver, but I am looking for OpenGL to develop games/graphics/vizualizations ( precisely version 2.0, but I know that higher versions are also compatible with 2.0 then they are also OK ). Could someone please give me a source, which I could get appropriate OpenGL for my project from? I've managed only to download one, but it didn't work, because it was created for 32bit OS, and I use 64bit windows 7. Does anyone know how to handle this problem as well?
this is my graphic card : NVIDIA GeForce 9600M GS
You don't have to download an SDK to use OpenGL in 64-bit applications on Windows. All you need is a 64-bit capable compiler, and the Windows Platform SDK (which comes bundled with Microsoft Visual Studio).
But there is a catch: Microsoft's OpenGL implementation hasn't been updated since OpenGL 1.1, and to use functionality from later versions OpenGL, you need to use OpenGL-extensions. Luckily, some nice people has made GLEW, a library that does the extension-work for you and allows you to simply compile OpenGL 2.0 (and later, as GLEW is updated) source code for Windows. Perhaps this is what you're looking for?
kusma is completely right, but maybe you'll need more precise directions.
First you'll need OpenGL libraries. These will be given with your Visual Studio / mingw / whatever installation.
Then you'll need to create an OpenGL window. You can do it with windows functions, but it is a nightmare. You should go for something like GLFW.
Then you'll need something to deal with openGL extensions ( as kusma said, you don't want OpenGL 1.1 only ). Use GLEW.
You will also need some math stuff : create a vector ( on the C++ side ), compute your projection matrix... GLM can do that for you.
Last but not least, you may want to use Cg for your shaders (but you can use GLSL instead, which is "built-in" in OpenGL)
Here's the OpenGL SDK site. LINK Is this what you are looking for?
The easy way to tell is if your using glBegin/glEnd statements you using old context methods (good for quick demos and prototyping, bad if your looking to do something that needs to look professional). When you start dealing with opengl topics that cover buffers and hint to VBO- vertex buffer objects and FBOs - Frame buffer objects your in the area of more modern opengl methods. If you want to get up to speed in the shortest amount of time, start with buffers and keep working your way forward. Just remember when your dealing with device contexts (methods to create your windows) if you stick with OGL 2.1 or lower your limiting yourself ( Think roughly DirectX9/early DirectX10) . Your video card handles DirectX10 and OpenGL 3. Best bet start there. Check out NVidia's developer site, http://developer.nvidia.com/ And, take a look at http://opengl.org site check out the forums - http://www.opengl.org/discussion_boards, the guys there are helpful (be careful not to re-post old questions).
Also check out http://swiftless.com - its a good start - and he labels his tutorials by ogl versions.

Lighting issue in OpenGL

I have trouble developing an OpenGL application.
The weird thing is that me and a friend of mine are developing a 3d scene with OpenGL under Linux, and there is some code on the repository, but if we both checkout the same latest version, that means, the SAME code this happens: On his computer after he compiles he can see the full lighting model, whilst on mine, I have only the ambient lights activated but not the diffuse or specular ones.
Can it be a problem of drivers ?(since he uses an ATi card and I use an nVIDIA one)
Or the static libraries ?
I repeat, it is the same code, compiled in different machines.. that's the strange thing, it should look the same.
Thanks for any help or tip given.
This can very easily be a driver problem, or one card supporting extensions that the other does not.
Try his binaries on your machine. If it continues to fail, either your drivers are whack or you're using a command not supported by your card. On the other hand if your screen looks right when using your code compiled on his machine, then your static libraries have a problem.