rotation inside OpenGL shading language - opengl

I wonder if there is a way how to compute a rotationmatrix using OpenGL shading language, i.e. doing the calculation using a shader. (Optimally by two given vectors using quaternionpower.)
Some background:
I have a project, where I want to implement a lightfield on the vertices of a 3D mesh (+interpolation) that requires vectors to be in local coordinates (Torrance, Lafortune, etc...).
Now this requires the calculation of a rotation matrix quite a number of times(Number of vertices should be scalable). This could be done somewhere in normal sourcecode using normal CPU, but I was hoping to find a way to use the power of the graphicscard to do this job for me.
Till now I could find any hint in the OpenGL manual itself nor anywhere else...

Related

How to create fog using Open GL ES 2.0 or WebGL?

I would like to create a fog effect for my game but I can't find any tutorials on how to do it using OpenGL ES 2.0. If anyone has links to tutorials, can provide an explanation, or source code I would be grateful.
There's a section on replicating fixed-function fog using shaders in the OpenGL ES 2.0 Programming Guide on page 224. The source code is available on the google code project (MIT License). It's a gigantic rendermonkey XML file, but the shader source embedded in it is pretty straightforward (I would copy it directly here, but not sure if that's OK).
The idea is to use the distance to a particular pixel as the input to some fog function. In that example, they calculate the eye to vertex distance in the vertex shader, then provide the interpolated distance to each fragment by passing it as a varying.
They then do a simple linear fog function. There's some minimum distance where there would be zero fog color, and some maximum distance where the output would be all fog color. You mix (linearly interpolate) the fog color and the fragment color by where the pixel distance falls between the max and the min.
As mentioned in the book, once you have that working, there's no reason to limit yourself to linear fog. You can easily make it exponential, dependent on other variables (e.g. distance to a floor, variable due to a texture lookup or noise functions), make god rays through it, etc.
It's not clear from your question what exactly you're after, so if you want to go really dynamic, that's a whole other ballgame (and not always worth the development effort and performance costs for the effect you get). For existing WebGL code, you might try something like the loading screen of Three Dreams of Black, which you can find somewhere in the source code, or you can go more simulation-based by actually modeling the fog as a 3d fluid.

How to rotate vertices exactly like with glRotatef() in OpenGL?

I need to optimize my rendering code, currently I'm using glPushMatrix() with glTranslatef() and glRotatef(). But this costs more than rendering all objects in a single vertex array call.
Is there some fast built in function in OpenGL to rotate my vertices data exactly like glRotatef() would do? If not, what library or method would you recommend using? The only method I know is to use sin/cos functions to rotate the vertices, but I'm not sure if that is the fastest way to do it, or will it even result in the same outcome.
I only need to rotate along one axis once per object (2D rendering), so it doesn't need to be super complicated or support that glPushMatrix() system in its full potential.
Edit: I don't want to use shaders for this.
Edit2: I am only rendering individual quads (in 2D mode) which are rotated along the zero point, so each vertex would go from -10 to 10 values for example. My current code: (quad.vert[i].x*cosval)-(quad.vert[i].y*sinval) (twice, for y too).
I'm assuming you're using an old version of OpenGL, since you're using glRotate, and truly ancient/strange hardware since you don't want to use shaders.
You can put the glRotate*() calls in a display list, or compute the rotation matrices yourself. Chapter 3 of the OpenGL Red Book together with appendix F has the information you need to construct the matrices yourself. Look at chapter 7 for more information about display lists.

How do I retrieve the ModelView matrix in GLSL 1.5?

I have been reading through the specification of openGL 1.5, and saw that any reference to what used to be a variable holding the reference to the ModelView matrix (like gl_ModelViewMatrix) has been deprecated, and is only availble in some kind of compatability mode (which happens not to be supported on my GPU).
I have seen a few examples first retrieving the ModelView matrix or creating one, then sending it back to the GPU as a uniform variable.
Now this all seems just backward to me; even for a simple Vertex Shader you will in many cases want to use some kind of transformation on the geometry.
So I am really wondering now; is there any way to get the current ModelView matrix from within a vertex shader, using GLSL 1.5?
OpenGL-3 core completely dropped the matrix stack, i.e. the built-in modelview, projection, texture and color matrices. It's now expected from you to implement the matrix math and supply the matrices through self chosen uniforms.
There is no built in matrix system/lib in core openGL - since 3.+ version.
A lot of people had similar (bad) opinions about that "huge change" in openGL.
You have to use your own set of functions to perform matrix calculation. See libraries like: GLM or in lighthouse3D.
All in all it was very useful to have matrix functions in OpenGL when learning. Now you have to look for other solutions...
On the other side it is not a problem for game engines or game frameworks that usually have their own math libraries. So for them "new" OpenGL is even easier to use.

In a big OpenGL game with lots of 3D moving objects, how are the points typically updated?

Points calculated with own physics engine and then sent to OpenGL every time it has to display, e.g. with glBufferSubDataArb, with the updated coordinates of a flying barrel
There are lots of barrels with the same world coordinates but somehow for each one you tell OpenGL to use a different matrix transformation. When a barrel moves you update it's transformation matrix somehow, to reflect which way it rotated/translated in the world.
Some other way
Also, if the answer is #2, is there any easy way to do it, e.g. with abstracted code rather than manipulating the matrices yourself
OpenGL is not a scene graph, it's a drawing API. Most recent versions of OpenGL (OpenGL-3 core and above) reflect this, by not managing matrix state at all. Indeed the answer is 2, more or less. And actually you are expected to deal with the matrix math. OpenGL-3 no longer provides any primitives for that.
Usually a physics engine sees an object as a rigid body with a convex hull. The natural way to represent such a body is using a 4×3 matrix (a 3×3 rotation matrix and a translation vector). So if using a physics engine you're presented with such matrices anyway.
Also you must understand that OpenGL doesn't maintain a scene, so there is nothing you "update". You just draw your data using OpenGL. Matrices are loaded as they are needed.

How do you do nonlinear shading in OpenGL?

I am developing a visualization tool in OpenGL to visualize the output of a 3d finite element modeling application. The application uses a tetrahedral mesh (but I am only viewing the exterior facets, which are triangles). The output is a scalar variable, which I want to map to a color map (I already know how to do that). The tricky part is that the value of the variable in each cell is given by a polynomial function (I think it's of degree 3, but that hasn't been finalized yet) of the coordinates in that cell.
In OpenGL, if I use the "smooth" shading model, then if I create a polygon and give each vertex a different value, then it will automatically interpolate (linearly) between the values at the vertices in order to get the color values at the points in the interior. But that just gives a linear function in each cell, and I want it to be a nonlinear function that I specify. Is there a way to do this?
(Of course, one solution would be to interpolate "manually" by drawing each cell as a composite of much smaller OpenGL polygons that are small enough that the color doesn't change much in each of them. But I want to know if OpenGL itself has a solution.)
You could either use a pixel shader if you have experience in GLSL (or the time to learn it), or render your scalar values to a texture and texture-map your triangles with it.
If you use a shader, you should be able to read the color values from your triangle's vertices and perform the interpolation yourself as you see fit.
Edit
I found a paper dealing with that exact problem: http://mgarland.org/files/papers/perpixel.pdf