Texture mapping of the cessna.osg via GLSL shaders - opengl

What I am trying to do should be pretty simple, I guess, but I am not getting what I want (maybe because I am kind of newbie in this kind of tools). I just want to map a certain texture image (typical bricks, for instance) into the existing cessna.osg, via vertex and fragment shaders. To do the task, I chose a very simple vertex and fragment shader, which, in my limited understanding, should work, but it doesn't.
Here is the code:
#include <osg/PositionAttitudeTransform>
#include <osgDB/ReadFile>
#include <osgDB/FileUtils>
#include <osgGA/TrackballManipulator>
#include <osg/Program>
#include <osg/Shader>
#include <osgViewer/Viewer>
#include <iostream>
#include <osg/Texture2D>
static const char* VertexShader = {
"varying vec2 texCoords;\n"
"void main()\n"
"{\n"
" texCoords = gl_MultiTexCoord0.st;\n"
" gl_Position = ftransform();\n"
"}\n"
};
static const char* FragmentShader = {
"varying vec2 texCoords;\n"
"uniform sampler2D tex;\n"
"void main()\n"
"{\n"
" gl_FragColor = texture2D(tex, texCoords);\n"
"}\n"
};
int main(int argc, char **argv)
{
// Assembling scenegraph
osg::ref_ptr<osg::Node> model = osgDB::readNodeFile("cessna.osg");
// Assigning program
osg::ref_ptr<osg::StateSet> ss = model->getOrCreateStateSet();
osg::ref_ptr<osg::Program> program(new osg::Program);
osg::ref_ptr<osg::Shader> vShader(new osg::Shader(osg::Shader::VERTEX, VertexShader));
osg::ref_ptr<osg::Shader> fShader(new osg::Shader(osg::Shader::FRAGMENT, FragmentShader));
program->addShader(vShader);
program->addShader(fShader);
ss->setAttributeAndModes(program.get());
//Set Texture 1
osg::ref_ptr<osg::Texture2D> bodyTexture = new osg::Texture2D;
bodyTexture->setImage(osgDB::readImageFile("Images/Brick-Norman-Brown.TGA"));
bodyTexture->setWrap(osg::Texture::WRAP_S, osg::Texture::REPEAT);
bodyTexture->setWrap(osg::Texture::WRAP_T, osg::Texture::REPEAT);
bodyTexture->setFilter(osg::Texture::MIN_FILTER, osg::Texture::LINEAR);
bodyTexture->setFilter(osg::Texture::MAG_FILTER, osg::Texture::LINEAR);
ss->setTextureAttributeAndModes(0, bodyTexture.get());
ss->addUniform(new osg::Uniform("tex", 0));
// View the scene
osgViewer::Viewer viewer;
viewer.setSceneData(model);
viewer.setUpViewInWindow(0, 0, 512, 384);
return viewer.run();
return 0;
}
What I get is the cessna model with a plain color (the main color of the texture image), but I never get a actual "texturized" cessna.
My apologizes if the question is very stupid, but I really appreciate any kind of hint that help keep moving.
Thanks

Your code looks good, but the cessna.osg model you're using definitely does not have any UV (texture) coordinates.
You can inspect the text version of the model (cessna.osgt) with a notepad and you'll see only vertex coords and not texture coords.
Run your test with any other model with texture coordinates. In the osg-data repo there are several, like skydome.osgt

Related

Qt3d/C++ - How to use frameGraphe to achive Outlines?

I made simple viewer that allow you to import .obj file, what I want to achieve is, when user selecting the model, shaders will draw model outlines in different color.
What I use to represent the .obj file is a QEntity with Custom QMaterial, Custom QPickerObject and QMesh.
With my Custom QMaterial I made simple Flat Shading(coloring based on face normal)
// My Custom QMaterial :
explicit CustomizedMaterial(Qt3DCore::QNode *parent = nullptr) : QMaterial(parent)
{
// Create effect, technique, render pass and shader
Qt3DRender::QEffect *effect = new Qt3DRender::QEffect();
Qt3DRender::QTechnique *gl3Technique = new Qt3DRender::QTechnique();
Qt3DRender::QRenderPass *gl3Pass = new Qt3DRender::QRenderPass();
Qt3DRender::QShaderProgram *glShader = new Qt3DRender::QShaderProgram();
QByteArray ver(
"#version 330\n"
"out vec3 vViewPos;\n"
"in vec3 vertexPosition;\n"
"in vec3 vertexNormal;\n"
"uniform mat4 modelView;\n"
"uniform mat3 modelViewNormal;\n"
"uniform mat4 mvp;\n"
"void main()\n"
"{\n"
"vec4 pos = vec4(vertexPosition, 1.0);\n"
"vec4 mpos = modelView * pos;\n"
"gl_Position = mvp * vec4(vertexPosition, 1.0);\n"
"vViewPos = -mpos.xyz;\n"
"}\n");
QByteArray frag(
"#version 330\n"
"vec3 normals(vec3 pos)\n"
"{\n"
"vec3 fdx = dFdx(pos);\n"
"vec3 fdy = dFdy(pos);\n"
"return normalize(cross(fdx, fdy));\n"
"}\n"
"in vec3 vViewPos;\n"
"out vec4 fragColor;\n"
"void main()\n"
"{\n"
"vec3 normal = normals(vViewPos);\n"
"vec3 gray = vec3(0.9, 0.9, 0.9);\n"
"float theta = dot(normal, vec3(0, 0, 1)) / length(normal);\n"
"fragColor = vec4(gray * theta , 1.0);\n"
"}\n");
glShader->setVertexShaderCode(ver);
glShader->setFragmentShaderCode(frag);
// Set the shader on the render pass
gl3Pass->setShaderProgram(glShader);
// filter
Qt3DRender::QFilterKey *m_filterKey = new Qt3DRender::QFilterKey(this);
m_filterKey->setName(QStringLiteral("renderingStyle"));
m_filterKey->setValue(QStringLiteral("forward"));
// Add the pass to the technique
gl3Technique->addRenderPass(gl3Pass);
// Set the targeted GL version for the technique
gl3Technique->graphicsApiFilter()->setApi(Qt3DRender::QGraphicsApiFilter::OpenGL);
gl3Technique->graphicsApiFilter()->setMajorVersion(3);
gl3Technique->graphicsApiFilter()->setMinorVersion(2);
gl3Technique->graphicsApiFilter()->setProfile(Qt3DRender::QGraphicsApiFilter::CoreProfile);
// Add filter
gl3Technique->addFilterKey(m_filterKey);
// Add the technique to the effect
effect->addTechnique(gl3Technique);
// Set the effect on the materials
setEffect(effect);
}
From my searching I think the easiest way is by using two rendering pass technique, sadly there is no documentation or example in Qt3d/C++ show me how to do it, can some one help ?
Thanks in advance.
Yes there is not much information in this regard unfortunately. It is looks like that Qt put more effort and resource into QML and not C++ / all examples are 5 to 1 in favor of QML/.
Ok I manage to make custom shader work. I played with your code and changed just a few:
I moved the configuration of the QTechnique immediately after the creation plus I change the order of initialization of the technique:
gl3Technique->graphicsApiFilter()->setProfile(Qt3DRender::QGraphicsApiFilter::CoreProfile);
gl3Technique->graphicsApiFilter()->setApi(Qt3DRender::QGraphicsApiFilter::OpenGL);
gl3Technique->graphicsApiFilter()->setMajorVersion(3);
gl3Technique->graphicsApiFilter()->setMinorVersion(1);
I put QFilterKey for the technique
Qt3DRender::QFilterKey *filterkey = new Qt3DRender::QFilterKey(this);
filterkey->setName(QStringLiteral("renderingStyle"));
filterkey->setValue(QStringLiteral("forward"));
and I load shaders from resource the same way as is shown in example QML code :
glShader->setVertexShaderCode(Qt3DRender::QShaderProgram::loadSource(QUrl(QStringLiteral("qrc:/MyShader/simpleColor.vert"))));
glShader->setFragmentShaderCode(Qt3DRender::QShaderProgram::loadSource(QUrl(QStringLiteral("qrc:/MyShader/simpleColor.frag"))));
I did not investigate which one of those is the main reason but is working.
After that I found another confirmation - the same way of processing - in this post
How to make color of a section be different on a 3D object
from #AdaRaider and #user3405291

How to draw red lines if I used a fragment shader for texture?

I am writing a simple video player using opengl. I used Qt and followed its basic texture examples.
The vertex and fragment shaders are here:
QOpenGLShader *vshader = new QOpenGLShader(QOpenGLShader::Vertex, this);
const char *vsrc =
"attribute highp vec4 vertex;\n"
"attribute mediump vec4 texCoord;\n"
"varying mediump vec4 texc;\n"
"uniform mediump mat4 matrix;\n"
"void main(void)\n"
"{\n"
" gl_Position = matrix * vertex;\n"
" texc = texCoord;\n"
"}\n";
vshader->compileSourceCode(vsrc);
QOpenGLShader *fshader = new QOpenGLShader(QOpenGLShader::Fragment, this);
const char *fsrc =
"uniform sampler2D texture;\n"
"varying mediump vec4 texc;\n"
"void main(void)\n"
"{\n"
" gl_FragColor = texture2D(texture, texc.st);\n"
"}\n";
fshader->compileSourceCode(fsrc);
And I did this to display a image:
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, texture_cv.cols, texture_cv.rows, GL_RGB, GL_UNSIGNED_BYTE, texture_cv.data);
//then draw a quad
...
Then after this how could I draw several red lines on the screen, since I am using the fragment shader (I am very new to shader), I cannot turn off the texture.
By far the easiest solution is to use a different shader program for drawing your red lines. Since it just draws a solid color, it will be very easy. The fragment shader could be something like:
void main()
{
gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0);
}
The vertex shader will be very similar to what you have, except that it does not need to produce texture coordinates. You might even be able to use the existing vertex shader.
It is very common to use multiple shader programs for rendering. You have a shader program for each different type of rendering, and switch between them with glUseProgram().

Libgdx shader, render and draw confusion

in my Libgdx Scene2D stage I am trying to have an actor flashing with white color.
With OpenGLES 2.0 I understood I needed to use a shader to achieve this and I have a problem implementing it.
My goal is to color an actor in solid white color given a speficic game event. The problem is given this specific event, everything in the stage becomes white (all actors and the background) instead of just the selected actors textures.
My World class is where the stage is created. It is rendered like this:
public void render() {
Gdx.gl.glClearColor(1, 1, 1, 1);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
update();
stage.act(Gdx.graphics.getDeltaTime());
stage.draw();
}
In the stage I use actors that are defined in a MyActor class. This class has its own
draw method where I am trying to color the actor in solid white with the shader. Here is the draw method:
public void draw(Batch batch, float alpha) {
if (flashing) {
batch.setShader(shader);
}
batch.flush();
Color color = getColor();
batch.setColor(color.r, color.g, color.b, color.a * alpha);
batch.draw(tr, getX(), getY(), side/2, side/2, side, side, getScaleX(), getScaleY(), 0);
}
Obviously I do something wrong with the batch. Instead of coloring this one actor in white, it colors everything. Also it remains white even after the Boolean flashing = false;
Here is how I have set up my shader, in the actor class MyActor:
ShaderProgram shader;
String vertexShader =
"attribute vec4 a_position; \n" +
"attribute vec4 a_color;\n" +
"attribute vec2 a_texCoord0; \n" +
"uniform mat4 u_projTrans; \n" +
"varying vec4 v_color; \n" +
"varying vec2 v_texCoords; \n" +
"void main() { \n" +
"v_color = a_color; \n" +
"v_texCoords = a_texCoord0; \n" +
"gl_Position = u_projTrans * a_position; \n" +
"};";
String fragmentShader = "#ifdef GL_ES\n" +
"precision mediump float;\n" +
"#endif\n" + "varying vec4 v_color;\n" +
"varying vec2 v_texCoords;\n" +
"uniform sampler2D u_texture;\n" +
"uniform float grayscale;\n" +
"void main()\n" +
"{\n" +
"vec4 texColor = texture2D(u_texture, v_texCoords);\n" +
"float gray = dot(texColor.rgb, vec3(5, 5, 5));\n" +
"texColor.rgb = mix(vec3(gray), texColor.rgb, grayscale);\n" +
" gl_FragColor = v_color * texColor;\n" +
"}";
Then initiated with:
shader = new ShaderProgram(vertexShader, fragmentShader);
What is it that I am doing wrong?
I have to admit I probably don't understand correctly how the batch of MyActor, the drawing method and the rendering methods work together!
Thanks for your time
The problem here is, that you never set the Shader back to the default Shader, so your Spritebatch always uses the "solid white" Shader.
As the javadock of the setShader(ShaderProgram shader)sais you can reset the Shader by calling setShader(null).
Another possibility would be to use another Texture for the solid white and the simply say:
Texture t = flashing ? whiteTexture : defaultTexture;
and draw the Texture t.
Just a suggestion: Do not use batch.flush() if it is not absolutely neccessary, because flush() makes a draw call to the GPU and this costs some performance.

OpenGL Rotation with vertices not working

I am trying to make a rotation with shaders on vertices, here is the code of my shader :
"#version 150 core\n"
"in vec2 position;"
"in vec3 color;"
"out vec3 Color;"
"uniform mat4 rotation;"
"void main() {"
" Color = color;"
" gl_Position = rotation*vec4(position, 0.0, 2.0);"
"}";
I am using it with a quat, here is the code producing the matrice and dumping it in the shader :
glm::quat rotation(x,0.0,0.0,0.5);
x+=0.001;
ctm = glm::mat4_cast(rotation);
GLint matrix_loc;
// get from shader pointer to global data
matrix_loc = glGetUniformLocation(shaderProgram, "rotation");
if (matrix_loc == -1)
std::cout << "pointer for rotation of shader not found" << matrix_loc << std::endl;
// put local data in shader :
glUniformMatrix4fv(matrix_loc, 1, GL_FALSE, glm::value_ptr(ctm));
But when it rotates, the object gets bigger and bigger, I know i don't need to GetUniformLocation every time i iterate in my loop but this is the code for a test. GlUniformMatrix is supposed to make the rotation happen as far as I know. After these calls i simply draw my vertex array.
Given its still drawing, rotation in the shader is probably a valid matrix. If it were an issue with the uniform it'd probably be all zeroes and nothing would draw.
As #genpfault says, ctm needs to be initialized:
ctm = glm::mat4_cast(rotation);
See: Converting glm quaternion to rotation matrix and using it with opengl
Also, shouldn't the 2.0 in vec4(position, 0.0, 2.0) be a 1.0?

OpenGL luminance to color mapping?

I was wondering if there was a way I could process OpenGL texture buffers so that a buffer of grayscale values is converted to rgb values on the fly through some formula of my choosing.
I already have a function like, which works well but outputs a color 3 vector.
rgb convertToColor(float value);
I am very new to OpenGL and was wondering what kind of shader I should use and where it should go. My program currently cycles frames as such:
program1.bind();
program1.setUniformValue("texture", 0);
program1.enableAttributeArray(vertexAttr1);
program1.enableAttributeArray(vertexTexr1);
program1.setAttributeArray(vertexAttr1, vertices.constData());
program1.setAttributeArray(vertexTexr1, texCoords.constData());
glBindTexture(GL_TEXTURE_2D, textures[0]);
glTexSubImage2D(GL_TEXTURE_2D,0,0,0,widthGL,heightGL, GL_LUMINANCE,GL_UNSIGNED_BYTE, &displayBuff[displayStart]);
glDrawArrays(GL_TRIANGLES, 0, vertices.size());
glBindTexture(GL_TEXTURE_2D,0);
//...
Shader
QGLShader *fshader1 = new QGLShader(QGLShader::Fragment, this);
const char *fsrc1 =
"uniform sampler2D texture;\n"
"varying mediump vec4 texc;\n"
"void main(void)\n"
"{\n"
" gl_FragColor = texture2D(texture, texc.st);\n"
"}\n";
I am trying to recreate effects in matlab like imagesc as seen for example in the image below:
Something like this would work:
"uniform sampler2D texture;\n"
"uniform sampler1D mappingTexture;\n"
"varying mediump vec4 texc;\n"
"void main(void)\n"
"{\n"
" gl_FragColor = texture1D(mappingTexture, texture2D(texture, texc.st).s);\n"
"}\n";
where mapping texture is 1D textuer that maps grayscale to color.
Of course, you could also write function that calculates rgb color based on grayscale, but (depending on your hardware) it might be faster just to do texture lookup.
it is not clear how to bind two buffers to an OpenGL program at once
Binding textures to samplers.
It looks like you already have a program loaded and set up to read the texture (program1 in your code). Assuming the vertex shader is already set up to pass the pixel shader texture coordinates to look up into the texture (in the below program this is "texcoord"), you should be able to change the pixel shader to something like this:
uniform texture2d texture; // this is the greyscale texture
varying vec2 texcoord; // passed from your vertex shader
void main()
{
float luminance = tex2D(texture, texcoord).r; // grab luminance texture
gl_FragColor = convertToColor(luminance); // run your function
}
this reads in the luminance texture and calls your function which converts a luminance to a color. if your function only returns a 3-component rgb vector, you can change the last line to:
gl_FragColor = vec4(convertToColor(luminance), 1.0);