Accessing textures in PIXI.Filter fragment shader - glsl

I'm trying to pass an image texture to my fragment shader (PIXI.Filter) but it won’t draw.
const loader = PIXI.Loader.shared;
const KEY = 'testImg';
loader.add(KEY, testImg);
loader.load((loader, resources) => {
const uniforms = {
u_resolution: [this.app.screen.width, this.app.screen.height],
u_texture0: resources[KEY].texture
};
const shader = new PIXI.Filter('', contourShader.frag, uniforms);
this.app.stage.filters = [shader];
console.log(resources[KEY].texture.valid); // true
});
uniform sampler2D u_texture0;
void main(void) {
vec4 color = texture2D(u_texture0, vTextureCoord);
gl_FragColor = color;
}
uSampler will work fine for accessing the previous pass but ultimately I’ll want to draw stuff to a buffer and pass to the shader as a texture uniform.
Seems like you can only access uSampler in shaders invoked by that PIXI.Filter class, no other textures?

Related

How to bind a vertex buffer to a uniform array in Gfx-rs?

I am trying to pass a list of Uniforms to a vertex shader using gfx-rs. The data is defined as follows
gfx_defines! {
vertex Vertex { ... }
constant MyConst {
valoo: i32 = "my_val",
}
pipeline pipe {
my_const: gfx::ConstantBuffer<MyConst> = "my_const",
vbuf: gfx::VertexBuffer<Vertex> = (),
out: gfx::RenderTarget<ColorFormat> = "Target0",
}
}
The vertex shader is as follows:
#version 150 core
struct MyConst
{
uint my_val;
};
in vec2 a_Pos;
in vec3 a_Color;
uniform MyConst my_const[];
out vec4 v_Color;
void main() {
MyConst cc = my_const[0];
v_Color = vec4(a_Color, 1.0);
gl_Position = vec4(a_Pos, 0.0, 1.0);
}
When I introduce the first line in main(), the application crashes with the error:
thread 'main' panicked at 'called `Result::unwrap()` on an `Err` value: DescriptorInit(GlobalConstant("my_const[0].my_val", None))'
Full code
How to bind a vertex buffer to a uniform array [...]
In OpenGL a vertex buffer cannot be "bound" to a uniform array.
A vertex attribute can address a named vertex buffer object (stored in the state vector of the Vertex Array Object), but a uniform can't. See Vertex Specification.
If you want to bind some kind of buffer to some kind of uniform, then you've to use a Uniform Buffer Object which is available since OpenGL 3.1 respectively GLSL version 1.40.
Or you can use a Shader Storage Buffer Object, where the last element of the buffer can be an array of variable size. SSBOs are available since OpenGL 4.3 (GLSL version 4.30) respectively by the extension ARB_shader_storage_buffer_object.
e.g.:
layout(std430, binding = 0) buffer MyConst
{
uint my_const[];
};
See also Using Shader Storage Buffer Objects (SSBOs)

libgdx: mesh color not making it to vertex shader

I've implemented my own shader (see below), which I use to render meshes in a ModelBatch. The shader program seems to be bound correctly as I can see geometry on the screen in the correct locations so the camera transform is also setup correctly.
public class ModelShader implements Shader {
ShaderProgram program;
Camera camera;
RenderContext context;
int u_projTrans;
int u_worldTrans;
public void init() {
String vert = Gdx.files.internal("first_vertex.glsl").readString();
String frag = Gdx.files.internal("first_fragment.glsl").readString();
program = new ShaderProgram(vert, frag);
if (!program.isCompiled()) {
throw new GdxRuntimeException(program.getLog());
}
u_projTrans = program.getUniformLocation("u_projTrans");
u_worldTrans = program.getUniformLocation("u_worldTrans");
}
public void dispose() { ... }
public void begin(Camera camera, RenderContext context) {
this.camera = camera;
this.context = context;
program.begin();
program.setUniformMatrix(u_projTrans, camera.combined);
context.setDepthTest(GL20.GL_LEQUAL);
context.setCullFace(GL20.GL_BACK);
}
public void render(Renderable renderable) {
program.setUniformMatrix(u_worldTrans, renderable.worldTransform);
renderable.meshPart.render(program);
}
public void end() {
program.end();
}
public int compareTo(Shader other) { ... }
public boolean canRender(Renderable instance) {
return true;
}
}
The only problem I'm having now is the color isn't coming down. I have some meshes I setup as below with a diffuse color.
model = modelBuilder.createBox(1f, SIZE, SIZE, new Material(ColorAttribute.createDiffuse(Color.MAGENTA)), Usage.Position | Usage.Normal);
box = new ModelInstance(model);
I've verified the color from the vertex shader is making it to the fragment shader, but when I set the vertex shader varying property to the a_color, the scene is all red making me think the color isn't coming down at all. I had valid colors with these meshes when I was using the builtin shader so I'm doing something wrong.
attribute vec3 a_position;
attribute vec4 a_color;
uniform mat4 u_projTrans;
uniform mat4 u_worldTrans;
varying vec4 vColor;
void main() {
vColor = a_color; // hard-coding a color will show up on screen
gl_Position = u_projTrans * u_worldTrans * vec4(a_position, 1.0);
}
Is the a_color the wrong attribute to use here? Am I supposed to bind something manually or is that all done by the meshPart.render() call?

OpenGL Fragment Shaders - Changing a fixed color

At the moment I have simple fragment shader which returns one color (red). If I want to change it a different RGBA color from C code, how should I be doing that?
Is it possible to change an attribute within the fragment shader from C directly or should I be changing a solid color attribute in my vertex shader and then passing that color to the fragment shader? I'm drawing single solid colour rectangles - nothing special.
void main()
{
gl_FragColor = vec4( 1.0, 0, 0, 1 );"
}
If you are talking about generating the shader at runtime, then you COULD use the c string formatting functions to insert the color into the line "gl_FragColor..."
I would not recommend you do this since it will be unneccessary work. The standard method to doing this is using uniforms as so:
// fragment shader:
uniform vec3 my_color; // A UNIFORM
void main()
{
gl_FragColor.rgb = my_color;
gl_FragColor.a = 1; // the alpha component
}
// your rendering code:
glUseProgram(SHADER_ID);
....
GLint color_location = glGetUniformLocation(SHADER_ID, "my_color");
float color[3] = {r, g, b};
glUniform3fv(color_location, 1, color);
....
glDrawArrays(....);

Weird y-position offset using custom frag shader (Cocos2d-x)

I'm trying to mask a sprite so I wrote a simple fragment shader that renders only the pixels that are not hidden under another texture (the mask). The problem is that it seems my texture has its y-coordinate offset after passing through the shader.
This is the init method of the sprite (GroundZone) I want to mask:
bool GroundZone::initWithSize(Size size) {
// [...]
// Setup the mask of the sprite
m_mask = RenderTexture::create(textureWidth, textureHeight);
m_mask->retain();
m_mask->setKeepMatrix(true);
Texture2D *maskTexture = m_mask->getSprite()->getTexture();
maskTexture->setAliasTexParameters(); // Disable linear interpolation on the mask
// Load the custom frag shader with a default vert shader as the sprite’s program
FileUtils *fileUtils = FileUtils::getInstance();
string vertexSource = ccPositionTextureA8Color_vert;
string fragmentSource = fileUtils->getStringFromFile(
fileUtils->fullPathForFilename("CustomShader_AlphaMask_frag.fsh"));
GLProgram *shader = new GLProgram;
shader->initWithByteArrays(vertexSource.c_str(), fragmentSource.c_str());
shader->bindAttribLocation(GLProgram::ATTRIBUTE_NAME_POSITION, GLProgram::VERTEX_ATTRIB_POSITION);
shader->bindAttribLocation(GLProgram::ATTRIBUTE_NAME_TEX_COORD, GLProgram::VERTEX_ATTRIB_TEX_COORDS);
shader->link();
CHECK_GL_ERROR_DEBUG();
shader->updateUniforms();
CHECK_GL_ERROR_DEBUG();
int maskTexUniformLoc = shader->getUniformLocationForName("u_alphaMaskTexture");
shader->setUniformLocationWith1i(maskTexUniformLoc, 1);
this->setShaderProgram(shader);
shader->release();
// [...]
}
These are the custom drawing methods for actually drawing the mask over the sprite:
You need to know that m_mask is modified externally by another class, the onDraw() method only render it.
void GroundZone::draw(Renderer *renderer, const kmMat4 &transform, bool transformUpdated) {
m_renderCommand.init(_globalZOrder);
m_renderCommand.func = CC_CALLBACK_0(GroundZone::onDraw, this, transform, transformUpdated);
renderer->addCommand(&m_renderCommand);
Sprite::draw(renderer, transform, transformUpdated);
}
void GroundZone::onDraw(const kmMat4 &transform, bool transformUpdated) {
GLProgram *shader = this->getShaderProgram();
shader->use();
glActiveTexture(GL_TEXTURE1);
glBindTexture(GL_TEXTURE_2D, m_mask->getSprite()->getTexture()->getName());
glActiveTexture(GL_TEXTURE0);
}
Below is the method (located in another class, GroundLayer) that modify the mask by drawing a line from point start to point end. Both points are in Cocos2d coordinates (Point (0,0) is down-left).
void GroundLayer::drawTunnel(Point start, Point end) {
// To dig a line, we need first to get the texture of the zone we will be digging into. Then we get the
// relative position of the start and end point in the zone's node space. Finally we use the custom shader to
// draw a mask over the existing texture.
for (auto it = _children.begin(); it != _children.end(); it++) {
GroundZone *zone = static_cast<GroundZone *>(*it);
Point nodeStart = zone->convertToNodeSpace(start);
Point nodeEnd = zone->convertToNodeSpace(end);
// Now that we have our two points converted to node space, it's easy to draw a mask that contains a line
// going from the start point to the end point and that is then applied over the current texture.
Size groundZoneSize = zone->getContentSize();
RenderTexture *rt = zone->getMask();
rt->begin(); {
// Draw a line going from start and going to end in the texture, the line will act as a mask over the
// existing texture
DrawNode *line = DrawNode::create();
line->retain();
line->drawSegment(nodeStart, nodeEnd, 20, Color4F::RED);
line->visit();
} rt->end();
}
}
Finally, here's the custom shader I wrote.
#ifdef GL_ES
precision mediump float;
#endif
varying vec2 v_texCoord;
uniform sampler2D u_texture;
uniform sampler2D u_alphaMaskTexture;
void main() {
float maskAlpha = texture2D(u_alphaMaskTexture, v_texCoord).a;
float texAlpha = texture2D(u_texture, v_texCoord).a;
float blendAlpha = (1.0 - maskAlpha) * texAlpha; // Show only where mask is invisible
vec3 texColor = texture2D(u_texture, v_texCoord).rgb;
gl_FragColor = vec4(texColor, blendAlpha);
return;
}
I got a problem with the y coordinates. Indeed, it seems that once it has passed through my custom shader, the sprite's texture is not at the right place:
Without custom shader (the sprite is the brown thing):
With custom shader:
What's going on here? Thanks :)
Found the solution. The vert shader should not use the MVP matrix so I loaded ccPositionTextureColor_noMVP_vert instead of ccPositionTextureA8Color_vert.
In your vert shader (.vsh), your main method should look something like this:
attribute vec4 a_position;
attribute vec2 a_texCoord;
attribute vec4 a_color;
varying vec4 v_fragmentColor;
varying vec2 v_texCoord;
void main()
{
//CC_PMatrix is the projection matrix, where as the CC_MVPMatrix is the model, view, projection matrix. Since in 2d we are using ortho camera CC_PMatrix is enough to do calculations.
//gl_Position = CC_MVPMatrix * a_position;
gl_Position = CC_PMatrix * a_position;
v_fragmentColor = a_color;
v_texCoord = a_texCoord;
}
Note that we are using CC_PMatrix instead of CC_MVPMatrix.

Combining two texture in fragment shader

I'm working on implementing deferred shading to my game. I have rendered the diffuse textures to a render target, and I have lighting rendered to a render target. Both of which I know are fine because I can render them straight to the screen with no problems. What I want to do is combine both the diffuse map and the light map in a shader to create a final image. Here is my current fragment shader, which results in a black screen.
#version 110
uniform sampler2D diffuseMap;
uniform sampler2D lightingMap;
void main()
{
vec4 color = texture(diffuseMap, gl_TexCoord[0].st);
vec4 lighting = texture(lightingMap, gl_TexCoord[0].st);
vec4 finalColor = color;
gl_FragColor = finalColor;
}
Shouldn't this result in the same thing as just straight up drawing the diffuse map?
I set the sampler2d with this method
void ShaderProgram::setUniformTexture(const std::string& name, GLint t) {
GLint var = getUniformLocation(name);
glUniform1i(var, t);
}
GLint ShaderProgram::getUniformLocation(const std::string& name) {
if(mUniformValues.find(name) != mUniformValues.end()) {
return mUniformValues[name];
}
GLint var = glGetUniformLocation(mProgram, name.c_str());
mUniformValues[name] = var;
return var;
}
EDIT: Some more information. Here is the code where I use the shader. I set the two textures, and draw a blank square for the shader to use. I know for sure, my render targets are working, as I said before, because I can draw them fine using the same getTextureId as I do here.
graphics->useShader(mLightingCombinedShader);
mLightingCombinedShader->setUniformTexture("diffuseMap", mDiffuse->getTextureId());
mLightingCombinedShader->setUniformTexture("lightingMap", mLightMap->getTextureId());
graphics->drawPrimitive(mScreenRect, 0, 0);
graphics->clearShader();
void GraphicsDevice::useShader(ShaderProgram* p) {
glUseProgram(p->getId());
}
void GraphicsDevice::clearShader() {
glUseProgram(0);
}
And the vertex shader
#version 110
varying vec2 texCoord;
void main()
{
texCoord = gl_MultiTexCoord0.xy;
gl_Position = ftransform();
}
In GLSL version 110 you should use:
texture2D(diffuseMap, gl_TexCoord[0].st); // etc.
instead of just the texture function.
And then to combine the textures, just multiply the colours together, i.e.
gl_FragColor = color * lighting;
glUniform1i(var, t);
The glUniform functions affect the program that is currently in use. That is, the last program that glUseProgram was called on. If you want to set the uniform for a specific program, you have to use it first.
The problem ended up being that I didn't enable the texture coordinates for the screen rectangle I was drawing.