I am trying to pass in a value from Processing to the GLSL shader file but Processing produce and error that it did not read the uniform float that I have already declared in the shader file. However, Processing does read my other uniform float that I have declared. Below is part of my code in my program.
GLSL Shader File:
#ifdef GL_ES
precision mediump float;
precision mediump int;
#endif
varying vec2 surfacePosition;
const float checkSize = 20.0;
uniform float mouseX;
uniform float mouseY;
uniform float size;
uniform float time;
void main()
{
// other codes
}
Processing:
void draw()
{
background(255);
myShader.set("mouseX", map(mouseX, 0, width, 0 , 200));
myShader.set("mouseY", map(mouseY, 0, height, 200, 0));
myShader.set("counter", myCounter);
myShader.set("size", 600);
shader(myShader);
translate(width/2, height/2);
shape(sphere);
myCounter += 0.05;
}
Processing produces the following error message:
The shader doesn't have a uniform called "size"
Processing successfully reads the uniform variable mouseX, mouseY and time but not size. Why is this so?
I guess '600' is handled as an integer value. Try to cast it to float manually. This should solve your problem.
Another problem could be that you are currently not using the variable 'size'. Then the compiler just removes the uniform from your shader.
It is hard to get the problem without knowing your code within the shader.
Related
I am struggling with this for the second day now and it seems like such a simple task, but I can not find the right solution.
With p5.js I am creating a GL instance and sending uniforms to the vertex shader.
Now I want to change the global variable that is the uniform and have the shader interpolate between the old value to the new value.
In the example below, after the button is clicked, it should not change the color instantly, but do small increments to get to the new color.
#ifdef GL_ES
precision mediump float;
#endif
uniform vec2 iResolution;
uniform float iTime;
uniform vec3 uSky;
void main()
{
// Normalized pixel coordinates (from 0 to 1)
vec2 uv = gl_FragCoord.xy/iResolution.xy;
uv -= 0.5;
uv.x *= iResolution.x/iResolution.y;
vec3 sky = uSky;
//sky = mix(sky, uSky, 0.001);
// ^- This needs to be increment until 1 every time the uniform changes
// Output to screen
gl_FragColor = vec4(sky, 1.0);
}
https://glitch.com/~shader-prb
As Rabbid76 commented. I can not change the uniform value in the shader.
The solution was to lerp from the host.
I am trying to implement a Streak shader, which is described here:
http://www.chrisoat.com/papers/Oat-SteerableStreakFilter.pdf
Short explanation: Samples a point with a 1d kernel in a given direction. The kernel size grows exponentially in each step. Color values are weighted based on distance to sampled point and summed. The result is a smooth tail/smear/light streak effect on that direction. Here is the frag shader:
precision highp float;
uniform sampler2D u_texture;
varying vec2 v_texCoord;
uniform float u_Pass;
const float kernelSize = 4.0;
const float atten = 0.95;
vec4 streak(in float pass, in vec2 texCoord, in vec2 dir, in vec2 pixelStep) {
float kernelStep = pow(kernelSize, pass - 1.0);
vec4 color = vec4(0.0);
for(int i = 0; i < 4; i++) {
float sampleNum = float(i);
float weight = pow(atten, kernelStep * sampleNum);
vec2 sampleTexCoord = texCoord + ((sampleNum * kernelStep) * (dir * pixelStep));
vec4 texColor = texture2D(u_texture, sampleTexCoord) * weight;
color += texColor;
}
return color;
}
void main() {
vec2 iResolution = vec2(512.0, 512.0);
vec2 pixelStep = vec2(1.0, 1.0) / iResolution.xy;
vec2 dir = vec2(1.0, 0.0);
float pass = u_Pass;
vec4 streakColor = streak(pass, v_texCoord, dir, pixelStep);
gl_FragColor = vec4(streakColor.rgb, 1.0);
}
It was going to be used for a starfield type of effect. And here is the implementation on ShaderToy which works fine:
https://www.shadertoy.com/view/ll2BRG
(Note: Disregard the first shader in Buffer A, it just filters out the dim colors in the input texture to emulate a star field since afaik ShaderToy doesn't allow uploading custom textures)
But when I use the same shader in my own code and render using ping-pong FrameBuffers, it looks different. Here is my own implementation ported over to WebGL:
https://jsfiddle.net/1b68eLdr/87755/
I basically create 2 512x512 buffers, ping-pong the shader 4 times increasing kernel size at each iteration according to the algorithm and render the final iteration on the screen.
The problem is visible banding, and my streaks/tails seem to be losing brightness a lot faster: (Note: the image is somewhat inaccurate, the lengths of the streaks are same/correct, its color values that are wrong)
I have been struggling with this for a while in Desktop OpenGl / LWJGL, I ported it over to WebGL/Javascript and uploaded on JSFiddle in hopes someone can spot what the problem is. I suspect it's either about texture coordinates or FrameBuffer configuration since shaders are exactly the same.
The reason it works on Shadertoys is because it uses a floating-point render target.
Simply use gl.FLOAT as the type of your framebuffer texture and the issue is fixed (I could verify it with the said modification on your JSFiddle).
So do this in your createBackingTexture():
// Just request the extension (MUST be done).
gl.getExtension('OES_texture_float');
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, this._width, this._height, 0, gl.RGBA, gl.FLOAT, null);
I'm trying to implement a shader for grass in cocos2d-x. The shader works OK on texture loaded with Sprite::create() and it looks like this:
http://i.stack.imgur.com/Rv4rd.png
The problem is that if I'm using Sprite::createWithSpriteFrameName() and applying the same shader it looks like the offsets are wrong when calculating height also because it is moving at a larger degree, like it is using the height of the full texture from plist file:
http://i.stack.imgur.com/of6Ku.png
Here is the shader code:
VSH
attribute vec4 a_position;
attribute vec2 a_texCoord;
attribute vec4 a_color;
#ifdef GL_ES
varying lowp vec4 v_fragmentColor;
varying mediump vec2 v_texCoord;
#else
varying vec4 v_fragmentColor;
varying vec2 v_texCoord;
#endif
void main()
{
gl_Position = CC_PMatrix * a_position;
v_fragmentColor = a_color;
v_texCoord = a_texCoord;
}
FSH
#ifdef GL_ES
precision mediump float;
#endif
varying vec2 v_texCoord;
uniform float speed;
uniform float bendFactor;
void main()
{
float height = 1.0 - v_texCoord.y;
float offset = pow(height, 2.5);
offset *= (sin(CC_Time[1] * speed) * bendFactor);
gl_FragColor = texture2D(CC_Texture0, fract(vec2(v_texCoord.x + offset, v_texCoord.y))).rgba;
}
If what is happening is not clear I can provide some videos. Thank you.
EDIT
Here is the code used to generate the grass sprite:
// Smaller grass
auto grass2 = Sprite::createWithSpriteFrameName("grass2.png");
grass2->setAnchorPoint(Vec2(0.5f, 0));
grass2->setPosition(Vec2(230, footer1->getContentSize().height * 0.25f));
// Apply "grass" shader
grass2->setGLProgramState(mat->getTechniqueByName("grass")->getPassByIndex(0)->getGLProgramState()->clone());
grass2->getGLProgramState()->setUniformFloat("speed", RandomHelper::random_real(0.5f, 3.0f));
grass2->getGLProgramState()->setUniformFloat("bendFactor", RandomHelper::random_real(0.1f, 0.2f));
It's hard to tell what's happening without seeing more of your code...
If I should guess I would say that the problem is related to trimming in TexturePacker.
If you set TrimMode=Trim the sprite is stripped from transparency. This makes the sprite smaller. Cocos2d-x also only renders the smaller portion of the sprite, compensating the difference between the original sprite and the trimmed sprite with an offset vector.
I propose that you either try not to trim the sprite or try polygon trimming.
The problem was with TexturePacker trimming but also with offsets in v_texCoord.
The solution was to calculate offsets in cocos2d-x and pass them to shader.
I calculated offsets using following code:
Rect grass2Offset(
grass2->getTextureRect().origin.x / grass2->getTexture()->getContentSize().width,
grass2->getTextureRect().origin.y / grass2->getTexture()->getContentSize().height,
grass2->getTextureRect().size.width / grass2->getTexture()->getContentSize().width,
grass2->getTextureRect().size.height / grass2->getTexture()->getContentSize().height
);
Next I pass the height offset and scale to shader as uniforms using:
grass2->getGLProgramState()->setUniformFloat("heightOffset", grass2Offset.origin.y);
grass2->getGLProgramState()->setUniformFloat("heightScale", 1 / grass2Offset.size.height);
Last, the shader is using the offset like this:
#ifdef GL_ES
precision mediump float;
#endif
varying vec2 v_texCoord;
uniform float speed;
uniform float bendFactor;
uniform float heightOffset;
uniform float heightScale;
void main()
{
float height = 1.0 - (v_texCoord.y - heightOffset) * heightScale;
float offset = pow(height, 2.5);
offset *= (sin(CC_Time[1] * speed) * bendFactor);
gl_FragColor = texture2D(CC_Texture0, fract(vec2(v_texCoord.x + offset, v_texCoord.y))).rgba;
}
Using OpenGL 3.2 / GLSL 150 with OpenFrameworks v0.8.3
I'm trying to implement shaders into my programs. My program successfully loads the correct frag and vert files, but I get this glitched visual and error:
[ error ] ofShader: setupShaderFromSource(): GL_FRAGMENT_SHADER shader failed to compile
[ error ] ofShader: GL_FRAGMENT_SHADER shader reports:
ERROR: 0:39: Use of undeclared identifier 'gl_FragColor'
I read this SO answer explaining that gl_FragColor is not supported in GLSL 300, but I'm (pretty sure I'm) not using that version. Regardless, when I change gl_FragColor with an outputColor var, my screen just appears black with no error.
Why isn't my shader appearing as expected? I have a feeling it is either my .vert file / a fundamental misunderstanding of how shapes are drawn from within shaders, or versioning problems.
My simplified program:
.h
#pragma once
#include "ofMain.h" //includes all openGL libs/reqs
#include "GL/glew.h"
#include "ofxGLSLSandbox.h" //addon lib for runtime shader editing capability
class ofApp : public ofBaseApp{
public:
void setup();
void draw();
ofxGLSLSandbox *glslSandbox; //an object from the addon lib
};
.cpp
#include "ofApp.h"
//--------------------------------------------------------------
void ofApp::setup(){
// create new ofxGLSLSandbox instance
glslSandbox = new ofxGLSLSandbox();
// setup shader width and height
glslSandbox->setResolution(800, 480);
// load fragment shader file
glslSandbox->loadFile("shader"); //shorthand for loading both .frag and .vert as they are both named "shader.frag" and "shader.vert" and placed in the correct dir
}
//--------------------------------------------------------------
void ofApp::draw(){
glslSandbox->draw();
}
.vert (just meant to be a pass-through... if that makes sense)
#version 150
uniform mat4 modelViewProjectionMatrix;
in vec4 position;
void main(){
gl_Position = modelViewProjectionMatrix * position;
}
.frag (see 3rd interactive code block down this page for intended result)
#version 150
#ifdef GL_ES
precision mediump float;
#endif
out vec4 outputColor;
uniform vec2 u_resolution;
uniform vec2 u_mouse;
uniform float u_time;
float circle(in vec2 _st, in float _radius){
vec2 dist = _st-vec2(0.5);
return 1.-smoothstep(_radius-(_radius*0.01),
_radius+(_radius*0.01),
dot(dist,dist)*4.0);
}
void main(){
vec2 st = gl_FragCoord.xy/u_resolution.xy;
vec3 color = vec3(circle(st,0.9));
outputColor = vec4( color, 1.0 );
}
I don't see the uniform variables getting set anywhere, which means they take the default values. This means that your matrix is all zeroes.
I'm drawing a simple textured quad (2 triangles) using a one dimensional texture that hold 512 values ranging from 0 to 1. I'm using RGBA_32F on a NVIDIA GeForce GT 750M, GL_LINEAR interpolation and GL_CLAMP_TO_EDGE.
I draw the quad using the following shader:
varying vec2 v_texcoord;
uniform sampler1D u_texture;
void main()
{
float x = v_texcoord.x;
float v = 512.0 * abs(texture1D(u_texture,x).r - x);
gl_FragColor = vec4(v,v,v,1);
}
Basically, I'm displaying the difference between texture values and frag coordinates. I was hoping to get a black quad (no difference) but here is what I get instead:
I tried to narrow down the problem and try to generate a one dimensional texture with only two values (0 and 1) and display it using:
varying vec2 v_texcoord;
uniform sampler1D u_texture;
void main()
{
float x = v_texcoord.x;
float v = texture1D(u_texture,x).r;
gl_FragColor = vec4(v,v,v,1);
}
and then I get:
Obviously this is not a linear interpolation from 0 to 1. The result seems to be split in 3 areas: black, interpolated and white. I tried different wrapping values with no success (but different results). Any idea what I'm doing wrong here ?
After searching a bit more, it seems the texture needs a small adjustement depending on texture size:
varying vec2 v_texcoord;
uniform sampler1D u_texture;
uniform float u_texture_shape;
void main()
{
float epsilon = 1.0/u_texture_shape;
float x = epsilon/2.0 +(1.0-epsilon)*v_texcoord.x;
float v = 512.0*abs(texture1D(u_texture,x).r -v_texcoord.x);
gl_FragColor = vec4(v,v,v,1);
}
I guess this is related to wrapping mode but I did not find informations on how wrapping is enforced at GPU level.