WebGL error - INVALID_VALUE: shaderSource: string not ASCII - glsl

Hi i am trying to use WebGl technology for my react app using three.js and glslify.
I tried the code below but got error - INVALID_VALUE: shaderSource: string not ASCII.
I tried to find the way to solve it but there was not any result that fits my problem.
How can i solve it?
glslify(`
precision mediump float;
#define LUT_FLIP_Y
varying vec2 vUv;
uniform sampler2D tDiffuse;
uniform sampler2D tLookup;
#pragma glslify: lut = require('glsl-lut')
void main () {
gl_FragColor = texture2D(tDiffuse, vUv);
gl_FragColor.rgb = lut(gl_FragColor, tLookup).rgb;
}
`)

Related

fragment uv flip glsl shader and wondering how i can replace the if statements with working math

//This has been tested and works in AGK classic
#ifdef GL_ES
precision mediump float;
precision mediump int;
#endif
#define PROCESSING_TEXTURE_SHADER
varying mediump vec2 uvVarying;
uniform sampler2D texture0;
uniform vec2 rot; //where rot is a vector passed to the shader from my AGK program
void main(void)
{
vec2 p = uvVarying;
if (rot.x ==1.0)
{p.x=rot.x-p.x;}
if (rot.y==1.0)")
{p.y=rot.y-p.y;}
vec3 col = texture2D(texture0, p).rgb;
gl_FragColor = vec4(col, 1.0);
}
without the if statements
i.e and rot.x =0 then "p.x=rot.x-p.x" fails (the same for the second vector rot.y
im looking for a simple math work around that removes the if statement for performance
if you want to remove if statement and your rot.x and rot.y values could be either 0.0 or 1.0, then I would suggest to try with a mix function
vec2 p = mix(uvVarying, rot - uvVarying, rot);

Function to add mix with alpha

#version 140
in vec2 textureCoords;
out vec4 out_Color;
float alpha = 0.5;
uniform sampler2D guiTexture;
void main(void){
out_Color = texture(guiTexture,textureCoords);
}
I am pretty (very) new to GLSL.
I want to basically add a transparency value (float) to the following code above (don't bother running it I just had to get it in). The value float should be the float a (4th component) in the out_Color variable. However due to the code currently in there which is 4 components I am not sure how to. Is there a function that will allow me to do this.
You should take a look at pretty much any basic GLSL tutorial
out_Color = vec4(texture(guiTexture,textureCoords).rgb, alpha);

Qt 5 with QOpenGLTexture and 16 bits integers images

For a while I've been using RGB images in 32 bits floating point precision in textures with QOpenGLTexture. I had no trouble with it.
Originally those images have an unsigned short data type, and I'd liketo keep this data type for sending the data to openGL (BTW, does it actually save some memory at all to do that?). After many attempts, I can't get QOpenGLTexture to display the image. All I end up with is a black image.
Below is how I setup QOpenGLTexture. The parts that used floating points, and that worked so far, is commented out. The part that assumes images in 16 bits unsigned integers, is right below the latter, uncommented. I'm using OpenGL 3.3, GLSL 330, core profile, on a macbook pro retina with Iris graphics.
QOpenGLTexture *oglt = new QOpenGLTexture(QOpenGLTexture::Target2D);
oglt->setMinificationFilter(QOpenGLTexture::NearestMipMapNearest);
oglt->setMagnificationFilter(QOpenGLTexture::NearestMipMapNearest);
//oglt->setFormat(QOpenGLTexture::RGB32F); // works
oglt->setFormat(QOpenGLTexture::RGB16U);
oglt->setSize(naxis1, naxis2);
oglt->setMipLevels(10);
//oglt->allocateStorage(QOpenGLTexture::RGB, QOpenGLTexture::Float32); // works
//oglt->setData(QOpenGLTexture::RGB, QOpenGLTexture::Float32, tempImageRGB.data); // works
oglt->allocateStorage(QOpenGLTexture::RGB_Integer, QOpenGLTexture::UInt16);
oglt->setData(QOpenGLTexture::RGB_Integer, QOpenGLTexture::UInt16, tempImageRGB.data);
So, in just these lines above, is there something wrong?
My data in tempImageRGB.data are between [0-65535] when I used UInt16. When I use QOpenGLTexture::Float32, the values in tempImageRGB.data are already normalized so they would be within [0-1].
Then, here is my fragment shader:
#version 330 core
in mediump vec2 TexCoord;
out vec4 color;
uniform mediump sampler2D ourTexture;
void main()
{
mediump vec3 textureColor = texture(ourTexture, TexCoord).rgb;
color = vec4(textureColor, 1.0);
}
What am I missing?
It seems I fixed the problem by simply not using NearestMipMapNearest for magnification filter. Things work if I only use it for the minification. While in general it makes sense but I don't understand why I had no problem when using NearestMipMapNearest for both Magnification and Minification in the floating point case.
So, the code is working by simply changing 'sampler2D' to 'usampler2D' in the shader, by changing 'setMagnificationFilter(QOpenGLTexture::NearestMipMapNearest)' into 'setMagnificationFilter(QOpenGLTexture::Nearest)'. The minification filter does not need to change. In addition, although it works with and without, I did not need to set the MipMapLevels explicitly so I can remove oglt->setMipLevels(10).
To be clear, here is the corrected code:
QOpenGLTexture *oglt = new QOpenGLTexture(QOpenGLTexture::Target2D);
oglt->setMinificationFilter(QOpenGLTexture::NearestMipMapNearest);
oglt->setMagnificationFilter(QOpenGLTexture::Nearest);
//oglt->setFormat(QOpenGLTexture::RGB32F); // works
oglt->setFormat(QOpenGLTexture::RGB16U); // now works with integer images (unsigned)
oglt->setSize(naxis1, naxis2);
//oglt->allocateStorage(QOpenGLTexture::RGB, QOpenGLTexture::Float32); // works
//oglt->setData(QOpenGLTexture::RGB, QOpenGLTexture::Float32, tempImageRGB.data); // works
oglt->allocateStorage(QOpenGLTexture::RGB_Integer, QOpenGLTexture::UInt16); // now works with integer images (unsigned)
oglt->setData(QOpenGLTexture::RGB_Integer, QOpenGLTexture::UInt16, tempImageRGB.data); // now works with integer images (unsigned)
The fragment shader becomes simply:
#version 330 core
in mediump vec2 TexCoord;
out vec4 color;
uniform mediump usampler2D ourTexture;
void main()
{
mediump vec3 textureColor = texture(ourTexture, TexCoord).rgb;
color = vec4(textureColor, 1.0);
}

Cocos2d-x shader is using invalid offset on texturepacker imported spriteframe

I'm trying to implement a shader for grass in cocos2d-x. The shader works OK on texture loaded with Sprite::create() and it looks like this:
http://i.stack.imgur.com/Rv4rd.png
The problem is that if I'm using Sprite::createWithSpriteFrameName() and applying the same shader it looks like the offsets are wrong when calculating height also because it is moving at a larger degree, like it is using the height of the full texture from plist file:
http://i.stack.imgur.com/of6Ku.png
Here is the shader code:
VSH
attribute vec4 a_position;
attribute vec2 a_texCoord;
attribute vec4 a_color;
#ifdef GL_ES
varying lowp vec4 v_fragmentColor;
varying mediump vec2 v_texCoord;
#else
varying vec4 v_fragmentColor;
varying vec2 v_texCoord;
#endif
void main()
{
gl_Position = CC_PMatrix * a_position;
v_fragmentColor = a_color;
v_texCoord = a_texCoord;
}
FSH
#ifdef GL_ES
precision mediump float;
#endif
varying vec2 v_texCoord;
uniform float speed;
uniform float bendFactor;
void main()
{
float height = 1.0 - v_texCoord.y;
float offset = pow(height, 2.5);
offset *= (sin(CC_Time[1] * speed) * bendFactor);
gl_FragColor = texture2D(CC_Texture0, fract(vec2(v_texCoord.x + offset, v_texCoord.y))).rgba;
}
If what is happening is not clear I can provide some videos. Thank you.
EDIT
Here is the code used to generate the grass sprite:
// Smaller grass
auto grass2 = Sprite::createWithSpriteFrameName("grass2.png");
grass2->setAnchorPoint(Vec2(0.5f, 0));
grass2->setPosition(Vec2(230, footer1->getContentSize().height * 0.25f));
// Apply "grass" shader
grass2->setGLProgramState(mat->getTechniqueByName("grass")->getPassByIndex(0)->getGLProgramState()->clone());
grass2->getGLProgramState()->setUniformFloat("speed", RandomHelper::random_real(0.5f, 3.0f));
grass2->getGLProgramState()->setUniformFloat("bendFactor", RandomHelper::random_real(0.1f, 0.2f));
It's hard to tell what's happening without seeing more of your code...
If I should guess I would say that the problem is related to trimming in TexturePacker.
If you set TrimMode=Trim the sprite is stripped from transparency. This makes the sprite smaller. Cocos2d-x also only renders the smaller portion of the sprite, compensating the difference between the original sprite and the trimmed sprite with an offset vector.
I propose that you either try not to trim the sprite or try polygon trimming.
The problem was with TexturePacker trimming but also with offsets in v_texCoord.
The solution was to calculate offsets in cocos2d-x and pass them to shader.
I calculated offsets using following code:
Rect grass2Offset(
grass2->getTextureRect().origin.x / grass2->getTexture()->getContentSize().width,
grass2->getTextureRect().origin.y / grass2->getTexture()->getContentSize().height,
grass2->getTextureRect().size.width / grass2->getTexture()->getContentSize().width,
grass2->getTextureRect().size.height / grass2->getTexture()->getContentSize().height
);
Next I pass the height offset and scale to shader as uniforms using:
grass2->getGLProgramState()->setUniformFloat("heightOffset", grass2Offset.origin.y);
grass2->getGLProgramState()->setUniformFloat("heightScale", 1 / grass2Offset.size.height);
Last, the shader is using the offset like this:
#ifdef GL_ES
precision mediump float;
#endif
varying vec2 v_texCoord;
uniform float speed;
uniform float bendFactor;
uniform float heightOffset;
uniform float heightScale;
void main()
{
float height = 1.0 - (v_texCoord.y - heightOffset) * heightScale;
float offset = pow(height, 2.5);
offset *= (sin(CC_Time[1] * speed) * bendFactor);
gl_FragColor = texture2D(CC_Texture0, fract(vec2(v_texCoord.x + offset, v_texCoord.y))).rgba;
}

no uniform with name 'u_proj' in shader

I wrote a pair of shaders to display the textures as greyscale instead of full color. I used these shaders with libGDX's built in SpriteBatch class and it worked. Then when I tried to use it with the built in SpriteCache class it didn't work. I looked at the SpriteCache code and saw that it set some different uniforms that I tried to take into account but I seem to have gone wrong somwhere.
The SpriteCache class in libGDX sets the following uniforms:
customShader.setUniformMatrix("u_proj", projectionMatrix);
customShader.setUniformMatrix("u_trans", transformMatrix);
customShader.setUniformMatrix("u_projTrans", combinedMatrix);
customShader.setUniformi("u_texture", 0);
This is my vertex shader:
attribute vec4 a_position;
attribute vec4 a_color;
attribute vec2 a_texCoord0;
uniform mat4 u_proj;
uniform mat4 u_projTrans;
uniform mat4 u_trans;
varying vec4 v_color;
varying vec2 v_texCoords;
void main() {
v_color = a_color;
v_texCoords = a_texCoord0;
gl_Position = a_position* u_proj * u_trans;
}
and this is the fragment shader:
varying vec4 v_color;
varying vec2 v_texCoords;
uniform sampler2D u_texture;
uniform u_projTrans;
void main() {
vec4 color = texture2D(u_texture, v_texCoords).rgba;
float gray = (color.r + color.g + color.b) / 3.0;
vec3 grayscale = vec3(gray + 0* u_projTrans[0][0]);
gl_FragColor = vec4(grayscale, color.a);
}
The error I get is:
Exception in thread "LWJGL Application" java.lang.IllegalArgumentException: no uniform with name 'u_proj' in shader
...
com.badlogic.gdx.backends.lwjgl.LwjglApplication.mainLoop(LwjglApplication.java:206)
at com.badlogic.gdx.backends.lwjgl.LwjglApplication$1.run(LwjglApplication.java:114)ackends.lwjgl.LwjglApplication.mainLoop(LwjglApplication.java:206)
at com.badlogic.gdx.backends.lwjgl.LwjglApplication$1.run(LwjglApplication.java:114)
I guess do any of you guys know why this isn't working? There is a uniform with the name u_proj.
Thank you all!
What Reto Koradi said was true I had forgotten to put a mat4 tag before u_projTrans, that helped me.
Then what Tenfour04 said was a huge help too! I hadn't known about:
if (!shader.isCompiled()) throw new GdxRuntimeException("Couldn't compile shader: " + shader.getLog());
What helped me most in the long run was finding that glsl, when compiling, would do away with unused imports and that if you weren't able to trick the compiler into thinking that unused imports were used being used the shader would compile and then crash on runtime.
In libgdx there is a static "pedantic" variable that you can set. If it is set to false the application won't crash if variables are sent to the shader that the shader isn't using, they will simply be ignored. The code in my libgdx program looked something like this:
ShaderProgram.pedantic = false;
Thanks for your help all! I hope this can help someone in the future
Make sure that you check the success of your shader compilation/linking. Your fragment shader will not compile:
uniform u_projTrans;
This variable declaration needs a type. It should be:
uniform mat4 u_projTrans;
You can use the following calls to test for errors while setting up your shader programs:
glGetShaderiv(shaderId, GL_COMPILE_STATUS, ...);
glGetShaderInfoLog(shaderId, ...);
glGetProgramiv(programId, GL_LINK_STATUS, ...);
glGetProgramInfoLog(programId, ...);