I just made an account for shadertoy.com today whereas I am new to OpenGL and only know a bit of JavaScript and Processing. I am trying to use a bitwise xor on 2 integers and I get the error as follows:
'^' : bit-wise operator supported in GLSL ES 3.00 and above only.
I googled and found this. I insert #version 300 es at the top of the "Image" tab thingy. I am pleasantly rewarded for my cleverness with the following error:
'version' : #version directive must occur before anything else, except for comments and white space.
So, my real question is where should I place the #version directive in Shadertoy?
Post Scriptum; my experiment is here.
That's because ShaderToy 'probably' inserts the version for you. And GLSL obliges you to put the version directive at the start of the shader string.You must take into account that the stuff in their GLSL editor is not the whole picture of what is submitted to GLSL compiler. They add more information under the hood which you don't see,and #version is one of those hidden things you don't have control of.
Also,it looks like the ShaderToy doesn't support WebGL2, or GLSL 300, which DOES support XOR operator. So you should contact the owners of the site and ask them when they plan to add the support.
If you want to use the XOR operator, try opening the same shader in Chrome or some other browser that supports WebGL2. See if your browser supports WebGL2 here.
Related
I am using QtCreator 4.12 as a generic C++ IDE, installed from my distribution's package manager, so this is a generic question about QtCreator usage, not related to Qt in particular, nor building QtCreator from source.
Like any IDE, QtCreator highlights potential errors while writing code.
in a .cpp file, if I write int x = 0 and press enter, the 0 will be underlined in red, and there will be a tooltip telling me that I forgot the ; at the end of the line.
This is described in the QtCreator documentation, but I couldn't find anything in that documentation about GLSL.
My actual project is a C++ with openGl game, and I'm editing my GLSL shaders within QtCreator.
Reading the answer to this question, I've learned that all the texture*D() functions were deprecated since openGL 3.3, and have to be replaced with texture() which infers the texture dimension, so I decided to update my shaders.
Within QtCreator, when I use the texture() function, the whole line gets underlined with red color, with a tooltip saying expression too complex, whereas when I use texture2D() (or texture1D() or else), the line isn't underlined as shown in following pictures :
deprecated GLSL:
non-deprecated GLSL:
This doesn't prevent my shaders to work as designed at all, so there's no real problem here, but it's really disturbing.
I don't know anything about the syntax error checking mechanism more than what is written in the linked documentation page, and I'm looking for a way to change this mechanism to accept GLSL 3.3+. I would accept an answer telling me how to silence this specific false positive as a workaround, or a way to deactivate the syntax error checking for .glsl files, but I would really prefer to understand how I could tweak the error checking mechanism to accept modern glsl as it does for legacy glsl.
In the end I wrote a bug report : QTCREATORBUG-24068.
There's a patch addressing the issue, which I could test. It will be merged in QT Creator's source v4.14.
I am developing a 3D engine that is designed to support the implementation of any given graphics API. I would like your feedback on how I'm planning to manage the shader files:
I thought about creating a struct that contains 3 string variables, the directory and the file name (both vertex and fragment), something like this:
class ShaderFile : public SerializableAsset
{
std::string nameID; //Identifier
std::string directory;
std::string vertexName;
std::string fragmentName;
};
The user would be able to set these variables in the editor. Then, my engine would load the shader files in like:
void createShader(RenderAPI api)
{
ShaderFile shaderFile //Get it from some place
std::string vertexPath = shaderFile.directory + shader.vertexName + api.name + api.extension;
std::string fragmentPath = shaderFile.directory + shader.fragmentName + api.name + api.extension;
//Create shader...
}
Which would create something like: Project/Assets/Shaders/standardVulkan.spv.
Am I thinking in the right direction or is this a completely idiotic approach? Any feedback
It's an interesting idea and we've actually done exactly this, but we discovered some things along the way that are not easy to deal with:
If you take a deeper look at Shader API's, although they are close to offering the same capabilities on paper, they often do not support features in the same way and have to be managed differently. By extension, so do the shaders. The driver implementation is key here, and sometimes differs considerably when it comes to managing internal state (synchronization and buffer handling).
Flexibility
You'll find that OpenGL is flexible in the way it handles attributes and uniforms, where DirectX is more focussed on minimizing uploads to the hardware by binding them in blocks according to your renderpass configurations, usually on a per-object/per-frame/per-pass basis etc.. While you can also do this by creating tiny blocks, this obviously would give different performance.
Obviously, there are multiple ways to do binds, or even buffer objects, or shaders, even in a single API. Also, getting shader variable names and bind points is not that flexible to query in DirectX and some of the parameters needs to be set from code. In Vulkan, binding shader attributes and uniforms is even more generalized: you can completely configure the bind points as you wish.
Versioning
Another topic is everything that has to do with GLSL/HLSL shading versioning: You may need to write different shaders for different hardware capabilities that support lower shader models. If you're going to write unique shaders and are not going for the uber-shader approach (and also to a large extend IF you use this approach) this can get complicated if it ties too tightly into your design, and given the number of permutations might be unrealistic.
Extensions
OpenGL and Vulkan extensions can be 'queried' from within the shader, while other API's such as DirectX require setting this from the code side. Still, within the same compute_capability, you can have extensions that only work on NVidia, or are ARB approved but not CORE, etc. This is really quite messy and in most cases application specific.
Deprecation
Considerable parts of API's are getting deprecated all the time. This can be problematic if your engine expects those features to remain in place, especially if you like to deal with multiple API's that support that feature.
Compilation & Caching
Most API's by now support some form of offline compilation that can be loaded later. Compilation takes a considerable amount of time so caching that makes sense. Since the hardware shader code is compiled uniquely for the hardware that you have, you need to do this exercise for each platform that the code should run on, either the first time the app needs the shader, or in some other clever way in your production pipeline. Your filename would in that case be replaced by a hash so that the shader can be retrieved from the cache. But this means the cache needs a timestamp so it can detect new versions of the source shader, because if the shader source should change, then the cache entry needs to be rebuilt. etc.. :)
Long story short
If you aim for maximum flexibility in whatever API, you'd end up adding a useless layer in your engine that in the best case simply duplicates the underlying calls. If you aim for a generalized API, you'll quickly get trapped in the version story that is totally not synchronized between the different API's in terms of extensions, deprecation and driver implementation support.
Back at the time when the Internet was expensive and slow, the website authors used all sorts of HTML / JavaScript compression tools that would remove whitespace and shorten the names of variables.
Is there such a tool for GLSL shaders? I was going to write it myself, but then I realized there should be such a tool already out there, yet, I was unable to find one.
There is GLSL-unit doing this.
Also a browser version version following this link.
I've just discovered the OpenGL Shader Builder in Apple's developer tools. It seems mighty useful. Only trouble is that it seems to insist on using *.vs and *.fs in the save dialogue, where as I normally use *.vert and *.frag respectively for my shader file extensions. Is there any way to change these defaults? (this could involve a hacky solution)
I first ended up using my custom #include preprocessor (very simple to make!)
Made dummy wrapper shader :
test.vert
#include test.vs
test.frag
#include test.fs
It's a semi-terrible hack and is definitely something you don't want to include in svn/git/whatever if you are sharing the code, but great when you need the turnaround when tweaking.
As more people ended up using several different file extensions I ended up adding more ways to load shaders.
LoadShader("MyShader", <List of preprocessors>)
LoadShader("vertex shader", "fragment shader", ... , <List of preprocessors>)
(Simplified here. What is really passed in is a structure)
The first function would do the following :
Text files with [".vert",".vs".. etc] for the vertex shader
Text files with [".frag",".fs".. etc] for the fragment shader
etc..
The user can define what file extensions they want to support.
The second function is just using exact file names. In the first function the possible combinations are limited by preprocessors. In the second option you can combine any shader file you want and also make variations with preprocessors.
This is of course simplified, but shows the general idea.
I've just wondering how to bundle my GLSL shader source files (for OpenGL ES(iOs)/OpenGL with GLUT (Mac/Windows)) with my application. As pure text files, they would be easily changeable by every user of my software and I'm afraid of undefined behavior...
On iOS I simply use XCodes "Copy Bundle Ressources" for my shaders (and retrieve them then from the Application Bundle) - is there a similiar possibility with Visual Studio?
Or ist there even a better cross plattform way, to do so?
GLSL shaders are pure text files (or text snippets, whatever way you want to look at it). There is no way (apart from digitally signing your shaders and refusing to run if the signature does not match) to prevent a user from trivially modifying your shaders in a text editor. (Of course you could make them kind of unreadable by rot13-encoding them or by putting them all into a .zip file and renaming the .zip file to something else, this will not prevent someone determined to find your shaders from doing so, but it will probably deter 90% of the average users.)
But then again, if people do edit your shaders and that results in undefined behaviour... bad luck for them. You know, there is a certain faction of people who feels urged to edit everything that is human readable and editable. Fine, it's their problem if they break their install. You can't prevent people from being stupid.
There is the shader binary extension on recent versions of OpenGL, but it is not intended to be used in a way that would solve your problem. It is merely intended as a caching mechanism to speed up compile/link times on subsequent runs. It is not suited to distribute "shader binaries".
Just so you know, even on OSX the shaders are in "pure text", the application bundle is a normal directory that includes a Resource/ folder where your shaders are placed.