I'm using VS c++ 2012 and Directx 10 and have encountered some problems.
I found that in vs 2012, I can add .hlsl files directly and can choose which type of shader to create, such as vertex or pixel shader. But I can only add one type of shader at a time, unlike previously in vs 2010, I could just create and load a .fx file that contains both vertex and pixel shader and the technique. With present situation, I can add separate .hlsl files but I have no idea where to put the Technique to. Also I noticed that the 2012 version cannot find D3D10CreateEffectFromFile. How does Shader work in vs 2012 version? If I just want to use a vertex shader and one pixel shader, how can I do that?
It is a long time that i don't work with directX, but i think it doesn't depend on visual studio version and even also directX. in DX10 and also DX11 you can create a fx file that contain different types of shaders along with a technique.(but in directx 11 you must include effect class in your project) but in DX11 you can also use hlsl files that i think it can contain all types of shaders, just in compiling you send name and type of your specific shader to DX11 functions and i think in DX11 this approach is prefered way.
For D3D10CreateEffectFromFile i have no idea but i think you don't include headers correctly. again i say it is long time that i don't work with DX and my answer can be incorrect.
Related
I am piecing together information on how to import models following a bunch of different tutorials, because there isn't a resource that explains how to do this in a simple way.
So far I can import and display my model, but it's shown in the Gouraud shader if I use DirectX::BasicEffect
This makes my mechanical models appear very badly shaded, because they were exported from Solidworks.
I want to try switching to DirectX::DGSLEffect , but I need to load a shader, and I don't have any shaders to test.
Can anyone point me to where I can find a pre-compiled *.cso shader?
I need one that can preferably do flat shading, but I can settle for any, just to test that my application is working right.
I know that I need to compile a shader myself from .hlsl files, but this is too many steps and I just need something that works right now before I can continue further,
Thank you,
-D
I want to point out that I am not making a video game, but a mechanical visualization tool, and I do not need to know how to write shaders or do any advanced features of DirectX at this time. Please do not recommend reading books as that would be counter-productive to the amount of time I can work on this project. I would like to learn more in the future, but not for this application. Thanks,
EDIT:
It looks like I can compile with the following line, but now I cant find any HLSL files.
ID3DBlob* PS_Buffer;
D3DCompileFromFile(L"PixelSHader.hlsl", 0, 0, "main", "ps_5_0", 0, 0, &PS_Buffer, 0);
CSO files are complied HLSL shaders. Visual Studio can generate them from HLSL files, and the DGSL pipeline can create pixel shaders using a visual language.
The DirectX Tool Kit BasicEffect system is quite flexible, so you can override in code the parameters or provide your own material information. See the wiki for more information.
If you want to use the Visual Studio content pipeline, you should look at Working with 3-D Assets for Games and Apps for the details. Visual Studio converts WaveFront obj or Autodesk fbx files and can include DGSL materials as part of that pipeline. The DirectX Tool Kit DGSLEffect provides a built-in vertex shader that will work with the DGSL shader designer output.
If you'd like an example of using the DGSLEffect pipeline, look at the tutorial Creating custom shaders with DGSL.
I have been working on a small hobby project for over a year now. Its written in C++, using SDL/Image/Mixer and Minizip. It uses OpenGL for rendering.
Till July of this year I had been maintaining and testing both x64 and x86 versions of the code. Both compiled without any changes to the original code and ran exactly the same.
However, around August i moved the code upto SDL 2.0 from 1.2.15 and started only maintaining and testing the x64 version. Now when i try to build a x86 version I am getting the below problem.
Correct Output
Incorrect Output
-
Things I have tried:
gDebugger: both version of the code create the same type of context.
However accumulate buffer is 64 bits in both. Cannot find a way to
disable it.
ran it through drmemory: no alarming memory or heap corruption
check contexts on creation: both version create the same value
context in SDL, both generate the same "No OpenGL context has been
made current" even after calling SDL_GL_MakeCurrent, but the x64
version works, the x86 debug version gives a black screen, and the
x86 release version gives the above output.
Both the x64 and x86 version are the same exact code, which used to compile and work properly before SDL 2.0. I am not sure if its a bug in SDL, or something i did wrong. Let me if you need more information on this.
Update:
I am using pure GL 1.1 code only, so no shaders or vbo's. Using only glVertexPointer, and associated glColorPointer and glTexCoordPointer functions. all arrays are defined as GL_types, with the gl functions given the pointer to the client memory. All textures are rendered as quads.
GLfloat vertex_array_3f[12];
//Initialize array
glVertexPointer(3, GL_FLOAT, 0, vertex_array_3f);
//set color and tex pointers
glDrawArrays(GL_QUADS, 0, 4);
The context type I am requesting is 2.1, but instead i get a backward compatible context. Doesnt cause any issues in the x64 version.
I also changed over the from VS2010 express to VS2012 express during the same period. But i do remember it compiling succesfully for x86 for VS2012 express.
edit: Has any one experience anything like this before? I am doing some testing in the meanwhile, if i find anything will post the findings below.
I have an issue with a webgl shader that I've determined is related to ANGLE because it only occurs on Windows in Firefox or Chrome, and it doesn't happen if I force opengl (chrome --use-gl=desktop).
I've created a jsfiddle that shows ANGLE-generated HLSL of my custom shader. (for hlsl conversion to work in this jsfiddle, you must run chrome with --enable-privileged-webgl-extensions, or just see my gist of the output)
So I have working glsl and the generated hlsl compiles but doesn't do the same thing. The symptom is that on Windows, the vertices appear in correct initial locations, but do not move although I change the uniform jed. I can't find the bug in the generated code.
Any tips for debugging problems like this?
Hard to say based on your info (including no original GLSL). It's not hard to imagine this being fixed by the July 4 revisions to ANGLE, however. I would say update, first.
I'm working on an OpenGL project on Windows, using GLEW to provide the functionality the provided Windows headers lack. For shader support, I'm using NVIDIA's Cg. All the documentation and code samples I have read indicate that the following is the correct method for loading an using shaders, and I've implemented things this way in my code:
Create a Cg context with cgCreateContext.
Get the latest vertex and pixel shader profiles using cgGLGetLatestProfile with CG_GL_VERTEX and CG_GL_FRAGMENT, respectively. Use cgGLSetContextOptimalOptions to create the optimal setup for both profiles.
Using these profiles and shaders you have written, create shader programs using cgCreateProgramFromFile.
Load the shader programs using cgGLLoadProgram.
Then, each frame, for an object that uses a given shader:
Bind the desired shader(s) (vertex and/or pixel) using cgGLBindProgram.
Enable the profile(s) for the desired shader(s) using cgGLEnableProfile.
Retrieve and set any needed uniform shader parameters using cgGetNamedParameter and the various parameter setting functions.
Render your object normally
Clean up the shader by calling cgGLDisableProfile
However, things start getting strange. When using a single shader everything works just fine, but the act of loading a second shader with cgGLLoadProgram seems to make objects using the first one cease to render. Switching the draw order seems to resolve the issue, but that's hardly a fix. This problem occurs on both my and my partner's laptops (fairly recent machines with Intel integrated chipsets).
I tested the same code on my desktop with a GeForce GTX 260, and everything worked fine. I would just write this off as my laptop GPU not getting along with Cg, but I've successfully built and run programs that use several Cg shaders simultaneously on my laptop using the OGRE graphics engine (unfortunately the assignment I'm currently working on is for a computer graphics class, so I can't just use OGRE).
In conclusion, I'm stumped. What is OGRE doing that my code is not? Am I using Cg improperly?
You have to call cgGLEnableProfile before you call cgGLBindProgram. From your question it appears you do it the other way around.
From the Cg documentation for cgGLBindProgram:
cgGLBindProgram binds a program to the current state. The program must have been loaded with cgGLLoadProgram before it can be bound. Also, the profile of the program must be enabled for the binding to work. This may be done with the cgGLEnableProfile function.
I created a Visual C++ Project as win32 console setting, and I made a triangle successfully.
I'm using Visual studio 2010.
I wonder why there is no glOrthof function but only glOrtho function.
Is that an OpenGL version matter?
I used to use glOrthof function when I developed a game on the Android platform.
In OpenGL you only have the one version of glOrtho, the one taking floating point values. In Open GL ES there are two versions, one taking floats (glOrthof / GLfloat) and one taking fixeds (glOrthox / GLfixed).
One might argu that glOrtho in OpenGL should have the possibility to also use GLdouble, but since glOrtho orgins from back in time when only floats where used and today its depricated/removed there I see no reason to implement that.