Adding geometry shader causes "shader linkage error" - c++

I'm adding a geometry shader (a very simple one) to my DirectX 11 program. I've already got vertex and pixel shader written, and they work just as expected - no errors, no warnings. The shaders are simple, too. The vertex shader is:
cbuffer PerApplication : register(b0)
{
matrix projectionMatrix;
}
cbuffer PerFrame : register(b1)
{
matrix viewMatrix;
}
cbuffer PerObject : register(b2)
{
matrix worldMatrix;
}
struct VertexShaderInput
{
float4 position : POSITION;
float4 color: COLOR;
};
struct VertexShaderOutput
{
float4 color : COLOR;
float4 position : SV_POSITION;
};
//entry point
VertexShaderOutput SimpleVertexShader(VertexShaderInput IN)
{
VertexShaderOutput OUT;
matrix mvp = mul(projectionMatrix, mul(viewMatrix, worldMatrix));
OUT.color = IN.color;
OUT.position = mul(mvp, IN.position);
return OUT;
}
The pixel shader is:
struct PixelShaderInput
{
float4 color : COLOR;
};
float4 SimplePixelShader(PixelShaderInput IN) : SV_TARGET
{
return IN.color;
}
Well, as I've said, that's working pretty well. Then I'm adding a geometry shader, which doesn't actually do anything, it just takes a triangle and returns the same triangle. The geometry shader is:
struct VertexInput
{
float4 color : COLOR;
float4 position : POSITIONT;
};
struct VertexOutput
{
float4 color : COLOR;
float4 position : SV_Position;
};
[maxvertexcount(3)]
void SimpleGeometryShader(triangle VertexInput input[3], inout TriangleStream<VertexOutput> stream)
{
VertexOutput v1 = { input[0].color, input[0].position };
stream.Append(v1);
VertexOutput v2 = { input[1].color, input[1].position };
stream.Append(v2);
VertexOutput v3 = { input[2].color, input[2].position };
stream.Append(v3);
stream.RestartStrip();
}
Doing this also requires to change the vertex shader, which now returns
struct VertexShaderOutput
{
float4 color : COLOR;
float4 position : POSITIONT; //I'm not returning SV_Position in vertex shader anymore.
};
And the program itself works, and it works as expected, I see what I expect to see. But there are now two D3D11 errors:
D3D11 ERROR: ID3D11DeviceContext::DrawIndexed: Vertex Shader - Geometry Shader linkage error: Signatures between stages are incompatible. The input stage requires Semantic/Index (POSITIONT,0) as input, but it is not provided by the output stage. [ EXECUTION ERROR #342: DEVICE_SHADER_LINKAGE_SEMANTICNAME_NOT_FOUND]
D3D11 ERROR: ID3D11DeviceContext::DrawIndexed: Geometry Shader - Pixel Shader linkage error: Signatures between stages are incompatible. The input stage requires Semantic/Index (TEXCOORD,0) as input, but it is not provided by the output stage. [ EXECUTION ERROR #342: DEVICE_SHADER_LINKAGE_SEMANTICNAME_NOT_FOUND]
Both are pretty strange. The vertex shader clearly returns a POSITIONT, and COLOR and POSITIONT are in the same order. What's my mistake?

The reason of the problem was that beside of rendering the scene using the shaders that I provided I was also drawing text using some library. Obviously, there were some vertex and pixel shaders involved in it with their own input and output signatures. Setting geometry shader to null solved my problem.

Related

Geometry Shader must have a max vertex count DirectX 11

I am trying to add a Geometry Shader to my DirectX 11 project in C++
There are no examples of this anywhere I look. There are millions of tutorials on OpenGL but nothing on geometry shaders in DirectX
I just wrote a basic shader below, but I get the following error when trying to build it
error X3514: 'LightGeometryShader' must have a max vertex count
Can anyone please advise on what this shader is missing to be able to compile?
////////////////////////////////////////////////////////////////////////////////
// Filename: light.gs
////////////////////////////////////////////////////////////////////////////////
//////////////
// TYPEDEFS //
//////////////
struct GeometryInputType
{
float4 position : POSITION;
float2 tex : TEXCOORD0;
float3 normal : NORMAL;
};
struct PixelInputType
{
float4 position : SV_POSITION;
float2 tex : TEXCOORD0;
float3 normal : NORMAL;
};
////////////////////////////////////////////////////////////////////////////////
// Geometry Shader
////////////////////////////////////////////////////////////////////////////////
PixelInputType LightGeometryShader(GeometryInputType input)
{
PixelInputType output;
output = input;
return output;
}
GeometryShader is not necessarily a 1:1 function, which is why you have to provide a max vertex count. See Microsoft Docs.
[maxvertexcount(3)]
void LightGeometryShader( triangle GeometryInputType input[3],
inout TriangleStream<PixelInputType> outStream )
{
PixelInputType output;
for( int v = 0; v < 3; v++ )
{
output.position = input[v].position;
output.tex = input[v].tex;
output.normal = input[v].normal;
outStream.Append( output );
}
}
Geometry Shader was introduced with Direct3D 10, so the bulk of the deved samples were in the legacy DirectX SDK at the time. You can find the latest copy of these samples buildable without the legacy DirectX SDK on GitHub.

Error: Invalid vs_2_0 output semantic

Its says: Invalid vs_2_0 output semantic SV_Target.
So for some reason Visual Studio 2017 is compiling my pixel shader as if it is a vertex shader. But in the properties panel I specified it to be a ps_5_0. Is there something I'm missing that should be specified?
Vertex shader:-
cbuffer ConstantBuffer : register(b0)
{
matrix World;
matrix View;
matrix Projection;
}
struct Input {
float3 Pos : POSITION;
float4 Color: COLOR;
};
struct VS_OUTPUT
{
float4 Pos : SV_POSITION;
float4 Color : COLOR0;
};
VS_OUTPUT main(Input input)
{
VS_OUTPUT output = (VS_OUTPUT)0;
output.Pos = mul(input.Pos, World);
output.Pos = mul(output.Pos, View);
output.Pos = mul(output.Pos, Projection);
output.Color = input.Color;
return output;
}
Pixel shader:-
struct VS_OUTPUT
{
float4 Pos : SV_POSITION;
float4 Color : COLOR0;
};
float4 main(VS_OUTPUT input) : SV_Target
{
return input.Color;
}
And here are my settings for the pixel shader.
I hope someone can help me.
Open the Property page for the .hlsl file, and in HLSL Compiler/General/Shader Type select Pixel Shader.
And don't forget to set this propertie for debug and release.
You can resolve the issue by modifying the shader property
properties -> HLSL Compiler -> General -> Shader Type -> Pixel Shader (/ps)
OR
Configuration properties -> General -> Excluded from Build = Yes.

HLSL error X3000: unrecognized identifier

I don't have any experience with hlsl and can't figure out how to fix this error. Here is my SimpleVertexShader.hlsl file
cbuffer PerApplication : register (b0)
{
matrix projectionMatrix;
}
cbuffer PerFrame : register (b1)
{
matrix viewMatrix;
}
cbuffer PerObject : register (b2)
{
matrix worldMatrix;
}
struct AppData
{
float3 position : POSITION;
float3 color : COLOR;
};
struct VertexShaderOuput
{
float4 color : COLOR;
float4 position : SV_POSITION;
};
VertexShaderOuput SimpleVertexShader(AppData IN)
{
VertexShaderOutput OUT;
matrix mvp = mul(projectionMatrix, mul(viewMatrix, worldMatrix));
OUT.position = mul(mvp, float4(IN.position, 1.0f));
OUT.color = float4(IN.color, 1.0f);
return OUT;
}
I get an error X3000: unrecognized identifier 'VertexShaderOutput' as well as unrecognized identifier 'OUT'. I am using Visual Studio 2013. And under the Properties for this file I have these settings:
HLSL Compiler>Entrypoint Name: SimpleVertexShader
Shader Type: Vertex Shader (/vs)
Shader Model: Shader Model 4 (/4_0).
HLSL Compiler>Output Files>Header Variable Name: g_vs
Object File Name: $(OutDir)%(Filename)_d.cso.
(the _d is only under the Debug configuration.)

Reading & Writing to textures in HLSL 5.0 (deferred shading)

I am trying to achieve deferred shading in DirectX 11 , c++. I have managed to create the G-Buffer and render my scene to it(Checked with "GPU PerfStudio"). I am having difficulty with final lighting stage. I am not able to read from textures(Diffuse,Normal,Specular) using SV_Position returned coordinates.
This is the pixel shader used to render light as shapes.
Texture2D<float4> Diffuse : register( t0 );
Texture2D<float4> Normal : register( t1 );
Texture2D<float4> Position : register( t2 );
cbuffer MaterialBuffer : register( b1 )
{
float4 ambient;
float4 diffuse;
float4 specular;
}
//--------------------------------------------------------------------------------------
struct VS_OUTPUT
{
float4 Pos : SV_POSITION;
float4 PosVS: POSITION;
float4 Color : COLOR0;
float4 normal : NORMAL;
};
float4 main(VS_OUTPUT input) : SV_TARGET
{
//return Diffuse[screenPosition.xy]+Normal[screenPosition.xy]+Position[screenPosition.xy];
//return float4(1.0f, 1.0f, 1.0f, 1.0f);
//--------------------------------------------------------------------------------------
//Problematic line
float4 b=Diffuse.Load(int3(input.Pos.xy,0));
//--------------------------------------------------------------------------------------
return b;
}
I have checked with "GPU PerfStudio" the input textures are properly bound.
The above code is returning the color I used to clear the texture.(From my debugging I have found that its returning value at pixel location 0,0)
If I replace the problematic line with:-
float4 b=Diffuse.Load(int3(350,300,0));
Then its rendering the value at 350,300 pixel location with the proper shape of light.
Thanks
Do you tried with the debug flag D3D11_CREATE_DEVICE_DEBUG at device creation and looked at the output log. You may experience signature mismatch between the Vertex and the Pixel stages. It would explain why the sv_position semantic do not behave correctly.
I solved the problem.I was using the same z-buffer for rendering light geometries that I had used previously for G-buffer.
Thank you for your response.

Getting the color of a vertex in HLSL?

I have the following vertex and pixel shaders:
struct VS_INPUT
{
float4 Position : POSITION0;
float2 TexCoord : TEXCOORD0;
float4 Color : TEXCOORD1;
};
struct VS_OUTPUT
{
float4 Position : POSITION0;
float4 Color : COLOR0;
float2 TexCoord : TEXCOORD0;
};
float4x4 projview_matrix;
VS_OUTPUT vs_main(VS_INPUT Input)
{
VS_OUTPUT Output;
Output.Position = mul(Input.Position, projview_matrix);
Output.Color = Input.Color;
Output.TexCoord = Input.TexCoord;
return Output;
}
px
texture tex;
sampler2D s = sampler_state {
texture = <tex>;
};
float4 ps_main(VS_OUTPUT Input) : COLOR0
{
float4 pixel = tex2D(s, Input.TexCoord.xy);
return pixel;
}
This is for a 2d game. The vertices of the quads contain tinting colors that I want to use to tint the bitmap. How can I obtain the color of the current vertex so I can multiply it in the pixel shader by the current pixel color?
Thanks
In your pixel shader, do:
float4 pixel = tex2D(s, Input.TexCoord.xy) * Input.Color;
The Input.Color value will be linearly interpreted across your plane for you, just like Input.TexCoord is. To blend two color vectors together, you simply multiply them together. It may also be advisable to do:
float4 pixel = tex2D(s, Input.TexCoord.xy) * Input.Color;
pixel = saturate(pixel);
The saturate() function will clip each RGB value in your color in the range of 0.0 to 1.0, which may avoid any possible display artifacts.