HLSL: SV_Position, why/how from float3 to float4? - c++

I'm just at the very very beginning of learning shaders/hlsl etc., so please excuse the probably stupid question.
I'm following Microsoft's DirectX Tutorials (Tutorial (link) , Code (link) ). As far as I understand, they're defining POSITION as 3-element array of float values:
// Define the input layout
D3D11_INPUT_ELEMENT_DESC layout[] =
{
{ "POSITION", 0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 0, D3D11_INPUT_PER_VERTEX_DATA, 0 },
{ "COLOR", 0, DXGI_FORMAT_R32G32B32A32_FLOAT, 0, 12, D3D11_INPUT_PER_VERTEX_DATA, 0 },
};
Makes sense of course, each vertex position has 3 float values: x, y, and z. But now when looking at the Vertex shader, position is suddently of type float4, not of float3:
//--------------------------------------------------------------------------------------
// Vertex Shader
//--------------------------------------------------------------------------------------
VS_OUTPUT VS( float4 Pos : POSITION, float4 Color : COLOR )
{
VS_OUTPUT output = (VS_OUTPUT)0;
output.Pos = mul( Pos, World );
output.Pos = mul( output.Pos, View );
output.Pos = mul( output.Pos, Projection );
output.Color = Color;
return output;
}
I'm aware a float4 is basically a homogenous coordinate and needed for the transformations. As this is a position, I'd expect the fourth value of Pos (Pos.w, if you will) to be 1.
But how exactly does this conversion work? I've just defined POSITION to be 3 floats in C++ code, and now I'm suddently using a float4 in my Vertex shader.
In my naivety, I would have expected one of two things to happen:
Either: Pos is initialized as a float4 array with all zero-elements, and then the first 3 elements are filled with the vertix coordinates. But this would result in the fourth coordinate / w = 0, instead of 1.
Or: Since I've defined "Color" Input with InputSlot=12, i.e. to start with a byte-offset of 12, I could've imagined that Pos[0] = First four bytes (vertex position x), Pos[1] = Next 4 bytes (vertex.y), Pos[2] = Next 4 bytes (vertex.z), and Pos[3] = next 4 bytes - which would be the first element of COLOR.
Why does neither of these alternatives/errors happen? How, and why, does DirectX convert my float3 coordinates automatically to a float4 with w=1?
Thanks!

The default values for vertex attribute components that are missing are (0,0,0,1).
The MSDN confirms that's the case for D3D8 and D3D9 but I can't find an equivalent page that confirms that behaviour continued for 10 and 11. From experience though, I can say that this is still the case. Missing X, Y and Z components are replaced with 0, and missing W components are replaced with 1.

Related

How to draw a dotted pattern 3D line with tessellation in DirectX 11?

I need to draw lines in directx 11 which will show colors like dotted pen drawing in GDI. I know tessellation will put more vertices in between each line. Does anybody make it clear to me how can I get a dotted pattern drawing line in directx 11?
You can tile a small texture in screen space, in pixel shader. Here's how pixel shader may look like:
Texture2D<float4> patternTexture : register(t0);
static const uint2 patternSize = uint2( 8, 8 );
float4 main( float4 screenSpace : SV_Position ) : SV_Target
{
// Convert position to pixels
const uint2 px = (uint2)screenSpace.xy;
// Tile the pattern texture.
// patternSize is constexpr;
// if it's power of 2, the `%` will compile into bitwise and, much faster.
const uint2 readPosition = px % patternSize;
// Read from the pattern texture
return patternTexture.Load( uint3( readposition, 0 ) );
}
Or you can generate pattern in runtime, without reading textures. Here's a pixel shader which skips every other pixel:
float4 main( float4 color: COLOR0, float4 screenSpace : SV_Position ) : SV_Target
{
// Discard every second pixel
const uint2 px = ((uint2)screenSpace.xy) & uint2( 1, 1 );
if( 0 != ( px.x ^ px.y ) )
return color;
discard;
return float4( 0 );
}

TRIANGLESTRIP works as required with DrawIndexed why not with Draw?

I was trying to create triangle drawing with user-defined coordinates using Directx 11:
void triangle(float anchorx, float anchory, float x1, float y1, float x2, float y2, XMFLOAT4 _color)
{
XMMATRIX scale;
XMMATRIX translate;
XMMATRIX world;
simplevertex framevertices[3] = { XMFLOAT3(anchorx, ui::height - anchory, 1.0f), XMFLOAT2(0.0f, 0.0f),
XMFLOAT3(x1, ui::height - y1, 1.0f), XMFLOAT2(1.0f, 0.0f),
XMFLOAT3(x2, ui::height - y2, 1.0f), XMFLOAT2(1.0f, 1.0f) };
world = XMMatrixIdentity();
dx11::generalcb.world = XMMatrixTranspose(world);// XMMatrixIdentity();
dx11::generalcb.fillcolor = _color;
dx11::generalcb.projection = XMMatrixOrthographicOffCenterLH( 0.0f, ui::width, 0.0f, ui::height, 0.01f, 100.0f );
// copy the vertices into the buffer
D3D11_MAPPED_SUBRESOURCE ms;
dx11::context->Map(dx11::vertexbuffers::trianglevertexbuffer, NULL, D3D11_MAP_WRITE_DISCARD, NULL, &ms);
memcpy(ms.pData, framevertices, sizeof(simplevertex) * 3);
dx11::context->Unmap(dx11::vertexbuffers::trianglevertexbuffer, NULL);
dx11::context->VSSetShader(dx11::shaders::simplevertexshader, NULL, 0);
dx11::context->IASetVertexBuffers(0, 1, &dx11::vertexbuffers::trianglevertexbuffer, &dx11::verticestride, &dx11::verticeoffset);
dx11::context->IASetInputLayout(dx11::shaders::simplevertexshaderlayout);
dx11::context->PSSetShader(dx11::shaders::panelpixelshader, NULL, 0);
dx11::context->UpdateSubresource(dx11::general_cb, 0, NULL, &dx11::generalcb, 0, 0);
dx11::context->IASetIndexBuffer(dx11::indexbuffers::triangleindexbuffer, DXGI_FORMAT_R16_UINT, 0);
dx11::context->IASetPrimitiveTopology(D3D11_PRIMITIVE_TOPOLOGY_TRIANGLESTRIP);
dx11::context->DrawIndexed(4, 0, 0);
//dx11::context->Draw(3, 0);
};
Substration on Y axis when filling pos.y data:
XMFLOAT3(anchorx, ui::height - anchory, 1.0f)
XMFLOAT3(x1, ui::height - y1, 1.0f)
XMFLOAT3(x2, ui::height - y2, 1.0f)
My orthographic projection sets coordinates zeroes to left-bottom of the screen, so I substract passed Y coordinate from height of the window to get proper position on y axis. Not sure it can affect my problem because it worked well with all the rest of primitives (rectangles, textures, filled rectangles, lines and circles).
Index order defined in index buffer:
unsigned short triangleindices[6] = { 0, 1, 2, 0, 2, 3 };
Im actually using this index buffer to render rectangles so its created to render 2 triangles forming a quad, didn't bother to create separate triangle index buffer
Trianglevertexbuffer contains array 4 of simplevertex:
//A VERTEX STRUCT
struct simplevertex
{
XMFLOAT3 pos; // pixel coordintes
XMFLOAT2 uv; // corresponding point color coordinates in texture
};
I was not using UV data at the function above, just filled them with random data, because color data is passed via constant buffer. As you see I also memcpy only 3 first array data to the vertex buffer for a triangle requires only 3.
VERTEX SHADER:
// SIMPLE VERTEX SHADER
cbuffer buffer : register(b0) // constant buffer, set up as 0 in set constant buffers command
{
matrix world; // world matrix
matrix projection; // projection matrix, is orthographic for now
float4 bordercolor;
float4 fillcolor;
float blendindex;
};
// simple vertex shader
float4 main(float4 input : POSITION) : SV_POSITION
{
float4 output = (float4)0; // setting variable fields to zero, may be skipped?
output = mul(input, world); // multiplying vertex shader output by world matrix passed in the constant buffer
output = mul(output, projection); // multiplying on by projection passed in the constant buffer (2d for now), resulting in final output data(.pos)
return output;
}
PIXEL SHADER:
// SIMPLE PIXEL SHADER RETURNING COLOR PASSED IN CONSTNT BUFFER
cbuffer buffer : register(b0) // constant buffer, set up as 0 in set constant buffers command
{
matrix world; // world matrix
matrix projection; // projection matrix, is orthographic for now
float4 bordercolor;
float4 fillcolor;
float blendindex;
};
// pixel shader returning preset color in constant buffer
float4 main(float4 input : SV_POSITION) : SV_Target
{
return fillcolor; // and we just return the color passed to constant buffer
}
Layout used in the function:
{ "POSITION", 0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 0, D3D11_INPUT_PER_VERTEX_DATA, 0 };
There, I created next lines in the rendering cycle:
if (ui::keys[VK_F1]) tools::triangle(700, 100, 400, 300, 850, 700, colors::blue);
if (ui::keys[VK_F2]) tools::triangle(400, 300, 700, 100, 850, 700, colors::blue);
if (ui::keys[VK_F3]) tools::triangle(700, 100, 850, 700, 400, 300, colors::blue);
if (ui::keys[VK_F4]) tools::triangle(850, 700, 400, 300, 700, 100, colors::blue);
if (ui::keys[VK_F5]) tools::triangle(850, 700, 700, 100, 400, 300, colors::blue);
if (ui::keys[VK_F6]) tools::triangle(400, 300, 850, 700, 700, 100, colors::blue);
The point of this setup is to achieve any random order of coordinates forming a triangle for rendering, but this is actually the point of this question - I didn't get this triangle rendered at some coordinates variations, so I came to conclusion it goes from TOPOLOGY, as you can see I have at the moment:
dx11::context->IASetPrimitiveTopology(D3D11_PRIMITIVE_TOPOLOGY_TRIANGLESTRIP);
dx11::context->DrawIndexed(4, 0, 0);
This is the only combination that draws all the triangles but I honestly don't understand how it does happen, from what I know STRIP topology is used with context->Draw function, while LIST works with index buffer setup. Tried next:
dx11::context->IASetPrimitiveTopology(D3D11_PRIMITIVE_TOPOLOGY_TRIANGLELIST);
dx11::context->DrawIndexed(4, 0, 0);
Triangles F1, F5, F6 were not drawn. Aight, next:
dx11::context->IASetPrimitiveTopology(D3D11_PRIMITIVE_TOPOLOGY_TRIANGLESTRIP);
//dx11::context->DrawIndexed(4, 0, 0);
dx11::context->Draw(3, 0);
Same story, F1, F5 and F6 are not rendered.
I cannot understand the things are going on, you may find the code a bit primitive but I only want to know why I get working result only with the combination of STRIP topology and DrawIndexed function. I hope I provided enough information, sorry if not, Ill correct on demand. Thank you:)
Got this.
First of all in my rasterizer settings I had rasterizerState.CullMode = D3D11_CULL_FRONT, don't remember why may be I just copy-pasted some other's steady code, so I changed it to D3D11_CULL_NONE and all worked as intended.
dx11::context->IASetPrimitiveTopology(D3D11_PRIMITIVE_TOPOLOGY_TRIANGLESTRIP);
dx11::context->Draw(3, 0);
Second, I found out that facing or not-facing of a primitive depends on the drawing direction, if drawing vertices goes counter-clockwise (vertice 2 is "righter" then vertice 1 on the start) the primitive is considered as facing the view, if it goes clockwise - than we see its "back" instead.
So i decided to keep D3D11_CULL_NONE instead of adding some math logic to define the order of vertices, dont know for sure if D3D11_CULL_NONE mode cuts performance tho.

D3D11/C++ Inaccuracies in uv interpolation in pixel shader. How to avoid?

I'm trying to draw a quad with a texture onto the screen such that texels and pixels perfectly align. Sounds pretty easy. I draw 2 triangles (as TRIANGLE_LIST, so 6 vertices) using these shaders:
struct VSOutput
{
float4 position : SV_POSITION;
float2 uv : TEXCOORD0;
};
VSOutput VS_Draw(uint index : SV_VertexId)
{
uint vertexIndex = index % 6;
// compute face in [0,0]-[1,1] space
float2 vertex = 0;
switch (vertexIndex)
{
case 0: vertex = float2(0, 0); break;
case 1: vertex = float2(1, 0); break;
case 2: vertex = float2(0, 1); break;
case 3: vertex = float2(0, 1); break;
case 4: vertex = float2(1, 0); break;
case 5: vertex = float2(1, 1); break;
}
// compute uv
float2 uv = vertex;
// scale to size
vertex = vertex * (float2)outputSize;
vertex = vertex + topLeftPos;
// convert to screen space
VSOutput output;
output.position = float4(vertex / (float2)outputSize * float2(2.0f, -2.0f) + float2(-1.0f, 1.0f), 0, 1);
output.uv = uv;
return output;
}
float4 PS_Draw(VSOutput input) : SV_TARGET
{
uint2 pixelPos = (uint2)(input.uv * (float2)outputSize);
// output checker of 4x4
return (((pixelPos.x >> 2) & 1) ^ ((pixelPos.y >> 2) & 1) != 0) ? float4(0, 1, 1, 0) : float4(1, 1, 0, 0);
}
where outputSize and topLeftPos are constants and expressed in pixel units.
Now for outputSize = (102,12) and topLeftPos=(0,0) I get (what I would expect):
link to image (as i'm not allowed to post images)
But for outputSize = (102,12) and topLeftPos=(0,0.5) I get: Output for x=0, y=0.5
link to image (as i'm not allowed to post images)
As you can see there is a uv-discontinuity where the 2 triangles connect and interpolation of uv is inaccurate). This basically happens (in x and y) only at positions around the .5 (actually below .49 it correctly snaps to texel 0 and above .51 it snaps correctly to texel 1, but in between i get this artifact).
Now for the purpose I need this for it is essential to have pixel perfect mapping. Can anyone enlighten me why this happens ?
There are a two things you need consider to understand what is happening:
Pixel corners in window space have integer coordinates and pixel centers in windows space have half-integer coordinates.
When triangle is rasterized, D3D11 interpolates all attributes to pixel centers
So what is happening is that when topLeftPos=(0,0), the value of input.uv * (float2)outputSize is always half-integer, and it is consistently rounded down to closest integer. However, when topLeftPos=(0,0.5), the (input.uv * (float2)outputSize).y should always be exactly integer. However, due to unpredictable floating-point precision issues, it is sometimes little less than exact integer, and in this case it is rounded down too much. This is where you see your stretched squares.
So if you want perfect mapping, your source square should be aligned with the pixel boundaries and not pixel centers.

Hresult : An undetermined error occurred

I'm trying to load a shader file using the D3DX11CompileFromFile() function when I get the HRESULT error "An undetermined error occurred"( I'm using DXGetErrorDescription() to debug). The weird thing is that I can't find the error even listed in the MSDN documentation of HRESULT. I'm abit unsure what the problem could be, because lately my visual studio 2010 c++ express have been acting up giving me loads of code unrelated errors. Have anyone else encountered this error, and if so what was the problem ?
I have only made a few small changes to the program since it last worked perfectly. I'm following a tutorial to so one would expect the code to be correct, but seeing how i am a newbie i will list the changed areas since it last worked just in case there is some silly error.
.fx -file:
VS_OUTPUT VS(float4 inPos : POSITION, float4 inColor : COLOR)
{
VS_OUTPUT output;
output.Pos = inPos;
output.Color = inColor;
return output;
}
float4 PS(VS_OUTPUT input) : SV_TARGET
{
return input.Color;
}
The other code that have been changed:
struct Vertex //Overloaded Vertex Structure
{
Vertex(){}
Vertex(float x, float y, float z, float cr, float cg, float cb, float ca)
: pos(x,y,z), color(cr,cg,cb,ca){}
XMFLOAT3 pos;
XMFLOAT4 color;
};
//the layout, one element for each variable in the vertex struct
D3D11_INPUT_ELEMENT_DESC layout[] =
{
{"POSITION", 0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 0, D3D11_INPUT_PER_VERTEX_DATA, 0 },
{"COLOR", 0, DXGI_FORMAT_R32G32B32A32_FLOAT, 0, 12, D3D11_INPUT_PER_VERTEX_DATA, 0 },
};
UINT numElements = ARRAYSIZE(layout); //the number of elements
I also added the the new parameters to the vertex strucs on the place i use them.

DirectX Clip space texture coordinates

Okay first up I am using:
DirectX 10
C++
Okay this is a bit of a bizarre one to me, I wouldn't usually ask the question, but I've been forced by circumstance. I have two triangles (not a quad for reasons I wont go into!) full screen, aligned to screen through the fact they are not transformed.
In the DirectX vertex declaration I am passing a 3 component float (Pos x,y,z) and 2 component float (Texcoord x,y). Texcoord z is reserved for texture2d arrays, which I'm currently defaulting to 0 in the the pixel shader.
I wrote this to achieve the simple task:
float fStartX = -1.0f;
float fEndX = 1.0f;
float fStartY = 1.0f;
float fEndY = -1.0f;
float fStartU = 0.0f;
float fEndU = 1.0f;
float fStartV = 0.0f;
float fEndV = 1.0f;
vmvUIVerts.push_back(CreateVertex(fStartX, fStartY, 0, fStartU, fStartV));
vmvUIVerts.push_back(CreateVertex(fEndX, fStartY, 0, fEndU, fStartV));
vmvUIVerts.push_back(CreateVertex(fEndX, fEndY, 0, fEndU, fEndV));
vmvUIVerts.push_back(CreateVertex(fStartX, fStartY, 0, fStartU, fStartV));
vmvUIVerts.push_back(CreateVertex(fEndX, fEndY, 0, fEndU, fEndV));
vmvUIVerts.push_back(CreateVertex(fStartX, fEndY, 0, fStartU, fEndV));
IA Layout: (Update)
D3D10_INPUT_ELEMENT_DESC ieDesc[2] = {
{ "POSITION", 0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 0, D3D10_INPUT_PER_VERTEX_DATA, 0 },
{ "TEXCOORD", 0, DXGI_FORMAT_R32G32B32_FLOAT, 0,12, D3D10_INPUT_PER_VERTEX_DATA, 0 }
};
Data reaches the vertex shader in the following format: (Update)
struct VS_INPUT
{
float3 fPos :POSITION;
float3 fTexcoord :TEXCOORD0;
}
Within my Vertex and Pixel shader not a lot is happening for this particular draw call, the pixel shader does most of the work sampling from a texture using the specified UV coordinates. However, this isn't working quite as expected, it appears that I am getting only 1 pixel of the sampled texture.
The workaround was in the pixel shader to do the following: (Update)
sampler s0 : register(s0);
Texture2DArray<float4> meshTex : register(t0);
float4 psMain(in VS_OUTPUT vOut) : SV_TARGET
{
float4 Color;
vOut.fTexcoord.z = 0;
vOut.fTexcoord.x = vOut.fPosObj.x * 0.5f;
vOut.fTexcoord.y = vOut.fPosObj.y * 0.5f;
vOut.fTexcoord.x += 0.5f;
vOut.fTexcoord.y += 0.5f;
Color = quadTex.Sample(s0, vOut.fTexcoord);
Color.a = 1.0f;
return Color;
}
It was also worth noting that this worked with the following VS out struct defined in the shaders:
struct VS_OUTPUT
{
float4 fPos :POSITION0; // SV_POSITION wont work in this case
float3 fTexcoord :TEXCOORD0;
}
Now I have a texture that's stretched to fit the entire screen, both triangles already cover this, but why did the texture UV's not get used as expected?
To clarify I am using a point sampler and have tried both clamp and wrapping UV.
I was a bit curious and found a solution / workaround mentioned above, however I'd prefer not to have to do it if anyone knows why it's happening?
What semantics are you specifying for your vertex-type? Are they properly aligned with your vertices and also your shader? If you are using a D3DXVECTOR4, D3DXVECTOR3 setup (as shown in your VS code) this could be a problem if your CreateVertex() returns a D3DXVECTOR3, D3DXVECTOR2 struct.
It would be reassuring to see your pixel-shader code too.
Okay well, for one, the texture coordinates outside of 0..1 range get clamped. I made the mistake of assuming that by going to clip space -1 to +1 that the texture coordinates would be too. This is not the case, they still go from 0.0 to 1.0.
The reason why the code in the pixel shader worked, was because it was using the clip space x,y,z coordinates to automatically overwrite these texture coordinates; an oversight on my part. However, the pixel shader code results in texture stretch on a full screen 'quad', so it might be useful to someone somewhere ;)