How to declare stream output in GS in HLSL? - glsl

I want to implement Stream output stage in DX9 using HLSL. Stream output buffers are passed in Geometry shaders. How to implements this Stream output logic in GS when writing in HLSL?

DX9 doesn't support Geometry Shaders or Stream Output.

Related

Geometry shader and Draw calls

I am working on an old code base that uses Geometry shaders. It uses "glProgramParameteriEXT(...) to enable and specify Input/Output of the GS. The code in question is rendering curves.
The input to the GS(via above mentioned method and not in the GLSL using layout specifier) is GL_LINES_ADJACENCY and output is GL_TRIANGLE_STRIP. And in the actual GLSL Geometry shader, the code spitting out two EmitVertex() as expected by GL_LINES. Eventually at the time of executing the drawcall, glDrawElements (GL_LINES, ....) is used. Is the draw call not supposed to match what was output by the GS, which is GL_TRIANGLE_STRIP? I am not too familiar geometry shaders so I must be missing something. If that is the case, how does openGL figure out to draw triangles while it is told to draw lines?
Thanks!
The primitivegeometry shader generates primitives from input primitives. The draw call determines the input primitive type; the GS determines the output primitive type. OpenGL "figures out" that it is drawing triangles because that's what the GS says to draw. The GS is part of OpenGL; it's not some foreign entity that gets in OpenGL's way.

Binding OpenGL texture to OpenCL buffer

I am working on some project with require of some rendering using OpenGL, and then passing output texture for OpenCL post-processing. The problem is that our kernels work with buffers, not images, and the final output also should be buffer, so changing kernels for work with image2d instead of buffers is not an option.
Of course, mapping OpenGL buffer/texture to the same type on OpenCL is an easy task, but it seems that there are no direct way to map OpenGL output (both texture or renderbuffer objects) to OpenCL buffer without additional steps/memory allocation as copying GL texture data to PBO or CL image to buffer etc. Ability to bind GL buffer objects as framebuffer output would be nice, but I haven't found anything like this so far. I thought about GL_TEXTURE_BUFFER as rendering target, but OpenGL prohibits to use it with framebuffer.
So, the question is - is there any way to directly render with OpenGL into vertex buffer object, and if no - what is the most efficient (time/memory) way to convert OpenGL texture into OpenCL buffer?

Reading back generated vertices and fragment OpenGL

Is there any way to read back from server space vertices and fragments generated from vertex and fragment shaders back to client space?
Is there specific functions to do this or some method by which this is done?
And if so what is the function call or method to do this?
Yes.
You can read the vertex shader output with transform feedback.
You can read the fragment shader output with glReadPixels().

D3D11 Writing to buffer in geometry shader

I have some working OpenGL code that I was asked to port to Direct3D 11.
In my code i am using Shader Storage Buffer Objects (SSBOs) to read and write data in a geometry shader.
I am pretty new of Direct3D programming. Thanks to google I've been able to identify the D3D equivalent of SSBOs, RWStructuredBuffer (I think).
The problem is that I am not sure at all I can use them in a geometry shader in D3D11, which, from what i understand, can generally only use up to 4 "stream out"s (are these some sort of transform feedback buffer?).
The question is: is there any way with D3D11/11.1 to do what I'm doing in OpenGL (that is writing to SSBOs from the geometry shader)?
UPDATE:
Just found this page: http://msdn.microsoft.com/en-us/library/windows/desktop/hh404562%28v=vs.85%29.aspx
If i understand correctly the section "Use UAVs at every pipeline stage", it seems that accessing such buffers is allowed in all shader stages.
Then i discovered that DX11.1 are available only on Windows 8, but some features are also ported to Windows 7.
Is this part of Direct3D included in those features available on Windows 7?
RWBuffers are not related to the geometry shader outputting geometry, they are found in compute shader mostly and in a less percentage in pixel shader, and as you spot, other stages needs D3D 11.1 and Windows 8.
What you are looking for is stream output. The API to bind buffers to the output of the geometry shader stage is ID3D11DeviceContext::SOSetTargets and buffers need to be created with the flag D3D11_BIND_STREAM_OUTPUT
Also, outputting geometry with a geometry shader was an addition from D3D10, in D3D11, it is often possible to have something at least as efficient and simpler with compute shaders. That's not an absolute advice of course.
The geometry shader is processed once per assembled primitive and can generate one or more primitives as a result.
The output of the geometry shader can be redirected towards an output buffer instead of passed on further for rasterization.
See this overview diagram of the pipeline and this description of the pipeline stages.
A geometry shader has access to other resources, bound via the GSSetShaderResources method on the device context. However, these are generally resources that are "fixed" at shader execution time such as constants and textures. The data that varies for each execution of the geometry shader is the input primitive to the shader.
just been pointed to this page:
http://nvidia.custhelp.com/app/answers/detail/a_id/3196/~/fermi-and-kepler-directx-api-support .
In short, nvidia does not support the feature on cards < Maxell.
This pretty much answers my question. :/

OpenGL Y420 Shaders

I am working on OpenGL shaders. My current shaders will take textures in RGB24 format and display. I wanted to take Y420 as input and convert into RGB24 at fragment shader level. Guide me to proceed further with this.
The capability to render YUV via OpenGL depends on the platform. Most of the platforms expose YUV streaming texture capability via extensions, for example GL_OES_EGL_image_external at http://www.khronos.org/registry/gles/extensions/OES/OES_EGL_image_external.txt
For eglImage based streaming, you can refer to TEST16 in sgxperf codebase at,
https://github.com/prabindh/sgxperf/blob/master/sgxperf_gles20_vg.cpp
Additionally, when you use these extensions, it is NOT necessary to do any conversions in the shader. The sampler (HW) already does the conversion to RGB before you process it in the shader.
For developing a complete application though, you typically need other additional mechanisms like synchronisation with the display etc. If your vendor provides a gstreamer sink that integrates GL streaming functionality, that would be the best option.