DirectX11 Non-Solid wireframe - c++

I'm attempting to draw a 100x100 grid in DirectX11 using C++, but the following issue happens:
The image above shows a grid I am drawing in wireframe, using a rasterizer with its fillmode set to 'D3D11_FILL_WIREFRAME' and a 'D3D11_PRIMITIVE_TOPOLOGY_LINELIST' topology. The lines of the wireframe appear staggered rather than straight, with some parts of the wireframe missing.
As I'm not sure what this issue is referred to as, I'm not entirely sure what I should be looking for and as such, any help is appreciated.

This is a classic problem of 'aliasing'.
With Direct3D 11, you can use either MSAA which is a general anti-aliasing option, or use a specific line algorithm if you are not using MSAA. See D3D11_RASTERIZER_DESC AntialiasedLineEnable and MultisampleEnable.
See Aliasing and Multisample anti-aliasing
UPDATE: I've added MSAA and the AA mode to the DirectX Tool Kit tutorial.

I stumbled across the solution to this and it wasn't what I expected at all. Since the window I created had outline bars, the windows client area was being misrepresented. So when I was creating my graphics device, I was initializing it with the windows size, rather than the windows client size. Somehow, this caused the issue above.
I fixed this by implementing a 'GetClientSize' method for the window, using 'GetWindowRect'.

Related

what is multisample per pixel in directx11 DXGI_SAMPLE_DECS

I was reading documentation about DXGI_SWAP_CHAIN_DESC and i came across with DXGI_SAMPLE_DESC
Count
Type: UINT
The number of multisamples per pixel.
now what exactly is multisamples per pixel?
DXGI_SAMPLE_DESC as you surmised is for specifying Multi-Sample Anti-Aliasing (MSAA).
That said, you should be aware that the SwapChain support for MSAA is not something you should use anymore. As such, just always set DXGI_SWAP_CHAIN_DESC.SampleDesc.Count = 1; and DXGI_SWAP_CHAIN_DESC.SampleDesc.Quality = 0;.
Instead, to use MSAA you should explicitly create your own MSAA render target and explicitly resolve the result yourself as part of your presentation of the results to the single-sample SwapChain. For details on why and how, see this blog post series.
Note that you can use MSAA SwapChains for DirectX 11 with the older DXGI_SWAP_EFFECT_DISCARD and DXGI_SWAP_EFFECT_SEQUENTIAL flip-effects, and the DirectX 11 runtime will do the resolve automatically. Per the blog post, this is NOT supported for DirectX 12 or the use of modern DXGI_SWAP_EFFECT_FLIP_DISCARD or DXGI_SWAP_EFFECT_FLIP_SEQUENTIAL swap effects. This is really a 'toy' setup as any production rendering will do additional processing after the resolve from multi-sample to single-sample before putting it into the swapchain for display.
As you are likely new to DirectX 11, you may want to look at DirectX Tool Kit. I have a tutorial that covers MSAA.
ok it is anti-aliasing (my bad i didn't know that)
Anti-aliasing is a technique used by users to get rid of jaggies that form on the screen. Since pixels are rectangular, they form small jagged edges when used to display round edges. Anti-aliasing tries to smooth out the shape and produce perfect round edges.

How Can I implement MSAA on DX12?

I Searched many other questions and samples, but I still can't understand what I must do.
What I know about this process is
Create a Render Target for msaa. - Different from SwapChain's Backbuffer.
Draw everything (like meshes) on msaa render target.
Copy the contents of the msaa Render Target to the current BackBuffer using the ResolveSubresource function.
Is this the right process? Or is there a part that I left out?
These samples demonstrate using MSAA with DirectX12:
https://github.com/microsoft/Xbox-ATG-Samples/tree/master/PCSamples/IntroGraphics/SimpleMSAA_PC12
https://github.com/microsoft/Xbox-ATG-Samples/tree/master/UWPSamples/IntroGraphics/SimpleMSAA_UWP12
I also cover this (among other topics) in this blog series.
Per the comments, you can also find MSAA covered in the DirectX Tool Kit for DX12 tutorials.

Font Outline Beizier Spline Contoures and DirectWrite

I'm considering maybe using DirectWrite for a project that will be coming in a DirectX 11 version and a OpenGL 3.1+ version.
From what understand, DirectWrite uses Direct2D which sits on top of Direct3D 10.1 (until DirectX 11.1 is released). This means that to use DirectWrite with Direct3D 11, currently, I would have to create one Direct3D 10.1 device and a Direct3D 11 device and then share the resources between these two devices, which comes with some synchronization overhead it seems. Another problem is that I won't seem to be able to render the text directly to the d3d11 backbuffer with this set up, right...?
Also I have no idea if it is even possible to combine DirectWrite with OpenGL in any practical sense..? My guess is not...
Sooo... I'm also considering writing my own font renderer and I would like to be able to render the fonts based on their Bezier spline outlines for resolution independence. I know about the GetGlyphOutline() function but it seems to be in the process of being deprecated and "...should not be used in new applications" according to MSDN library. And looking at DirectWrites reference pages at MSDN, I can't see any way of getting the same in Bezier spline information like you can with GetGlyphOutline(). You can get the outline information wrapped in a ID2D1SimplifiedGeometrySink, but I can't see how you get the pure Bezier curve, control points, information from the ID2D1SimplifiedGeometrySink, you can only use it for drawing using D2D (D3D10.1) which I am not so much interested in at this point.
Is there a way to get the font outline contours using a non-deprecated method, DirectWrite or otherwise?
I'm not that familiar with either DirectWrite and Direct2D as you can probably tell. I'm trying to figure out what direction to take. Whether it is worth going down the DirectWrite/D2D road, or to make my own font renderer, or some other brilliant idea :). Any suggestions?
PS I'm currently developing for the Win7 platform and will migrate to Win8 when it is released.
Fortunately you have it a little backwards. Direct2D uses DirectWrite, not the other way around. You can technically use DirectWrite on its own (see: IDWriteBitmapRenderTarget). If you're at the prototyping stage, it may be easier to use Direct2D to do software rendering into an IWICBitmap created through IWICImagingFactory::CreateBitmap() (or just wrap a bitmap you've already created, and implement IWICBitmap yourself). You use ID2D1Factory::CreateWicBitmapRenderTarget(), call ID2D1RenderTarget::BeginDraw(), then create your IDWriteTextFormat and/or IDWriteTextLayout via IDWriteFactory, and then call DrawText() or DrawTextLayout() on the render target, then EndDraw(). Then you copy into a hardware texture and draw it however you like.

OpenGL transparent effects displayed quite awful on Meego

we've been creating several half-transparent 3D cubes in a scene by OpenGL which displays very good on Windows 7 and Fedora 15, but become quite awful on Meego system.
This is what it looks like on my Fedora 15 system:
This is what it looks like on Meego. The color of the line has been changed by us, otherwise the cubes you see would be more pathetic:
The effects are implemented by just using the normal glColor4f function, and made to be transparent just by setting the value of alpha. How could it be like that?
Both freeglut and openglut have been tried on the Meego system and failed to display any better.
I've even tried to use an engine like irrlicht to implement this instead but there would be nothing but black on the screen when the zBuffer argument of beginScene method was set to be false (and normal when it's true, but that would not be what we want).
This should not be the problem of the display card or the driver, because we've seen a 3D game with a transparent ball involved on the very same netbook and system.
We failed to find the reason here. Could any one give any help on why this would be happening please?
It sounds as if you may be relying on default settings (or behavior), which may be different between platforms.
Are you explicitly setting any of OpenGL's blend properties, such as glBlendFunc? If you are, it may help to post the relevant code that does this.
One of the comments mentioned sorting your transparent objects. If you aren't, that's something you might want to consider to achieve more accurate results. In either case, that behavior should be the same from platform to platform so I would have guessed that's not your issue.
Edit:
One other thought. Are you setting glCullFace? It could be that your transparent faces are being culled because of your vertex winding.
Both freeglut and openglut have been tried on the Meego system and failed to display any better.
Those are just simple windowing frameworks and have no effect whatsoever on the OpenGL execution.
Somewhere in your blending code you're messing up. From the looks of the correct rendering I'd say your blend function there is glBlendFunc(GL_ONE, GL_ONE), while on Meego it's something like glBlendFunc(GL_SRC_ALPHA, GL_ONE).

Enable antialiasing using Xlib

I'm trying to develop a custom set of libraries for creating GUIs in Linux, with, you know, widgets, buttons, etc. So I'm now learning to creating user interfaces using X11 and its Xlib. I get to the point of having a nice window of a size specified, at a position specified, of a specified background color, and the possibility of drawing points, rectangles, arcs. However as I drew my first circle I got really disappointed by the fact that the circle is not antialiased. I can see every single pixel as a square.
Now the question is easy. Is there any way to tell X: please antialias anything before drawing? Or do I have to avoid using XDrawArc and use a custom function which calls XDrawPoint for each point of the circle? Or there is a third solution?
Thanks in advance.
The short answer is "no". Xlib doesn't do anti-aliasing.
The longer answer is "you can use a higher level API such as Cairo Graphics". It's not necessary to roll your own.
What you encountered are the limitations of the X11 core protocol; technically it would be perfectly possible to add antialiasing to it, but that didn't happen.
Instead there's the XRender extension, that provides nice antialiased primitives. You'll also want to look into Xft to render antialiased text using vector fonts.
You can roll your own antialiasing algorithm. You have the only 2 primitives you need: 1) a function to draw TrueColor points (namely, xcb_poly_point(), if you're using XCB), and 2) for loops.