Direct3D 11 Access is Denied - c++

I am trying to create a 3D simulation, the code is here and the vertex shader is here, as you can imagine the pixel shader is just a straight forward function which passes the input straight to output, please forgive the lack of comments and sanity, I'm new to D3D.
Well basically the problem is that it runs fine on my PC, but on my laptop all I get is "Access is Denied". Which is a problem since I need to make the sim portable. My laptop has DirectX 12 installed and has directdraw and direct3d acceleration enabled. It's an AMD Readeon Graphics Processor chip type and its name is AMD Radeon vega 3 graphics. The laptop is brand new so I don't expect that I need to update drivers (please correct me if I am wrong).
enter code here
In addition in its early stages, the program managed to run a cube, but since I made a very rudimental object importer, that's where the problem started..

Related

Tessellation shaders not working with UWP DirectX 11 on Xbox Series X|S

I ported a DirectX 11 application to UWP to deploy it on Xbox Series X|S and hardware tessellation shaders are not working when running the app on Xbox (tested on retail Xbox Series X and Series S in devmode). The rendered geometry doesn't show up in the viewport but no errors are thrown. Running the same app locally on my PC renders the tessellated geometry without any issues. After reading through this blogpost and the follow-up, I made sure my application is running in game mode and a DirectX 11 Feature Level 11.0 context is created (creating a 10.1 context procudes errors when trying to use tessellation primtives as expected). Rendering statistics from the app suggest that there are vertex shader invokes but no hull shader invokes afterwards and hull shader invokes, but not domain shader invokes afterwards.
Because I assumed there to be some sort of subtle bug in the tessellation implementation of my app, I next tried the SimpleBezierUWP sample app from Microsoft. The result is the same: On PC, it renders just fine but when running on Xbox, the geometry is missing. This applies to both the DX11 and DX12 version of that sample app.
To recreate this bug just download the SimpleBezierUWP sample, build it and deploy it to a retail Xbox Series X or S in devmode.
So has anyone successfully used tessellation shaders in an UWP application on Xbox Series X or S? Is it not supported after all, even if Direct X Feature Level 11.0 is? Or are there special requirements for writing hull and domain shaders for this specific hardware that I was not able to find out about from publicly available source?
Thanks!

directx game on laptops with two video adapters and the wrong one connected the output

I'm having a problem with a directx 11 game I'm developing on laptops with two video cards. The normal case I'm running into (and I have this on my own laptop) is a weak intel card and a powerful nvidia card. Obviously I want the nvidia one and I've already got it enumerating the adapters and figuring out the correct one to create the device interface for.
The problem is nvidia one doesn't have an output. When you call EnumOutputs on the IDXGIAdapter interface you don't find any. And this makes sense because the laptop only has one screen and its attached to the intel adapter (you can find it by calling EnumOutputs on the intel IDXGIAdapter interface).
But this seemingly makes it impossible to create a fullscreen swap chain for that device (IDXGIFactory::CreateSwapChain fails when given the nvidia device and fullscreen settings even when I'm certain the other mode parameters are valid).
It seems like other games are figuring out a way around this. Off of my steam list for example Half-Life 2 seems to be running in fullscreen mode. However stardew valley is running in borderless windowed mode which I could do but has its own issues.
I'm aware that its possible to change the laptop's settings so the nvidia card is the dominate one. But I need this to work on customer's laptops where I can't expect them to deal with all that.
One potential solution might be create a device for both adapters and then create a swap chain on the intel one as a device shared resource https://learn.microsoft.com/en-us/windows/desktop/api/d3d11/nf-d3d11-id3d11device-opensharedresource I'm not even sure if that's possible though. The docs are vague.
Before I go down a difficult potentially dead end though I'm wondering if anyone knows the solution.

Can EGL application run in console mode?

I want to implement an opengl application which generates images and I view the image via a webpage.
the application is intended to run on a linux server which has no display, no x windows, but with gpu.
I know that egl can use pixmap or pbuffer as render targets.
but the function eglGetDisplay worries me, it sounds like I still need to have attached display to make it work?
does egl work without display and xwindows or wayland?
This is a recurring question. TL;DR: With the current Linux graphics driver model it is impossible to use the GPU with traditional drivers without running a X server. If the GPU is supported by KMS+DRM+DRI you can do it. (EDIT:) Also in 2016 Nvidia finally introduced truly headless OpenGL support in their drivers through EGL.
The long story is, that technically GPUs are perfectly capable of rendering to an offscreen buffer without a display being attached or a graphics server running. However due to the history of graphics driver and environment development this is not possible, yet has not been possible for a long time. The assumption back then (when graphics was first introduced to Linux) was: "The graphics device is there to deliver a picture to a screen." That a graphics card could be used as an accelerating coprocessor was not even a figment of an idea.
Add to this, that until a few years ago, the Linux kernel itself had no idea how to talk to graphics devices (other than a dumb framebuffer somewhere in the system's address space). The X server was what talked to GPUs, so you needed that to run. And the first X server developers made the assumption that there is a person between keyboard and chair.
So what are your options:
Short term, if you're using a NVidia GPU: Just start an X server. You don't need a full blown desktop environment. You can even save yourself the trouble of starting a window manager. Just have the X server claim the VT and being active. There is now support for headless OpenGL contexts through EGL in the Nvidia drivers.
If you're using an AMD or Intel GPU you can talk directly to it. Either through EGL or using KMS (Google for something called kmscube, when trying it, make sure you switch away from your X server to a text VT first, otherwise you'll crash the X server). I've not tried it yet, but it should be possible to adjust the kmscube example that it uses the GPU to render into an offscreen buffer, without switching the VT to graphics mode or have any graphics output on the display framebuffer at all.
As datenwolf told u can create a frame buffer without using x with AMD and intel GPU. since iam using AMD graphics card with EGL and iam able to create a frame buffer and iam drawing on it.with Mesa Library by configuring without x u can achieve.

How to do stereoscopic 3D with OpenGL on GTX 560 and later?

I am using the open source haptics and 3D graphics library Chai3D running on Windows 7. I have rewritten the library to do stereoscopic 3D with Nvidia nvision. I am using OpenGL with GLUT, and using glutInitDisplayMode(GLUT_RGB | GLUT_DEPTH | GLUT_DOUBLE | GLUT_STEREO) to initialize the display mode. It works great on Quadro cards, but on GTX 560m and GTX 580 cards it says the pixel format is unsupported. I know the monitors are capable of displaying the 3D, and I know the cards are capable of rendering it. I have tried adjusting the resolution of the screen and everything else I can think of, but nothing seems to work. I have read in various places that stereoscopic 3D with OpenGL only works in fullscreen mode. So, the only possible reason for this error I can think of is that I am starting in windowed mode. How would I force the application to start in fullscreen mode with 3D enabled? Can anyone provide a code example of quad buffer stereoscopic 3D using OpenGL that works on the later GTX model cards?
What you experience has no technical reasons, but is simply product policy of NVidia. Quadbuffer stereo is considered a professional feature and so NVidia offers it only on their Quadro cards, even if the GeForce GPUs would do it as well. This is not a recent development. Already back in 1999 it was like this. For example I had (well still have) a GeForce2 Ultra back then. But technically this was the very same chip like the Quadro, the only difference was the PCI-ID reported back to the system. One could trick the driver into thinking you had a Quadro by tinkering with the PCI-IDs (either by patching the driver or by soldering an additional resistor onto the graphics card PCB).
The stereoscopic 3D mode for Direct3D hack was already supported by my GeForce2 then. Back then the driver duplicated the rendering commands, but applied a translation to the modelview and a skew to the projection matrix. These days it's implemented a shader and multi rendertarget trick.
The NVision3D API does allow you to blit images for specific eyes (this is meant for movie players and image viewers). But it also allows you to emulate quadbuffer stereo: Instead of GL_BACK_LEFT and GL_BACK_RIGHT buffers create two Framebuffer Objects, which you bind and use as if they were quadbuffer stereo. Then after rendering you blit the resulting images (as textures) to the NVision3D API.
With only as little as 50 lines of management code you can build a program that seamlessly works on both NVision3D as well as quadbuffer stereo. What NVidia does is pointless and they should just stop it now and properly support quadbuffer stereo pixelformats on consumer GPUs as well.
Simple: you can't. Not the way you're trying to do it.
There is a difference between having a pre-existing program do things with stereoscopic glasses and doing what you're trying to do. What you are attempting to do is use the built-in stereo support of OpenGL: the ability to create a stereoscopic framebuffer, where you can render to the left and right framebuffers arbitrarily.
NVIDIA does not allow that with their non-Quadro cards. It has hacks in the driver that will force stereo on applications with nVision and the control panel. But NVIDIA's GeForce drivers do not allow you to create stereoscopic framebuffers.
And before you ask, no, I have no idea why NVIDIA doesn't let you control stereo.
Since I was looking into this issue for my own game, I w found this link where somebody hacked the USB protocol. http://users.csc.calpoly.edu/~zwood/teaching/csc572/final11/rsomers/
I didn't follow it through but at the time when I was researching on this it didn't look to hard to make use of this information. So you might have to implement your own code in order to support it in your app, which should be possible. Unfortunately a generic solution would be harder, because then you would have to hack the driver or somehow hook into the OpenGL library and intercept the calls.

OpenGL rendering glitch

I'm working on a 3D game project with a bunch of people. The project runs fine on all of their machines but mine. On my computer the skybox texture intermittently disappears and the rendering goes awfully bad.
We've all been working on Windows XP with Visual Studio 2008, and the only significant difference between my machine and my co-workers is that my computer has a Nvidia 9400 GT graphics card, which, I guess, is the thing to blame.
Here's a screenshot, skybox-less. Is there any setting in OpenGL or the Nvidia video manager that I can tweak to avoid this?
Sometimes this can depend on the texture size. Could tyou try with a texture in which dimensions are power of two ?