I am making an app that create image from 3d scene.
I use GLFW and GLEW library.
I want to call this app since web service.
My app run well when I launch it with the .exe file but when it is launch by IIS7. it crash when glCreateShader is called and it seem that glfwInit fails.
I put the .dll path in environment variable.
any idea ?
The OpenGL implementations you can usually find on a computer assume a GPU to be available. In general network services, like web servers, are run in an environment configuration that doesn't give access to a GPU. Hence OpenGL is not available for that either.
Furthermore often for security reasons, all API functions that deal with UI elements (like Window and Device Context) creation are disabled as well.
Update:
You could drop using GLFW and use OSMesa to create a pure offscreen, windowless OpenGL context, which rasterizes using a CPU-only implementation. OSMesa has to be custom built and liked into your program, and when doing so it will not be able to fall back (effortlessly) to a GPU accelerated OpenGL implementation.
Related
I am on Windows 10, my GPU is GTX 880M.
When I use vkEnumerateInstanceExtensionProperties to get supported extensions all I get is:
"VK_KHR_surface"
"VK_KHR_win32_surface"
"VK_EXT_debug_report"
However I have no "VK_KHR_swapchain" and if I try to enable "VK_KHR_swapchain" at instance creation, it just hangs.
Without "VK_KHR_swapchain" however, I can't create a swap chain, my debug callback from the validation layer gets called with this message:
"Attemped to call vkCreateSwapchainKHR() but its required extension
VK_KHR_swapchain has not been enabled\n"
I can run games with Vulkan enabled just fine, as well as run the Cube demo from the Vulkan SDK, so there has to be some way I can create a Swapchain and render, right?
Or is there some kind of hack that has to be employed when a GPU doesn't have that extension?
VK_KHR_swapchain is a device extension, not an instance extension. So you need to add it at device creation level instead.
I would like to integrate ogre3d with directx and c++ using hololens.
is it that possible to do so ?
what are the steps to convert the rendering engine, what's rendered to the frame buffer to the hololens buffer?
As RCYR mentioned, to run on the Hololens you are currently required to use UWP.
Running Ogre in an UWP app
There is a wiki entry which shows how to get an OGRE app running in UWP. First you can try to build a simple UWP app without any calls to the Hololens api at all. Note that you can run usual 2d- UWP apps which are not only made for hololens on the device in a windowed view (see mixed reality documentation for more detailes about 2d-views vs. immersive views).
You can check your UWP app by using the Hololens-Emulator. It is integrated with visual studio.
If you just wanted to create a windowed app running on the Hololens you are done by now.
Setting up an immersive view
But more likely you wanted to create an immersive view to display holograms. There are really helpful samples available at the UWP samples repository. I would recommend you look at the HolographicSpatialMapping sample.
Basically the sample shows how to:
create a HolographicSpace (core class for immersive views)
initalize direct3d device (can be done by Ogre as long as the adapter supports Hololens)
register to camera added/ remove events and create resources (buffers and views) for each camera
use the SpatialCoordinateSystem and locate the viewer
get a HolographicFrame, render some content and present
There is a lot of basic functions that you can just copy&paste in the CameraResources and DeviceResources classes in the sample.
For development you should use the Hololens Emulator (mentioned above) and the Visual Studio Graphics Debugger which is fully supported with the Hololens Emulator such that you can easily debug what is going on in direct3d.
Is there a way to start an application with OpenGL >= 3 on a remote machine?
Local and remote machine run on Linux.
More precisely, I have the following problem:
I have an application that uses Qt for GUI stuff and OpenGL for 3D rendering.
I want to start this application on several remote machines because the program does some very time consuming computation.
Thus, I created a version of my program that does not raise a window. I use QGuiApplication, QOffscreenSurface, and a framebuffer object as rendertarget.
BUT: When I start the application on a remote machine (ssh -Y remotemachine01 myapp) I only have OpenGL version 2.1.2. When I start the application locally (on the same machine, I have opengl 4.4). I suppose the X forwarding is the problem.
So I need a way to avoid X forwarding.
Right now there's no clean solution, sorry.
GLX (the OpenGL extension to X11 which does the forwarding stuff) is only specified up to OpenGL-2.1, hence your inability to forward a OpenGL-3 context. This is actually a ridiculous situation, because the "OpenGL-3 way" is much better suited for indirected rendering, than old fashioned OpenGL-2.1 and earlier. Khronos really needs to get their act together and specify GLX-3.
Your best bet would be either to fall back to a software renderer on the remote side and some form of X compression. Or use Xpra backed by on GPU X11 server; however that only works for only a single user at a time.
In the not too far future the upcomming Linux graphics driver models will allow for remote GPU rendering execution by multiple users sharing graphics resources. But we're not there yet.
I'm trying to find a solution to setting up an OpenGL build server. My preference would be to have a virtual or cloud server, but as far as I can see those only go up to 3.0/3.1 using software rendering. I have a server running Windows, but my tests are Linux specific and I'd have to run it in a VM, which as far as I know also only support OpenGL 3.1.
So, is it possible to set up a OpenGL 4 build/unittest server?
OpenGL specification does not include any pixel-perfect warranties. This means your tests may fail just by switching to the other GPU or even to the other version of the driver.
So, you have to be specific. You should test not the result of rendering, but the result of math that just precedes the submission of the primitives to the API.
We are having some weird problems using Java3D over a Windows' remote desktop. The remote machine is a virtualized server, which can't use the (physical) server's graphic card. When I run the app, the following error pops:
Unable to create DirectX D3D context.
Neither Hardware and Software Renderer are available.
Please update your video card drivers
and get the latest DirectX available at http://microsoft.com/directx
After switching to OpenGL (starting the JVM with -Dj3d.rend=ogl) the same error appears! What is possibly happening? How can I fallback to software rendering, either with OpenGL or DirectX, when the error appears?
EDIT: I've already tried using another OpenGL vendor, using Mesa3D's DLLs instead of the native ones, but it did nothing different. I also installed DirectX SDK and tried to start Java3D with the reference driver (-Dj3d.d3ddevice=reference), but it didn't work either.
The same error appears because if OpenGL fails, Java3D tries to use DirectX. If that fails, too, then the pop is shown.
I didn't manage to solve it because, instead of trying to change things at the remote server, I tried to emulate the problem at my own machine by disabling the video driver. I still don't know why both problems aren't equivalent, but after I returned to work on the server and put DirectX's d3dref9.dll at Java's \bin, it worked.
Now I have an entire new problem, as the JVM can't find the DLL if I place it at java.library.path or Tomcat's \bin :) Problems just can't not exist.
Try the following:
Under Windows:
First, open the Display Properties pane by right clicking on desktop screen and choosing Properties item in menu. In that pane, display the Settings tab, and click on the Advanced button. Then in the Troubleshoot tab of the pane that opened, check the Hardware acceleration cursor is at its maximum on Full, confirm your choice and try to run your program again.
If the previous operation didn't resolve your problem, update the OpenGL and DirectX drivers of your graphic card with the latest available ones, and try to run Sweet Home 3D again.