Amazon Web Services GPU G2 - amazon-web-services

Today I got setup with AWS GPU G2 instance (g2.2xlarge). I wanted to test out the 3d hardware capability that is offered as mentioned here
http://aws.amazon.com/ec2/instance-types/
Features:
High Frequency Intel Xeon E5-2670 (Sandy Bridge) Processors
High-performance NVIDIA GPU with 1,536 CUDA cores and 4GB of video
memory On-board hardware video encoder designed to support up to eight
real-time HD video streams (720p#30fps) or up to four real-time FHD
video streams (1080p at 30 fps). Support for low-latency frame capture
and encoding for either the full operating system or select render
targets, enabling high-quality interactive streaming experiences.
But when I tried running 3dmark 2011 to try things out. I got an exception "No DXGI adapters found"
Also I noticed, dxdiag says no hardware acceleration available.
So im a bit puzzled as to why I dont see the NVIDIA GPU with 1500+ cuda cores.
Also, It would be great if Azure offered 3d compute capabilities.

To answer my own question, there is some setup required before GPU can be used. One needs to install nvidia grid k520 driver as well as the latest cuda toolkit. Finally install vnc server on the instance and then open relevant ports in the aws instance. Then install vnc client on your local pc and that should give you access to the gpu.
Thanks

Related

How can I run Parsec streaming service on a GCP instance without a GPU? (Error Code 15000)

I am attempting to run Android Studio on a GCP n1-standard-4 instance following this article. Everything works fine until it comes to accessing the instance. However, Chrome RDP gives poor resolution and I would prefer to use something better, which is Parsec. Once I try to connect to the instance, I get the error 15000, 'The host encoder failed to initialize'. I do not have a GPU attached to this instance, so could this be the problem?
Have a look at the Parsec documentation:
Error Codes - 22006 and 15000 (Unable To Initialize Encoder On Your Server) This is an issue on the host's PC. Below are the things that
can cause it.
Check that your GPU is supported! The is the most common issue. If
your GPU is not supported, none of these solutions will help you, and
you will require a new GPU or PC to be able to host. If your GPU is
supported, continue on below.
As you can see, you should use supported GPU with supported drivers for Parsec.
To solve your issue you should use supported GPU, also be sure that you're using supported OS and drivers.
In addition, have a look at the documentation GPUs on Compute Engine and Adding or removing GPUs if you decided to use specific GPU.

Enable NVIDIA VGA in Google GCP Instance

Hi I created a Google Compute Engine with Tesla T4 GPU. I want to be able to run some application that renders some graphics. So i need NVIDIA GPU for display.
OS : Ubuntu 18.04
I tried installing the NVIDIA proprietary drivers , still i am unable to see any valid NVIDIA VGA Device.
But the compute instance i created doesn't seem to be exposing the VGA driver for NVIDIA.
when i do lshw -c display , it only shows Nvidia as a 3D accelerator and not as VGA Compatible device.
Hence my doubt, how do i enable VGA capability for NVIDA Card on Google Cloud Compute Platform?
Google supports virtual display devices only on Windows instances that use any Windows images or higher.
Virtual display devices are not compatible with instances running the Sandy Bridge CPU platform.
Here are the instructions to enable a Virtual Display in GCP instances

Does Virtualbox support Intel Quick Sync Video?

I want to make use of hardware acceleration for decoding an h264 encoded MP4 file. However, since I am using Mac OS, I cannot use Intel quick sync video which only supports Linux and Windows.
Does Virtualbox support Intel Quick Sync Video? If I install a guest Linux distribution in Virtualbox, does Intel quick sync video work?
Quick Sync is Intel's technology, but I think you are interested in GPU virtualization support in VirtualBox. Also it depends from your system hardware whether it is possible to use GPU virtualization, your system needs to support VT-d or GVT-D and your virtualization software need to support GPU passthrough on the other hand.
Qemu, Xen and VMWar should support VT-d, and as far as I know Virtualbox does not support it yet.

Enable Hot/Remove CPU and RAM

How do I enable CPU hot add and Remove for this virtual machine option?Are they any OS specific configuration required or NUMA architecture is required?
Yes, It is specific.
Linux: Newer Linux kernels have support for CPU hot add and CPU hot remove. To verify this, check the documentation for your Linux distribution in the documentation directory of the distribution source code. The documentation contains directions for special boot time switches related to CPU Hot Plug, as well as how to dynamically bring CPUs on and offline.
Windows: Windows Server 2012 (Standard and Datacenter Edition) and Windows Server 2008 Datacenter Edition support CPU hot add, but not CPU hot remove.
Note: Windows Server 2008 Standard and Enterprise Editions do not support CPU hot add.
Check and conform the virtual machine's are using hardware version 7 or later.
For information regarding hot add Refer this links.
VMware KB
Hot Plug Settings

Running an MS unittest that requires a GPU from TFS Build

We have a series of unit tests that require an NVidia GPU for execution. These tests currently fail (I think) because TFSBuild runs as a windows service and Windows (Windows 7 and later) does not allow windows services to access GPUs. Any ideas on how to get around this problem?
You are correct in that the MS Test Execution Engine on a build server does run as a service (just like the MSBuild process) and that services by default cannot access the GPU because of the "Session 0 Isolation" concept that was introduced in Windows Vista.
From what I've researched the only official way to get around this is to purchase an Nvidia Tesla card and have it run in "Tesla Compute Cluster" (TCC) mode which allows services to access the GPU for computing (like CUDA). There is indirect evidence that some Quadro cards also support TCC mode, but I have not found anything official stating which ones do.
I have a question up on Nvidia's forums asking about an inexpensive card for this exact scenario but it does not have any replies as of yet.
EDIT:
I just acquired an Nvidia Quadro K2200 and can confirm that it does indeed support TCC mode and works great running CUDA Unit tests on my build server during the build process.