I want to position an image(.png) in my window using OpenGL ES...I have done that but I want to set the transparency of that image.
Related
I am trying to understand Signed Distance Field font rendering and from what I understand they are used to preserve quality when zooming, rotating rendered fonts. Is there any difference in the quality of a TTF font (rendered onto a texture) vs a font rendered via SDF if the font is never zoomed or rotated? For example, in a 2D game would there be any improvement in rendering via SDF vs a TTF font rendered onto a texture (other than reduced memory usage)?
I'm trying to create a display with a complex OpenGL image and some spinboxes on the image. Using http://doc.qt.digia.com/qq/qq26-openglcanvas.html I'm able to have a two layers object (inheriting from QGraphicsScene) with a simple OpenGL image as background and the controls on foreground.
So, now I'm trying to display my true OpenGL image as background. This image is created by:
A quad mapped on a structure,
Some small 2D objects represented by 2D textures with alpha channel and specific shaders, drawn on the quad (upper z value)
Some polylines.
With this image I have some strange behavior. The 2D textured objects are drawn with a white background. Some experiments seem to indicate that, in the drawing of this complex OpenGL image the alpha channel is disabled.
I tried different configurations for the QGLWidget used as viewport of the QGraphicsView but without result.
So I need help to be able to create this OpenGL image with the right transparency effects.
I'm looking for a tutorial on how to draw a .PNG using DirectX 10, although I'm having no luck. Anyone know where I can find more information on this? I want to create a 2D game
You can load a png as a texture and render it the same way you render other images. For the transparency you can use Alphablending which you enable with Renderstates. (Tutorial, only googled not tested)
you can use D3DX10CreateShaderResourceViewFromFile to create a shader-resource view from a PNG file. This tutorial explains how it works.
In short:
Create a shader to draw a rectangle
Create vertex layout
Create a vertex buffer which stores the vertices of the rectangle
Create a sampler to sample the texture
Create a texture object
Draw the scene
I've been playing with LWJGL a little, as a bit of a step up from Pygame. I'm trying to render a sprite and I was wondering if LWJGL has a function similar to Pygame's colorkey that lets you define a color in an image that will be rendered as transparent. Do you have to use an alpha channel in OpenGL?
OpenGL doesn't have any built in color keying support. You'll need to either manually swap your key for alpha on the CPU, or use a custom shader that replaces it on the fly.
Right now I'm drawing a cube with OpenGL, I'm using Windows and WGL context. I have blending enabled so my cube looks semi transparent. Basically the background == the clear color (Black). I'd like to be able to save the image in raw RGBA format which I can then make into a png. I basically want the cube to blend in with a NULL background (0,0,0,0). How could I save the OpenGL output and have the background color be (0,0,0,0) (transparent) Without using a color mask (like 255,0,255).
Thanks
Just draw the cube, setting the clear color to (0, 0, 0, 0), and save the output using glReadPixels.