I'm rendering a partially transparent rectangle over my game (d3d9) (eventually to be part of a GUI) and I want to blur the contents behind this rectangle. I understand the best way to do with would be with a shader, but that's basically the extent of what I know. Many of the implementations I've found seem far more complex than I should need or they are blurring an image whereas I'm blurring stuff that's already been drawn. A similar question has been asked here for C# but received fairly vague answers.
Rather than drawing the objects you want to blur to the screen, draw them to a texture that is bound as the render target. Then, you bind that texture to a compute shader and save the blur result in some other texture. Then you can draw the blurred final result as a "full screen quad", basically a 2d rect that covers the entire screen and will have your scene texture on it. This article is from 2003, so the source code might not be useful, but, it covers the basic idea.
Related
I started learning shaders, playing around on ShaderToy.com. For a project I want to make, I need to create an arbitrary glow filter on WebGL (not Bloom). I want to calculate alpha that I can then use to draw a color glow or use for some animated texture like fire etc.
So far I thought of a few ideas:
Averaging alpha across some area near each pixel - obviously slow
Going in circle around each pixel in one loop then over distance in another to calculate alpha based on how close the shape is to this pixel - probably just as slow
Blur entire shape - sounds like an overkill since I just need the alpha
Are there other ideas for approaching this? All I can find are gaussian blur techniques from bloom-like filters.
Please find this nvidia document for the simple glow effect.
The basic idea is to
render the scene in the back buffer
activate the effect
render some elements of the scene in a FBO
compute the Glow effect
bind the final FBO as a texture and blend this effect with the previously rendered scene in the backbuffer
I want to create an app in Cocos2d/Cocos2dx in which i have an image which is not visible but when i move my finger on device it start drawing. Only that part of image draws where i move my finger.
Thanks in Advance
There are two ways I can think of drawing an image.
The first way would be like a brush. You would use RenderTexture and draw/visit a brush sprite to draw it into this texture. If you just need to draw with solid colors (can have opacity) you could also use the primitive draw commands (drawCircle, drawPoly, drawSegment). You will need a high rate of touch tracking and will likely want to draw segments or bezier curves between touch movements to catch fast movements.
http://discuss.cocos2d-x.org/t/using-rendertexture-to-render-one-sprite-multiple-times/16332/3
http://discuss.cocos2d-x.org/t/freehand-drawing-app-with-cocos2d-x-v3-3-using-rendertexture/17567/9
Searching on how other drawing games work would be useful.
The second way I can envision it is similar to revealing except using an inverse mask. So you would draw a set image, but reveal that image by drawing.
http://www.raywenderlich.com/4428/how-to-mask-a-sprite-with-cocos2d-2-0
There are more elaborate ways to handle the drawing into the RenderTexture in order to have the brush design tile correctly and repeat based on a given size, but that'll involve making something closer to an image editing tool.
I have a very simple OpenGL (3.2) setup, no lighting, perspective projection and a simple shader program (applies projection transformation and uses texture2D to read the color from the texture).
The camera is looking down the negative z-axis and I draw a few walls and pillars on the x-y-plane with a texture (http://i43.tinypic.com/2ryszlz.png).
Now I'm moving the camera in the x-y-plane and this is what it looks like:
http://i.imgur.com/VCrNcly.gif.
My question is now: How do I handle the flickering of the wall texture?
As the camera centers the walls, the view angle onto the texture compresses the texture for the screen, so one pixel on the screen is actually several pixels on the texture, but only one is chosen for display. From the information I have access to in the shaders, I don't see how to perform an operation which interpolates the required color.
As this looks like a problem nearly every 3D application should have, the solution is probably pretty simple (I hope?).
I can't seem to understand the images, but from what you are describing you seem to be looking for MIPMAPPING. Please google it, it's a very easy and very generally used concept. You will be able to use it by adding one or two lines to your program. Good Luck. I'd be more detailed but I am out of time for today.
We are working on porting some software from Windows to MacOS.
When we bring up a texture with an alpha channel, the pixels that are fully Opaque work as expected, pixels that are Fully transparent work as expected (You can see the wall behind).
However, pixels that are semi-transparent >0% opacity and < 100% opacity, render poorly and you are able to see through the wall behind and you can see the skybox through the texture and the wall behind it.
I know you will likely need more information and I will be happy to provide. I am not looking for a quick fix solution, I really have just run out of ideas and need someone else to take a guess as whats wrong.
I will post the solution and correct answer goes to whoever pointed me that way.
It is not the texture being placed right on the wall, it is placed on a static mesh close to the wall.
(Unable to post images as this is my first question here)
You are sorting transparent objects by depth, yes? I gather from your question, the answer will be no.
You cannot just render transparent objects the way you do opaque ones. Your renderer is just a fancy triangle drawer. As such, it has no real concept of objects, or even transparency. You achieve transparency by blending the transparent pixels with whatever happens to be in the framebuffer at the time you draw the transparent triangle.
It simply cannot know what it is you intend to draw behind the triangle later. Therefore, the general method for transparent objects is to:
Render all opaque objects first.
Render transparent objects sorted back-to-front. Also, turn off depth writes (depth tests are fine).
This might not be an answer but might useful.
Making an object which applies a transparent texture in Maya/3d MAX and export as fbx and import to unreal ?
is it possible to create a GLSL shader to get any object to be surrounded by a glowing effect?
Let's say i have a 3d cube and if it's selected the cube should be surrounded by a blue glowing effect. Any hints?
Well there are several ways of doing this. If each object is also represented in a winged edge format then it is trivial to calculate the silhouette and then extrude it to generate a glow. This however is, very much, a CPU method.
For a GPU method you could try rendering to an offscreen buffer with the stencil set to increment. If you then perform a blur on the image (though only writing to pixels where the stencil is non zero) you will get a blur around the edge of the image which can then be drawn into the main scene with alpha blending. This is more a blur than a glow but it would be relatively easy to re-jig the brightness so that it renders a glow.
There are plenty of other methods too ... here are a couple of links for you to look through:
http://http.developer.nvidia.com/GPUGems/gpugems_ch21.html
http://www.codeproject.com/KB/directx/stencilbufferglowspart1.aspx?display=Mobile
Have a hunt round on google because there is lots of information :)