Is it possible to smooth a grid of values displayed as color using methods such as bicubic or linear interpolation with SDL? Currently I use SDL_FillRect to fast fill rectangles. Here is a reference image of interpolation in MATLAB to get an idea of what I want to achieve:
There is no built-in system in SDL to do what you want. You could easily do this using a shader and OpenGL, but you mentioned using SDL_FillRect so I assume you are using an SDL_Surface which suggests you're not using OpenGL. In that case you can easily plot the pixels and calculate the colours however you like.
Related
I'm trying to create 2d drop shadow filter using glsl.
what I'm doing now is first render my pixels into a texture, then apply a gaussian blur filter to it, then draw it to main frame buffer with tinted color, then draw the actual pixels on top.
The result is quite, nice, but the performance is quite low. so it's there a simpler way to create drop shadow using glsl, since the shadow doesn't need all the color components of the actual pixel, only the alpha value?
If the shapes are constant, you could precompute the drop shadow. If not, it is simply a matter of making your blur shader more efficient. There is a lot of information available about this online, such as Rideout's article (archived) or ivankuckir's article (archived).
The trick is to minimize the amount of Texture2D calls and to use built-in features like lineair interpolation.
Also refer to Fastest Gaussian blur implementation
I started learning shaders, playing around on ShaderToy.com. For a project I want to make, I need to create an arbitrary glow filter on WebGL (not Bloom). I want to calculate alpha that I can then use to draw a color glow or use for some animated texture like fire etc.
So far I thought of a few ideas:
Averaging alpha across some area near each pixel - obviously slow
Going in circle around each pixel in one loop then over distance in another to calculate alpha based on how close the shape is to this pixel - probably just as slow
Blur entire shape - sounds like an overkill since I just need the alpha
Are there other ideas for approaching this? All I can find are gaussian blur techniques from bloom-like filters.
Please find this nvidia document for the simple glow effect.
The basic idea is to
render the scene in the back buffer
activate the effect
render some elements of the scene in a FBO
compute the Glow effect
bind the final FBO as a texture and blend this effect with the previously rendered scene in the backbuffer
I can get the histogram of an opengl texture using the glGetHistogram() function.
Similar to the OpenCV histogram function, where a second OpenCV matrix can be given as a mask, I have an OpenGL Texture and a binary mask (either as alpha channel or as a separate texture), and I would like to get a histogram of all the pixels in the image that are not masked.
Is this possible somehow?
glGetHistogram is deprecated since OpenGL 3.1 anyway.
Using compute shaders or occlusion queries would be a better idea.
I have researched and the methods used to make a blooming effects are usually based on having a sharp and blurred image conjoined to give the glow effect. But I want to know how I can make gl_lines(or any line) have brightness. Since in my game I am randomly generating a simple 2D terrain, I wish to make the terrain line segments glow.
Use a fragment shader to calculate the distance from a fragment to the edge and color the fragment with the appropriate color value. You can use a simple control curve to control the radius and intensity anlong of the glow(like in photoshop). It can also be tuned to act like wireframe visualization. The idea is you don't really rasterize points to lines using a draw call, just shade each pixel based on its distance from the corresponding edge.
The difference from using a blur pass is that you will first get better performance, and second - per-pixel control over the glow, you can have non-uniform glow which you cannot get by using blur because it is not really aware of the actual line geometry, it just blindly works on pixels, whereas with edge distance detection you do use the actual geometry data as input without flatting it down to pixels. You can also have stuff like gradient glows, e.g. the glow color is different and changes with the radius.
I want to render a fire effect in OpenGL based on a particle simulation. I have hundreds of particles which have a position and a temperature (and therefore a color) as well as with all their other properties. Simply rendering a solidSphere using glut doesn't look very realistic, as the particles are spread too wide. How can I draw the fire based on the particles information?
If you are just trying to create a realistic fire effect I would use some kind of re-existing library as recommended in other answers. But it seems to me you that you are after a display of the simulation.
A direct solution worth trying might be replace your current spheres with billboards (i.e. graphic image that always faces toward the camera) which are solid white in the middle and fade to transparent towards the edges - obviously positioning and colouring the images according to your particles.
A better solution I feel is to approach the flame as a set of 2D Grids on which you can control the transparency and colour of each vertex on the grid. One could do this in OpenGL by constructing a plane from quads and use you particle system to calculate (via interpolation from the nearest particles you have) the colour and transparency of each vertex. OpenGL will interpolate each pixel between vertexes for you and give you a smooth looking picture of the 'average particles in the area'.
You probably want to use a particle system to render a fire effect, here's a NeHe tutorial on how to do just that: http://nehe.gamedev.net/data/lessons/lesson.asp?lesson=19