Coloring objects in ray tracing - c++

I'm trying to render a screen. So far, I have intersected my rays with the objects in the scene and if there is an intersection, I set a random color to that intersection.
What I need to do next is color pixels according to their values. I have watched more than 10 tutorials, read several websites about coloring the pixels. However, the file I'm reading does not have colors of the objects. Instead it has the following:
An ambient light, with rgb color
A point light, with rgb color and position
Objects have ambient, diffuse, specular, mirror reflectance(in rgb) and a phong exponent(a value).
Also, I know that the intensity of the light emitted is proportional to the square of the distance(as distance becomes larger, there is less light on an object).
If I had the color of the object, I could use the algorithm below:
Color3 trace(..)
{
...
Color3 ambient = object.color * 0.3;
Color3 phong = phongModel(..) or object.color;
Color3 reflection = trace(..);
return ambient + phong + reflection;
}
as in stated in: How to compute reflected color?
I don't have the color of the object, just reflectance values. How can I calculate the color of the object?

However, the file I'm reading does not have colors of the objects.
Instead it has the following:
...
Objects have ambient, diffuse, specular, mirror reflectance(in rgb) and a phong exponent(a value).
If you have RGB far all these, that is the color of the object.
Multiply incoming light with how the object reflects it in given angles and the result is the color by which it is perceived.

Related

Showing Red Light on Green Surfaces LWJGL

Hey everyone I'm working with lighting in a 2D Tile Based game and have run into a problem with my lighting calculations, in my game I take greyscale images then color them using shaders whatever color I like whether that be green(rgb=(0,1,0)) or red(rgb=(1,0,0)) or any color. So then I apply my lighting calculations to that textured and colored pixel. The lighting works fine when the light is white(rgb=(1,1,1)) but when it is say red or green it wont show the way I want it to. I know why this is happening of course because realistic a pure red light in a pure green room would reflect no red light so the room would remain dark. What I really want is to see a red light appear over a green surface. So my question is how can I show a red light clearly on a green surface?(or really any other color on any surface)
This is the code for my fragment shader, where attenuation is simply the attenuation for the light, lightColor is obviously the lights rgb value, distance is the distance from the given vector to that light(calculated in the vertex shader) and finally color is the rgb value that is applied to the texture.
Thanks in advance for your help!
vec3 totalDiffuse = vec3(0.0);
for(int i = 0; i < 4; i++)
{
float attFactor = attenuation[i].x + (attenuation[i].y * distance[i]) + (attenuation[i].z * distance[i] * distance[i]);
totalDiffuse = totalDiffuse + (lightColor[i])/attFactor;
}
totalDiffuse = max(totalDiffuse,0.2);
out_Color = texture(textureSampler, pass_textureCoords)*vec4(color,alpha)*vec4(totalDiffuse,1);
And here is an image of what a pure red light looks like on a surface currently, it should be inside the white circle and you may be able to see it is affecting the water a little bit because I give the water a small red component-
Light Demo Image
One possibility would be to change the light calculation.
Calculate a gray scales of the light color and the surface color. Multiply the surface color by the gray scale of the light color and the multiply the light color by the gray scale of the surface color, finally sum them up:
vec4 texCol = texture(textureSampler, pass_textureCoords);
float grayTex = dot(texCol.rgb, vec3(0.2126, 0.7152, 0.0722));
float grayCol = dot(colGray.rgb, vec3(0.2126, 0.7152, 0.0722));
vec3 mixCol = texCol.rgb * grayCol + color.rgb * grayTex;
out_Color = vec4(mixCol * totalDiffuse, texCol.a * alpha);
Note, this algorithm emphasizes the color of the light at the expense of the color of the surface. But that was what you wanted by dipping a green area in red light. Of course, that contradicts the desire to illuminate an area in its own color. If the light is white, then the surface will also shine white.
If you want some light sources with the effect described above, other sources but with the original effect of the question, then I recommend to introduce a parameter that mixes the two effects:
uniform float u_lightTint;
void main()
{
.....
vec3 mixCol = texCol.rgb * grayCol + color.rgb * grayTex;
mixCol = mix(texCol.rgb * color.rgb, mixCol.rgb, u_lightTint);
out_Color = vec4(mixCol * totalDiffuse, texCol.a * alpha);
}
If u_lightTint is set 1.0, then the "new" light calculation is uses, it it is set 0.0, then the original light calculation is use. Both algorithms can be interpolated linearly by u_lightTint.
Alternatively the u_lightTint parameter can be encoded in the alpha channel of the light color:
mixCol = mix(texCol.rgb * color.rgb, mixCol.rgb, color.a);

passing scene information to graphics card through a buffer to render reflections

I am trying to render reflections of objects in my scene onto another model in the same scene. Here is the bit of code in my fragment function that handles specular lighting. (I am doing this in Metal, but just a general explanation of how it's done would be helpful, I can apply it to my application from there).
//Specular Color
float3 reflection = reflect(light.direction, unitNormal);
float specularFactor = pow(saturate(-dot(reflection, unitEye)), vIn.shininess);
float3 specularColor = float3(1.0, 0, 0) * vIn.specularIntensity * specularFactor;
color = color * float4(ambientColor + diffuseColor + specularColor,1);
unitEye is just the unit vector of the eye position (cam position). The line at the bottom is just adding together my 3 colors (ambient, diffuse, and specular). From what I understand, my current specular lighting would be converted into my reflection lighting. For example float3(1,0, 0, 0) would just be whatever color is being reflected onto my object. Here is an image of what i'm trying to accomplish for clarification.

How does the coordinate system work for 3D textures in OpenGL?

I am attempting to write and read from a 3D texture, but it seems my mapping is wrong. I have used Render doc to check the textures and they look ok.
A random layer of this voluemtric texture looks like:
So just some blue to denote absence and some green values to denote pressence.
The coordinates I calculate when I write to each layer are calculated in the vertex shader as:
pos.x = (2.f*pos.x-width+2)/(width-2);
pos.y = (2.f*pos.y-depth+2)/(depth-2);
pos.z -= level;
pos.z *= 1.f/voxel_size;
gl_Position = pos;
Since the texture itself looks ok it seems these coordinates are good to achieve my goal.
It's important to note that right now voxel_size is 1 and the scale of the texture is supposed to be 1 to 1 with the scene dimensions. In essence, each pixel in the texture represents a 1x1x1 voxel in the scene.
Next I attempt to fetch the texture values as follows:
vec3 pos = vertexPos;
pos.x = (2.f*pos.x-width+2)/(width-2);
pos.y = (2.f*pos.y-depth+2)/(depth-2);
pos.z *= 1.f/(4*16);
outColor = texture(voxel_map, pos);
Where vertexPos is the global vertex position in the scene. The z coordinate may be completely wrong however (i am not sure if I am supposed to normalize the depth component or not) but that is not the only issue. If you look at the final result:
There is a horizontal sclae component problem. Since each texel represents a voxel, the color of a cube should always be a fixed color. But as you can see I am getting multiple colors for a single cube on the top faces. So my horizontal scale is wrong.
What am i doing wrong when fetching the texels from the texture?

How to render a radial field in OpenGL?

How would I render a 2D radial field in OpenGL? I know I can render it pixel by pixel but I'm wondering if there are more efficient solutions? I don't mind if it requires OpenGL3+ functionality.
How familiar are you with shaders? Because I'm thinking an easy-ish answer would be to render a quad and then write a fragment shader to color the quad based off of how far each pixel is from the center.
Pseudocode:
vertex shader:
vec2 center = vec2((x1+x2)/2,(y1+y2)/2); //pass this to the fragment shader
fragment shader:
float dist = distance(pos,center); //"pos" is the interpolated position of the fragment. Its passed in from the vertex shader
//Now that we have the distance between each fragment and the center, we can do all kinds of stuff:
gl_fragcolor = vec4(1,1,1,dist) //Assuming you're drawing a unit square, this will make each pixel's transparency smoothly vary from 1 (right next to the center) to 0 (on the edce of the square)
gl_fragcolor = vec4(dist, dist, dist, 1.0) //Vary each pixel's color from white to black
//etc, etc
Let me know if you need more detail

Blending lightmap with diffuse texture

I'm using c++, opengl 4.0 and glsh shader language.
I'm wondering how to correctly blend diffuse texture with lightmap texture.
Let's assume that we have a room. Every object has diffuse texture and lightmap. In every forum like gamedev.net or stackoverflow people say, that those textures should be multiplied. And in most cases it gives good results, but sometimes some objects are very close to light source (for example white bulb). This light source for close objects generates white lightmap. But when we multiply diffuse texture with white lightmap, then we get original diffuse texture color.
But if light source is close to some object, then color of light should be dominant
It means, that if white, strong light is close to red wall, then some part of this wall should be white, not red!
I think I need something more than just one lightmap. Lightmap don't have information about light intensity. It means, that the most shiny color is just maximum diffuse color.
Maybe I should have 2 textures - shadowmap and lightmap? Then equations should looks like this:
vec3 color = shadowmapColor * diffuseTextureColor + lightmapColor;
Is it good approach?
Generally speaking, if you're still using lightmaps, you are probably also not using HDR rendering. And without that, what you want is not particularly reasonable. Unless your light map provides the light intensity as an HDR floating-point value (perhaps in a GL_R11F_G11F_B10F or GL_RGBA16F format), this is not going to work very well.
And of course, you'll have to do the usual stuff that you do with HDR, such as tone mapping and so forth.
Lastly, your additive equation makes no sense. If the light map color represents the diffuse interaction between the light and the surface, then simply adding the light map color doesn't mean anything. The standard diffuse lighting equation is C * (dot(N, L) * I * D), where I is the light intensity, D is the distance attenuation factor, and C is the diffuse color. The value from the lightmap is presumably the parenthesized quantity. So adding it doesn't make sense.
It still needs to multiply with the surfaces's diffuse color. Any over-brightening will be due to the effective intensity of the light as a function of D.
What you need is the distance (or to save some sqrt-ing, the squared distance) of the light source to the fragment being illuminated. Then you can, in the simplest case, interpolate linearly between the light map and light source contributions:
The distance is a simple calculation which can be done per vertex in you vertex shader:
in vec4 VertexPosition; // let's assume world space for simplicity
uniform vec4 LightPosisiton; // world-space - might also be part of a uniform block etc.
out float LightDistance; // pass the distance to the fragment shader
// other stuff you need here ....
void main()
{
// do stuff
LightDistance = length(VertexPosition - LightPosisiton);
}
In your fragment shader, you use the distance to compute interpolation factors betweem light source and lightmap contributions:
in float LightDistance;
const float MAX_DISTANCE = 10.0;
uniform sampler2D LightMap;
// other stuff ...
out vec4 FragColor;
void main()
{
vec4 LightContribution;
// calculate illumination (including shadow map evaluation) here
// store in LightContribution
vec4 LightMapConstribution = texture(LightMap, /* tex coords here */);
// The following DistanceFactor will map distances in the range [0, MAX_DISTANCE] to
// [0,1]. The idea is that at LightDistance >= MAX_DISTANCE, the light source
// doesn't contribute anymore.
float DistanceFactor = min(1.0, LightDistance / MAX_DISTANCE);
// linearly interpolat between LightContribution and LightMapConstribution
vec4 FinalContribution = mix(LightContribution, LightMapConstribution, DistanceFactor);
FragColor = WhatEverColor * vec4(FinalContribution.xyz, 1.0);
}
HTH.
EDIT: To factor in Nicol Bolas' remarks, I assume that the LightMap stores the contribution encoded as an RGB color, storing the contributions for each channel. If you actually have a single channel lightmap which only store monochromatic contributions, you'll have to either use the surface color, use the color of the light source or reduce the light source contribution to a single channel.
EDIT2: Although this works mathematically, it's definitely not physically sound. You might need some correction of the final contribution to make it at least physically plausible. If your only aiming for effect, you can simply play around with correction factors until you're satisfied with the result.