Black Planes in Blender - opengl

Blender noob trying to render an object but certain parts of it keep coming out black. I'm not sure why. My image is here which might help:
A few extra details:
The shelves I've been trying to render are just a collection of planes which I've aligned to form shelves
All of them have no material or texture on them
Despite this, the very furthest plane (back of the shelves) and the front one which says laundry render properly in white, but the others render in black
I've tried adding material/texture to all of them, but they still come out black
My light source is set to "Sun" and sits a little bit behind the camera and shines directly onto the shelves
Is this a lighting issue?? If so any suggestions for how to fix?

Your "normals" are flipped. In other words, each surface has a "front" and "back" side. The backs of yours are facing in (which makes sense, when you consider the method you created the volume...you're looking at the inside surfaces).
To fix your issue, select the dark faces and do Mesh->Normals->Flip

It looks like they are not getting any light on their faces because of the angle compared to the light source. Try adding a point light source to your scene just in front of your shelf objects to see if that changes the outcome.

Flipping the 'normals' worked for me, only after switching to render cycles from blender render. Hope that helps someone else.

Related

Algorithm to correct wrong normal direction on 3D model

i am new to 3D math, but i am facing a problem that when i am importing a model to CAD program - (single sided light/ open scene graph) - there are a lot of faces from meshs, Black !
When i flip the normal manually for each of the faces i got the correct material and texture.
my question is, if i know vertices table and normals table for every mesh in the model, can i write an algorithm that will correct all wrong normal direction automatically, i mean it must detect the wrong normals without any help form users and correct them.
i thought about an idea that needs image processing, i know nothing about image processing, so if you can help with from where i should start to achieve this :
first i will assume every blacked face is a wrong normal.
second i will direct a light form the camera to the mesh, and if are all the pixels in black then flip the normal.
and do this for all meshs.
i think i will have a speed issue but that all what i think about.
thanks.
the wrong red plane and the black one, they are in the same model and both of them must have red color, but as i mentioned the black one his normals are flipped.
OSG has a visitor class in osgUtil that can calculate all the vertex normals for you, called smoothingVisitor. You just need to point it at your model after you've loaded it, eg:
// recalculate normals with the smoothing visitor
osgUtil::SmoothingVisitor sv;
loadedModel->accept(sv);
This will clobber existing vertex normals, or create new ones if they don't exist. Just looking quickly at it, it looks like it assumes your geometry is triangles, not quads as you've drawn, but I could be wrong.
I suspect something is up with the indexing, if not then it is probably better to ignore the normal's table entirely. I'm not really sure about the details, but I assume it is 1 normal/vertex, so approaching it would involve calculating poligon normals, which you can do with:
normalize(cross_product(vert0 - vert1, vert0 - vert2));
After that averaging the normals of the poligons sharing the vertex should do.
Calculating the dot product of the freshly calculated normal and the original normal would reveal if the original normal is way off.
Anyways for your problem StackOverflow isn't really the one, there are other Stack Exchange sites that are actually specialized for similar problems. Also there are way too few informations about the issue so helping isn't easy as well.

Rendering Point Sprites across cameras in cube maps

I'm rendering a particle system of vertices, which are then tessellated into quads in a geom shader, and textured/rendered as point sprites. Then they are scaled in size depending on how far away they are from the camera. I'm trying to render out every frame of my scene into cube maps. So essentially I place six cameras into my scene and point them in each direction for the face of the cube and save an image.
My point sprites are of varying sizes. When they near the border of one camera, (if they are large enough) they appear in two cameras simultaneously. Since point sprites are always facing the camera, this means that they are not continuous along the seam when I wrap my cube map back into 3d space. This is especially noticeable when the points are quite close to the camera, as the points are larger, and stretch further across into both camera views. I'm also doing some alpha blending, so this may be contributing to the problem as well.
I don't think I can just cull points that near the edge of the camera, because when I put everything back into 3d I'd think there would be strange areas where the cloud is more sparsely populated. Another thought I had would be to blur the edges of each camera, but I think this too would give me a weird blurry zone when I go back to 3d space. I feel like I could manually edit the frames in photoshop so they look ok, but this would be kind of a pain since it's an animation at 30fps.
The image attached is a detail from the cube map. You can see the horizontal seam where the sprites are not lining up correctly, and a slightly less noticeable vertical one on the right side of the image. I'm sure that my camera settings are correct, because I've used this same camera setup in other scenes and my cubemaps look fine.
Anyone have ideas?
I'm doing this in openFrameworks / openGL fwiw.
Instead of facing the camera, make them face the origin of the cameras? Not sure if this fixes everything, but intuitively I'd say it should look close to OK. Maybe this is already what you do, I have no idea.
(I'd like for this to be a comment, but no reputation)

"Weird" Artifacts With Blending

This is what I'm seeing:
To provide some perspective for the image, the torus is behind the model. The model is transparent. Those lines appear on the model. I don't want those lines to appear.
Can someone explain what I'm seeing? I don't know what to search for. I tried:
Weird lines
Line artifacts
Artifacts
etc. etc. but I could find nothing relevant. I understand that my question is vague, but, if someone could name my problem, I think I can identify the problematic code!
If you render transparencies you need to keep different thing in mind.
Normally you render in OpenGL with z-buffer testing and writing enabled.
So if a face is rendered OpenGL looks which pixels are visible testing them against the z-buffer. If it is visible it is drawn with the blending setting and its z value is written into the z-buffer. If not it is discarded.
If you don't render your faces in the correct z-order (from back to front out of the view direction) they are rendered in the order they arrive in the pipeline.
The artifacts appear when e.g. for some areas the pixels of the back-faces are rendered before the overlaying pixels of the front-faces, and if for some areas the pixels of the front faces are rendered before the one of the back face. So for some areas of your object you have a blending of background - backface - frontface and for some areas you only have background - forntface.
I know that explanation is not accurate, but i hope you get what i mean. Otherwise feel free to ask.

Texture Transparency with Unreal (See through walls)

We are working on porting some software from Windows to MacOS.
When we bring up a texture with an alpha channel, the pixels that are fully Opaque work as expected, pixels that are Fully transparent work as expected (You can see the wall behind).
However, pixels that are semi-transparent >0% opacity and < 100% opacity, render poorly and you are able to see through the wall behind and you can see the skybox through the texture and the wall behind it.
I know you will likely need more information and I will be happy to provide. I am not looking for a quick fix solution, I really have just run out of ideas and need someone else to take a guess as whats wrong.
I will post the solution and correct answer goes to whoever pointed me that way.
It is not the texture being placed right on the wall, it is placed on a static mesh close to the wall.
(Unable to post images as this is my first question here)
You are sorting transparent objects by depth, yes? I gather from your question, the answer will be no.
You cannot just render transparent objects the way you do opaque ones. Your renderer is just a fancy triangle drawer. As such, it has no real concept of objects, or even transparency. You achieve transparency by blending the transparent pixels with whatever happens to be in the framebuffer at the time you draw the transparent triangle.
It simply cannot know what it is you intend to draw behind the triangle later. Therefore, the general method for transparent objects is to:
Render all opaque objects first.
Render transparent objects sorted back-to-front. Also, turn off depth writes (depth tests are fine).
This might not be an answer but might useful.
Making an object which applies a transparent texture in Maya/3d MAX and export as fbx and import to unreal ?

OpenGL Spotlight shining through from rear-face

I have a Spotlight source in OpenGL, pointing towards a texture mapped sphere.
I rotate the lightsource with the sphere, such that if I rotate the sphere to the 'non-light' side, that side should be dark.
The odd part is, the spotlight seems to be shining through my sphere (it's a solid, no gaps between triangles. The light seems to be 'leaking' through to the other side.
Any thoughts on why this is happening?
Screenshots:
Front view, low light to emphasize the problem
Back view, notice the round area that is 'shining through'
Its really hard to tell from the images, but:
Check if GL_LIGHT_MODEL_TWO_SIDE is being set (two sided lighting), but more importantly have a look at the normals of the sphere you are rendering.
Edit: Also - change the background colour to something lighter. Oh and make sure you aren't rendering with alpha blending turned on (maybe its a polygon sorting issue).
OK, I'm a nob - I was specifying my normals, but not calling glEnableClientState(GL_NORMAL_ARRAY). Hence all normals were facing one direction (I think that's the default, no?)
Anyway - a lesson learned - always go back over the basics.