How to clip only an object ( not all sence ) use glClipPlane? - opengl

The question is as the title. If not use glClipPlane, please suggest me a solution for clip a 3D Object ( like Sphere or Torus... ).
Thanks in advance!

In order for glClipPlane to work you have to Enable it using glEnable();
If you want to clip just one object your code should look like this:
//draw some stuff
glEnable(GL_CLIP_PLANE_1);
glClipPlane(...);
//draw clipped object
glDisable(GL_CLIP_PLANE_1);
//draw some stuff

Related

Drawing a 3d object on a 2D screen plane without angle of view

I have a problem, I need to draw a 3D object in such a way that I can move it along the screen plane and rotate so that the angle of view is as if I always looked at it fixedly from one point.
I use the GLM library for working with matrices. I tried to use glm::ortho, but I can not operate with the z coordinate, respectively, the model does not rotate. if i use glm::perspective, then the model looks like i need only in the center of the screen. I mean, for example, the character model is depicted in the window of a game. How would you not move the window at any point on the screen, you look at the model directly, and not as if looking out of the corner.
I apologize, I do not know how to explain it in normal language, I hope it is understandable.

draw texture with polygon shape

I am trying to draw the texture with polygon shape. What is the logic and how will i draw texture in polygon shape with dynamically?
I am working on cutting the rectangle, going to apply the texture in sliced shape(it may any shape).
Can any one assist me?
Don't think you can achieve it only using cocos2d classes. You need to draw with openGL. You already have polygon vertices, triangulate your polygon(here is an example) and set your texture to draw it.
You can also use LevelHelper (unfortunatelly - not free :( ) which has a built-in cutting engine : link

Rotating Sprite Object Directx9

I am making little game in C++ with Directx 9. I read some tutorial and i can draw my sprite object and move it etc.But i want to rotate it.I try :
void D3DGraphics::DrawSprite(LPDIRECT3DTEXTURE9 &texture,ID3DXSprite* pSprite, D3DXVECTOR3* pos, D3DXVECTOR3* dim ){
pDevice->Clear( 0,NULL,D3DCLEAR_STENCIL,D3DCOLOR_XRGB(0,0,0),0.0f,0 );
D3DXMATRIX matrix;
D3DXMatrixRotationX(&matrix, 0.05f);
pSprite->Begin(D3DXSPRITE_ALPHABLEND);
hresult= pSprite->SetTransform(&matrix);
hresult= pSprite->Draw(texture, NULL, dim, pos, 0xFFFFFFFF);
pSprite->End();
}
When i remove settransform part, it works perfectly.I checked hresult s they returned S_OK.Any idea?
I think the main problem is, that you try to use 3D-Transformations for a 2D usecase. Try to work with D3DXMatrixTransformation2D (doc) for the matrix. But there are additional problems. I presume the name dim stands for dimension, but this parameter of Draw()(doc) sets the center of the sprite, not the size. Finally, a fullscreen-clear of your stencilbuffer without any use of it is unnecessary expensive, but I don't know your other code.
Hope that helps :)
I think it does not make sense to rotate the sprite, since a sprite is normally come with a 2d texture and used for bill-boarding, if you rotate it around X-axis, it's just like a picture flips around the X-axis, you can not get the effect you want, I have try to rotate it around the Z-axis, this works well since the texture was usually on the XOY plane.
if you want to make a spinning ball, why not use a 3D mesh? sprite is not a good choice for this request.

3D spheres and adding textures in OpenGL

I have been asked to do 3D sphere and adding textures to it so that it looks like different planets in the Solar System. However 3ds max was not mentioned as mandatory.
So, how can I make 3D spheres using OpenGL and add textures to it? using glutsphere or am I suppose to do it some other method and how to textures ?
The obvious route would be gluSphere (note, it's glu, not glut) with gluQuadricTexture to get the texturing done.
I am not sure if glutSolidSphere has texture coordinates (as far as I can remeber they were not correct, or not existant). I remember that this was a great resource to get me started on the subject though:
http://paulbourke.net/texture_colour/texturemap/
EDIT:
I just remembered that subdividing an icosahedron gives a better sphere. Also texture coordinates are easier to implement that way:
see here:
http://www.gamedev.net/topic/116312-request-for-help-texture-mapping-a-subdivided-icosahedron/
and
http://www.sulaco.co.za/drawing_icosahedron_tutorial.htm
and
http://student.ulb.ac.be/~claugero/sphere/

Per-model local rotation breaks openGL Lighting

I'm having trouble with OpenGL lighting. My issue is this: When the object has 0 rotation, the lighting is fine- otherwise the lighting works, but rotates with the object, instead of staying fixed in regards to the scene.
Sounds simple, right? The OpenGL FAQ has some simple advice on this: coordinates passed to glLightfv(GL_LIGHT0, GL_POSITION...) are multiplied by the current MODELVIEW matrix. So I must be calling this at the wrong place... except I'm not. I've copied the MODELVIEW matrix into a variable to debug, and it stays the same regardless of how my object is rotated. So it has to be something else, but I'm at a loss as to what.
I draw the model using glDrawArrays, and position my model within the world using glMatrixMult on a matrix built from a rotation quaternion and a translation. All of this takes place within glPushMatrix/glPopMatrix, so shouldn't have any side effect on the light.
A cut down version of my rendering process looks like this:
//Setup our camera
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
cameraMatrix = translate(Vector3D(Pos.mX,Pos.mY,Pos.mZ)) * camRot.QuatToMatrix();
glMultMatrixf((GLfloat*)&cameraMatrix);
//Position the light now
glEnable(GL_LIGHTING);
glEnable(GL_LIGHT0);
GLfloat lp[4] = {lightPos.mX, lightPos.mY, lightPos.mZ, 1.0f};
glLightfv(GL_LIGHT0, GL_POSITION,(GLfloat*) lp);
//Loop, doing this for each model: (mRot, mPos, and mi are model member variables)
matrix = translate(Vector3D(mPos.mX,mPos.mY,mPos.mZ)) * mRot.QuatToMatrix();
glPushMatrix();
glMultMatrixf((GLfloat*)&matrix);
glBindBuffer(GL_ARRAY_BUFFER, mi->mVertexBufHandle); //Bind the model VBO.
glDrawArrays(GL_TRIANGLES, 0, mi->verts); //Draw the object
glPopMatrix();
I thought the normals might be messed up, but when I render them out they look fine. Is there anything else that might effect openGL lighting? The FAQ mentions:
If your light source is part of a
light fixture, you also may need to
specify a modeling transform, so the
light position is in the same location
as the surrounding fixture geometry.
I took this to mean that you'd need to translate the light into the scene, kind of a no-brainer... but does it mean something else?
It might be minor, but in this line:
glLightfv(GL_LIGHT0, GL_POSITION,(GLfloat*) &lp);
remove the & (address operator). lp will already give you the array-address.
This was awhile back, but I did eventually figure out the problem. The issue I thought I was having was that the light's position got translated wrong. Picture this: the light was located at 0,0,0, but then I translated and rotated my mesh. If this had been the case, I'd have to do as suggested in the other answers and make certain I was placing my glLightfv calls in the right place.
The actual problem turned out to be much simpler, yet much more insidious. It turns out I wasn't setting the glNormalPointer correctly, and so it was being fed garbage data. While debugging, I'd render the normals to check that they were correct, but when doing so I'd manually draw them based on the positions I'd calculated. A recommendation to future debuggers: when drawing your debug info normal rays, make sure you feed the debug function /the same data/ as openGL gets. In my case, this would mean pointing my normal ray draw function's glVertexPointer to the same place as the model's glNormalPointer.
Basically an OpenGL light behaves like a vertex. So in your code it's transformed by cameraMatrix, while your meshes are transformed by cameraMatrix * matrix. Now, it looks like both cameraMatrix and matrix contain mrot.QuatToMatrix(), that is: there is a single rotation matrix there, and the light gets rotated once, while the objects get rotated twice. It doesn't look right to me, unless your actual code is different; the mRot matrix you use for each mesh should be its own, e.g. mRot[meshIndex].