Object rotation using keys - c++

I want to rotate a cube using keys. This is a part of the code. When I press LEFT key, cube rotates to left, etc. My goal is to rotate cube all around, so I have to rotate it by x and y axis which causes a problem.
I have defined mat4 rotation; and used it to assign a rotation when I press and hold a key. When I hold the key, it is rotating, for example to left. Then I release the key and the object gets back to initial position (camera gets back to initial position, since object is not moving). I think this problem is causing the auto rotateMat = rotation; line which is defined below the key functions.
What am I doing wrong?
mat4 rotation; //global
if(keysPressed[GLFW_KEY_LEFT]){
timer -= delta;
rotation = rotate(mat4{}, timer * 0.5f, {0, 1, 0});
}
if(keysPressed[GLFW_KEY_RIGHT]){
timer += delta;
rotation = rotate(mat4{}, timer * 0.5f, {0, 1, 0});
}
if(keysPressed[GLFW_KEY_UP]){
timer += delta;
rotation = rotate(mat4{}, timer * 0.5f, {1, 0, 0});
}
if(keysPressed[GLFW_KEY_DOWN]){
timer -= delta;
rotation = rotate(mat4{}, timer * 0.5f, {1, 0, 0});
}
...
program.setUniform("ModelMatrix", rotation* cubeMat);
cube.render();
UPDATE:
So the problem to this was solved when I used matrix variable as global variable, not local.

There are multiple ways how such an interaction can be implemented. One of the easier ones is to create a relative translation in every frame instead of a global one and add it to the current rotation:
For this, one has to store the sum of all rotations in a global varialbe
//Global variable
mat4 total_rotate;
And calculate the relative translation in every frame:
//In the function
mat4 rotation;
if(keysPressed[GLFW_KEY_LEFT]){
rotation = rotate(mat4{}, delta, {0, 1, 0});
}
if(keysPressed[GLFW_KEY_RIGHT]){
rotation = rotate(mat4{}, -delta, {0, 1, 0});
}
if(keysPressed[GLFW_KEY_UP]){
rotation = rotate(mat4{}, delta, {1, 0, 0});
}
if(keysPressed[GLFW_KEY_DOWN]){
rotation = rotate(mat4{}, -delta, {1, 0, 0});
}
total_rotate = total_rotate * rotation;
...
program.setUniform("ModelMatrix", total_rotate * cubeMat);
cube.render();
As an alternative, you could store the two rotation angles instead and calculate the matrix in every frame:
//Global variables
float rot_x = 0.0f, rot_y = 0.0f;
//In every frame
if(keysPressed[GLFW_KEY_LEFT]){
rot_x += delta;
}
if(keysPressed[GLFW_KEY_RIGHT]){
rot_x -= delta;
}
//Same for y
auto rotation = rotate(rotate(mat4{}, rot_y, {0, 1, 0}), rot_x, {1, 0, 0}
...
program.setUniform("ModelMatrix", rotation * cubeMat);
cube.render();

Related

Incorrect render of a cube mesh in DirectX 11

I am practicing DirectX 11 following Frank Luna's book.
I have implemented a demo that renders a cube, but the result is not correct.
https://i.imgur.com/2uSkEiq.gif
As I hope you can see from the image (I apologize for the low quality), it seems like the camera is "trapped" inside the cube even when I move it away. There is also a camera frustum clipping problem.
I think the problem is therefore in the definition of the projection matrix.
Here is the cube vertices definition.
std::vector<Vertex> vertices =
{
{XMFLOAT3(-1, -1, -1), XMFLOAT4(1, 1, 1, 1)},
{XMFLOAT3(-1, +1, -1), XMFLOAT4(0, 0, 0, 1)},
{XMFLOAT3(+1, +1, -1), XMFLOAT4(1, 0, 0, 1)},
{XMFLOAT3(+1, -1, -1), XMFLOAT4(0, 1, 0, 1)},
{XMFLOAT3(-1, -1, +1), XMFLOAT4(0, 0, 1, 1)},
{XMFLOAT3(-1, +1, +1), XMFLOAT4(1, 1, 0, 1)},
{XMFLOAT3(+1, +1, +1), XMFLOAT4(0, 1, 1, 1)},
{XMFLOAT3(+1, -1, +1), XMFLOAT4(1, 0, 1, 1)},
};
Here is how I calculate the view and projection matrices.
void TestApp::OnResize()
{
D3DApp::OnResize();
mProj = XMMatrixPerspectiveFovLH(XM_PIDIV4, AspectRatio(), 1, 1000);
}
void TestApp::UpdateScene(float dt)
{
float x = mRadius * std::sin(mPhi) * std::cos(mTheta);
float y = mRadius * std::cos(mPhi);
float z = mRadius * std::sin(mPhi) * std::sin(mTheta);
XMVECTOR EyePosition = XMVectorSet(x, y, z, 1);
XMVECTOR FocusPosition = XMVectorZero();
XMVECTOR UpDirection = XMVectorSet(0, 1, 0, 0);
mView = XMMatrixLookAtLH(EyePosition, FocusPosition, UpDirection);
}
And here is how I update the camera position on mouse move.
glfwSetCursorPosCallback(mMainWindow, [](GLFWwindow* window, double xpos, double ypos)
{
TestApp* app = reinterpret_cast<TestApp*>(glfwGetWindowUserPointer(window));
if (glfwGetMouseButton(window, GLFW_MOUSE_BUTTON_LEFT) == GLFW_PRESS)
{
float dx = 0.25f * XMConvertToRadians(xpos - app->mLastMousePos.x);
float dy = 0.25f * XMConvertToRadians(ypos - app->mLastMousePos.y);
app->mTheta += dx;
app->mPhi += dy;
app->mPhi = std::clamp(app->mPhi, 0.1f, XM_PI - 0.1f);
}
else if (glfwGetMouseButton(window, GLFW_MOUSE_BUTTON_RIGHT) == GLFW_PRESS)
{
float dx = 0.05f * XMConvertToRadians(xpos - app->mLastMousePos.x);
float dy = 0.05f * XMConvertToRadians(ypos - app->mLastMousePos.y);
app->mRadius += (dx - dy);
app->mRadius = std::clamp(app->mRadius, 3.f, 15.f);
}
app->mLastMousePos = XMFLOAT2(xpos, ypos);
});
Thanks.
The root problem here was in the constant buffer vs. CPU update.
HLSL defaults to column-major matrix definitions per Microsoft Docs. DirectXMath uses row-major matrices, so you have to transpose while updating the Constant Buffer.
Alternatively, you can declare the HLSL matrix with the row_major keyword, #pragma pack_matrix, or the /Zpr compiler switch.

Convert OpenGL coord system to custom and back

I have a task to draw 3D objects on the ground using OpenGL. I use OpenGL left-handed coord system where Y axis is up. But 3D objects and camera orbiting objects should use different coord systems with following properties:
XY plane is a ground plane;
Z-axis is up;
Y-axis is "North";
X-axis is "East";
The azimuth (or horizontal) angle is [0, 360] degrees;
The elevation (or vertical) angle is [0, 90] degrees from XY plane.
End user uses azimuth and elevation to rotate camera around some center. So I made following code to convert from spherical coordinates to quaternion:
//polar: x - radius, y - horizontal angle, z - vertical
QQuaternion CoordinateSystemGround::fromPolarToQuat(const QVector3D& polar) const {
QQuaternion dest = QQuaternion::fromEulerAngles({0, polar.y(), polar.z()});
//convert user coord system back to OpenGL by rotation around X-axis
QQuaternion orig = QQuaternion::fromAxisAndAngle(1, 0, 0, -90);
return dest * orig;
}
and back:
QVector3D CoordinateSystemGround::fromQuatToPolar(const QQuaternion& q) const {
//convert OpenGL coord system to destination by rotation around X-axis
QQuaternion dest = QQuaternion::fromAxisAndAngle(1, 0, 0, 90);
QQuaternion out = q * dest;
QVector3D euler = out.toEulerAngles();
float hor = euler.y();
if(hor < 0.0f)
hor += 360.0f;
float ver = euler.z();
if(ver > 90.0f)
ver = 90.0f;
else if(ver < 0.0f)
ver = 0.0f;
//x changes later
return QVector3D(0, hor, ver);
}
But it doesn't work right. I suppose fromPolarToQuat conversion has mistake somewhere and I can't understand where.
Seems like I found the solution. So, to get polar angles from quaternion:
QVector3D CoordinateSystemGround::fromQuatToPolar(const QQuaternion& q) const {
QQuaternion coord1 = QQuaternion::fromAxisAndAngle(0, 1, 0, -180);
QQuaternion coord2 = QQuaternion::fromAxisAndAngle(1, 0, 0, -90);
QQuaternion out = orig1 * orig2 * q;
QVector3D euler = out.toEulerAngles();
float hor = euler.y();
if(hor < 0.0f)
hor += 360.0f;
float ver = euler.x();
return QVector3D(0, hor, -ver);
}
And back:
QQuaternion CoordinateSystemGround::fromPolarToQuat(const QVector3D& polar) const {
QQuaternion dest = QQuaternion::fromEulerAngles({-polar.z(), polar.y(), 0});
QQuaternion coord1 = QQuaternion::fromAxisAndAngle(1, 0, 0, 90);
QQuaternion coord2 = QQuaternion::fromAxisAndAngle(0, 1, 0, 180);
return coord1 * coord2 * dest;
}
Not sure it's an optimal solution though, but it works as it should.
Edited
After some research I've found few mistakes and made optimized and hope correct version of conversion:
QVector3D CoordinateSystemGround::fromQuatToPolar(const QQuaternion& q) const {
// back to OpenGL coord system: just multiplication on inverted destination coord system
QQuaternion dest = QQuaternion::fromAxes({-1, 0, 0}, {0, 0, 1}, {0, 1, 0}).inverted();
QVector3D euler = (dest * q).toEulerAngles();
float hor = euler.y();
if(hor < 0.0f)
hor += 360.0f;
float ver = euler.x();
return QVector3D(0, hor, -ver);
}
And back:
QQuaternion CoordinateSystemGround::fromPolarToQuat(const QVector3D& polar) const {
//just rotate if we were in OpenGL coord system
QQuaternion orig = QQuaternion::fromEulerAngles({-polar.z(), polar.y(), 0});
//and then multiply on destination coord system
QQuaternion dest = QQuaternion::fromAxes({-1, 0, 0}, {0, 0, 1}, {0, 1, 0});
return dest * orig;
}

How do I apply a local/relative rotation to a transform?

I'm writing a simple physics component in my C++ engine, using the GLM math library, and any rotations I apply are done in world space, i.e. each rotation is applied along the global X, Y, and Z axes, no matter which way the object is facing. I am applying a torque to my object, and using that to calculate a rotation amount for each axis.
I add the torque via a call to the AddTorque function, which uses the object's transform to apply it in a relative direction. For example, for pitching the object (rotation around the X axis):
AddTorque(m_transform[0] * 50.0f);
The calculation code itself, where the rotation is not local/relative (m_acceleration, m_velocity, m_torque, etc, are all of the form glm::vec3, and the transform of the type glm::mat4):
void PhysicsComponent::Update(float a_deltaTime)
{
///////////
// rotation
glm::mat4 transform = m_parent->GetTransform();
glm::vec3 rotVec = glm::vec3(0, 0, 0);
m_angularAcceleration = m_torque / m_momentOfInertia;
m_angularVelocity += m_angularAcceleration * a_deltaTime;
rotVec += m_angularVelocity * a_deltaTime;
transform = glm::rotate(transform, rotVec.x, glm::vec3(1, 0, 0));
transform = glm::rotate(transform, rotVec.y, glm::vec3(0, 1, 0));
transform = glm::rotate(transform, rotVec.z, glm::vec3(0, 0, 1));
///////////
// position
glm::vec3 position = transform[3].xyz;
m_acceleration = m_force / m_mass;
m_velocity += m_acceleration * a_deltaTime;
position += m_velocity * a_deltaTime;
transform[3].xyz = position;
m_parent->SetTransform(transform);
m_force.x = 0;
m_force.y = 0;
m_force.z = 0;
m_torque.x = 0;
m_torque.y = 0;
m_torque.z = 0;
rotVec = glm::vec3(0,0,0);
}

OpenGL Matrix Camera controls, local rotation not functioning properly

So I'm trying to figure out how to mannually create a camera class that creates a local frame for camera transformations. I've created a player object based on OpenGL SuperBible's GLFrame class.
I got keyboard keys mapped to the MoveUp, MoveRight and MoveForward functions and the horizontal and vertical mouse movements are mapped to the xRot variable and rotateLocalY function. This is done to create a FPS style camera.
The problem however is in the RotateLocalY. Translation works fine and so does the vertical mouse movement but the horizontal movement scales all my objects down or up in a weird way. Besides the scaling, the rotation also seems to restrict itself to 180 degrees and rotates around the world origin (0.0) instead of my player's local position.
I figured that the scaling had something to do with normalizing vectors but the GLframe class (which I used for reference) never normalized any vectors and that class works just fine. Normalizing most of my vectors only solved the scaling and all the other problems were still there so I'm figuring one piece of code is causing all these problems?
I can't seem to figure out where the problem lies, I'll post all the appropriate code here and a screenshot to show the scaling.
Player object
Player::Player()
{
location[0] = 0.0f; location[1] = 0.0f; location[2] = 0.0f;
up[0] = 0.0f; up[1] = 1.0f; up[2] = 0.0f;
forward[0] = 0.0f; forward[1] = 0.0f; forward[2] = -1.0f;
}
// Does all the camera transformation. Should be called before scene rendering!
void Player::ApplyTransform()
{
M3DMatrix44f cameraMatrix;
this->getTransformationMatrix(cameraMatrix);
glRotatef(xAngle, 1.0f, 0.0f, 0.0f);
glMultMatrixf(cameraMatrix);
}
void Player::MoveForward(GLfloat delta)
{
location[0] += forward[0] * delta;
location[1] += forward[1] * delta;
location[2] += forward[2] * delta;
}
void Player::MoveUp(GLfloat delta)
{
location[0] += up[0] * delta;
location[1] += up[1] * delta;
location[2] += up[2] * delta;
}
void Player::MoveRight(GLfloat delta)
{
// Get X axis vector first via cross product
M3DVector3f xAxis;
m3dCrossProduct(xAxis, up, forward);
location[0] += xAxis[0] * delta;
location[1] += xAxis[1] * delta;
location[2] += xAxis[2] * delta;
}
void Player::RotateLocalY(GLfloat angle)
{
// Calculate a rotation matrix first
M3DMatrix44f rotationMatrix;
// Rotate around the up vector
m3dRotationMatrix44(rotationMatrix, angle, up[0], up[1], up[2]); // Use up vector to get correct rotations even with multiple rotations used.
// Get new forward vector out of the rotation matrix
M3DVector3f newForward;
newForward[0] = rotationMatrix[0] * forward[0] + rotationMatrix[4] * forward[1] + rotationMatrix[8] * forward[2];
newForward[1] = rotationMatrix[1] * forward[1] + rotationMatrix[5] * forward[1] + rotationMatrix[9] * forward[2];
newForward[2] = rotationMatrix[2] * forward[2] + rotationMatrix[6] * forward[1] + rotationMatrix[10] * forward[2];
m3dCopyVector3(forward, newForward);
}
void Player::getTransformationMatrix(M3DMatrix44f matrix)
{
// Get Z axis (Z axis is reversed with camera transformations)
M3DVector3f zAxis;
zAxis[0] = -forward[0];
zAxis[1] = -forward[1];
zAxis[2] = -forward[2];
// Get X axis
M3DVector3f xAxis;
m3dCrossProduct(xAxis, up, zAxis);
// Fill in X column in transformation matrix
m3dSetMatrixColumn44(matrix, xAxis, 0); // first column
matrix[3] = 0.0f; // Set 4th value to 0
// Fill in the Y column
m3dSetMatrixColumn44(matrix, up, 1); // 2nd column
matrix[7] = 0.0f;
// Fill in the Z column
m3dSetMatrixColumn44(matrix, zAxis, 2); // 3rd column
matrix[11] = 0.0f;
// Do the translation
M3DVector3f negativeLocation; // Required for camera transform (right handed OpenGL system. Looking down negative Z axis)
negativeLocation[0] = -location[0];
negativeLocation[1] = -location[1];
negativeLocation[2] = -location[2];
m3dSetMatrixColumn44(matrix, negativeLocation, 3); // 4th column
matrix[15] = 1.0f;
}
Player object header
class Player
{
public:
//////////////////////////////////////
// Variables
M3DVector3f location;
M3DVector3f up;
M3DVector3f forward;
GLfloat xAngle; // Used for FPS divided X angle rotation (can't combine yaw and pitch since we'll also get a Roll which we don't want for FPS)
/////////////////////////////////////
// Functions
Player();
void ApplyTransform();
void MoveForward(GLfloat delta);
void MoveUp(GLfloat delta);
void MoveRight(GLfloat delta);
void RotateLocalY(GLfloat angle); // Only need rotation on local axis for FPS camera style. Then a translation on world X axis. (done in apply transform)
private:
void getTransformationMatrix(M3DMatrix44f matrix);
};
Applying transformations
// Clear screen
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glLoadIdentity();
// Apply camera transforms
player.ApplyTransform();
// Set up lights
...
// Use shaders
...
// Render the scene
RenderScene();
// Do post rendering operations
glutSwapBuffers();
and mouse
float mouseSensitivity = 500.0f;
float horizontal = (width / 2) - mouseX;
float vertical = (height / 2) - mouseY;
horizontal /= mouseSensitivity;
vertical /= (mouseSensitivity / 25);
player.xAngle += -vertical;
player.RotateLocalY(horizontal);
glutWarpPointer((width / 2), (height / 2));
Honestly I think you are taking a way to complicated approach to your problem. There are many ways to create a camera. My favorite is using a R3-Vector and a Quaternion, but you could also work with a R3-Vector and two floats (pitch and yaw).
The setup with two angles is simple:
glLoadIdentity();
glTranslatef(-pos[0], -pos[1], -pos[2]);
glRotatef(-yaw, 0.0f, 0.0f, 1.0f);
glRotatef(-pitch, 0.0f, 1.0f, 0.0f);
The tricky part now is moving the camera. You must do something along the lines of:
flaot ds = speed * dt;
position += tranform_y(pich, tranform_z(yaw, Vector3(ds, 0, 0)));
How to do the transforms, I would have to look that up, but you could to it by using a rotation matrix
Rotation is trivial, just add or subtract from the pitch and yaw values.
I like using a quaternion for the orientation because it is general and thus you have a camera (any entity that is) that independent of any movement scheme. In this case you have a camera that looks like so:
class Camera
{
public:
// lots of stuff omitted
void setup();
void move_local(Vector3f value);
void rotate(float dy, float dz);
private:
mx::Vector3f position;
mx::Quaternionf orientation;
};
Then the setup code uses shamelessly gluLookAt; you could make a transformation matrix out of it, but I never got it to work right.
void Camera::setup()
{
// projection related stuff
mx::Vector3f eye = position;
mx::Vector3f forward = mx::transform(orientation, mx::Vector3f(1, 0, 0));
mx::Vector3f center = eye + forward;
mx::Vector3f up = mx::transform(orientation, mx::Vector3f(0, 0, 1));
gluLookAt(eye(0), eye(1), eye(2), center(0), center(1), center(2), up(0), up(1), up(2));
}
Moving the camera in local frame is also simple:
void Camera::move_local(Vector3f value)
{
position += mx::transform(orientation, value);
}
The rotation is also straight forward.
void Camera::rotate(float dy, float dz)
{
mx::Quaternionf o = orientation;
o = mx::axis_angle_to_quaternion(horizontal, mx::Vector3f(0, 0, 1)) * o;
o = o * mx::axis_angle_to_quaternion(vertical, mx::Vector3f(0, 1, 0));
orientation = o;
}
(Shameless plug):
If you are asking what math library I use, it is mathex. I wrote it...

Camera rotation in OpenGL not using glRotate glLookAt

I am trying to write a own rotation function for a camera in OpenGL, but I can't get it to run. My camera is mainly from flipcode, with some minor changes:
Camera code:
Camera::Camera(float x, float y, float z) {
memset(Transform, 0, 16*sizeof(float));
Transform[0] = 1.0f;
Transform[5] = 1.0f;
Transform[10] = 1.0f;
Transform[15] = 1.0f;
Transform[12] = x; Transform[13] = y; Transform[14] = z;
Left=&Transform[0];
Up=&Transform[4];
Forward=&Transform[8];
Position=&Transform[12];
old_x = 0;
old_y = 0;
}
The view is set before every rendered frame:
void Camera::setView() {
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
float viewmatrix[16]={//Remove the three - for non-inverted z-axis
Transform[0], Transform[4], -Transform[8], 0,
Transform[1], Transform[5], -Transform[9], 0,
Transform[2], Transform[6], -Transform[10], 0,
-(Transform[0]*Transform[12] +
Transform[1]*Transform[13] +
Transform[2]*Transform[14]),
-(Transform[4]*Transform[12] +
Transform[5]*Transform[13] +
Transform[6]*Transform[14]),
//add a - like above for non-inverted z-axis
(Transform[8]*Transform[12] +
Transform[9]*Transform[13] +
Transform[10]*Transform[14]), 1};
glLoadMatrixf(viewmatrix);
}
Now to my problem, the rotation. Consider for example rotation around the y-axis. This is the rotation matrix stack:
// deg is the angle it is not working in degree or radiant
void Camera::rotateLocal_y(float deg){
glMatrixMode(GL_MODELVIEW);
glPushMatrix();
glLoadMatrixf(Transform);
rotateMatrixf_y(Transform, deg);
glGetFloatv(GL_MODELVIEW_MATRIX, Transform);
glPopMatrix();
}
So next I am going to show the rotation function:
//rotate a matrix around y axis
void rotateMatrixf_y(float *aMatrix, float angle){
// x y z t
float rotMatrix[] = {cos(angle),0,-1*sin(angle),0, 0, 1, 0, 0, sin(angle), 0, cos(angle), 0, 0, 0, 0, 1};
multMatrixMatrix(rotMatrix,aMatrix);
}
And finally the matrix multiplication function:
void multMatrixMatrix(float* m_a, float* m_b){
float m_c[16] = {m_a[0]*m_b[0]+m_a[4]*m_b[1]+m_a[8]*m_b[2]+m_a[12]*m_b[3],
m_a[0]*m_b[4]+m_a[4]*m_b[5]+m_a[8]*m_b[6]+m_a[12]*m_b[7],
m_a[0]*m_b[8]+m_a[4]*m_b[9]+m_a[8]*m_b[10]+m_a[12]*m_b[11],
m_a[0]*m_b[12]+m_a[4]*m_b[13]+m_a[8]*m_b[14]+m_a[12]*m_b[15],
m_a[1]*m_b[0]+m_a[5]*m_b[1]+m_a[9]*m_b[2]+m_a[13]*m_b[3],
m_a[1]*m_b[4]+m_a[5]*m_b[5]+m_a[9]*m_b[6]+m_a[13]*m_b[7],
m_a[1]*m_b[8]+m_a[5]*m_b[9]+m_a[9]*m_b[10]+m_a[13]*m_b[11],
m_a[1]*m_b[12]+m_a[5]*m_b[13]+m_a[9]*m_b[14]+m_a[13]*m_b[15],
m_a[2]*m_b[0]+m_a[6]*m_b[1]+m_a[10]*m_b[2]+m_a[14]*m_b[3],
m_a[2]*m_b[4]+m_a[6]*m_b[5]+m_a[10]*m_b[6]+m_a[14]*m_b[7],
m_a[2]*m_b[8]+m_a[6]*m_b[9]+m_a[10]*m_b[10]+m_a[14]*m_b[11],
m_a[2]*m_b[12]+m_a[6]*m_b[13]+m_a[10]*m_b[14]+m_a[14]*m_b[15],
m_a[3]*m_b[0]+m_a[7]*m_b[1]+m_a[11]*m_b[2]+m_a[15]*m_b[3],
m_a[3]*m_b[4]+m_a[7]*m_b[5]+m_a[11]*m_b[6]+m_a[15]*m_b[7],
m_a[3]*m_b[8]+m_a[7]*m_b[9]+m_a[11]*m_b[10]+m_a[15]*m_b[11],
m_a[3]*m_b[12]+m_a[7]*m_b[13]+m_a[11]*m_b[14]+m_a[15]*m_b[15]
};
m_b = m_c;
}
I though this must be it, but it seems as if something is fundamentaly wrong. It is not moving at all. the camera is properly set. The method order is: cam.rotate then cam.setView.
Flipcodes originial rotate function:
void Camera::rotateLoc(float deg, float x, float y, float z) {
glMatrixMode(GL_MODELVIEW);
glPushMatrix();
glLoadMatrixf(Transform);
glRotatef(deg, x,y,z);
glGetFloatv(GL_MODELVIEW_MATRIX, Transform);
glPopMatrix();
}
Your code is pretty messy and incomplete.
I think your problem is here :
glPushMatrix();
glLoadMatrixf(Transform); // give the Transform matrix to GL (why?)
rotateMatrixf_y(Transform, deg); // modify the Transform matrix
glGetFloatv(GL_MODELVIEW_MATRIX, Transform); // (3) retrieve the original Tranform matrix
glPopMatrix();
(3) just undoes whatever changes you've been doing in 'Transform' by calling 'rotateMatrixf_y'.
The flipcode code you added is using OpenGL to update the Tranform matrix, by calling glRotatef' and reading back the result, which is fine. In your method code, you should just remove every reference to OpenGL and just keep the call to rotateMatrixf_y, which does all the work in its own.
Do you really understand what's the use of the GL matrix stack ? You should perhaps go back to the basics by either using only GL functions or using your own, but get to know why it works in either way before mixing the uses.