I am trying to rotate bitmap. By invoking
XFORM form;
angle = -30;
form.eM11 = (FLOAT)cos(angle * 2 * M_PI / 360);
form.eM12 = (FLOAT)sin(angle * 2 * M_PI / 360);
form.eM21 = (FLOAT)-sin(angle * 2 * M_PI / 360);
form.eM22 = (FLOAT)cos(angle * 2 * M_PI / 360);
form.eDx = -SIZE / 2;
form.eDy = -SIZE / 2;
SetWorldTransform(hdc, &form);
The bitmap rotates around center but the rotation point is not in the middle of the hdc. If I try to add another tranformation in order to translate the bitmap right after this one then I don't see rotation effect. Is there a way to mix these transformations? Something similar to matrix multiplication should work but I have no idea how to achieve that.
Related
I have a 3d, stereoscopic rendering application that currently uses parallel stereoscopy by just moving (shifting) the camera to the side for each Left and Right views. It does work, but recently I felt it could be much improved if I had the off-axis option. I got a semi-working algorithm for glm::frustum() to allow for this but am having some troubles immediately when I switch to it over glm::perspective().
I followed the only GL guide I could find, Simple, Low-Cost Stereographics, that which said to replace my existing glm::perspective() with (2 calls
//OFF-AXIS STEREO
if (myAbj.stereoOffsetAxis) {
glm::vec3 targ0_stored = i->targO;
if (myAbj.stereoLR == 0)
{
float sgn = -1.f * (float)myAbj.stereoSwitchLR;
float eyeSep = myAbj.stereoSep;
float focalLength = 50.f;
float eyeOff = (sgn * (eyeSep / 2.f) * (myAbj.selCamLi->nearClip->val_f / focalLength));
float top = myAbj.selCamLi->nearClip->val_f * tan(myAbj.selCamLi->fov->val_f / 2.f);
float right = myAbj.aspect * top;
myAbj.selCamLi->PM = glm::frustum(-right - eyeOff, right - eyeOff, -top, top, myAbj.selCamLi->nearClip->val_f, myAbj.selCamLi->farClip->val_f);
i->targO += myAbj.selCamLi->rightO * myAbj.stereoSep * (float)myAbj.stereoSwitchLR;
VMup(i);
i->targO = targ0_stored;
}
if (myAbj.stereoLR == 1)
{
float sgn = 1.f * (float)myAbj.stereoSwitchLR;
float eyeSep = myAbj.stereoSep;
float focalLength = 50.f;
float eyeOff = (sgn * (eyeSep / 2.f) * (myAbj.selCamLi->nearClip->val_f / focalLength));
float top = myAbj.selCamLi->nearClip->val_f * tan(myAbj.selCamLi->fov->val_f / 2.f);
float right = myAbj.aspect * top;
myAbj.selCamLi->PM = glm::frustum(-right - eyeOff, right - eyeOff, -top, top, myAbj.selCamLi->nearClip->val_f, myAbj.selCamLi->farClip->val_f);
i->targO += myAbj.selCamLi->rightO * -myAbj.stereoSep * (float)myAbj.stereoSwitchLR;
VMup(i);
i->targO = targ0_stored;
}
}
Using this equation, my View Matrix is rotated 180 degrees on the Z axis. However, the bigger issue is a large amount of black dots and flickering on my objects. When I move the camera to a close enough point the flickering stops. Even when I minimize the scene, the issue is still there.
Why is this flickering happening and what can I do to prevent it? It is ruining my scenes.
My near clip was causing the problem. It couldnt be set to the same low value that glm::perspective() was using - it needed a little bit more.
I have my own 3D engine implementation around OpenGL (C++) and it has worked fine for everything these last years.
But today I stumbled upon a problem. I have this scene with spheres (planets around a sun, orbit rings and things like that, very very simple) and I want to ray pick them with the mouse.
As long as the camera/view matrix is identity, picking works. When the camera is rotated and then moved, the picking goes completely haywire. I have been searching for a solution for a while now so now I'm asking you guys.
This is the code (summarized for this question):
mat4f mProj = createPerspective(PI / 4.0f, float(res.x) / float(res.y), 0.1f, 100.0f);
mat4f mCamera = createTranslation(-1.5f, 3, -34.0f) * createRotationZ(20.0f * PI / 180.0f) * createRotationX(20.0f * PI / 180.0f);
... render scene, using shaders that transform vertices with gl_Position = mProj * mCamera * aPosition;
mat4f mUnproject = (mProj * mCamera).getInverse();
vec2f mouseClip(2.0f * float(coord.x) / float(res.x) - 1.0f, 1.0f - 2.0f * float(coord.y) / float(res.y));
vec3f rayOrigin = (mUnproject * vec4f(mouseClip, 0, 1)).xyz();
vec3f rayDir = (mUnproject * vec4f(mouseClip, -1, 1)).xyz();
// In a loop over all planets:
mat4f mObject = createRotationY(planet.angle) * createTranslation(planet.distance, 0, 0);
vec3f planetPos = mObject.transformCoord(vec3f(0, 0, 0));
float R = planet.distance;
float a = rayDir.dot(rayDir);
float b = 2 * rayDir.x * (rayOrigin.x - planetPos.x) + 2 * rayDir.y * (rayOrigin.y - planetPos.y) + 2 * rayDir.z * (rayOrigin.z - planetPos.z);
float c = planetPos.dot(planetPos) + rayOrigin.dot(rayOrigin) -2 * planetPos.dot(rayOrigin) - R * R;
float d = b * b - 4 * a * c;
if (d >= 0)
HIT!
So when I use identity for mCamera, everything works fine, even when I use only rotation for mCamera, it works fine. It is when I start using the translation that it goes completely wrong.
Anyone knows where I am going wrong?
BDL's answer was spot on and put me back in the right direction. Indeed, when transforming coordinates myself, I forgot to do the perspective-divide. After writing so much shader code where the gpu does this for you, you forget about these things.
It is logical that this only gave issues when the camera moved and not when it was at (0, 0, 0) as then, the translation part of the transformation matrices stayed 0 and the w-factor of the coordinates were unaffected.
I immediately wrote transformCoord and transformNormal implementations in my matrix classes to prevent this error from happening again.
Also, the ray origin and direction were incorrect, although I don't really understand why yet. I now take the origin from my camera matrix (inverted of course) and calculate the direction the same way but now subtract the camera position from it to make it a direction vector. I normalize it, although I don't think it is really necessary in this case, but normalizing it will make its numbers look more readable when debugging anyway.
This works:
vec2f mouseClip(2.0f * float(coord.x) / float(res.x) - 1.0f, 1.0f - 2.0f * float(coord.y) / float(res.y));
mat4f mUnproject = (mProj * mCamera).getInverse();
mat4f mInvCamera = mCamera.getInverse();
vec3f rayOrigin(mInvCamera.m[12], mInvCamera.m[13], mInvCamera.m[14]);
vec3f rayDir = (mUnproject.transformCoord(vec3f(mouseClip, 1)) - rayOrigin).normalized();
... per planet
vec3f planetPos = mObject.transformCoord(vec3f(0, 0, 0));
float a = rayDir.dot(rayDir);
float b = 2 * rayDir.x * (rayOrigin.x - planetPos.x) + 2 * rayDir.y * (rayOrigin.y - planetPos.y) + 2 * rayDir.z * (rayOrigin.z - planetPos.z);
float c = planetPos.dot(planetPos) + rayOrigin.dot(rayOrigin) -2 * planetPos.dot(rayOrigin) - 0.4f * 0.4f;
float d = b * b - 4 * a * c;
if (d >= 0)
... HIT!
I am trying to create a camera thats works like a flight-simulator (because I'm making a flight simulator) camera - I want to be able to perform pitch, yaw and roll, as well as translations. The translations work perfectly, but the rotations are causing me a very big headache.
The rotations are calculated using quaternions, using GLM, like so:
// Fuck quaternions
glm::fquat pitchQuat(cos(TO_RADIANS(pitch / 2.0f)), m_rightVector * (float)sin(TO_RADIANS(pitch / 2.0f)));
glm::fquat yawQuat(cos(TO_RADIANS(yaw / 2.0f)), m_upVector * (float)sin(TO_RADIANS(yaw / 2.0f)));
glm::fquat rollQuat(cos(TO_RADIANS(roll / 2.0f)), m_direction * (float)sin(TO_RADIANS(roll / 2.0f)));
m_rotation = yawQuat * pitchQuat * rollQuat;
m_direction = m_rotation * m_direction * glm::conjugate(m_rotation);
m_upVector = m_rotation * m_upVector * glm::conjugate(m_rotation);
m_rightVector = glm::cross(m_direction, m_upVector);
If I calculate pitch or roll or yaw alone, everything works fine, however, as soon as I introduce another rotation everything goes wrong. This video should be enough to show you what is going wrong:
https://www.youtube.com/watch?v=jlklem6t68I&feature=youtu.be
The translations work fine, the rotations not such much. When I move the mouse in a circular motion - which is yaw and pitch rotations - the house slowly begins the flip upside-down. You may have noticed in the video that rotating causes the house to stretch, which is also unwanted.
I cannot figure out what is wrong. Can anyone explain how I can create a camera with pitch, yaw and roll working?
If it is of any help the view matrix is calculated using:
m_viewMatrix = glm::lookAt(m_position, m_position + m_direction, m_upVector);
And the projection matrix is calculated using:
float t = tan(fov * 3.14159 / 360.0) * nPlane;
float r = aspectRatio * t;
float l = aspectRatio * -t;
m_projectionMatrix[0][0] = (2 * nPlane) / (r - l);
m_projectionMatrix[0][2] = (r + l) / (r - l);
m_projectionMatrix[1][1] = (2 * nPlane) / (t + t);
m_projectionMatrix[1][2] = (t - t) / (t + t);
m_projectionMatrix[2][2] = (nPlane + fPlane) / (nPlane - fPlane);
m_projectionMatrix[2][3] = (2 * fPlane * nPlane) / (nPlane - fPlane);
m_projectionMatrix[3][2] = -1;
If you would like to see all my code for the camera class you can find it in my Google Drive:
http://goo.gl/FFMPa0
What you're experiencing is normal for a quaternion camera. If you want to avoid the problem, simply use a fixed up-vector when yawing, but beware, that there's going to be an exception when you look straight up. You might want to handle that explicitly. And always reconstruct your view matrix, by crossing view with this static up.
EDIT:
Here's a step-by-step:
First, calculate your rotation around the static up-vector, while your direction is extracted from the view as usual (and right being the cross):
// Quaternions
glm::fquat pitchQuat(cos(TO_RADIANS(pitch / 2.0f)), cross(m_direction,vec3(0,1,0) * (float)sin(TO_RADIANS(pitch / 2.0f)));
glm::fquat yawQuat(cos(TO_RADIANS(yaw / 2.0f)), vec3(0,1,0) * (float)sin(TO_RADIANS(yaw / 2.0f)));
glm::fquat rollQuat(cos(TO_RADIANS(roll / 2.0f)), m_direction * (float)sin(TO_RADIANS(roll / 2.0f)));
m_rotation = yawQuat * pitchQuat * rollQuat;
Then reconstruct your view by using the quaternion, as shown in here (lm::lookAt will do too).
And, of course - repeat step every frame.
I am trying to rotate opengl scene using track ball. The problem i am having is i am getting rotations opposite to direction of my swipe on screen. Here is the snippet of code.
prevPoint.y = viewPortHeight - prevPoint.y;
currentPoint.y = viewPortHeight - currentPoint.y;
prevPoint.x = prevPoint.x - centerx;
prevPoint.y = prevPoint.y - centery;
currentPoint.x = currentPoint.x - centerx;
currentPoint.y = currentPoint.y - centery;
double angle=0;
if (prevPoint.x == currentPoint.x && prevPoint.y == currentPoint.y) {
return;
}
double d, z, radius = viewPortHeight * 0.5;
if(viewPortWidth > viewPortHeight) {
radius = viewPortHeight * 0.5f;
} else {
radius = viewPortWidth * 0.5f;
}
d = (prevPoint.x * prevPoint.x + prevPoint.y * prevPoint.y);
if (d <= radius * radius * 0.5 ) { /* Inside sphere */
z = sqrt(radius*radius - d);
} else { /* On hyperbola */
z = (radius * radius * 0.5) / sqrt(d);
}
Vector refVector1(prevPoint.x,prevPoint.y,z);
refVector1.normalize();
d = (currentPoint.x * currentPoint.x + currentPoint.y * currentPoint.y);
if (d <= radius * radius * 0.5 ) { /* Inside sphere */
z = sqrt(radius*radius - d);
} else { /* On hyperbola */
z = (radius * radius * 0.5) / sqrt(d);
}
Vector refVector2(currentPoint.x,currentPoint.y,z);
refVector2.normalize();
Vector axisOfRotation = refVector1.cross(refVector2);
axisOfRotation.normalize();
angle = acos(refVector1*refVector2);
I recommend artificially setting prevPoint and currentPoint to (0,0) (0,1) and then stepping through the code (with a debugger or with your eyes) to see if each part makes sense to you, and the angle of rotation and axis at the end of the block are what you expect.
If they are what you expect, then I'm guessing the error is in the logic that occurs after that. i.e. you then take the angle and axis and convert them to a matrix which gets multiplied to move the model. A number of convention choices happen in this pipeline --which if swapped can lead to the type of bug you're having:
Whether the formula assumes the angle is winding left or right handedly around the axis.
Whether the transformation is meant to rotate an object in the world or meant to rotate the camera.
Whether the matrix is meant to operate by multiplication on the left or right.
Whether rows or columns of matrices are contiguous in memory.
I am currently trying to work on getting my virtual trackball to work from any angle. When I am looking at it from the z axis, it seems to work fine. I hold my mouse down, and move the mouse up... the rotation will move accordingly.
Now, if I change my viewing angle / position of my camera and try to move my mouse. The rotation will occur as if I were looking from the z axis. I cannot come up with a good way to get this to work.
Here is the code:
void Renderer::mouseMoveEvent(QMouseEvent *e)
{
// Get coordinates
int x = e->x();
int y = e->y();
if (isLeftButtonPressed)
{
// project current screen coordinates onto hemi sphere
Point sphere = projScreenCoord(x,y);
// find axis by taking cross product of current and previous hemi points
axis = Point::cross(previousPoint, sphere);
// angle can be found from magnitude of cross product
double length = sqrt( axis.x * axis.x + axis.y * axis.y + axis.z * axis.z );
// Normalize
axis = axis / length;
double lengthPrev = sqrt( previousPoint.x * previousPoint.x + previousPoint.y * previousPoint.y + previousPoint.z * previousPoint.z );
double lengthCur = sqrt( sphere.x * sphere.x + sphere.y * sphere.y + sphere.z * sphere.z );
angle = asin(length / (lengthPrev * lengthCur));
// Convert into Degrees
angle = angle * 180 / M_PI;
// 'add' this rotation matrix to our 'total' rotation matrix
glPushMatrix(); // save the old matrix so we don't mess anything up
glLoadIdentity();
glRotatef(angle, axis[0], axis[1], axis[2]); // our newly calculated rotation
glMultMatrixf(rotmatrix); // our previous rotation matrix
glGetFloatv(GL_MODELVIEW_MATRIX, (GLfloat*) rotmatrix); // we've let OpenGL do our matrix mult for us, now get this result & store it
glPopMatrix(); // return modelview to its old value;
}
// Project screen coordinates onto a unit hemisphere
Point Renderer::projScreenCoord(int x, int y)
{
// find projected x & y coordinates
double xSphere = ((double)x/width)*2.0 - 1.0;
double ySphere = ( 1 - ((double)y/height)) * 2.0 - 1.0;
double temp = 1.0 - xSphere*xSphere - ySphere*ySphere;
// Do a check so you dont do a sqrt of a negative number
double zSphere;
if (temp < 0){ zSphere = 0.0;}
else
{zSphere = sqrt(temp);}
Point sphere(xSphere, ySphere, zSphere);
// return the point on the sphere
return sphere;
}
I am still fairly new at this. Sorry for the trouble and thanks for all the help =)
The usual way involves quaternions. E.g., in sample code originally from SGI.