LWJGL - OpenGL - Camera rotation - opengl

i tried to create a first person camera. I took a piece of code of my XNA project and tried to convert it into java (LWJGL).
Matrix4f mx = Matrix4f.rotate(pitch, new Vector3f(1, 0, 0),
new Matrix4f(), null);
Matrix4f my = Matrix4f.rotate(yaw, new Vector3f(0, 1, 0),
new Matrix4f(), null);
Matrix4f rotationMatrix = Matrix4f.mul(mx, my, null);
Vector4f tmp = Matrix4f.transform(rotationMatrix, new Vector4f(0, 0,
-1, 0), null);
target = new Vector3f(position.x + tmp.x, position.y + tmp.y,
position.z + tmp.z);
GLU.gluLookAt(this.position.x, this.position.y, this.position.z,
this.target.x, this.target.y, this.target.z, 0, 1.0f, 0);
Pitch and yaw will increase/decrease if i move the mouse. At the start of running my game it works fine but after some movements i cant look down or up. Should it happen if my position is < 0 ?
The XNA code which works perfekt:
Matrix rotationMatrix = Matrix.CreateRotationX(pitch) * Matrix.CreateRotationY(yaw);
camera.Target = Vector3.Transform(Vector3.Forward, rotationMatrix) + camera.Position;
this.viewMatrix = Matrix.CreateLookAt(this.position, this.target, Vector3.Up);
EDIT
So I figured out that there is something different between LWJGL and XNA for rotating a matrix.
Java:
Matrix4f mx = Matrix4f.rotate(1.2345f, new Vector3f(1,0,0),new Matrix4f(),null);
Result:
{1.0 0.0 0.0 0.0} {0.0 0.3299931 -0.9439833 0.0}{0.0 0.9439833 0.3299931 0.0}{0.0 0.0 0.0 1.0}
XNA
Matrix.CreateRotationX(1.2345f)
Result:
{1,0,0,0} {0,0.3299931,0.9439833,0} {0,-0.9439833,0.3299931,0} {0,0,0,1}
The difference is -0.9439833 and +0.9439833 ... can someone explain me why there are different results? Isnt it the same function?
Thank you!

Oh i see. Ok well thats makes the different. Finally i solved my problem. The Problem was the "transform" method in LWJGL. Thats my final working code. Maybe i could optimize it?
public static Matrix4f createRotationX(float pitch) {
Matrix4f mX = new Matrix4f();
mX.m00 = 1;
mX.m10 = 0;
mX.m20 = 0;
mX.m30 = 0;
mX.m01 = 0;
mX.m11 = (float) Math.cos(pitch);
mX.m21 = (float) Math.sin(pitch);
mX.m02 = 0;
mX.m03 = 0;
mX.m12 = (float) -Math.sin(pitch);
mX.m22 = (float) Math.cos(pitch);
mX.m23 = 0;
mX.m03 = 0;
mX.m13 = 0;
mX.m23 = 0;
mX.m33 = 1;
return mX;
}
public static Matrix4f createRotationY(float yaw) {
Matrix4f mY = new Matrix4f();
mY.m00 = (float) Math.cos(yaw);
mY.m10 = 0;
mY.m20 = (float) -Math.sin(yaw);
mY.m30 = 0;
mY.m01 = 0;
mY.m11 = 1;
mY.m21 = 0;
mY.m31 = 0;
mY.m02 = (float) Math.sin(yaw);
mY.m12 = 0;
mY.m22 = (float) Math.cos(yaw);
mY.m23 = 0;
mY.m03 = 0;
mY.m13 = 0;
mY.m23 = 0;
mY.m33 = 1;
return mY;
}
public static Vector3f getNewTarget(Vector3f position,float pitch, float yaw){
Matrix4f mx = MathUtil.createRotationX(pitch);
Matrix4f my = MathUtil.createRotationY(yaw);
Matrix4f rotationMatrix = Matrix4f.mul(mx, my, null);
Matrix4f def = new Matrix4f();
def.m20 = -1;
def.m00 = 0;
def.m11 = 0;
def.m22 = 0;
def.m33 = 0;
Matrix4f v4 = Matrix4f.mul(def,rotationMatrix,null);
return new Vector3f(v4.m00+position.x, v4.m10+position.y, v4.m20+position.z);
}

Related

Centering the object into the 3D space with Direct3D

The idea is to present a drawn 3D object "centered" in the screen. After loading the object with WaveFrontReader I got an array of vertices:
float bmin[3], bmax[3];
bmin[0] = bmin[1] = bmin[2] = std::numeric_limits<float>::max();
bmax[0] = bmax[1] = bmax[2] = -std::numeric_limits<float>::max();
for (int k = 0; k < 3; k++)
{
for (auto& v : objx->wfr.vertices)
{
if (k == 0)
{
bmin[k] = std::min(v.position.x, bmin[k]);
bmax[k] = std::max(v.position.x, bmax[k]);
}
if (k == 1)
{
bmin[k] = std::min(v.position.y, bmin[k]);
bmax[k] = std::max(v.position.y, bmax[k]);
}
if (k == 2)
{
bmin[k] = std::min(v.position.z, bmin[k]);
bmax[k] = std::max(v.position.z, bmax[k]);
}
}
}
I got the idea from the Viewer in TinyObjLoader (which uses OpenGL though), and then:
float maxExtent = 0.5f * (bmax[0] - bmin[0]);
if (maxExtent < 0.5f * (bmax[1] - bmin[1])) {
maxExtent = 0.5f * (bmax[1] - bmin[1]);
}
if (maxExtent < 0.5f * (bmax[2] - bmin[2])) {
maxExtent = 0.5f * (bmax[2] - bmin[2]);
}
_3dp.scale[0] = maxExtent;
_3dp.scale[1] = maxExtent;
_3dp.scale[2] = maxExtent;
_3dp.translation[0] = -0.5 * (bmax[0] + bmin[0]);
_3dp.translation[1] = -0.5 * (bmax[1] + bmin[1]);
_3dp.translation[2] = -0.5 * (bmax[2] + bmin[2]);
However this doesn't work. With an object like this spider which has vertices that the coordinates do not extend +/-100, the scale gets to about 100x by the above formula and yet, with the current view set to 0,0,0 the object is too close and I have to put the Z translation manually to something like 50000 to view it into a full box with a D3D11_VIEWPORT viewport = { 0.0f, 0.0f, w, h, 0.0f, 1.0f };, Not to mention that the Y is not centered as well.
Is there a proper algorithm to center the object into view?
Thanks a lot
You can actually change the position of the camera itself and not the objects.
Its recommended that you edit the camera position in OpenGL tutorials.
In games the camera (which is what captures the viewpoint which the rendered objects are viewed from) are not in the middle of the view but actually further way so you can see everything going on in the view/scene.

gluPartialDisk in osg/osgEarth

I've been trying to create an openGL gluDisk like object in osgEarth. So far I've attempted to do the following (Edited, this is the correct answer):
void ViewDriver::drawCircZone(double lat, double lon, double innerRadius, double outerRadius, QColor color, double beginAngle, double endAngle){
GeometryFactory g;
osg::ref_ptr<osgEarth::Geometry> outerCircleGeom = g.createArc(osg::Vec3d(lat, lon, 0), convertFromMetersToMercDeg(outerRadius), beginAngle, endAngle);
osg::ref_ptr<osgEarth::Geometry> innerCircleGeom = g.createArc(osg::Vec3d(lat, lon, 0), convertFromMetersToMercDeg(innerRadius), beginAngle, endAngle);
osg::Vec3dArray* outerCircArray = outerCircleGeom->createVec3dArray();
osg::Vec3dArray* innerCircArray = innerCircleGeom->createVec3dArray();
Vec3dVector* diskVec = new Vec3dVector;
for(int i = 0; i < outerCircArray->size() - 1; i++){
diskVec->push_back((*outerCircArray)[i]);
}
//This is important for closing the shape and not giving it a Pac-Man-like mouth
diskVec->push_back((*outerCircArray)[0]);
//This is how you make a "hole", by iterating backwards
for(int i = innerCircArray->size() - 1; i >= 0; i--){
diskVec->push_back((*innerCircArray)[i]);
}
osg::ref_ptr<osgEarth::Symbology::Ring> diskRing = new Ring(diskVec);
diskRing->close();
osg::ref_ptr<Feature> circFeature = new Feature(diskRing, view->getMapViewer()->geoSRS);
Style circStyle;
circStyle.getOrCreate<PolygonSymbol>()->outline() = true;
circStyle.getOrCreate<PolygonSymbol>()->fill()->color() = Color(color.red()/255.0, color.green()/255.0, color.blue()/255.0, 1.0);
circStyle.getOrCreate<AltitudeSymbol>()->clamping() = AltitudeSymbol::CLAMP_RELATIVE_TO_TERRAIN;
circStyle.getOrCreate<AltitudeSymbol>()->technique() = AltitudeSymbol::TECHNIQUE_DRAPE;
osg::ref_ptr<FeatureNode> circNode = new FeatureNode(circFeature, circStyle);
circNode->setDynamic(true);
view->getMapNode()->addChild(circNode);
}
What stumped me originally was that I don't have a lot of graphics knowledge. Somewhere I read that when drawing outlines, do it in clockwise direction. When drawing outlines in counter clockwise direction they will "cut-out" or create a "hole" when combined with clockwise-drawn points. I was filling the "hole" outline with the smaller circle's point in a clockwise direction originally when testing that method which was why it didn't work.
void ViewDriver::drawCircZone(double lat, double lon, double innerRadius, double outerRadius, QColor color, double beginAngle, double endAngle){
GeometryFactory g;
osg::ref_ptr<osgEarth::Geometry> outerCircleGeom = g.createArc(osg::Vec3d(lat, lon, 0), convertFromMetersToMercDeg(outerRadius), beginAngle, endAngle);
osg::ref_ptr<osgEarth::Geometry> innerCircleGeom = g.createArc(osg::Vec3d(lat, lon, 0), convertFromMetersToMercDeg(innerRadius), beginAngle, endAngle);
osg::Vec3dArray* outerCircArray = outerCircleGeom->createVec3dArray();
osg::Vec3dArray* innerCircArray = innerCircleGeom->createVec3dArray();
Vec3dVector* diskVec = new Vec3dVector;
for(int i = 0; i < outerCircArray->size() - 1; i++){
diskVec->push_back((*outerCircArray)[i]);
}
//This is important for closing the shape and not giving it a Pac-Man-like mouth
diskVec->push_back((*outerCircArray)[0]);
for(int i = innerCircArray->size() - 1; i >= 0; i--){
diskVec->push_back((*innerCircArray)[i]);
}
osg::ref_ptr<osgEarth::Symbology::Ring> diskRing = new Ring(diskVec);
diskRing->close();
osg::ref_ptr<Feature> circFeature = new Feature(diskRing, view->getMapViewer()->geoSRS);
Style circStyle;
circStyle.getOrCreate<PolygonSymbol>()->outline() = true;
circStyle.getOrCreate<PolygonSymbol>()->fill()->color() = Color(color.red()/255.0, color.green()/255.0, color.blue()/255.0, 1.0);
circStyle.getOrCreate<AltitudeSymbol>()->clamping() = AltitudeSymbol::CLAMP_RELATIVE_TO_TERRAIN;
circStyle.getOrCreate<AltitudeSymbol>()->technique() = AltitudeSymbol::TECHNIQUE_DRAPE;
osg::ref_ptr<FeatureNode> circNode = new FeatureNode(circFeature, circStyle);
circNode->setDynamic(true);
view->getMapNode()->addChild(circNode);
}

What's the equivalent of pmdGet3DCoordinates for Intel RealSense 3D camera (SR300)?

I am trying to write the similar code for a PMD camera now for an Intel RealSense 3D camera (SR300). However, I can't find a method that does the same thing in the realsense SDK. Is there a hack for this or what do you suggest?
pmdGet3DCoordinates(PMDHandle hnd, float * data, size_t size);
an example usage is:
float * cartesianDist = new float[nRows*nCols*3];
res = pmdGet3DCoordinates(hnd, cartesianDist, nCols*nRows*3*sizeOf(float));
I need this function in order to create an xyzmap as in the code below:
int res = pmdGet3DCoordinates(hnd, dists, 3 * numPixels * sizeof(float));
xyzMap = cv::Mat(xyzMap.size(), xyzMap.type(), dists);
I have written the following code so far, I don't know if it make any sense in terms of Intel RealSense. Please feel free to comment.
void SR300Camera::fillInZCoords()
{
//int res = pmdGet3DCoordinates(hnd, dists, 3 * numPixels * sizeof(float)); //store x,y,z coordinates dists (type: float*)
////float * zCoords = new float[1]; //store z-Coordinates of dists in zCoords
//xyzMap = cv::Mat(xyzMap.size(), xyzMap.type(), dists);
Projection *projection = pp->QueryCaptureManager()->QueryDevice()->CreateProjection();
std::vector<Point3DF32> vertices;
vertices.resize(bufferSize.width * bufferSize.height);
projection->QueryVertices(sample->depth, &vertices[0]);
xyzBuffer.clear();
for (int i = 0; i < bufferSize.width*bufferSize.height; i++) {
//cv::Point3f p;
//p.x = vertices[i].x;
//p.y = vertices[i].y;
//p.z = vertices[i].z;
//xyzBuffer.push_back(p);
xyzBuffer.push_back(cv::Point3f(vertices[i].x, vertices[i].y, vertices[i].z));
}
xyzMap = cv::Mat(xyzBuffer);
projection->Release();
}
sts = projection->QueryVertices(depthMap, &pos3D[0]); is an almost equivalent and does the job of converting depth UV-map to real-world xyz map.
Status sts = sm ->AcquireFrame(true);
if (sts < STATUS_NO_ERROR) {
if (sts == Status::STATUS_STREAM_CONFIG_CHANGED) {
wprintf_s(L"Stream configuration was changed, re-initilizing\n");
sm ->Close();
}
}
sample = sm->QuerySample();
PXCImage *depthMap = sample->depth;
renderd.RenderFrame(sample->depth);
PXCImage::ImageData depthImage;
depthMap->AcquireAccess(PXCImage::ACCESS_READ, &depthImage);
PXCImage::ImageInfo imgInfo = depthMap->QueryInfo();
int depth_stride = depthImage.pitches[0] / sizeof(pxcU16);
PXCProjection * projection = device->CreateProjection();
pxcU16 *dpixels = (pxcU16*)depthImage.planes[0];
unsigned int dpitch = depthImage.pitches[0] / sizeof(pxcU16);
PXCPoint3DF32 *pos3D = new PXCPoint3DF32[num_pixels];
sts = projection->QueryVertices(depthMap, &pos3D[0]);
if (sts < Status::STATUS_NO_ERROR) {
wprintf_s(L"Projection was unsuccessful! \n");
sm->Close();
}

What is the correct way to create a vertex and index buffer from a physx cloth object

I'm trying to actually RENDER the cloth I created to the screen in DirectX11.
I used the PhysX API to create a cloth object and tried to create the vertex and index buffer accordingly. As far as I know the cloth object should be okay.
Here's my code. Please note that this is in a custom engine (from school) so some things might look weird (like the gameContext object for example) but you should be able to comprehend the code.
I used the Introduction to 3D Game Programming with DirectX10 book from Frank D Luna as a reference for the buffers.
// create regular mesh
PxU32 resolution = 20;
PxU32 numParticles = resolution*resolution;
PxU32 numTriangles = 2*(resolution-1)*(resolution-1);
// create cloth particles
PxClothParticle* particles = new PxClothParticle[numParticles];
PxVec3 center(0.5f, 0.3f, 0.0f);
PxVec3 delta = 1.0f/(resolution-1) * PxVec3(15.0f, 15.0f, 15.0f);
PxClothParticle* pIt = particles;
for(PxU32 i=0; i<resolution; ++i)
{
for(PxU32 j=0; j<resolution; ++j, ++pIt)
{
pIt->invWeight = j+1<resolution ? 1.0f : 0.0f;
pIt->pos = delta.multiply(PxVec3(PxReal(i),
PxReal(j), -PxReal(j))) - center;
}
}
// create triangles
PxU32* triangles = new PxU32[3*numTriangles];
PxU32* iIt = triangles;
for(PxU32 i=0; i<resolution-1; ++i)
{
for(PxU32 j=0; j<resolution-1; ++j)
{
PxU32 odd = j&1u, even = 1-odd;
*iIt++ = i*resolution + (j+odd);
*iIt++ = (i+odd)*resolution + (j+1);
*iIt++ = (i+1)*resolution + (j+even);
*iIt++ = (i+1)*resolution + (j+even);
*iIt++ = (i+even)*resolution + j;
*iIt++ = i*resolution + (j+odd);
}
}
// create fabric from mesh
PxClothMeshDesc meshDesc;
meshDesc.points.count = numParticles;
meshDesc.points.stride = sizeof(PxClothParticle);
meshDesc.points.data = particles;
meshDesc.invMasses.count = numParticles;
meshDesc.invMasses.stride = sizeof(PxClothParticle);
meshDesc.invMasses.data = &particles->invWeight;
meshDesc.triangles.count = numTriangles;
meshDesc.triangles.stride = 3*sizeof(PxU32);
meshDesc.triangles.data = triangles;
// cook fabric
PxClothFabric* fabric = PxClothFabricCreate(*PhysxManager::GetInstance()->GetPhysics(), meshDesc, PxVec3(0, 1, 0));
//delete[] triangles;
// create cloth
PxTransform gPose = PxTransform(PxVec3(0,1,0));
gCloth = PhysxManager::GetInstance()->GetPhysics()->createCloth(gPose, *fabric, particles, PxClothFlags(0));
fabric->release();
//delete[] particles;
// 240 iterations per/second (4 per-60hz frame)
gCloth->setSolverFrequency(240.0f);
GetPhysxProxy()->GetPhysxScene()->addActor(*gCloth);
// CREATE VERTEX BUFFER
D3D11_BUFFER_DESC bufferDescriptor = {};
bufferDescriptor.Usage = D3D11_USAGE_DEFAULT;
bufferDescriptor.ByteWidth = sizeof( PxClothParticle* ) * gCloth->getNbParticles();
bufferDescriptor.BindFlags = D3D11_BIND_VERTEX_BUFFER;
bufferDescriptor.CPUAccessFlags = 0;
bufferDescriptor.MiscFlags = 0;
D3D11_SUBRESOURCE_DATA initData = {};
initData.pSysMem = particles;
gameContext.pDevice->CreateBuffer(&bufferDescriptor, &initData, &m_pVertexBuffer);
// BUILD INDEX BUFFER
D3D11_BUFFER_DESC bd = {};
bd.Usage = D3D11_USAGE_IMMUTABLE;
bd.ByteWidth = sizeof(PxU32) * sizeof(triangles);
bd.BindFlags = D3D11_BIND_INDEX_BUFFER;
bd.CPUAccessFlags = 0;
bd.MiscFlags = 0;
D3D11_SUBRESOURCE_DATA initData2 = {};
initData2.pSysMem = triangles;
gameContext.pDevice->CreateBuffer(&bd, &initData2, &m_pIndexBuffer);
When this is done I run this code in the "draw" part of the engine:
// Set vertex buffer(s)
UINT offset = 0;
UINT vertexBufferStride = sizeof(PxClothParticle*);
gameContext.pDeviceContext->IASetVertexBuffers( 0, 1, &m_pVertexBuffer, &vertexBufferStride, &offset );
// Set index buffer
gameContext.pDeviceContext->IASetIndexBuffer(m_pIndexBuffer,DXGI_FORMAT_R32_UINT,0);
// Set primitive topology
gameContext.pDeviceContext->IASetPrimitiveTopology( D3D10_PRIMITIVE_TOPOLOGY_TRIANGLELIST );
auto mat = new DiffuseMaterial();
mat->Initialize(gameContext);
mat->SetDiffuseTexture(L"./Resources/Textures/Chair_Dark.dds");
gameContext.pMaterialManager->AddMaterial(mat, 3);
ID3DX11EffectTechnique* pTechnique = mat->GetDefaultTechnique();
D3DX11_TECHNIQUE_DESC techDesc;
pTechnique->GetDesc( &techDesc );
for( UINT p = 0; p < techDesc.Passes; ++p )
{
pTechnique->GetPassByIndex(p)->Apply(0, gameContext.pDeviceContext);
gameContext.pDeviceContext->DrawIndexed(gCloth->getNbParticles(), 0, 0 );
}
I think there's something obviously wrong that I'm just totally missing. (DirectX isn't my strongest part in programming). Every comment or answer is much appreciated.

3D picking lwjgl

I have written some code to preform 3D picking that for some reason dosn't work entirely correct! (Im using LWJGL just so you know.)
This is how the code looks like:
if(Mouse.getEventButton() == 1) {
if (!Mouse.getEventButtonState()) {
Camera.get().generateViewMatrix();
float screenSpaceX = ((Mouse.getX()/800f/2f)-1.0f)*Camera.get().getAspectRatio();
float screenSpaceY = 1.0f-(2*((600-Mouse.getY())/600f));
float displacementRate = (float)Math.tan(Camera.get().getFovy()/2);
screenSpaceX *= displacementRate;
screenSpaceY *= displacementRate;
Vector4f cameraSpaceNear = new Vector4f((float) (screenSpaceX * Camera.get().getNear()), (float) (screenSpaceY * Camera.get().getNear()), (float) (-Camera.get().getNear()), 1);
Vector4f cameraSpaceFar = new Vector4f((float) (screenSpaceX * Camera.get().getFar()), (float) (screenSpaceY * Camera.get().getFar()), (float) (-Camera.get().getFar()), 1);
Matrix4f tmpView = new Matrix4f();
Camera.get().getViewMatrix().transpose(tmpView);
Matrix4f invertedViewMatrix = (Matrix4f)tmpView.invert();
Vector4f worldSpaceNear = new Vector4f();
Matrix4f.transform(invertedViewMatrix, cameraSpaceNear, worldSpaceNear);
Vector4f worldSpaceFar = new Vector4f();
Matrix4f.transform(invertedViewMatrix, cameraSpaceFar, worldSpaceFar);
Vector3f rayPosition = new Vector3f(worldSpaceNear.x, worldSpaceNear.y, worldSpaceNear.z);
Vector3f rayDirection = new Vector3f(worldSpaceFar.x - worldSpaceNear.x, worldSpaceFar.y - worldSpaceNear.y, worldSpaceFar.z - worldSpaceNear.z);
rayDirection.normalise();
Ray clickRay = new Ray(rayPosition, rayDirection);
Vector tMin = new Vector(), tMax = new Vector(), tempPoint;
float largestEnteringValue, smallestExitingValue, temp, closestEnteringValue = Camera.get().getFar()+0.1f;
Drawable closestDrawableHit = null;
for(Drawable d : this.worldModel.getDrawableThings()) {
// Calcualte AABB for each object... needs to be moved later...
firstVertex = true;
for(Surface surface : d.getSurfaces()) {
for(Vertex v : surface.getVertices()) {
worldPosition.x = (v.x+d.getPosition().x)*d.getScale().x;
worldPosition.y = (v.y+d.getPosition().y)*d.getScale().y;
worldPosition.z = (v.z+d.getPosition().z)*d.getScale().z;
worldPosition = worldPosition.rotate(d.getRotation());
if (firstVertex) {
maxX = worldPosition.x; maxY = worldPosition.y; maxZ = worldPosition.z;
minX = worldPosition.x; minY = worldPosition.y; minZ = worldPosition.z;
firstVertex = false;
} else {
if (worldPosition.x > maxX) {
maxX = worldPosition.x;
}
if (worldPosition.x < minX) {
minX = worldPosition.x;
}
if (worldPosition.y > maxY) {
maxY = worldPosition.y;
}
if (worldPosition.y < minY) {
minY = worldPosition.y;
}
if (worldPosition.z > maxZ) {
maxZ = worldPosition.z;
}
if (worldPosition.z < minZ) {
minZ = worldPosition.z;
}
}
}
}
// ray/slabs intersection test...
// clickRay.getOrigin().x + clickRay.getDirection().x * f = minX
// clickRay.getOrigin().x - minX = -clickRay.getDirection().x * f
// clickRay.getOrigin().x/-clickRay.getDirection().x - minX/-clickRay.getDirection().x = f
// -clickRay.getOrigin().x/clickRay.getDirection().x + minX/clickRay.getDirection().x = f
largestEnteringValue = -clickRay.getOrigin().x/clickRay.getDirection().x + minX/clickRay.getDirection().x;
temp = -clickRay.getOrigin().y/clickRay.getDirection().y + minY/clickRay.getDirection().y;
if(largestEnteringValue < temp) {
largestEnteringValue = temp;
}
temp = -clickRay.getOrigin().z/clickRay.getDirection().z + minZ/clickRay.getDirection().z;
if(largestEnteringValue < temp) {
largestEnteringValue = temp;
}
smallestExitingValue = -clickRay.getOrigin().x/clickRay.getDirection().x + maxX/clickRay.getDirection().x;
temp = -clickRay.getOrigin().y/clickRay.getDirection().y + maxY/clickRay.getDirection().y;
if(smallestExitingValue > temp) {
smallestExitingValue = temp;
}
temp = -clickRay.getOrigin().z/clickRay.getDirection().z + maxZ/clickRay.getDirection().z;
if(smallestExitingValue < temp) {
smallestExitingValue = temp;
}
if(largestEnteringValue > smallestExitingValue) {
//System.out.println("Miss!");
} else {
if (largestEnteringValue < closestEnteringValue) {
closestEnteringValue = largestEnteringValue;
closestDrawableHit = d;
}
}
}
if(closestDrawableHit != null) {
System.out.println("Hit at: (" + clickRay.setDistance(closestEnteringValue).x + ", " + clickRay.getCurrentPosition().y + ", " + clickRay.getCurrentPosition().z);
this.worldModel.removeDrawableThing(closestDrawableHit);
}
}
}
I just don't understand what's wrong, the ray are shooting and i do hit stuff that gets removed but the result of the ray are verry strange it sometimes removes the thing im clicking at, sometimes it removes things thats not even close to what im clicking at, and sometimes it removes nothing at all.
Edit:
Okay so i have continued searching for errors and by debugging the ray (by painting smal dots where it travles) i can now se that there is something oviously wrong with the ray that im sending out... it has its origin near the world center and always shots to the same position no matter where i direct my camera...
My initial toughts is that there might be some error in the way i calculate my viewMatrix (since it's not possible to get the viewmatrix from the glulookat method in lwjgl; I have to build it my self and I guess thats where the problem is at)...
Edit2:
This is how i calculate it currently:
private double[][] viewMatrixDouble = {{0,0,0,0}, {0,0,0,0}, {0,0,0,0}, {0,0,0,1}};
public Vector getCameraDirectionVector() {
Vector actualEye = this.getActualEyePosition();
return new Vector(lookAt.x-actualEye.x, lookAt.y-actualEye.y, lookAt.z-actualEye.z);
}
public Vector getActualEyePosition() {
return eye.rotate(this.getRotation());
}
public void generateViewMatrix() {
Vector cameraDirectionVector = getCameraDirectionVector().normalize();
Vector side = Vector.cross(cameraDirectionVector, this.upVector).normalize();
Vector up = Vector.cross(side, cameraDirectionVector);
viewMatrixDouble[0][0] = side.x; viewMatrixDouble[0][1] = up.x; viewMatrixDouble[0][2] = -cameraDirectionVector.x;
viewMatrixDouble[1][0] = side.y; viewMatrixDouble[1][1] = up.y; viewMatrixDouble[1][2] = -cameraDirectionVector.y;
viewMatrixDouble[2][0] = side.z; viewMatrixDouble[2][1] = up.z; viewMatrixDouble[2][2] = -cameraDirectionVector.z;
/*
Vector actualEyePosition = this.getActualEyePosition();
Vector zaxis = new Vector(this.lookAt.x - actualEyePosition.x, this.lookAt.y - actualEyePosition.y, this.lookAt.z - actualEyePosition.z).normalize();
Vector xaxis = Vector.cross(upVector, zaxis).normalize();
Vector yaxis = Vector.cross(zaxis, xaxis);
viewMatrixDouble[0][0] = xaxis.x; viewMatrixDouble[0][1] = yaxis.x; viewMatrixDouble[0][2] = zaxis.x;
viewMatrixDouble[1][0] = xaxis.y; viewMatrixDouble[1][1] = yaxis.y; viewMatrixDouble[1][2] = zaxis.y;
viewMatrixDouble[2][0] = xaxis.z; viewMatrixDouble[2][1] = yaxis.z; viewMatrixDouble[2][2] = zaxis.z;
viewMatrixDouble[3][0] = -Vector.dot(xaxis, actualEyePosition); viewMatrixDouble[3][1] =-Vector.dot(yaxis, actualEyePosition); viewMatrixDouble[3][2] = -Vector.dot(zaxis, actualEyePosition);
*/
viewMatrix = new Matrix4f();
viewMatrix.load(getViewMatrixAsFloatBuffer());
}
Would be verry greatfull if anyone could verify if this is wrong or right, and if it's wrong; supply me with the right way of doing it...
I have read alot of threads and documentations about this but i can't seam to wrapp my head around it...
I just don't understand what's wrong, the ray are shooting and i do hit stuff that gets removed but things are not disappearing where i press on the screen.
OpenGL is not a scene graph, it's a drawing library. So after removing something from your internal representation you must redraw the scene. And your code is missing some call to a function that triggers a redraw.
Okay so i finaly solved it with the help from the guys at gamedev and a friend, here is a link to the answer where i have posted the code!