I'm using the PhysX 3.3 SDK in a application and mostly everything is working fine. However, I'm trying to develop a feature that would allow users to click on various actors in the scene and modify their properties.
My approach so far has been to use the PhysX raycast to query the scene and use RaycastHit to return a pointer to the actor. This works fine for rigid bodies, but for cloth actors, the hit is invariably null.
This suggests that PhysX can't raycast against cloth objects, but other than my tests, I haven't seen anything in the docs (or on the interwebs) that say this definitively.
Any thoughts?
Related
I'm currently in the process of developing a very basic 3D OpenGL game in C++ as part of a small college project. We don't get a lot of insight from the teachers however, and only very limited documentation, as well as a small timeframe, so I'm kind of a little lost here at the moment.
My game is a tank battle on an orthogonal plane that pretty much looks exactly like the image I sketched below. Each tank (A and B) can be controlled by a different player, and each one can shoot projectiles, which are supposed to influence the other tank's score upon collision.
My question is, what would be the simplest way of effectively implementing collisions for the tanks? (Tank vs tank, tank vs map boundaries and tank vs any kind of parallelepipedic object like the one in the center of the picture - and the same thing but applied to the projectiles shot from the tank turrets).
Ideally, without the need of using an external physics engine, but also accepted if the implementation can be done easily. At the moment, I'm solely using the GLUT library.
Download and integrate Box2D (http://box2d.org) into your project.
Unless your project is to implement a physics engine, then don't bother doing it yourself. Your time will be much better spent learning how to integratate libraries and how proper physics engines work.
You can then easily use a box collider for your tanks, circle for projectiles and 4 lines for your perimeter. You can create callbacks to notify you when a projectile has collided with another tank.
You will have to use forces and torques to move and rotate your tanks, rather than just updating their positions. But you would probably have to do that anyway if you were going to implement the physics yourself.
I am looking for a way to create an updating texture from iOS to an OpenGL Context. I have seen Render contents of UIView as an OpenGL texture but this is quite slow, as it requires the whole view to be rerendered every time it changes. This means webviews in particular are hit very hard as the whole page needs blitting around. This breaks animations in web views, and makes everything very static. I was wondering if there is a technique using some other APIs in iOS that would enable a link to be created between view to texture (much like video textures do).
This seems to be a fundamental requirement of OS display composition, but it feels like it always happens under the covers and is not exposed to developers. Perhaps I am wrong!
Bonus points for anyone that can tell me if any other OSes support this feature.
Take a look at RosyWriter sample project form Apple.
It uses CVOpenGLESTextureCache to improve performance of rendering camera frames as opengl textures.
I've create a ParticleSystem in Ogre so that my object emitt, suppose, a lot of star.
My question is: how can I realize the interaction of this stars with the environment and the objects in the scene too? but more importantly, can I do this issue with ParticleSystem?
Any help will be appreciated!
update
I'm trying to use inside my particle file:
affector DeflectorPlane {
....
}
A DeflectorPlane supports as the name suggests only a single plane of which particles can bounce of (see entry in Ogre manual).
Having particles bounce of arbitrary surfaces involves a lot of heavy collision detection and is therefore a task that is not in the responsibility of a rending but a physics engine, hence Ogre3D has no out-of-the-box support for this requirement.
But there are four different options in terms of already existing Ogre3D physics engine wrappers: Newton, Bullet, PhysX and ODE. Each of the wrappers has its own dedicated section in the Ogre Addons forum with further information and links.
I'm new to graphics, and I have to make a model of a building for an assignment using only GLUT or OpenGL.
Basically the school building's model( only the exterior portion) is to be made, and I have no clue where to start. Upto now I have drawn polygons, other shapes using GLUT, nothing in which there are multiple shapes. All the drawing upto now is using lines, or points, or polygons and mathematics.
Could you please give me an idea of how to go about it?
Update: I just want to know what steps I can follow to get it done. Some reference links would be awesome!
You could use modeling programs to create your model, and then use tools such as COLLADA to get your model into OpenGL.
The problem with hand-coding a complex object like that is that it takes a great number of lines of code just to define the vertices of the object.
People usually use 3D modeler software to build complex 3D objects, like Maya, 3DSMax or Blender and then export them in a format to be read into your OpenGL application.
Think about what you want your building to look like, and think about what kind of triangles you need to render in order to make that. You can either draw the entire thing in some sort of modelling software, and then import it into OpenGL, or you can come up with the triangles/textures yourself and do it by hand in OpenGL.
The exterior of the building will probably have a similar texture on the whole thing (brick, etc), and then there will be windows, doors, and a roof. Maybe some sort of sign that says "School Building". Take this all into account, what exactly you want your building to look like, and then think about what textures you will need to draw these things.
For example, say you're doing a brick building that is in the shape of a box, with a door and a few windows. I'd use one texture for the brick, and first draw an entire wall of brick. Then, I'd use a grey/blue looking texture for the window, and draw it over the brick wall. Then I'd do the same (different texture) for the door.
Just think about the design, and then just try things out - experiment. Good luck!
I once had a simillar homework. I did it by creating the models with Google SketchUp, then export the models to .3ds file and use my program to render it.
I choose Google SketchUp because it's the easiest to use among those tool I tried. Plus, they had a discount for students. You could also use Blender, which is free but take too much time to learn IMHO. 3dsMax is too expensive to pay for a homework.
To load the model into my program, I used Assimp library.
I'm wanting to build a game with some simple effects.
I want to add the warping effect that you see in games like geometry wars and geodefence. I know how to implement this effect in OpenGL ES. Would I be able to add this to a Cocos2D created app?
I want to have a 3D model that only moves on a 2D plane. It may rotate. First, can I add OpenGL shading to the model? Second, can I have Box2D physics applied to it like it was a 2D sprite?
That's about it. Those are the main functionality I'm hoping I can add to a Cocos2D application and am trying to figure out if I can before I spend a lot of time learning how to use the game engine.
1) Yes, you can intermix Cocos2D and OpenGL ES together - you can override the CCNode's "draw" method and do just about anything you'd like in there (such as rotating, scaling, etc in OpenGL with the texture).
2) You can add the model, and you can shade the model - yes. If you create the body fixtures for the model from Box2D, but treat the Model as if it were a '2d sprite' (has set width/height) - yes, you can use Box2D - but understand that it will only react within the 2D Physics World, and won't have any depth applied to it.
It should be noted though, that though these are possible, you will still need to implement the code to do so on your own.