I am trying to parse line items from "mtl" file and use the values as parameters to opengl functions.
I could use values of ambience (Ka), specular(Ks) and diffuse(Kd) using glMaterialfv. But I don't know the use for Ni (optical density), d (dissolve), illum (illumination) values given in the mtl file.
Which opengl function should be used with these values?
Any help with these line items?
....
Ni 1.000000
d 1.000000
illum 2
...
Dissolve means transparency. 1.0 means fully opaque object, 0.0 means fully transparent. You can control rendering of transparent objects by using functions like glBlendFunc().
For a full definition of mtl files, including illum, please see http://people.sc.fsu.edu/~jburkardt/data/mtl/mtl.html.
Ni seems to be unsupported and can be ignored.
Related
According to my recollection, once an object or a scene was described in PovRay, there was no other output than the rendering generated. In other words, once exported into *.pov, you were no longer able to convert it into an other 3D file format.
Hence I was very surprised to learn about pov2mesh that aims to generate a point cloud, thanks to meshlab eventually suitable for 3D printing.
As I have a number of scenes defined only as *.pov describing molecules (so, spheres and sticks) and colour encoded molecular surfaces from computation, I wonder if there were a way to convert / rewrite such a scene into a format like vrml 2.0, preserving both shape and colour of them.
Performing the computation again and saving the result straight ahead as vrml is not an option, as beside binary output understood by the software, the choice to save the results is either *.png, or *.pov.
Or is there a povray editor, that is able to understand a *.pov produced by other software, and offeres to export the scene in *.vrml (or a different 3D file format)?
I don't think there is an editor that converts from .pov to .vrml, but both formats are text based. Since your pov file is only made out of sphere and cylinders you could convert it by hand, or write a simple program to do it for you. Here is a red sphere in Povray (http://www.povray.org/documentation/view/3.6.2/283/)
sphere{
<0, 0, 0>, 1
pigment{
color rgb <1, 0, 0>
}
}
I don't know much about vrml but this should be the equivalent (found here: https://www.siggraph.org/special-projects/com97/vrmlexample1.html)
Shape {
appearance Appearance {
material Material {
diffuseColor 1.0 0.0 0.0
transparency 0.0
}
}
geometry Sphere {
radius 1.0
}
}
I would very much like to create a repeating texture on a 3D object.
I tried exporting from Maya to .obj. The material file (.mtl) looks like this:
newmtl lambert10SG
illum 4
Kd 0.00 0.00 0.00
Ka 0.00 0.00 0.00
Tf 1.00 1.00 1.00
map_Kd -s 0.1 0.1 grass.jpg
Ni 1.00
the line "map_Kd -s 0.1 0.1 grass.jpg" should indicate that the texture is repeating. However this doesn't work at all. The texture doesn't show until I remove "-s 0.1 0.1". Then it gets stretched.
I tried exporting to .fbx and then convert to .c3b. Same result. Texture gets stretched.
Then I tried creating my own texture. I know that in OpenGL I would have to set texture coordinates to >1 to make the texture repeat itself. These seems to be equivalent to maxS and maxT in the texture(?).
This is my texture setup:
cocos2d::Image *textImage = new (std::nothrow) cocos2d::Image();
textImage->initWithImageFile("grass.jpg");
cocos2d::Texture2D *texture = new (std::nothrow)cocos2d::Texture2D();
texture->initWithImage(textImage);
cocos2d::Texture2D::TexParams texParam;
texParam.wrapS = GL_REPEAT;
texParam.wrapT = GL_REPEAT;
texParam.minFilter = GL_LINEAR;
texParam.magFilter = GL_LINEAR;
texture->setTexParameters(texParam);
texture->setMaxS(10.0f);
texture->setMaxT(10.0f);
sprite->getMesh()->setTexture(texture);
Texture is still stretching.
From searching the internet it seems I would be able to set texture coordinates on a 2D sprite in Cocos with the setTextureRect function. However this doesn't seem to exist for sprite3D.
Any ideas will be very much appreciated!
UPDATE:
I managed to get a texture tiling by editing the .obj file manually.
Obviously the CCObjLoader doesn't understand the line in the material file (.mtl):
map_Kd -s 0.1 0.1 grass.jpg
Removing "-s 0.1 0.1" makes the loader recognize the texture (still stretched though).
After this I had to manually change all vt coordinates in the .obj file, by multiplying with 10. Still the texture didn't repeat, until I changed the texture parameters to GL_REPEAT instead of GL_CLAMP_TO_EDGE.
cocos2d::Texture2D::TexParams texParam;
texParam.wrapS = GL_REPEAT;
texParam.wrapT = GL_REPEAT;
texParam.minFilter = GL_LINEAR;
texParam.magFilter = GL_LINEAR;
sprite->getMesh()->getTexture()->setTexParameters(texParam);
This is not a solution to my problem as such, as I would need the app to recognize when a texture should repeat automatically.
I haven't yet deciphered where texture coordinates are kept in the cocos2d structure, hence haven't been able to change these after the sprite has been loaded. A solution could be to fix the objLoader, however this is not very prone to cocos updates. Or maybe make a small .obj file fixer. None of these seems to be ideal solutions...
I'm trying to import *.x files to my engine and animate them using OpenGL (without shaders for now, but that isn't really relevant right now). I've found the format reference at MSDN, but it doesn't help much in the problem.
So - basically - I've created a file containing a simple animation of a demon-like being with 7 bones (main, 2 for the tail, and 4 for the legs), from which only 2 (the ones in the right leg) are animated at the moment. I've tested the mesh in the DXViewer, and it seems to work perfectly there, so the problem must be the side of my code.
When I export the mesh, I get a file containing lots of information, from which there are 3 important places for the skeletal animation (all the below matrices are for the RLeg2 bone):
SkinWeights - matrixOffset
-0.361238, -0.932141, -0.024957, 0.000000,
0.081428, -0.004872, -0.996669, 0.000000,
0.928913, -0.362066, 0.077663, 0.000000,
0.139213, -0.057892, -0.009323, 1.000000
FrameTransformMatrix
0.913144, 0.000000, -0.407637, 0.000000,
0.069999, 0.985146, 0.156804, 0.000000,
0.401582, -0.171719, 0.899580, 0.000000,
0.000000, -0.000000, 0.398344, 1.000000
AnimationKey matrix in bind pose
0.913144, 0.000000, -0.407637, 0.000000,
0.069999, 0.985146, 0.156804, 0.000000,
0.401582, -0.171719, 0.899580, 0.000000,
0.000000, -0.000000, 0.398344, 1.000000
My question is - what do I exactly do with these matrices? I've found an equation on the Newcastle University site (http://research.ncl.ac.uk/game/mastersdegree/graphicsforgames/skeletalanimation/), but there's only 1 matrix there. The question is - how do I combine these matrices to get the vertex transform matrix?
This post is not pretend to be a full answer, but a set of helpful links.
How to get all information needed for animation
The question is how do you import your mesh, and why do you do this. You can fight with .x meshes for a months, but this doesn't make any sense, because .x is a very basic, old and really not good enough format. You don't find many fans of .x format on StackOverflow. =)
.x file stores animation data in a tricky way. It was intended to load via set of D3DX*() functions. But, to get bones and weights from it manually, you must preprocess loaded data. Much things to code. Here is a big post, explaining how to:
Loading and displaying .X files without DirectX
Good way to do things is just switch to some mesh loading library. The most popular and universal one is Assimp. At least, look at their docs and/or source code, on how they handle loading and preprocessing, and what whey have as output. Also, here is a good explanation:
Tutorial 38 - Skeletal Animation With Assimp
So, with assimp you can stop fighting and begin animating right now. And maybe later, when you'll find idea on how it's works, you can write your own loader.
When you've got all information needed for animation
Skeletal animation is a basic topic that explained in details all around the web.
You can find basic vertex shader for animation here:
OpenGL Wiki: Skeletal Animation
Here is a explanation of how all works (but implemented in fixed-function style): Basic Bones System
Hope it helps!
Since Drop provided links that talk about the problem, and give clues on how to solve it, but don't quite provide a simple answer, i feel obliged to leave the solution here, in case someone else stumbles on the same problem.
To get the new vertex position in "bind pose"
v'(i) = v(i)*Σ(transform(bone)*W(bone,i))
where:
v'(i) - new vertex position,
v(i) - old vertex position, and
W(bone,i) - weight of the transformation.
(and of course Σ is the sum from 0 to the amount of bones in the skeleton)
The transform(bone) is equal to sw(bone) * cM(bone), where sw is the matrix found inside the SkinWeights tag, and cM(bone) is calculated using a recursive function:
cM(bone)
{
if(bone->parent)
return localTransform*cM(bone->parent);
else
return localTransform;
}
The localTransform is the matrix located inside the FrameTransformMatrix tag.
To get the new vertex position in a certain animation frame
Do the exact same operation as mentioned above, but instead of the matrix in FrameTransformMatrix, use one of the matrices inside the appropriate AnimationKey tag. Note that when an animation is playing, the matrix inside the FrameTransformMatrix tag becomes unused. Which means, you'll probably end up ignoring it most of the time.
I was studying Perlin's Noise through some examples # http://dindinx.net/OpenGL/index.php?menu=exemples&submenu=shaders and couldn't help to notice that his make3DNoiseTexture() in perlin.c uses noise3(ni) instead of PerlinNoise3D(...)
Now why is that? Isn't Perlin's Noise supposed to be a summation of different noise frequencies and amplitudes?
Qestion 2 is what does ni, inci, incj, inck stand for? Why use ni instead of x,y coordinates? Why is ni incremented with
ni[0]+=inci;
inci = 1.0 / (Noise3DTexSize / frequency);
I see Hugo Elias created his Perlin2D with x,y coordinates, and so does PerlinNoise3D(...).
Thanks in advance :)
I now understand why and am going to answer my own question in hopes that it helps other people.
Perlin's Noise is actually a synthesis of gradient noises. In its production process, we must compute the dot product of a vector pointing from one of the corners flooring the input point to the input point itself with the random-generated gradient vector.
Now if the input point were a whole number, such as the xyz coordinates of a texture you want to create, the dot product would always return 0, which would give you a flat noise. So instead, we use inci, incj, inck as an alternative index. Yep, just an index, nothing else.
Now returning to question 1, there are two methods to implement Perlin's Noise:
1.Calculate the noise values separately and store them in the RGBA slots in the texture
2.Synthesize the noises up before-hand and store them in one of the RGBA slots in the texture
noise3(ni) is the actual implementation of method 1, while PerlinNoise3D(...) suggests the latter.
In my personal opinion, method 1 is much better because you have much more flexibility over how you use each octave in your shaders.
My guess on the reason for using noise3(ni) in make3DNoiseTexture() instead if PerlinNoise3D(...) is that when you use that noise texture in your shader you want to be able to replicate and modify the functionality of PerlinNoise3D(...) directly in the shader.
My guess for the reasoning behind ni, inci, incj, inck is that using x,y,z of the volume directly don't give a good result so by scaling the the noise with the frequency instead it is possible to adjust the resolution of the noise independently from the volume size.
I have created a an OBJ loader that can import .OBJ files that were exported from 3DS Max into my simple OpenGL viewer / app. At the heart of this was a Vector3.h written with the help of some tutorials.
It worked great on a couple models I used, but the one I want to work with has something different that wasn't accounted for. It has 4 points in its vertices instead of 3. Here is a sample of a line I am working with:
g Box02
usemtl Wood_Bark
s 4
f 1/1/1 2/2/1 3/3/1 4/4/2
f 4/1/3 3/2/3 5/3/3
The first 'f' line has 4 vertices I am interested in. My Vertex3.h takes X, Y, Z. In the other models I had, all lines were like the second 'f' line, with only 3 elements. I am getting a vertex out of range, so when I went to check where it was happening, I saw it was on this line, so I assumed because there is more data on the line that can be handled. Here is the entire Vertex3.h
http://pastebin.com/dgGSBSFe
And this is the line of code that fails. vertices is a Vector3.
tempVertices.push_back ( vertices[--vertex] );
My question is, what is the 4th point? How would you account for that in something like my Vector3.h file? Seems like I need to create a Vector4.h, and ignore the 4th var if there is only 3 on the line. But I would like to know more about what I am dealing with, and any tips on how to do it. Is the 4th element an alpha or something? How should it be used, or should it be used at all in my calculations in Vector3.h?
A face with four points is called a quad. Usually if you want to render it you should break it up into two triangles.
So for example, you have this:
___
| |
| |
|___|
You need to turn it into this:
___
|\ |
| \ |
|__\|
Assuming the vertices go counter-clockwise (the default in OpenGL), starting from the upper left, you could make two triangles. The first triangle's vertices would be the quad's first, second, and third vertices. The second triangle's vertices would be the quad's third, fourth, and first vertices.
Why don't you export as triangles only? Convert to EMesh in 3ds max, make all edges visible, export. Or simply use the appropriate OBJ export option.