I am using Qt3D (5.11), and I am experiencing asserts when I try to set a QParameter value to be used by a custom fragment shader. Here is the section of code that doesn't seem to be working:
auto entity = new Qt3DRender::QEntity( mRootEntity );
auto material = new Qt3DRender::QMaterial( entity );
// Set up the custom geometry and material for the entity, which works
// fine in tests as long as the fragment shader does not use texture mapping
auto image = new Qt3DRender::QTextureImage( entity );
image->setSource( QString( "qrc:/image.png" ) );
auto texture = new Qt3DRender::QTexture2D( entity );
texture->addTextureImage( image );
material->addParameter( new QParameter( "imageTexture", texture, entity ) );
I've included only the bits of code needed for this question:
Is this a valid way to set a simple texture parameter? If not, what am I missing to set a simple image?
Note that qrc:/image.png is a 256x256 image that I have used elsewhere in this project without a problem.
The code compiles fine, but when I run it I get an assert with the following message: ASSERT: "texture" in file texture\textureimage.cpp, line 94
I am using VS2017 on Windows 10 with Qt 5.11.
I stumbled upon the issue. Parenting the QTextureImage to entity leads to the assert. Leaving off the parent completely (effectively setting it to nullptr) or parenting it to texture fixes the issue.
Here is some working code:
auto texture = new Qt3DRender::QTexture2D( entity );
auto image = new Qt3DRender::QTextureImage( texture );
image->setSource( QString( "qrc:/image.png" ) );
texture->addTextureImage( image );
material->addParameter( new QParameter( "imageTexture", texture, entity ) );
This is probably a bug? If someone knows why the QTextureImage cannot be safely parented to entity, please add a comment.
Related
I am coding a 2D Game using DirectX11 and DirectXTK.
I did a class Framework that initializes both the window displayed for the game and initializes DirectX. These initializations work correctly. Then, I decided to draw some backgrounds, etc in the window, but after a while it exits on an exception. I did a try{ ... } catch(){ } block, which tells me that "Texture cannot be null". However, i could not find which texture it is talking about, even by debbugging and checking all the values.
I decided to separate the different elements i was drawing in the window, to see where the problem might come from... So now i have 3 draw methods :
Draw(DWORD &elapsedTime);
DrawBackground(DWORD &elapsedTime);
DrawCharacter(DWORD &elapsedTime);
The Draw(DWORD &elapsedTime) method calls both DrawBackground() and DrawCharacter() methods.
Here is my Draw Method :
void Framework::Draw(DWORD * elapsedTime)
{
// Clearing the Back Buffer
immediateContext->ClearRenderTargetView(renderTargetView, Colors::Aquamarine);
//Clearing the depth buffer to max depth (1.0)
immediateContext->ClearDepthStencilView(depthStencilView, D3D11_CLEAR_DEPTH, 1.0f, 0); //immediateContext is a ID3D11DeviceContext*
CommonStates states(d3dDevice); //d3dDevice is a ID3D11Device*
sprites.reset(new SpriteBatch(immediateContext));
sprites->Begin(SpriteSortMode_Deferred, states.NonPremultiplied());
DrawBackground1(elapsedTime);
DrawCharacter(elapsedTime);
sprites->End();
//Presenting the back buffer to the front buffer
swapChain->Present(0, 0);
}
By debugging i am almost sure that the exception comes from both DrawBackground() and DrawCharacter(). Indeed, when I comment those in the Draw method, i have no error, but as soon as i put one it sets the exception after displaying what i want during a few seconds.
Here is the method DrawBackground() for example :
void Framework::DrawBackground1(DWORD * elpasedTime)
{
RECT *try1 = new RECT();
try1->bottom = 0; try1->left = 0; try1->right = (int)WIDTH; try1->bottom = (int)HEIGHT;
ID3D11ShaderResourceView * texture2 = nullptr;
ID3D11ShaderResourceView * textureRV = nullptr;
CreateDDSTextureFromFile(d3dDevice, L"../Images/backgrounds/set2_background.dds", nullptr, &textureRV);
CreateDDSTextureFromFile(d3dDevice, L"../Images/backgrounds/set3_tiles.dds", nullptr, &texture2);
sprites->Draw(textureRV, XMFLOAT2(0, 0), try1, Colors::White);
sprites->Draw(texture2, XMFLOAT2(0, 0), try1, Colors::CornflowerBlue);
}
So as soon as i uncomment this method (or any DrawCharacter(), which follows the same steps), the window displays what i expect it to for a few seconds, but then i get the exception "Texture cannot be null". I also noticed that the method DrawCharacter() lets the window displaying what i want longer than the method DrawBackground(), whose texture is way bigger than the character's one.
I'm not sure if this information is useful but i think that maybe this might be linked to the size of the texture ?
Would you notice anything that i did wrong in this code ? Why would a texture be considered null while it does display it for a while ? I've been looking for answers for a few hours now, some help would be amazing please !
Thank you
I noticed that you create two new ID3D11ShaderResourceView every iteration without Release-ing the old ones. You could try by creating the ShaderResourceViews only once and storing them as global variables, or you could try by ->Release() them after the sprites->Draw(...) calls.
I use VTK-6.2, C++ (gcc-4.7.2) on Linux and I have the following VTK pipeline setup (please ignore implementation, details and focus on the pipeline: cone->filter->mapper->actor):
// cone/initialize
vtkConeSource cone;
// add cone(s) to filter
vtkAppendFilter filter;
filter.AddInputData(cone.GetOutput());
// add filter to mapper
vtkDataSetMapper mapper;
mapper.SetInputData(filter->GetOutput());
// actor
vtkActor actor;
actor.SetMapper(mapper);
The scene renders fine.
The Problem
I want to update the original data (i.e. the cones) and the actor to be rendered correctly.
How do I access the original cone data if I have just the actors? Does this guarantee that the actors will be updated too? Because when I decided to keep track of the original data (via pointers: the whole implementation is with vtkSmartPointers) and then change some of their attributes, the pipeline did not update. Shouldn't it update automatically?
(When I change the actor (e.g. their visibility), the scene renders fine)
Forgive me, I am not a VTK expert and the pipelines are confusing. Maybe one approach would be to simplify my pipeline.
Thanks
[update]
According to this answer to a similar post, the original data (vtkConeSource) are transformed (to vtkUnstructuredGrid when added in the vtkAppendFilter) so even if I keep track of the original data, changing them is useless.
VTK pipelines are demand-driven pipelines. They do not update automatically even if one of the elements of the pipeline is modified. We need to explicitly call the Update() function on the last vtkAlgorithm( or its derived class object) of the pipeline to update the entire pipeline. The correct way to set up a pipeline is when we are connecting two objects which are derived from vtkAlgorithm type is to use
currAlgoObj->SetInputConnection( prevAlgoObj->GetOutputPort() )
instead of
currAlgoObj->SetInputData( prevAlgo->GetOutput() )
Then we can update the pipeline using the pointer to the actor object by doing actor->GetMapper()->Update() like shown in the example below.
In this example, we will create a cone from a cone source, pass it through vtkAppendFilter and then change the height of the original cone source and render it in another window to see the updated cone. (You will have to close the first render window to see the updated cone in second window.)
#include <vtkConeSource.h>
#include <vtkDataSetMapper.h>
#include <vtkActor.h>
#include <vtkRenderer.h>
#include <vtkRenderWindow.h>
#include <vtkRenderWindowInteractor.h>
#include <vtkSmartPointer.h>
#include <vtkAppendFilter.h>
int main(int, char *[])
{
// Set up the data pipeline
auto cone = vtkSmartPointer<vtkConeSource>::New();
cone->SetHeight( 1.0 );
auto appf = vtkSmartPointer<vtkAppendFilter>::New();
appf->SetInputConnection( cone->GetOutputPort() );
auto coneMapper = vtkSmartPointer<vtkDataSetMapper>::New();
coneMapper->SetInputConnection( appf->GetOutputPort() );
auto coneActor = vtkSmartPointer<vtkActor>::New();
coneActor->SetMapper( coneMapper );
// We need to update the pipeline otherwise nothing will be rendered
coneActor->GetMapper()->Update();
// Connect to the rendering portion of the pipeline
auto renderer = vtkSmartPointer<vtkRenderer>::New();
renderer->AddActor( coneActor );
renderer->SetBackground( 0.1, 0.2, 0.4 );
auto renderWindow = vtkSmartPointer<vtkRenderWindow>::New();
renderWindow->SetSize( 200, 200 );
renderWindow->AddRenderer(renderer);
auto renderWindowInteractor =
vtkSmartPointer<vtkRenderWindowInteractor>::New();
renderWindowInteractor->SetRenderWindow(renderWindow);
renderWindowInteractor->Start();
// Change cone property
cone->SetHeight( 10.0 );
//Update the pipeline using the actor object
coneActor->GetMapper()->Update();
auto renderer2 = vtkSmartPointer<vtkRenderer>::New();
renderer2->AddActor( coneActor );
renderer2->SetBackground( 0.1, 0.2, 0.4 );
auto renderWindow2 = vtkSmartPointer<vtkRenderWindow>::New();
renderWindow2->SetSize( 200, 200 );
renderWindow2->AddRenderer(renderer2);
auto renderWindowInteractor2 =
vtkSmartPointer<vtkRenderWindowInteractor>::New();
renderWindowInteractor2->SetRenderWindow(renderWindow2);
renderWindowInteractor2->Start();
return EXIT_SUCCESS;
}
I am trying to load a texture via OpenGL for a 2d platformer and the code seems to be crashing on this exact part, but the lack of knowledge in C++ or openGL seems to be my problem, pls help!
bool Texture::LoadTextureFromFile( std::string path )
{
//Texture loading success
bool textureLoaded = false;
//Generate and set current image ID
GLuint imgID = 0;
glGenTextures( 1, &imgID );
glBindTexture(GL_TEXTURE_2D ,imgID );
//Load image
GLboolean success = glLoadTextures( path.c_str() );
//Image loaded successfully
if( success == GL_TRUE )
{
//Convert image to RGBA
// success = ilConvertImage( IL_RGBA, IL_UNSIGNED_BYTE );
if( success == GL_TRUE )
{
//Create texture from file pixels
textureLoaded = LoadTextureFromPixels32
( (GLuint*)glGetDoublev, (GLuint*)glGetIntegerv( GL_TEXTURE_WIDTH ), GLuint*(glGetIntegerv( GL_TEXTURE_HEIGHT )) );
}
//Delete file from memory
glDeleteTextures( 1, &imgID );
}
//Report error
if( !textureLoaded )
{
printf( "Unable to load %s\n", path.c_str() );
}
return textureLoaded;
}
There are a few possible issues.
Firstly, where do you get glLoadTextures() from? And what does it do? That's not part of the OpenGL specification. It's not clear to me what it does. It's possible that this function does the entire texture loading for you, and the code below just screws things up.
Next, your first parameter to LoadTextureFromPixels32() is glGetDoubleV. You're passing it a pointer to the function glGetDoubleV, which is definitely NOT right. I assume that parameter expects a pointer to the loaded image data.
Finally, your code deletes the texture you just created with glDeleteTextures(). That makes no sense. Your texture object's ID is stored in imgID, which is a local variable. You're deleting it so the texture's gone.
The normal procedure for creating a texture is:
Load the texture data (using something like SDL_image, or another image loading library)
Create the texture object using glGenTextures(); NOTE: This is your handle to the texture; you MUST store this for future use
Bind it glBindTexture()
Load in the image data glTexture2D()
That's it.
I am new to c++/cinder and I am trying to import a 3ds .obj file into cinder and apply a simple texture. I really cant find any simple tutorials on how to do this and it seems to be slightly different to freeGLUT.
gl::Texture sTexture;
sTexture = gl::Texture(loadImage(loadAsset("texture.jpg")));
cinder::TriMesh mySphere;
ObjLoader loader( loadFile( "mySphere/sphere.obj" ) );
loader.load( &mySphere );
gl::draw( mySphere );
I understand that mySphere constains the texture co-ords as a vector and I need to bind the texture to the object, but I cant find a clear example of how? Everything I have tried has left me with a white circle.
Thanks.
Found my solution. I was using sTexture.bind(); but sTexture.enableAndBind(); is needed.
gl::Texture sTexture;
sTexture = gl::Texture(loadImage(loadAsset("texture.jpg")));
sTexture.enableAndBind();
cinder::TriMesh mySphere;
ObjLoader loader( loadFile( "mySphere/sphere.obj" ) );
loader.load( &mySphere );
gl::draw( mySphere );
sTexture.unbind();
I'm working with Java3d under eclipse Indigo in windows. After finally modifying the StlLoader example and ObjLoad classes to get my STL files to load up, I get a result that looks like the below (I think from other questions these are definitely bad vector normals). Does anybody know why I might be having this problem? I am using SolidWorks to save the STL as an ASCII file and using a modification of the code for loading STL files given on java3d.org. Although I have only changed some appearance properties and fixed broken imports etc. I have confirmed that the facet normals put into "normList" below definitely match those from the file.
Example of Result:
Snippet of StlFile.java from http://www.java3d.org :
private SceneBase makeScene()
{
// Create Scene to pass back
SceneBase scene = new SceneBase();
BranchGroup group = new BranchGroup();
scene.setSceneGroup(group);
// Store the scene info on a GeometryInfo
GeometryInfo gi = new GeometryInfo(GeometryInfo.TRIANGLE_STRIP_ARRAY);
// Convert ArrayLists to arrays: only needed if file was not binary
if(this.Ascii)
{
coordArray = objectToPoint3Array(coordList);
normArray = objectToVectorArray(normList);
}
gi.setCoordinates(coordArray);
gi.setNormals(normArray);
gi.setStripCounts(stripCounts);
// Setting the Material Appearance
Appearance app = new Appearance();
// Coloring Attributes
ColoringAttributes catt = new ColoringAttributes();
catt.setShadeModel( ColoringAttributes.NICEST );
app.setColoringAttributes(catt);
Material mat = new Material(new Color3f(0.6f, 0.6f, 0.6f), // ambient
new Color3f(0, 0, 0), // emissive
new Color3f(0.6f, 0.6f, 0.6f), // diffuse
new Color3f(0.6f, 0.6f, 0.6f), // specular
10); // shininess
app.setMaterial(mat);
// Put geometry into Shape3d
Shape3D shape = new Shape3D(gi.getGeometryArray(), app);
group.addChild(shape);
scene.addNamedObject(objectName, shape);
return scene;
} // end of makeScene
If some areas on the surface are really black (0x000000), I would guess some of the normals are actually pointing inwards the model rather than to the outside.
You may check if vertices v1,v2,v3 for all the triangles are defined in right-hand order (just test if det(v1,v2,v3) > 0 ) and reorder points accordingly. Alternatively, detect the "opposite" normals and multiply them by -1