Transparent SwapChainPanel (compositeMode) - no opaque DirectX drawing? - c++

I'm making an Augmented Reality App (C++) in Windows 10 (universal Windows app / WinRT) with openCV.
Problem:
I want to have a transparent SwapChainPanel background to make the content behind (webcam stream) visible, but with opaque 3D-models (eg. a cone).
Try:
To my research it seems, that setting the CompositeMode of the SwapChainPanel to "minBlend" should do it - yes, but I still want my 3D Objects to be opaque. In fact, I want my objects to be semitransparent, but always visible. The "minBlend" mode is more for text-highlighting, not really to overlay something with semitransparent models (dark areas are not overlayed, see pictures).
Image: Standard DirectX Cube (oqaque background and model)
Image: DirectX Cube overlayed
Do you have any suggestions? Is it possible?
Background:
I'm making an Augmented Reality Windows 10 App with openCV. For getting the current pixel data of my webcam stream I'm using the Win10 methods mediaCapture->GetPreviewFrameAsync(videoframe) with SoftwareBitmap->LockBuffer to get access to the bytes in the memorybuffer. The bytes are processed within openCV functions and after processing is complete, I'm setting up a WriteableBitmap to show the modified webcam-stream in my Xaml-UI element. Because of already having classes to draw my DirectX objects and modify them with touch input, i want to use DirectX to overlay the webcam preview with my objects.
Sorry for not linking the used methods and the linked Images, I haven't enough reputation
Edit: Maybe an alternative would be to create a texture from my pixel data and set up a fullscreen rectangle on my swapchainpanel which functions as a background. Then every frame I have to update the texture data.

Due to some official Windows Store examples, changing the compositeMode is the only solution to make Xaml content behind a swapChainPanel visible.
But this workaround finally works: I've created a Texture2D and update the containing data every frame:
/*----- Update background texture with new videoframe -----*/
// image = openCV cv::Mat containing the pixeldata of the current frame
if (image.data != nullptr)
{
D3D11_MAPPED_SUBRESOURCE mappedResource;
ZeroMemory(&mappedResource, sizeof(D3D11_MAPPED_SUBRESOURCE));
// Disable GPU access to data.
auto m_d3dContext = m_deviceResources->GetD3DDeviceContext();
m_d3dContext->Map(m_pTexture, 0, D3D11_MAP_WRITE_DISCARD, 0, &mappedResource);
// Update texture
memcpy(mappedResource.pData, image.data, 4*image.rows*image.cols);
// Reenable GPU access to data.
m_d3dContext->Unmap(m_pTexture, 0);
}
At first I'm drawing a fullscreen rectangle (SpriteBatch with DirectXTK) with this texture. After that I can draw anything else.

Related

How to access individual frames of an animated GIF loaded into a ID3D11ShaderResourceView?

I used CreateWICTextureFromFile() from DirectXTK to load an animated GIF texture.
ID3D11Resource* Resource;
ID3D11ShaderResourceView* View;
hr = CreateWICTextureFromFile(d3dDevice, L"sample.gif",
&Resource, &View);
Then I displayed it on an ImageButton in dear IMGUI library:
ImGui::ImageButton((void*)View, ImVec2(width, height));
But it only displays a still image (the first frame of the GIF file).
I think I have to give it the texture of each frame separately. But I don't know how. Can you give me a hint?
The CreateWICTextureFromFile function in DirectX Tool Kit (a.k.a. the 'light-weight' version in the WICTextureLoader module) only loads a single 2D texture, not multi-frame images like animated GIF or TIFF.
The DirectXTex function LoadFromWICFile can load multiframe images if you give it the WIC_FLAGS_ALL_FRAMES flag. Because the library is focused on DirectX resources, it will resize them all to match the first image size.
That said, what WIC is going to return to you is a bunch of raw frames. You have query the metadata from WIC to actually get the animation information, and it's a little complicated to reconstruct. I have a simple implementation in the DirectXTex texassemble tool you can reference here. I focused on converting the animated GIF into a 'flip-book' style 2D texture array which is quite a bit larger.
The sample I referenced can be found on GitHub

Can the Texture Data of a Qt 6.1 QSGTexture be Changed Dynamically?

I am updating my Qt 5.15 graphics code from OpenGL dependent to the new RHI format in Qt 6.1. One requirement is to change the texture data on a 2D model frequently (20 times per sec). This was fairly easy to accomplish in OpenGL by just updating the texture data to an already allocated texture. I don't see how this is possible in the Qt 6 documentation. There is QSGDynamicTexture but I don't see anything in the API to allow to actually change the texture data. It provides updateTexture() to override but how can you update the actual texture data in this method?
I've also looked at using QSGTextureProvider or recreating the texture when needed but that seems very inefficient recreating the texture object every time.
Is it possible to update the raw texture data of a SQGTexture or similar in Qt 6.1?
Update
The following code demonstrates a working method. This draws an updated image correctly but I'd like to eliminate the QImage use and recreating the QSGTexture. Updating the QSGTexture data directly would achieve that, if possible.
{ // contruction of custom QSGNode
QSGGeometryNode* node = new QSGGeometryNode;
QSGGeometry* geometry = new QSGGeometry(QSGGeometry::defaultAttributes_TexturedPoint2D(), 0);
// geometry vertices defined later
QSGTextureMaterial* material = new QSGTextureMaterial;
// blue image for testing
QImage image(100, 100, QImage::Format_ARGB32);
image.fill(Qt::blue);
QSGTexture* texture = window->createTextureFromImage(image);
m_material->setTexture(texture);
node->setGeometry(geometry);
node->setFlag(QSGNode::OwnsGeometry);
node->setMaterial(material);
node->setFlag(QSGNode::OwnsMaterial);
}
...
{ // new image data - called in QQuickItem::::updatePaintNode
QImage image = QImage((const uchar*)data, imageWidth, imageHeight, QImage::Format_ARGB32);
QSGTexture* texture = window->createTextureFromImage(image);
delete material->texture();
material->setTexture(texture);
node->markDirty(QSGNode::DirtyMaterial);
}

DirectX — is there an analogue of DirectDraw surface Flip()?

I'm building an application that is drawing an anaglyph (stereoimage) on 200 Hz screen based on two provided pictures (NOT 3D model). So speed integity of redrawing is very important. I've achieved the best results with DirectDraw surfaces and their Flip() (switching current surface's image to secondary one):
(void) lpddsPrimary->Flip(nullptr, DDFLIP_WAIT);
But DirectDraw is very outdated and I look for a way to reimplement this functionality based on modern DirectX libraries. But I really don't want to create a quad, draw picture as it's texture, calculate 3D projection matrices just to output 2D images.
I would be really greatful for any snippet of how this can be possibly done with DirectX. Thanks in advance.
For your purposes you can use DXGI and avoid D3D completely. You don't say how you get the data into the backbuffer, but DXGI allows you to create a swapchain, flip it (Present), and access the surfaces (e.g. lock them - it's called Map now). For 3D you need the "1" versions e.g. DXGISwapChain1. See http://msdn.microsoft.com/en-us/library/windows/desktop/bb205075(v=vs.85).aspx.
Note that DXGISwapChain1 is a subclass of DXGISwapChain, and some vital methods such as GetBuffer are in the base interface.

opengl off screen rendering

I am using off screen rendering using opengl FBO and glut on a MAC OS X 10.6. The program involves movement of multiple 3D objects.
The program seems to be working fine except that I am required to include an option where the off screen buffer contents are not swapped to the on screen buffer. Hence you do not see anything on the screen. I want to know if the program is working as it should be in this mode when nothing is seen on screen - ie 3D movements etc work fine as usual. Is there a utility that can read offscreen buffer and display it onscreen while my process runs separately.
Alternatively, are there other ways to achieve this? That is to hide the onscreen window while rendering offscreen using FBO.
Appreciate any comments/suggestions. I hope I am clear in my question.
gDEBugger for Mac should be able to display the FBO content with no additional effort on your side, at least the Windows version does so just fine. A 7 days trial version is available.
I would copy the offscreen buffer onto a shared memory. Then, an external application reads continuosly the shared memory contents, updates a texture and display it on the screen.
That's it.
I've used it a lot, even with off-screen rendering, but I have not a handy example. :(
I would advice to store additional information at the beginning of the shared memory (width, height, pixel type, incremental integer to know whether the image has changed from the last read...).
After this header, store the pixel data generated by you application, which size depends actually by width, height and pixel size.
I would also advice to use glReadPixels to store pixel data, passing the mapped shared memory as parameter. Remote application can use that data to update a texture.

DirectX9 Texture of arbitrary size (non 2^n)

I'm relatively new to DirectX and have to work on an existing C++ DX9 application. The app does tracking on a camera images and displays some DirectDraw (ie. 2d) content. The camera has an aspect ratio of 4:3 (always) and the screen is undefined.
I want to load a texture and use this texture as a mask, so tracking and displaying of the content only are done within the masked area of the texture. Therefore I'd like to load a texture that has exactly the same size as the camera images.
I've done all steps to load the texture, but when I call GetDesc() the fields Width and Height of the D3DSURFACE_DESC struct are of the next bigger power-of-2 size. I do not care that the actual memory used for the texture is optimized for the graphics card but I did not find any way to get the dimensions of the original image file on the harddisk.
I do (and did, but with no success) search a possibility to load the image into the computers RAM only (graphicscard is not required) without adding a new dependency to the code. Otherwise I'd have to use OpenCV (which might anyway be a good idea when it comes to tracking), but at the moment I still try to avoid including OpenCV.
thanks for your hints,
Norbert
D3DXCreateTextureFromFileEx with parameters 3 and 4 being
D3DX_DEFAULT_NONPOW2.
After that, you can use
D3DSURFACE_DESC Desc;
m_Sprite->GetLevelDesc(0, &Desc);
to fetch the height & width.
D3DXGetImageInfoFromFile may be what you are looking for.
I'm assuming you are using D3DX because I don't think Direct3D automatically resizes any textures.