WriteConsoleOutputCharacter in c++ loads of ?s on console - c++

So I am following this tutorial here https://www.youtube.com/watch?v=xW8skO7MFYw and I am new to accessing console buffer and that. I started debugging when I realized my console looked like this https://ibb.co/896tGGL when I ran it. And I narrowed it down to this line of code
WriteConsoleOutputCharacter(hConsole, screen, nScreenWidth * nScreenheight, { 0,0 }, &dwBytesWritten);
the values in this are set to:
HANDLE hConsole = CreateConsoleScreenBuffer(GENERIC_READ | GENERIC_WRITE, 0, NULL, CONSOLE_TEXTMODE_BUFFER, NULL);
wchar_t* screen = new wchar_t[nScreenWidth * nScreenheight];
int nScreenWidth = 120;
int nScreenheight = 40;
DWORD dwBytesWritten = 0;

All the parts of your code that say
screen[y * nScreenWidth] = something;
should be
screen[y * nScreenWidth + x] = something;
And of course you need to loop to set the x coordinate.
If you look carefully you can actually see the # on the left edge of your screen. That's because you missed the x ccordinate out of your calculations.

Related

WriteConsoleOutputCharacterW - Where are my new lines?

I'm trying to learn how to use a screen buffer, and I made a mistake that I do not understand. These are the settings for my screen buffer:
wchar_t* screen = new wchar_t[nScreenWidth * nScreenHeight];
for (int i = 0; i < nScreenWidth * nScreenHeight; i++) {
screen[i] = L' ';
}
HANDLE hConsole = CreateConsoleScreenBuffer(GENERIC_READ | GENERIC_WRITE, 0, NULL, CONSOLE_TEXTMODE_BUFFER, NULL);
SetConsoleActiveScreenBuffer(hConsole);
DWORD dwBytesWritten = 0;
WriteConsoleOutputCharacterW(hConsole, screen, (nScreenWidth * nScreenHeight), { 0,0 }, &dwBytesWritten);
I manage to print to it my 2D array but it is weird that it's lying flat in my terminal window (see link to print screen).
Small print screen of my failed 2D array
It's as if all the new lines have been removed. This is my loop that prints my 2D array to "screen".
int g = 0;
while (g < 100) {
WriteConsoleOutputCharacterW(hConsole, screen, (nScreenWidth * nScreenHeight), { 0,0 }, &dwBytesWritten);
for (int i = 0; i < field.difficulty; i++) {
std::this_thread::sleep_for(std::chrono::milliseconds(50));
}
for (int y = 0; y < field.nFieldHeight; y++) {
for (int x = 0; x < field.nFieldWidth; x++) {
screen[(y + 2) * field.nFieldWidth + (x + 2)] = field.matrix[x][y];
}
}
}
Is it possible that I need to write to a coordinate in the screen buffer every time I print a character?
By default the console window is re-sizable and the this causes the output to wrap. You can prevent this by using the following:
// Get console window
HWND hwndWindow = GetConsoleWindow();
// Prevent resize & maximize
LONG lFlags = GetWindowLong(hwndWindow , GWL_STYLE) & ~WS_MAXIMIZEBOX & ~WS_SIZEBOX & ~WS_HSCROLL;
SetWindowLong(hwndWindow , GWL_STYLE, lFlags);
// Get console handle
HANDLE hConsole = GetStdHandle(STD_OUTPUT_HANDLE);
// Set window and buffer size
_SMALL_RECT consoleRect = { 0, 0, SCREEN_W - 1, SCREEN_H - 1 };
SetConsoleScreenBufferSize(hConsole, { SCREEN_W, SCREEN_H });
SetConsoleWindowInfo(hConsole, TRUE, &consoleRect);

C++, windows (sometimes) white screen while taking an application screenshot

I have an application function that triggers screenshot capture of the said application's window.
It goes like this:
void PlatformWindow::captureScreenshot()
{
WIN32Window *window = (WIN32Window*)&g_window;
if (window) {
HWND handle = window->getWindow();
if (handle){
RECT client_rect = { 0 };
GetClientRect(handle, &client_rect);
int width = client_rect.right - client_rect.left;
int height = client_rect.bottom - client_rect.top;
HDC hdcScreen = GetDC(handle);
HDC hdc = CreateCompatibleDC(hdcScreen);
HBITMAP hbmp = CreateCompatibleBitmap(hdcScreen, width, height);
SelectObject(hdc, hbmp);
BitBlt(hdc, 0, 0, width, height, hdcScreen, 0, 0, SRCCOPY);
BITMAPINFO bmp_info = { 0 };
bmp_info.bmiHeader.biSize = sizeof(bmp_info.bmiHeader);
bmp_info.bmiHeader.biWidth = width;
bmp_info.bmiHeader.biHeight = height;
bmp_info.bmiHeader.biPlanes = 1;
bmp_info.bmiHeader.biBitCount = 24;
bmp_info.bmiHeader.biCompression = BI_RGB;
int bmp_padding = (width * 3) % 4;
if (bmp_padding != 0) bmp_padding = 4 - bmp_padding;
BYTE *bmp_pixels = new BYTE[(width * 3 + bmp_padding) * height];;
GetDIBits(hdc, hbmp, 0, height, bmp_pixels, &bmp_info, DIB_RGB_COLORS);
BITMAPFILEHEADER bmfHeader;
//Make screenshot name as a time
time_t currentTime = std::time(NULL);
std::ostringstream oss;
auto tm = *std::localtime(&currentTime);
oss << std::put_time(&tm, "%d-%m-%Y_%H-%M");
auto time_string = oss.str();
uint id = 0;
std::string name = "screens\\" + time_string +"_0.bmp";
//Loop over its indexes
while(true){
name = "screens\\" + time_string + "_" + std::to_string(id) +".bmp";
if (!file_exists(name)){
break;
}
id++;
}
LPSTR fileName = const_cast<char *>(name.c_str());
HANDLE bmp_file_handle = CreateFile(fileName, GENERIC_WRITE, 0, NULL, CREATE_ALWAYS, FILE_ATTRIBUTE_NORMAL, NULL);
// Add the size of the headers to the size of the bitmap to get the total file size
DWORD dwSizeofDIB = (width * 3 + bmp_padding) * height + sizeof(BITMAPFILEHEADER) + sizeof(BITMAPINFOHEADER);
//Offset to where the actual bitmap bits start.
bmfHeader.bfOffBits = (DWORD)sizeof(BITMAPFILEHEADER) + (DWORD)sizeof(BITMAPINFOHEADER);
//Size of the file
bmfHeader.bfSize = dwSizeofDIB;
//bfType must always be BM for Bitmaps
bmfHeader.bfType = 0x4D42; //BM
DWORD dwBytesWritten = 0;
WriteFile(bmp_file_handle, (LPSTR)&bmfHeader, sizeof(BITMAPFILEHEADER), &dwBytesWritten, NULL);
WriteFile(bmp_file_handle, (LPSTR)&bmp_info.bmiHeader, sizeof(BITMAPINFOHEADER), &dwBytesWritten, NULL);
WriteFile(bmp_file_handle, (LPSTR)bmp_pixels, (width * 3 + bmp_padding) * height, &dwBytesWritten, NULL);
//Close the handle for the file that was created
CloseHandle(bmp_file_handle);
DeleteDC(hdc);
DeleteObject(hbmp);
ReleaseDC(NULL, hdcScreen);
delete[] bmp_pixels;
}
}
}
And it works fine on several machines (Windows 10, XP and so on).
There is, however, a rare case on Windows 7 (and maybe others, I don't know if that's just a bad luck or whatever) that makes screenshot's blank. Just all white.
I ran some diagnosis and am pretty convenient that it, for sure, captures right window, but somehow it does not capture pixels well.
I dig deeper and found out, that whenever I set this option in windows -> performance options -> "Adjust for best performance", it suddenly starts to work and a screenshot is positively taken (no more white screen, which is great).
What I am wondering right now is if I can somehow make my code better to cover up those situations since forcing user to change his Window's options is not an ideal scenario.
#EDIT:
I found out that this is the very option that makes it works, if I disable desktop composition, it works just fine.

C++ console application using double buffer size limitation

# include "DISPLAY.h"
DoubleBuffer::DoubleBuffer()
{
COORD size = { WINDOW_X_SIZE , WINDOW_Y_SIZE };
SMALL_RECT rect;
rect.Left = 0;
rect.Right = WINDOW_X_SIZE - 1;
rect.Top = 0;
rect.Bottom = WINDOW_Y_SIZE - 1;
hBuffer[0] = CreateConsoleScreenBuffer(GENERIC_READ | GENERIC_WRITE, 0, NULL, CONSOLE_TEXTMODE_BUFFER, NULL);
SetConsoleScreenBufferSize(hBuffer[0], size);
SetConsoleWindowInfo(hBuffer[0], TRUE, &rect);
hBuffer[1] = CreateConsoleScreenBuffer(GENERIC_READ | GENERIC_WRITE, 0, NULL, CONSOLE_TEXTMODE_BUFFER, NULL);
SetConsoleScreenBufferSize(hBuffer[1], size);
SetConsoleWindowInfo(hBuffer[1], TRUE, &rect);
CONSOLE_CURSOR_INFO cursorinfo;
cursorinfo.dwSize = 1;
cursorinfo.bVisible = FALSE;
SetConsoleCursorInfo(hBuffer[0], &cursorinfo);
SetConsoleCursorInfo(hBuffer[1], &cursorinfo);
nBufferIndex = 0;
}
void DoubleBuffer::WriteBuffer(int x, int y, char *string)
{
DWORD dw;
COORD startposition = { x,y };
SetConsoleCursorPosition(hBuffer[nBufferIndex], startposition);
WriteFile(hBuffer[nBufferIndex], string, strlen(string), &dw, NULL);
}
void DoubleBuffer::FlippBuffer()
{
Sleep(1000);
SetConsoleActiveScreenBuffer(hBuffer[nBufferIndex]);
nBufferIndex = !nBufferIndex;
}
void DoubleBuffer::ClearBuffer()
{
COORD coord = { 0,0 };
DWORD dw;
FillConsoleOutputCharacter(hBuffer[nBufferIndex], ' ', WINDOW_X_SIZE*WINDOW_Y_SIZE, coord, &dw);
}
void DoubleBuffer::ReleaseBuffer()
{
CloseHandle(hBuffer[0]);
CloseHandle(hBuffer[1]);
}
I used this code to construct a double buffer.
The entire game map which would be printed at the function 'WriteBuffer' was designed as a 1D char array.
But a big problem is that if the length of string goes over 80, then the console just shows 80 character in one line regardless of the console window size.
I mean, for example the input string is
char 1D_map[90] = {'D'};
and the console window size is 5*20.
Then no matter what the window size is, it shows like behind.
DDDDD (.. the left 70 characters are hidden..)
DDDDD (.. the left 5 character are hidden..)
As I pull the side bar of window to its maximum, it shows like behind.
DDDDDDDDDDDDDDDD ... 80 characters ... DDDDDDDDDDDDDDDDD
DDDDDDDDDD
So when I use these codes, then I cannot adjust the game map size.
(concretely horizontal size.)
Because whatever I do, it just shows 80 characters in a line!!
Can I fix the problem? or Should I apply other double buffering method?

Wrong colors when using StretchDIBits

I have got a trouble using the StretchDIBits function.
I want to draw a bitmap made from a buffer. However, the colors I define in the buffer are different from the result on screen.
I have read the documentation and I played with the biCompression (BI_RGB and BI_BITFIELDS) and biClrUsed (0 / 3) parameters of the BITMAPINFOHEADER. I can see some differences depending on their values, but the result is still different from what I am expecting.
Here is the code I am using (it can be inserted in the OnDraw method of a template SDI project to demonstrate the problem).
void CTestStretchDIBitsView::OnDraw(CDC* /*pDC*/)
{
...
CClientDC dc(this);
CRect rect;
GetClientRect(&rect);
DWORD* pBuffer = new DWORD[500 * 500];
memset(pBuffer, RGB(255, 255, 0), 500 * 500 * sizeof(DWORD));
LPBITMAPINFO pBmpInfo = (LPBITMAPINFO) new BYTE[sizeof(BITMAPINFOHEADER) + 256 * sizeof(RGBQUAD)];
pBmpInfo->bmiHeader.biSize = sizeof(BITMAPINFOHEADER);
pBmpInfo->bmiHeader.biWidth = 500;
pBmpInfo->bmiHeader.biHeight = 500;
pBmpInfo->bmiHeader.biPlanes = 1;
pBmpInfo->bmiHeader.biBitCount = 32;
pBmpInfo->bmiHeader.biCompression = BI_BITFIELDS;
pBmpInfo->bmiHeader.biSizeImage = 500 * 500;
pBmpInfo->bmiHeader.biXPelsPerMeter = 0;
pBmpInfo->bmiHeader.biYPelsPerMeter = 0;
pBmpInfo->bmiHeader.biClrUsed = 0;
pBmpInfo->bmiHeader.biClrImportant = 0;
SetStretchBltMode(dc.m_hDC, STRETCH_DELETESCANS);
StretchDIBits(dc.m_hDC,
0,
rect.Height(),
rect.Width(),
-rect.Height(),
0,
0,
500,
500,
pBuffer,
pBmpInfo,
DIB_RGB_COLORS,
SRCCOPY);
delete[] pBmpInfo;
delete[] pBuffer;
}
You have to use the following mode
SetStretchBltMode(hdcWindow,HALFTONE);
instead of
SetStretchBltMode(dc.m_hDC, STRETCH_DELETESCANS);
because halftone is the best mode according to my research.
The problem didn't come from the StretchDIBits function but from the initialization of the buffer used as the bitmap here.
memset(...) function was misused.
With an initialization such as :
int Color = RGB(255, 0, 0);
for (int i = 0 ; i < 500 * 500 ; i++)
pBuffer[i] = Color;
I get a perfectly blue image as expected.

Directx 11 Bitblt Alternative

I have the following function which I am trying to integrate into my directx 11 application. When I am using directx9 everything works fine but when converting to directx 11 I am getting a blue screen of death error at the Bitblt line (I must be doing something wrong with the HDC's?). I was wondering what the best way to convert this code to directx 11 compatible surfaces instead of HDC's would be.
Here is the function:
void CFlashDXPlayer::DrawFrame(HDC dc)
{
if (m_dirtyFlag)
{
IViewObject* pViewObject = NULL;
m_flashInterface->QueryInterface(IID_IViewObject, (LPVOID*) &pViewObject);
if (pViewObject != NULL)
{
// Combine regions
HRGN unionRgn, first, second = NULL;
unionRgn = CreateRectRgnIndirect(&m_dirtyRects[0]);
if (m_dirtyRects.size() >= 2)
second = CreateRectRgn(0, 0, 1, 1);
for (std::vector<RECT>::iterator it = m_dirtyRects.begin() + 1; it != m_dirtyRects.end(); ++it)
{
// Fill combined region
first = unionRgn;
SetRectRgn(second, it->left, it->top, it->right, it->bottom);
unionRgn = CreateRectRgn(0, 0, 1, 1);
CombineRgn(unionRgn, first, second, RGN_OR);
DeleteObject(first);
}
if (second)
DeleteObject(second);
RECT clipRgnRect; GetRgnBox(unionRgn, &clipRgnRect);
RECTL clipRect = { 0, 0, m_width, m_height };
// Fill background
if (m_transpMode != TMODE_FULL_ALPHA)
{
// Set clip region
SelectClipRgn(dc, unionRgn);
COLORREF fillColor = GetBackgroundColor();
HBRUSH fillColorBrush = CreateSolidBrush(fillColor);
FillRgn(dc, unionRgn, fillColorBrush);
DeleteObject(fillColorBrush);
// Draw to main buffer
HRESULT hr = pViewObject->Draw(DVASPECT_TRANSPARENT, 1, NULL, NULL, NULL, dc, &clipRect, &clipRect, NULL, 0);
assert(SUCCEEDED(hr));
}
else
{
if (m_alphaBlackDC == NULL)
{
// Create memory buffers
BITMAPINFOHEADER bih = {0};
bih.biSize = sizeof(BITMAPINFOHEADER);
bih.biBitCount = 32;
bih.biCompression = BI_RGB;
bih.biPlanes = 1;
bih.biWidth = LONG(m_width);
bih.biHeight = -LONG(m_height);
m_alphaBlackDC = CreateCompatibleDC(dc);
m_alphaBlackBitmap = CreateDIBSection(m_alphaBlackDC, (BITMAPINFO*)&bih, DIB_RGB_COLORS, (void**)&m_alphaBlackBuffer, 0, 0);
SelectObject(m_alphaBlackDC, m_alphaBlackBitmap);
m_alphaWhiteDC = CreateCompatibleDC(dc);
m_alphaWhiteBitmap = CreateDIBSection(m_alphaWhiteDC, (BITMAPINFO*)&bih, DIB_RGB_COLORS, (void**)&m_alphaWhiteBuffer, 0, 0);
SelectObject(m_alphaWhiteDC, m_alphaWhiteBitmap);
}
HRESULT hr;
HBRUSH fillColorBrush;
// Render frame twice - against white and against black background to calculate alpha
SelectClipRgn(m_alphaBlackDC, unionRgn);
COLORREF blackColor = 0x00000000;
fillColorBrush = CreateSolidBrush(blackColor);
FillRgn(m_alphaBlackDC, unionRgn, fillColorBrush);
DeleteObject(fillColorBrush);
hr = pViewObject->Draw(DVASPECT_TRANSPARENT, 1, NULL, NULL, NULL, m_alphaBlackDC, &clipRect, &clipRect, NULL, 0);
assert(SUCCEEDED(hr));
// White background
SelectClipRgn(m_alphaWhiteDC, unionRgn);
COLORREF whiteColor = 0x00FFFFFF;
fillColorBrush = CreateSolidBrush(whiteColor);
FillRgn(m_alphaWhiteDC, unionRgn, fillColorBrush);
DeleteObject(fillColorBrush);
hr = pViewObject->Draw(DVASPECT_TRANSPARENT, 1, NULL, NULL, NULL, m_alphaWhiteDC, &clipRect, &clipRect, NULL, 0);
assert(SUCCEEDED(hr));
// Combine alpha
for (LONG y = clipRgnRect.top; y < clipRgnRect.bottom; ++y)
{
int offset = y * m_width * 4 + clipRgnRect.left * 4;
for (LONG x = clipRgnRect.left; x < clipRgnRect.right; ++x)
{
BYTE blackRed = m_alphaBlackBuffer[offset];
BYTE whiteRed = m_alphaWhiteBuffer[offset];
m_alphaBlackBuffer[offset + 3] = 255 - (whiteRed - blackRed);
offset += 4;
}
}
// Blit result to target DC
BitBlt(dc, clipRgnRect.left, clipRgnRect.top,
clipRgnRect.right - clipRgnRect.left,
clipRgnRect.bottom - clipRgnRect.top,
m_alphaBlackDC, clipRgnRect.left, clipRgnRect.top, SRCCOPY);
}
DeleteObject(unionRgn);
pViewObject->Release();
}
m_dirtyFlag = false;
m_dirtyRects.clear();
m_dirtyUnionRect.left = m_dirtyUnionRect.top = LONG_MAX;
m_dirtyUnionRect.right = m_dirtyUnionRect.bottom = -LONG_MAX;
}
}
The HDC I am passing to this function is created in the following manner:
D3D11_TEXTURE2D_DESC textureDesc;
ZeroMemory(&textureDesc, sizeof(textureDesc));
textureDesc.Width = width;
textureDesc.Height = height;
textureDesc.MipLevels = 1;
textureDesc.ArraySize = 1;
textureDesc.Format = DXGI_FORMAT_B8G8R8A8_UNORM;
textureDesc.SampleDesc.Count = 1;
textureDesc.Usage = D3D11_USAGE_DEFAULT;
textureDesc.BindFlags = D3D11_BIND_SHADER_RESOURCE | D3D11_BIND_RENDER_TARGET;
textureDesc.MiscFlags = D3D11_RESOURCE_MISC_GDI_COMPATIBLE;
HRESULT hr = device->CreateTexture2D(&textureDesc, NULL, &m_flashTexture);
HRESULT hResult;
HDC hDC;
IDXGISurface1 *pSurface = NULL;
hResult = m_flashTexture->QueryInterface(__uuidof(IDXGISurface1), (void**)&pSurface);
hResult = pSurface->GetDC(TRUE, &hDC);
assert(SUCCEEDED(hResult));
m_flashPlayer->DrawFrame(hDC);
Any ideas of what I am doing wrong? I can't seem to figure out what is going on and why this is casuing a blue screen when if I use Directx 9 obejcts it doesn't. Is there a better way to do this?
(Also I've tried updating my drivers and they are all up to date).
Thank you for the help.
Turns out that this was indeed a driver issue. It works without a problem when I run with my graphics card set to the radeon in my latop, but when I have it on switchable for some reason it still crashes even though it should be selecting my radeon. I need to have the graphics mode fixed. Weird, but atleast its not my program I guess.
Can't really tell from code inspection. I haven't noticed anything blatantly wrong. There certainly should not be a any BSOD - that part is a driver bug. What hardware/driver are you running on?
A common reason for driver crashes though is illegally writing to some memory area, often if you're blitting to outside of your DC memory. I'd double check to verify that your regions are not out of bounds and that m_alphaBlackDC is the same size as dc.
I would also highly, highly recommend testing on another non-related GPU (that doesn't share the same hardware architecture).