AddFontResource + SetCurrentConsoleFontEx are not changing a console font - c++

I'm trying to change a console font to a custom one, but this specific code piece doesn't seem to acomplish anything, even though this is what I came up while trying to find a solution around the Internet. I tested just the SetCurrentConsoleFontEx with this custom font by installing and adding it to the console with regestry by hand, and it's been functioning properly.
#include <iostream>
#include <Windows.h>
int main()
{
std::cout << "Default font" << std::endl;
system("pause");
HANDLE m_stdOut = GetStdHandle(STD_OUTPUT_HANDLE);
AddFontResourceEx(L"Iosevka.ttf", FR_PRIVATE, 0);
SendNotifyMessage(HWND_BROADCAST, WM_FONTCHANGE, 0, 0);
CONSOLE_FONT_INFOEX cfie;
ZeroMemory(&cfie, sizeof(cfie));
cfie.cbSize = sizeof(cfie);
cfie.dwFontSize.Y = 21;
lstrcpyW(cfie.FaceName, L"Iosevka");
SetCurrentConsoleFontEx(m_stdOut, false, &cfie);
std::cout << "Custom font" << std::endl;
RemoveFontResource(L"Iosevka.ttf");
system("pause");
return 0;
}

You are calling AddFontResourceEx() with FR_PRIVATE flag, which means the font is available only to your process.
Unfortunately, the console window is not part of your process (GetWindowThreadProcessId() lies in this regard!). It is hosted by a system process ("csrss.exe" before Win 7, "conhost.exe" since then).
See: Windows Command-Line: Inside the Windows Console
To make the font available to the console, you have to remove the FR_PRIVATE flag or install the font permanently.

Related

How can I use SetCurrentConsoleFontEx with fonts loaded by AddFontMemResourceEx?

My main goal is to set the console font to Unifont using windows API functions. I can successfully do this by calling AddFontResource with the filename. I would rather use AddFontMemResourceEx because then I can load the font from a Resource.
I tested a variety of calls to load the font, and here are the results:
AddFontResource(filename); //works
AddFontResourceEx(filename, FR_PRIVATE, NULL); //does not work
AddFontResourceEx(filename, FR_NOT_ENUM, NULL); //works
AddFontMemResourceEx(valid_pointer, data_size, NULL, &font_count); //does not work
I tested these with two fonts, Unifont, and Fira Code (the only other font I could find that would display on the terminal).
I wrote this program to cut out possible issues with loading the font as a resource.
#include <windows.h>
#include <iostream>
#include <fstream>
#include <filesystem>
int main() {
std::string filename = "unifont-13.0.04.ttf";
std::wstring fontname = L"Unifont";
auto fileSize = std::filesystem::file_size(filename);
std::ifstream in(filename, std::ifstream::binary);
char* data = new char[fileSize];
in.read(data, fileSize);
in.close();
DWORD loadedFonts = 0;
AddFontMemResourceEx(reinterpret_cast<PVOID>(data), fileSize, NULL, &loadedFonts);
CONSOLE_FONT_INFOEX cfi;
cfi.cbSize = sizeof(cfi);
cfi.nFont = 0;
cfi.dwFontSize.X = 0;
cfi.dwFontSize.Y = 16;
cfi.FontFamily = FF_DONTCARE;
cfi.FontWeight = FW_NORMAL;
wcscpy_s(cfi.FaceName, fontname.c_str());
SetCurrentConsoleFontEx(GetStdHandle(STD_OUTPUT_HANDLE), FALSE, &cfi);
std::wcout << "I found " << loadedFonts << " fonts.";
std::wcin.ignore();
delete[] data;
}
With unifont, AddFontMemResourceEx will set loadedFonts to two, indicating it did find and load two fonts, and did not fail.
Worth noting, the HANDLE returned by SetCurrentConsoleFontEx is entirely useless for doing anything except calling RemoveFontMemResourceEx, which you don't even need to do in most cases because the fonts will be unloaded when the process ends.
Why does SetCurrentConsoleFontEx work with AddFontResource but not with AddFontMemResourceEx?
From the doc,
This function allows an application to get a font that is embedded in
a document or a webpage. A font that is added by AddFontMemResourceEx
is always private to the process that made the call and is not
enumerable.
For other content, please refer to zett42's answer,
You are calling AddFontResourceEx() with FR_PRIVATE flag, which means
the font is available only to your process.
Unfortunately, the console window is not part of your process
(GetWindowThreadProcessId() lies in this regard!). It is hosted by a
system process ("csrss.exe" before Win 7, "conhost.exe" since then).

I2C error when using the Windows Monitor Configuration Functions

I'm attempting to get/set the brightness level of the monitor through the Windows API. I've tried both the Low-Level Monitor Configuration Functions and the High-Level Monitor Configuration Functions, but they both seem to be breaking in the same place. In both cases I have no problem getting the HMONITOR handle and getting the physical monitor handle from the HMONITOR, but once I try to query the DDC/CI capabilities, I get an error saying "An error occurred while transmitting data to the device on the I2C bus."
The particular functions that cause this error are GetMonitorCapabilities for the high-level functions and GetCapabilitiesStringLength for the low-level functions. They both cause the same error.
This leads me to believe that maybe my monitor doesn't support DDC/CI, but I know my laptop's monitor brightness can be changed through the control panel, so it must be controlled through software somehow. Also I can successfully use the WMI classes in a PowerShell script to get/set the brightness as described on this page. Most things I've read suggest that most modern laptop screens do support DDC/CI.
Is there any way to find out what is causing this error or to get more information about it? I'm currently working in C++ in Visual Studio 2013 on Windows 7. I could probably use WMI in my C++ program if I can't get this current method working, but I thought I would ask here first.
Here's the code I currently have:
#include "stdafx.h"
#include <windows.h>
#include <highlevelmonitorconfigurationapi.h>
#include <lowlevelmonitorconfigurationapi.h>
#include <physicalmonitorenumerationapi.h>
#include <iostream>
#include <string>
int _tmain(int argc, _TCHAR* argv[])
{
DWORD minBrightness, curBrightness, maxBrightness;
HWND curWin = GetConsoleWindow();
if (curWin == NULL) {
std::cout << "Problem getting a handle to the window." << std::endl;
return 1;
}
// Call MonitorFromWindow to get the HMONITOR handle
HMONITOR curMon = MonitorFromWindow(curWin, MONITOR_DEFAULTTONULL);
if (curMon == NULL) {
std::cout << "Problem getting the display monitor" << std::endl;
return 1;
}
// Call GetNumberOfPhysicalMonitorsFromHMONITOR to get the needed array size
DWORD monitorCount;
if (!GetNumberOfPhysicalMonitorsFromHMONITOR(curMon, &monitorCount)) {
std::cout << "Problem getting the number of physical monitors" << std::endl;
return 1;
}
// Call GetPhysicalMonitorsFromHMONITOR to get a handle to the physical monitor
LPPHYSICAL_MONITOR physicalMonitors = (LPPHYSICAL_MONITOR)malloc(monitorCount*sizeof(PHYSICAL_MONITOR));
if (physicalMonitors == NULL) {
std::cout << "Unable to malloc the physical monitor array." << std::endl;
return 1;
}
if (!GetPhysicalMonitorsFromHMONITOR(curMon, monitorCount, physicalMonitors)) {
std::cout << "Problem getting the physical monitors." << std::endl;
return 1;
}
std::cout << "Num Monitors: " << monitorCount << std::endl; // This prints '1' as expected.
wprintf(L"%s\n", physicalMonitors[0].szPhysicalMonitorDescription); // This prints "Generic PnP Monitor" as expected
// Call GetMonitorCapabilities to find out which functions it supports
DWORD monCaps;
DWORD monColorTemps;
// The following function call fails with the error "An error occurred while transmitting data to the device on the I2C bus."
if (!GetMonitorCapabilities(physicalMonitors[0].hPhysicalMonitor, &monCaps, &monColorTemps)) {
std::cout << "Problem getting the monitor's capabilities." << std::endl;
DWORD errNum = GetLastError();
DWORD flags = FORMAT_MESSAGE_ALLOCATE_BUFFER | FORMAT_MESSAGE_FROM_SYSTEM | FORMAT_MESSAGE_IGNORE_INSERTS;
LPVOID buffer;
FormatMessage(flags, NULL, errNum, MAKELANGID(LANG_NEUTRAL, SUBLANG_DEFAULT), (LPWSTR)&buffer, 0, NULL);
wprintf(L"%s\n", buffer);
return 1;
}
// Same error when I use GetCapabilitiesStringLength(...) here.
// More code that is currently never reached...
return 0;
}
Edit: Also I should note that physicalMonitors[0].hPhysicalMonitor is 0, even though the monitor count and text description are valid and the GetPhysicalMonitorsFromHMONITOR function returns successfully. Any thoughts on why this might be?
This is a "wonky hardware" problem, the I2C bus it talks about is the logical interconnect between the video adapter and the display monitor. Primarily useful for plug & play. Underlying error code is 0xC01E0582, STATUS_GRAPHICS_I2C_ERROR_TRANSMITTING_DATA. It is generated by a the DxgkDdiI2CTransmitDataToDisplay() helper function in the video miniport driver. It is the vendor's video driver job to configure it, providing the functions that tickle the bus and to implement the IOCTL underlying GetMonitorCapabilities().
Clearly you are device driver land here, there isn't anything you can do about this failure in your C++ program. You can randomly spin the wheel of fortune by looking for a driver update from the video adapter manufacturer. But non-zero odds that the monitor is at fault here. Try another one first.
I know its bad time to reply but i thought you should know.
The problem you are facing is because of the DDC/CI disabled on your monitor so you should go to the monitor settings and check if DDC/CI is disabled and if it is, then you have to enable it and run your code again. It would work. If you were not able to find DDC/CI option ( some of the monitor have a separate button for enabling/disabling the DDC/CI like the Benq's T71W monitor has a separate down arrow button to enable/disable DDC/CI ) then you should refer to your monitor's manual or contact the manufacturer.
Hope that helps. and sorry for late reply.
Best of luck. :)
As I read the original question, the poster wanted to control a laptop display using DDC/CI. Laptop displays do not support DDC/CI. They provide a stripped down I2C bus sufficient to read the EDID at slave address x50, but that's it.

Controlling Windows Screen Orientation in C++/Qt/Windows

I am looking for a solution to control the screen orientation from within my application.
1. Qt program compiled with visual C++ 2013 (express)
2. Nvidia (if this matters)
I do not just want to control the orientation of the window because this will fail to change the orientation of any onscreen keyboard applications running.
Thank you
This can be done using ChangeDisplaySettings from the windows API
https://msdn.microsoft.com/en-us/library/dd183411%28VS.85%29.aspx
example:
#include <Windows.h>
DEVMODE mode;
//first get setting for "current" screen
EnumDisplaySettings(NULL, ENUM_CURRENT_SETTINGS, &mode);
if (mode.dmFields | DM_DISPLAYORIENTATION)
{
mode.dmDisplayOrientation = DMDO_180;
LONG r;
r = ChangeDisplaySettings(&mode, 0);
std::cout << "result: " << r;
}
Look here for info on DEVMODE: https://msdn.microsoft.com/en-us/library/dd183565%28v=vs.85%29.aspx
This can be done using pyautogui.hotkey
from pyautogui import hotkey
hotkey('ctrl','Alt','down')

C++ Finding Index for Font temporarily added to System Font Table with AddFontResource() to use in Console

I am trying to temporarily install a font to use in the win32 console with
int AddFontResource(LPCTSTR lpszFilename);
and
BOOL WINAPI SetConsoleFont(HANDLE hOutput, DWORD fontIndex)
I got hold of this function from this site.
Although both functions seem to work fine I have no idea how to find the added font index to use with SetConsoleFont.
AddFontResource returns no index value or key to the temporary font.
Here is my relevant code:
#include "Level.h"
#include "ConsoleFont.h" //acquired from above mentioned site
#include <Windows.h>
//-------------------------------------------------------------------------------
void init();
void cleanup();
int main()
{
FileManager *pFileManager = new FileManager(); //unrelated
Level *lvl1 = new Level("filename",pFileManager); //unrelated
///TEMPORARY PLANNING
// using add font resource. how can i get this fonts index value?
int err = AddFontResource(L"Files/gamefont.fnt");
if (err == 0)
{
MessageBox(NULL,L"loading font failed",L"Error",0);
}
else
{
wchar_t message[100];
swprintf_s(message,100,L"AddFontResource returned: %d",err);
MessageBox(NULL,LPTSTR(message),L"error",0);
}
SendMessage(HWND_BROADCAST, WM_FONTCHANGE,0,0);
//acquiring handle to current active screen buffer
HANDLE tempHandle = GetStdHandle(STD_OUTPUT_HANDLE);
if (tempHandle == INVALID_HANDLE_VALUE)
{
MessageBox(NULL,L"Failed to aquire Screen Buffer handle",L"Error",0);
}
//I dont know what to set this to. this is the crux of the problem.
DWORD fontIndex = 1;
if (FALSE == SetConsoleFont(tempHandle,fontIndex))
{
MessageBox(NULL,L"loading console font failed",L"Error",0);
}
//draws a house when in correct font
std::cout<<"!!!!!!!!#\n"
<<"!!!!!!!!!\n"
<<"! !! !! !\n"
<<"!!!!!!!!!\n"
<<"! !! !! !\n"
<<"!!!!!!!!!\n"
<<"! !! !! !\n"
<<"!!!!!!!!!\n"
<<"! !! !! !#\n"
<<"!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!"<<std::endl;
///PLANNING OVERS
bool quit = false;
while(!quit)
{
//still to be implemented
}
err = RemoveFontResource(L"Files/gamefont.fnt");
if (err==0)
{
MessageBox(NULL,L"removing font failed",L"Error",0);
}
return 0;
}
I don't know how to go about finding my new font's index value or even if this is possible using my current method.
If someone knows or has a better method please help me out.
any help or hints are appreciated. It must possible to use a custom font in the win32 Console without fiddling with the registry. I'm sure of it :)
Unfortunately you entered the dark world on Win APIs. There is no documentation (or atleast I could never find it) for a console font table lookup. You can try the method "GetNumberOfConsoleFonts()" to see what is returned. I think the font at index 10 is Lucida Console. You'll have to play around a little. Also, this may not work for the OS version you have. Worked for me on XP. Never had to try on anything else. And honestly, never got it fully working on XP too.
For the registry,
Fonts registries are here:
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Fonts
Console registries are here:
HKEY_CURRENT_USER\Console
If you end up modifying the registry, the changes may not be reflected immediately. You need to either restart the console or send a special WM_* message (sorry don't remember the name).
Will be great if you can find a solution/workaround :)
int err = AddFontResource(L"Files/gamefont.fnt");
if (err == 0)
{
MessageBox(NULL,L"loading font failed",L"Error",0);
}
else
{
wchar_t message[100];
swprintf_s(message,100,L"AddFontResource returned: %d",err);
MessageBox(NULL,LPTSTR(message),L"error",0);
}
this is wrong AddFontResource returns the number of fonts loaded, so the code in the ELSE doesn't make sense.

MS CryptoAPI doesn't work on Windows XP with CryptAcquireContext()

I wrote some code using the Microsoft CryptoAPI to calculate a SHA-1 and got the compiled exe working on Windows 7, Win Server 2008, Win Server 2003. However, when I run it under Windows XP SP3, it does not work.
I narrowed down the failure to the CryptAcquireContext() call.
I did notice that a previous post talked about the XP faulty naming of "… (Prototype)" and it must be accounted for by using a WinXP specific macro MS_ENH_RSA_AES_PROV_XP.
I did the XP specific code modifications and it still doesn't work. (The bResult returns 0 false on Win XP, all other platforms bResult returns 1 true.)
I checked the MS_ENH_RSA_AES_PROV_XP with the actual key+string values I see in regedit.exe so everything looks like it's set up to work but no success.
Have I overlooked something to make it work on Windows XP?
I've pasted shortest possible example to illustrate the issue. I used VS2010 C++.
// based on examples from http://msdn.microsoft.com/en-us/library/ms867086.aspx
#include "windows.h"
#include "wincrypt.h"
#include <iostream>
#include <iomanip> // for setw()
void main()
{
BOOL bResult;
HCRYPTPROV hProv;
// Attempt to acquire a handle to the default key container.
bResult = CryptAcquireContext(
&hProv, // Variable to hold returned handle.
NULL, // Use default key container.
MS_DEF_PROV, // Use default CSP.
PROV_RSA_FULL, // Type of provider to acquire.
0); // No special action.
std::cout << "line: " << std::setw(4) << __LINE__ << "; " << "bResult = " << bResult << std::endl;
if (! bResult) { // try Windows XP provider name
bResult = CryptAcquireContext(
&hProv, // Variable to hold returned handle.
NULL, // Use default key container.
MS_ENH_RSA_AES_PROV_XP, // Windows XP specific instead of using default CSP.
PROV_RSA_AES, // Type of provider to acquire.
0); // No special action.
std::cout << "line: " << std::setw(4) << __LINE__ << "; " << "bResult = " << bResult << std::endl;
}
if (bResult)
CryptReleaseContext(hProv, 0);
}
Windows 7 success:
Windows XP failure:
In your CryptAcquireContext code, it appears you are missing the parameter to get a context without a specific container set. You need to pass the CRYPT_VERIFYCONTEXT option in CryptAcquireContext.
Windows 7 might be working around this.
http://msdn.microsoft.com/en-us/library/windows/desktop/aa379886(v=vs.85).aspx
For further diagnosis, the results of GetLastError() would be requisite.