I'm having some issues utilizing the built-in function PlaySound. I continuously receive two errors, the first being:
argument of type "const char *" is incompatible with parameter of type "LPCWSTR",
and the second being:
'BOOL PlaySoundW(LPCWSTR,HMODULE,DWORD)': cannot convert argument 1 from 'const char [35]' to 'LPCWSTR'.
I can't seem to resolve these issues on my own, and would like some help with figuring out how to get rid of the errors. Here's a section of my source code, including what I believe to be causing the error.
#include <iostream>
#include <string>
#include <iomanip>
#include <dos.h>
#include <windows.h>
#include <playsoundapi.h>
#include <mmsystem.h>
using namespace std;
int main()
{
PlaySound("C:\\Users\\Cristian\\Desktop\\cafe.mp3", NULL, SND_FILENAME | SND_ASYNC);
return 0;
}
If I am using the PlaySound function incorrectly,please point me to the correct direction.
LPCWSTR is a macro for const wchar_t * - so you need to use a wide-character wchar_t string L"" instead of a normal char string "".
const wchar_t* path = L"C:\\Users\\Cristian\\Desktop\\cafe.mp3";
PlaySound( path , NULL, SND_FILENAME | SND_ASYNC );
The old-school Win32 way would be to use TCHAR with optional #define UNICODE but this is considered an anachronism as the "ANSI" Win32 functions don't support UCS-2/UTF-16 (and MBCS does not refer to UTF-8, surprisingly).
Note that you probably want to use SND_SYNC instead of SND_ASYNC because your program will terminate before the sound finishes plaiyng.
Finally, PlaySound does not support MP3 files - only Wave files - so your code won't work regardless.
To play MP3 files in Win32 you need to use either:
MCI (Media Control Interface - an ancient API from the Win3x days, yet surprisingly is the simplest - needing only a two function calls):
mciSendString("open \"fileName.mp3\" type mpegvideo alias mp3", NULL, 0, NULL);
mciSendString("play mp3", NULL, 0, NULL);
DirectShow - This is the official Windows multimedia API, but it's based on COM and requires you to create a component-graph (file parser, decoders, output devices, etc) so it has a steep learning curve. See here for the minimum code required - almost 60 lines ( https://msdn.microsoft.com/en-us/library/windows/desktop/dd389098.aspx )
Windows Vista introduced MediaFoundation to replace DirectShow, but in my experience it's not much better than DirectShow in terms of programmer-ergonomics: https://msdn.microsoft.com/en-us/library/windows/desktop/ms703190(v=vs.85).aspx
For Windows 10, there is a WinRT API for playback - but I haven't done much research and I don't know if you can call it from "real" Win32 programs or if it's reserved only for sandboxed UWP applications: https://learn.microsoft.com/en-us/windows/uwp/audio-video-camera/media-playback
Related
As above, how do I get the AppData folder in Windows using C?
I know that for C# you use Environment.SpecialFolder.ApplicationData
Use SHGetSpecialFolderPath with a CSIDL set to the desired folder (probably CSIDL_APPDATA or CSIDL_LOCAL_APPDATA).
You can also use the newer SHGetFolderPath() and SHGetKnownFolderPath() functions.
There's also SHGetKnownFolderIDList() and if you like COM there's IKnownFolder::GetPath().
If I recall correctly it should just be
#include <stdlib.h>
getenv("APPDATA");
Edit: Just double-checked, works fine!
Using the %APPDATA% environment variable will probably work most of the time. However, if you want to do this the official Windows way, you should use use the SHGetFolderPath function, passing the CSIDL value CSIDL_APPDATA or CSIDL_LOCAL_APPDATA, depending on your needs.
This is what the Environment.GetFolderPath() method is using in .NET.
EDIT: Joey correctly points out that this has been replaced by SHGetKnownFolderPath in Windows Vista. News to me :-).
You might use these functions:
#include <stdlib.h>
char *getenv(
const char *varname
);
wchar_t *_wgetenv(
const wchar_t *varname
);
Like so:
#include <stdio.h>
char *appData = getenv("AppData");
printf("%s\n", appData);
Sample code from MSDN:
TCHAR szPath[MAX_PATH];
if (SUCCEEDED(SHGetFolderPath(NULL,
CSIDL_APPDATA | CSIDL_FLAG_CREATE,
NULL,
0,
szPath)))
{
PathAppend(szPath, TEXT("MySettings.xml"));
HANDLE hFile = CreateFile(szPath, ...);
}
CSIDL_APPDATA = username\Application Data. In Window 10 is: username\AppData\Roaming
CSIDL_FLAG_CREATE = combine with CSIDL_ value to force folder creation in SHGetFolderPath()
You can also use:
CSIDL_LOCAL_APPDATA = username\Local Settings\Application Data (non roaming)
Ok, i have a problem in my program. I made a question about it before, but no one really understood my problem, so im trying a new approach to it this time.
If you're curious this is my problem:
My program takes arguments in the form of char* argv[] , and im having trouble making a pointer to whatever is on argv[1] using LPWSTR, since it only point to const wchar_t* objects.
This is a new thing im trying to solve my problem (i've tried multiple things but i need to know how to do what im thinking, or if its possible)
Basicly, my idea is to #define some sort of function that take whatever is on argv[1] and defines a const wchar_t* with that value.
Something like this:
#define path ((const wchar_t*)argv[1])
Im not sure that is the correct way (or even if this is possible) to do what i want to do...
If you have anny better way of solving me problem, please (please) tell me how to and help me out, i've been thinking about this so a really long time!
Explanation of my program:
Im making a program that recieves arguments. The arguments are the name of the drives, for example "F:". It then uses the function CreateFile with the drive letter. If you go here , and see the first parameter of the function, i think you will see what i mean.... the problem is that for me to make a LPWSTR, i would need a const wchat_t* object.... I hope my question is clear this time, last time people didnt really understand what i was trying to do.
Regardless, thanks!
EDIT 1: here are solve lines of my program (this is how i have to do for it to work, without arguments) (i used a fixed value in here)
int main()
{
HANDLE device;
device = CreateFile(L"\\\\.\\F:", // Drive to open
GENERIC_READ | GENERIC_WRITE, // Access mode
FILE_SHARE_READ | FILE_SHARE_WRITE, // Share Mode
NULL, // Security Descriptor
OPEN_EXISTING, // How to create
0, // File attributes
NULL);
}
This is with arguments (doesn't work)
int main(int argc, char* argv[])
{
HANDLE device;
device = CreateFile(argv[1], // Drive to open
GENERIC_READ | GENERIC_WRITE, // Access mode
FILE_SHARE_READ | FILE_SHARE_WRITE, // Share Mode
NULL, // Security Descriptor
OPEN_EXISTING, // How to create
0, // File attributes
NULL); // Handle to template
}
^ this shows what im trying to do
EDIT 2: i changed the CreateFile to CreateFileA , and this is the eroor codes it gives me (the drive D is a usb, its not a hard drive)
So unless im typing the wrong way to type a path, it always gives me erros. I think ill try another way to solve the problem, or if someone knows why thoes errors are happening, please tell!
EDIT 2: i changed the CreateFile to CreateFileA , and this is the eroor codes it gives me (the drive D is a usb, its not a hard drive)
This is a completely different question, and has nothing to do with wchar_t.
In your first snipped you passed "\\\\.\\F:" (AKA \\.\F: once we remove the C escaping); in all your tries from the command line you never provided this path, but respectively:
D - so it tried to open a file named D in the current directory, and it didn't find it (error 2, aka ERROR_FILE_NOT_FOUND);
D:\ - the root directory, which cannot be opened with CreateFile (error 3, aka ERROR_PATH_NOT_FOUND),
D: - the current directory on the drive D:, which again cannot be opened with CreateFile (error 5, aka ERROR_ACCESS_DENIED);
\D: - a file named "D:" in the root of the current drive, which cannot be created given that D: is not a valid file name (error 123, aka ERROR_INVALID_NAME).
To open a drive as a device, you must use the \\.\X: path (where X is the drive letter); you cannot just throw whatever floats in your mind and hope that it'll work. Call your program from the command line passing "\\.\D:" and it'll work fine.
Of course if you want to keep it simpler for the user you can accept just the drive letter on the command line and write some code to create the string required by CreateFile based on it.
if(argc<1) {
printf("Not enough arguments\n");
return 1;
}
const char *drive = argv[1];
char d = drive[0];
// accept both `d` and `d:` as argument for the drive
if(!((d>='a' && d<='z') || (d>='A' && d<='Z')) ||
(drive[1]!=0 && drive[1]!=':') ||
drive[2]!=0) {
printf("Invalid drive specifier: `%s`\n", drive);
return 2;
}
char path[]="\\\\.\\X:";
path[4] = d;
// now you can use path as argument to CreateFileA
What follows was the original answer, which is still valid but it addresses a completely different problem, unrelated to the actual problem OP is experiencing
You cannot make LPWSTR point to a char *, especially not by brutally casting the pointer - casting a pointer just makes the compiler shut up, it doesn't change the fact that what you are pointing at is not a wchar_t string. If you want to pass a char * to a function expecting a wchar_t * you have to perform an actual conversion of the pointed data.
Now, you have several possible solutions:
you can use _wmain and receive your command line arguments directly as wide characters;
you can convert your local-encoding strings to UTF-16 strings by using a function such as the MultiByteToWideChar; this can be encapsulated in a function returning a std::wstring;
you can just invoke the ANSI version of the API and let it deal with it; almost all Win32 APIs have both an ANSI and Unicode version, suffixed with A and W (CreateFile is just a macro that expands to CreateFileA or CreateFileW depending on the _UNICODE macro). So, you can use CreateFileA and pass it your string as-is.
The last two solutions are not great because using local-encoding strings as command line arguments precludes your program from opening files using arbitrary Unicode characters. OTOH, using wchar_t almost everywhere is quite a dread, since they "infect" virtually every string-processing corner of your application. The correct (IMHO) way out is to use UTF-8 everywhere, and convert on the fly when talking to the operating systems; see here for details.
#include <iostream>
#include <windows.h>
using namespace std;
int main(){
LPWSTR test = L"c:/aizen.png";
int result = SystemParametersInfo(SPI_SETDESKWALLPAPER, 0, test, SPIF_UPDATEINIFILE);
if(result)
cout << "Wallpaper set!";
else
cout << "Error: " << GetLastError();
cin >> result;
return 0;
}
The code is simply meant to change the background wallpaper, but I keep getting Error: 2, which means "file not found". However, the file is there! I'm using microsoft visual studio 2010 and I've tried running as admin, case-sensitive, changing slashes, etc. What am I doing wrong?
Error 2 is File not found.
First, make sure that aizen.png is actually located in the root folder of drive C:\ (which on Vista and above is not likely, given that non-admin users don't typically have write access there).
If the file is indeed there, the problem is most likely that you're not properly escaping backslashes:
LPWSTR test = L"c:\\aizen.png";
The problem is you are passing a UNICODE string - LPWSTR - to an API that takes ANSI.
Nearly all Win32 APIs (all that take strings at any rate) come in two versions, one that ends in ...A for ANSI (8-bit characters), and one that ends in ...W for Wide-char, aka UNICODE (technically not 'real' unicode, but that's more than is worth getting in this reply).
If you have UNICODE #defined at compilation time, then the plain unadorned version gets #defined as the ...W version; otherwise it gets #defined as the ...A version. Take a look at winuer.h, and you'll see:
WINUSERAPI
BOOL
WINAPI
SystemParametersInfoA(
__in UINT uiAction,
__in UINT uiParam,
__inout_opt PVOID pvParam,
__in UINT fWinIni);
WINUSERAPI
BOOL
WINAPI
SystemParametersInfoW(
__in UINT uiAction,
__in UINT uiParam,
__inout_opt PVOID pvParam,
__in UINT fWinIni);
#ifdef UNICODE
#define SystemParametersInfo SystemParametersInfoW
#else
#define SystemParametersInfo SystemParametersInfoA
#endif // !UNICODE
Note that Windows has two SystemParametersInfo functions; the W one expects wide LPWSTR and the A one expect plain LPSTRs; and whether you have UNICODE defined or not selects which is the 'default' one. (You can always add the A or W manually to call either explicitly.)
What's likely happening in your original code is that because you do not have UNICODE defined, you end up using the ...A version, which expects an ANSI string, but you're passing in a UNICODE string - so it doesn't work.
The "bit of a change" you made to get it working is more than just a bit: you're now passing an ANSI string to the ...A version of the API so it works fine:
int result = SystemParametersInfo(SPI_SETDESKWALLPAPER, 0, (void*)"c:/aizen.jpg", SPIF_UPDATEINIFILE);
Alternatively, you could call the W version explicitly with a LPWSTR:
int result = SystemParametersInfoW(SPI_SETDESKWALLPAPER, 0, L"c:\\aizen.jpg", SPIF_UPDATEINIFILE);
Or, you could define UNICODE at the start of your app, use L"..." strings, and the plain version of the APIs - just add #define UNICODE at the top of your original app before #include . (UNICODE is more usually defined in a makefile or in compiler settings options instead of being defined explicitly in code, so if you're new to Win32 programming, it can come as something of a surprise feature.)
Note that LPWSTR is not deprecated; if anything, it's the opposite; typical Win32 practice since XP or so has been to use W-flavor strings across the board, so it's effectively the plain "..." strings that are considered 'deprecated' on Win32. (For example, many COM APIs use only wide strings.)
Most other functions have some protection against this; if you accidentally try to pass an ANSI string to say SetWindowTextW, you'll get a compile-time error, because the LPSTR you are passing in doesn't match the expcted LPWSTR type the function is expecting. But SystemParamtersInfo is tricky; it takes a void* for the data parameter, so will accept [almost] anything at compile time, and it's only when you call the function at runtime that you'll hit the error.
--
This, by the way, is what David Herfernan pointed out in the answer to your question the first time you posted -
Some possible causes spring to mind:
...
You have an ANSI/Unicode encoding mismatch.
It's very weird looks like if you compile with MinGW c++ compiler it actually compiles this with a bit of change:
int main(){
int result = SystemParametersInfo(SPI_SETDESKWALLPAPER, 0, (void*)"c:/aizen.jpg", SPIF_UPDATEINIFILE);
This would work, and apparently LPWSTR is deprecated...
Visual Studio was probably having privilege issues. Run it as Admin and try again.
I am currently writing an application which requires me to call GetWindowText on arbitrary windows and store that data to a file for later processing. Long story short, I noticed that my tool was failing on Battlefield 3, and I narrowed the problem down to the following character in its window title:
http://www.fileformat.info/info/unicode/char/2122/index.htm
So I created a little test app which just does the following:
std::wcout << L"\u2122";
Low and behold that breaks output to the console window for the remainder of the program.
Why is the MSVC STL choking on this character (and I assume others) when APIs like MessageBoxW etc display it just fine?
How can I get those characters printed to my file?
Tested on both VC10 and VC11 under Windows 7 x64.
Sorry for the poorly constructed post, I'm tearing my hair out here.
Thanks.
EDIT:
Minimal test case
#include <fstream>
#include <iostream>
int main()
{
{
std::wofstream test_file("test.txt");
test_file << L"\u2122";
}
std::wcout << L"\u2122";
}
Expected result: '™' character printed to console and file.
Observed result: File is created but is empty. No output to console.
I have confirmed that the font I"m using for my console is capable of displaying the character in question, and the file is definitely empty (0 bytes in size).
EDIT:
Further debugging shows that the 'failbit' and 'badbit' are set in the stream(s).
EDIT:
I have also tried using Boost.Locale and I am having the same issue even with the new locale imbued globally and explicitly to all standard streams.
To write into a file, you have to set the locale correctly, for example if you want to write them as UTF-8 characters, you have to add
const std::locale utf8_locale
= std::locale(std::locale(), new std::codecvt_utf8<wchar_t>());
test_file.imbue(utf8_locale);
You have to add these 2 include files
#include <codecvt>
#include <locale>
To write to the console you have to set the console in the correct mode (this is windows specific) by adding
_setmode(_fileno(stdout), _O_U8TEXT);
(in case you want to use UTF-8).
For this you have to add these 2 include files:
#include <fcntl.h>
#include <io.h>
Furthermore you have to make sure that your are using a font that supports Unicode (such as for example Lucida Console). You can change the font in the properties of your console window.
The complete program now looks like this:
#include <fstream>
#include <iostream>
#include <codecvt>
#include <locale>
#include <fcntl.h>
#include <io.h>
int main()
{
const std::locale utf8_locale = std::locale(std::locale(),
new std::codecvt_utf8<wchar_t>());
{
std::wofstream test_file("c:\\temp\\test.txt");
test_file.imbue(utf8_locale);
test_file << L"\u2122";
}
_setmode(_fileno(stdout), _O_U8TEXT);
std::wcout << L"\u2122";
}
Are you always using std::wcout or are you sometimes using std::cout? Mixing these won't work. Of course, the error description "choking" doesn't say at all what problem you are observing. I'd suspect that this is a different problem to the one using files, however.
As there is no real description of the problem it takes somewhat of a crystal ball followed by a shot in the dark to hit the problem... Since you want to get Unicode characters from you file make sure that the file stream you are using uses a std::locale whose std::codecvt<...> facet actually converts to a suitable Unicode encoding.
I just tested GCC (versions 4.4 thru 4.7) and MSVC 10, which all exhibit this problem.
Equally broken is wprintf, which does as little as the C++ stream API.
I also tested the raw Win32 API to see if nothing else was causing the failure, and this works:
#include <windows.h>
int main()
{
HANDLE stdout = GetStdHandle(STD_OUTPUT_HANDLE);
DWORD n;
WriteConsoleW( stdout, L"\u03B2", 1, &n, NULL );
}
Which writes β to the console (if you set cmd's font to something like Lucida Console).
Conclusion: wchar_t output is horribly broken in both large C++ Standard library implementations.
Although the wide character streams take Unicode as input, that's not what they produce as output - the characters go through a conversion. If a character can't be represented in the encoding that it's converting to, the output fails.
I am just learning C++ and am trying to write a small program to change the desktop wallpaper. Using the documentation here, I wrote this program:
#include <windows.h>
#include <stdio.h>
#pragma comment(lib, "user32.lib")
void main(){
BOOL success = SystemParametersInfo(
SPI_SETDESKWALLPAPER, //iuAction
0, //uiParam
"C:\\test.jpg", //pvParam
SPIF_SENDCHANGE //fWinIni
);
if (success){
printf("Success!\n");
}else
printf("Failure =(\n");
}
The program always fails when I try to specify a file path for pvParam. It will correctly clear the wallpaper if I set pvParam to "". What am I doing wrong?
Thanks
-Abhorsen
It addition to Dennis' comment about JPEG files, it is also important whether or not you compile with UNICODE in effect. If you do then you'll have to specify the file as L"C:\test.jpg". Note the L in front of the string, that makes it a wide string. Or use SystemParametersInfoA(), note the A (but it's archaic).
Depending on the OS version, pvParam might not work.
If you are using Windows XP coupled with a JPEG file you are trying to assign as a wallpaper, notice the comment in the docs:
Windows Server 2003 and Windows
XP/2000: The pvParam parameter cannot
specify a .jpg file.