I used GetLastInputInfo for check last input info from mouse and keyboard.
On my system on PC is working correctly, but when I run my program on my laptop it does not working.
I see that LASTINPUTINFO changing every 10-15 sec.
Now, I am writing example program for check all input from mouse and keyboard and save last input time from this device but this time not changing if I idle.
How can I check who is generate Activity (device/program) and change struct LASTINPUTINFO?
You can use Raw Input to see if the activity is coming from the actual mouse/keyboard itself. If it is, you might have a faulty device driver, or a driver that is running some kind of internal timer to generate a steady flow of input events.
If GetLastInputInfo() updates without Raw activity being reported, than a running app is most likely using an input injection API like mouse_event(), keybd_event(), or SendInput(). You would have to hook those directly to find out which app is calling them.
Related
We are building an experiment with PyGaze using PsychoPy to handle all the screen related stuff and the keyboard input. Since we are using multiple timers to trigger events, we try using threading in order to have the different tasks not blocked by each other. Therefore we are also planning to create an event handler for all the inputs including keyboard input and also input from an eye tracker and other hardware.
But if we try to run the event handler within its own thread, we can't get any keyboard input. Everything runs without any errors, but there is simply no keyboard input coming through to the psychopy.event.getKeys() call.
PsychoPy itself is initialised in the main tread since otherwise it causes problems with OpenGL and pyglet.
Is there any way to get this setup running or is it fundamentally not compatible with PsychoPy?
I am making an RPG game with C++/x86 asm. My question is related to the C++ component. In C++/win32 how would I detect if the computer is shutting down or turning off, or whatever else - so that I can save the game session. My game has a 'save' option, but if the user or another program decides to shut off the computer how can I detect this with some kind of API hook so that I can instantly save the game session to the text file.
Now please don't suggest an option by creating a thread to run passively as I want to keep the file size to a minimum, so if you can suggest some sort of WM_ hook that would be great. I'd refer to MSDN but I don't want to be searching for hours through their WM directory.
You can handle session saving in response to the WM_ENDSESSION message. Note that after your message handler returns from handling this, your process may be terminated at any time, so you need to save it directly during the message handler, and not just set a flag to let some later code handle the saving, because that later code might not get to execute.
A comment suggests the WM_QUERYENDSESSION message. This has a slightly different meaning: it gives applications the chance to complain about the session ending, and gives the user a chance to not log off / shut down the system. Based on your question, you have no intention of preventing any such thing, so WM_ENDSESSION seems like a better match to me.
m_audioEngine->CreateMasteringVoice(
&m_masteringVoice,
XAUDIO2_DEFAULT_CHANNELS,
sampleRate,
0,
NULL
)
);
m_audioEngine->CreateSourceVoice(
&implData->sourceVoice,
format,
0,
XAUDIO2_DEFAULT_FREQ_RATIO,
reinterpret_cast<IXAudio2VoiceCallback*>(&implData->callbackHander),
nullptr,
nullptr
)
);
One of the above code when I have my earphones in seems to always run fine.
If I start my game without earphones in, sometimes (not always) the above function fails. It always throws the same HRESULT: 0x88890017
any ideas?
If I put a breakpoint directly after this, it seems to not throw an error... Does this task run asynchronously?
EDIT---------------------------------
My IXAudio2SourceVoice keeps getting lost randomly
what can cause that to lose itself?
this is why my program crashes...
it only loses itself when earphones are not plugged in (when creating XAudio2 objects)
What does it mean?
This error code is known as "*AUDCLNT_E_CPUUSAGE_EXCEEDED*" and occurs when the audio engine is taking too long to process audio packets. This typically occurs when the CPU-usage of the audio engine exceeds a certain treshold. The audio engine will fail creating new streams if its CPU-usage exceeds this threshold.
Resolving: The User
CPU-usage is subject to various things, like the processing power of your CPU, like the number of channels you're using and like the audio device enhancements you have enabled on a system level. Some possible solutions are to ensure a decent CPU (check the minimum system requirements specification), in the application/game-settings lower the amount of channels in use, or to disable some system-level audio device enhancements in your operating system. For the latter check your task manager for CPU-usage, and if one of the suspicious processes is "audiodg.exe", go into the Sound control panel, double-click each of your playback devices in turn, go to the Enhancements tab, and check the "disable all enhancements" box. This should lower the required CPU-usage and solve your problem.
Resolving: The Coder
Keep in mind that the more your audio code is doing, the more CPU-cycles it will require. If you have an IXAudio2 device created with a ton of effect processors in the chain, 1000 SubmixVoices and hundreds of SourceVoices, that's like asking for trouble. Before you point your fingers to the CPU or to the system-level audio device enhancements, do ensure that it isn't just your code being inefficient.
Your big friend here is IXAudio2::GetPerformanceData, which will query the device and fill in a XAUDIO2_PERFORMANCE_DATA-structure for you. This gives you some information about the CPU-cycles used. Chances are good you can intercept this error before it actually occurs. When you detect a heavy CPU-usage, or when the error actually occurs, it's not necessarily a reason to have things fail in your game/engine/framework. You could retry. Or you could adjust the number of SubmixVoices. Or you could choose to not create a SourceVoice. Or you could temporarily suspend audio/switch to a null-device, and inform the user about all of this.
You could setup an event or callback to inform the user of heavy CPU-usage in the audio engine. This enables the application to inform the user of this heavy CPU-usage, and inform the user to lower the amount of channels in the settings (alternatively you can have your application adjust things automatically) or to turn off some system-level audio device enhancements.
In my project i'm running a train which stops moving when it reaches a particular point this moving is carried out by glutTimerFunc .I once again want the train to start from the location where i click my mouse to a particular location
BUT THE PROBLEM HERE IS,
My timer still running even after reaching that location,so even when i initialise the starting point its not working(it continues from the left location).
Now i need to stop the timer and start the train timer for the new location.
The API documentation has the following to say:
There is no support for canceling a registered callback. Instead, ignore a callback based on its value parameter when it is triggered.
So, add a boolean to your software and ignore the event whenever it is triggered. It would be better to use a clock-based timer rather than an event-driven timer and do your timed updates manually everytime the main loop runs (you detect the amount of time since the last update, and you determine whether to perform an update tick(s)), in the long run however. This is how physics and various other time-based simulations are handled in most professional software, using the event-driven model sets you up to miss or frequently wind up handling a timed event excessively late.
Welcome to the world of game engines and actors.
My recommendation is that you don't try to do this by turning glutTimerFunc on or off directly. The timer function should be the top level "heartbeat" for the entire program, and it's job is just to tell every object that has behaviour - an "actor" - that it should update itself. The train should have its own internal state that knows where it is and whether it should be moving or not.
I'm working on a Windows Mobile 6.5 application that has a dialog box that displays input from a camera and has a button to save a snapshot of the stream. The camera API recommends calling the function that updates the view of stream when the application is idle, via the Windows Message Loop, but doesn't get any more specific than that. After much Googling, I still can't find anything helpful in terms of actually implementing something like this.
Does anyone know how this might be achieved?
You'll have to implement a message loop, not using the conventional GetMessage which blocks until a message exists in the thread's message queue[1], but rather using PeekMessage, which returns false if no message exists[1].
If it returns false, then you do your idle processing. Note that you should divide your idle processing in small enough chunks so that the message loop doesn't cause unresponsiveness to your app.
This is also a classical alternative to threading on 1 cpu or 1 core.
[1] or should be synthesized (painting or timers)