Firefox interacts with my timers somehow ?!?! This is crazy ^^ - c++

I have a problem with the timers in my win32 C++ DirectX game (a little demo). I am using "timeGetTime" to get the current time and use it to playback the animations or for logic by using the delta time (I also use a constant for normalization when animating). I use a tickrate as low as 2 miliseconds sometimes. I am using only ULONGs when working with time. The game works just fine, but if I start Mozilla Firefox and I start the game after all is playing very fast (the animations and the game logic). It's like fast forwarding. The strange thing is that it seems that a few timers are not affected by this. Does someone have a clue ? What is the connection between Firefox and my timers ? After exiting Firefox is stays the same, but after some undefined time or a PC restart it goes back to normal. Any suggestions are appreciated, even if it is a long shot. Thank you.

Yes the default precision of TimeGetTIme is 5ms or more.
This can be altered with the calls timeBeginPeriod and timeEndPeriod functions.
Alterning the precision of TimeGetTime affects all running applications.
I guess Firefox is calling those functions which affect your application.
Change to use the QueryPerformanceCounter/QueryPerformanceFrequency methods instead which provide high res timing and will not be subject to the issues of TimeGetTime.
EDIT:
A couple of links that show you how to use the functions.
http://www.programmersheaven.com/mb/windows/311148/311148/using-queryperformancecounter/
And also note the remarks on the MSDN page:
http://msdn.microsoft.com/en-us/library/ms644904(v=vs.85).aspx

Related

Native C++ webrtc client on WiFi reduces video quality, but fine on calls from chrome

I have a pet project with webrtc audio-video calls. Currently calls from browser are working exactly as expected.
And I'm trying C++/Qt client based on Native C++ WebRTC (it is just a modified peer_connection_client example with modified signalling).
It works, but I've found an issue: when I make a call from one PC to another, the quality of the video is extremely reducing until bandwidth became around 250-300kbps (or 500kbps total, not sure).
As I told, there is a web version working on same signalling and I tested it out on the same PCs. The result was really surprised for me: no quality reducing and around 2000kbps network load.
Moreover, if I connect PC with cable to my router, the quality is fine and 2000kbps load, as expected.
I suppose that the problem is somewhere around wifi bandwidth estimator or so, but can't realize how can I control it's behavior.
Have someone any ideas how can I improve the quality and make webrtc use 2000kbps not 500kbps?
Thanks in advance,
Br,
Sergey
While going forward, I've found out that I have to call rtc::Thread::ProcessMessages() eventually to prevent being stuck on signalling_threads events.
But after that a new problem occured. It is "UDP send of XXX bytes failed with error 10035" this is described at https://groups.google.com/forum/#!topic/discuss-webrtc/wmYo7AU3evI.

GUI loading issue on Windows compare to OsX

I use Juce C++ 4.0.2 to build an audio plugin with a relatively heavy GUI. It takes 5s to load the GUI on a DAW like reaper on OsX, but it takes 10 times more on Windows using the same DAW.
I eventually figure out that is it due to the Typeface::createSystemTypefaceFor function that takes 100ms on Windows. It was an issue on my side because I used it many times.
Does anyone face the same issue?
Typeface::createSystemTypefaceFor is not designed to be called frequently; you should call it once for each typeface ideally and cache the results. Calling it frequently will cause a performance hit which varies depending on platform, as you are experiencing.

DSSCL_EXCLUSIVE not giving exclusive audible output. DirectSound

Very simple question. In the MSDN documentation for the DirectSound API they state that when my application is in focus it will be the only audible program. This is exactly what I want to happen, however when setting this flag and playing sound through my application, I can still hear the background music on my computer.
So the question is, why? Is it because the application playing the background music using a different low level API, and thus different mixing buffers? Or is there some other little trick i need to tweak in order to become the only audible application.
I asked a similar/related question here, with no response. But once again if you don't know the answer to the specific DirectSound question, but you know a way of becoming the only audible application with a different API let me know!
Thanks, I'm on Windows XP 32Bit Professional, if it makes a difference.
A long time ago, the Windows developers realized that allowing one application to have total control of the audio system (whereby muting other apps) was a bad idea. And then they subsequently deprecated many of these "exclusive" and foreground/background mode flags. I believe this behavior change goes all the way back to DirectX 7.1 (WinME) and then formally everywhere on DX 8. This was 10 years ago.
Imagine your video conferencing app becoming muted when you switched the foreground application to an app that ran audio in some sort of exclusive mode. Not being able to reliably hear anyone when switching between apps is not a great experience.
As a matter of fact, prior to DX 8, many popular voice-comm apps for multi-player gaming would continually sniff for the foreground window handle and use that for the call to SetCooperativeLevel such that they wouldn't get muted.
I think it would be interesting to know, "what is that you really want to do?" that makes you assume you need total control of the audio output.
On Vista and higher, there is the WASAPI api for low-level audio. I believe there is a concept of "exclusive" mode but I don't know if trumps other apps using the sound card. YMMV.

How do I detect desktop transition effects?

I want to minimise my application, take a screenshot of the current desktop and return my application back to its original state.
This has been working fine under windows XP, however under testing on different Vista machines the minimise time of 200 milliseconds is no longer valid.
Is there a way to ask the operating system when it has finished these fancy effects or lookup how long it has been given to perform the operation?
While I don't know of a way to do what you ask, I have a suggestion: instead of minimizing your application's window, why not hide it (with ShowWindow(SW_HIDE))? That will not be subject to the animation effects, so should be pretty much instantaneous.
Maybe instead minimizing you should bring desktop to front?
The closest I can find is SPI_GETUIEFFECTS, which tells you if such effects are enabled at all.
If enabled, you could of course use SPI_SETUIEFFECTS to turn them off. But that's a rather shotgun method - how would you restore them? It's probably better to temporarily turn off the ones that bother you most.

Off screen rendering when laptop shuts screen down?

I have a lengthy number-crunching process which takes advantage of quite abit of OpenGL off-screen rendering. It all works well but when I leave it to work on its own while I go make a sandwich I would usually find that it crashed while I was away.
I was able to determine that the crash occurs very close to the moment The laptop I'm using decides to turn off the screen to conserve energy. The crash itself is well inside the NVIDIA dlls so there is no hope to know what's going on.
The obvious solution is to turn off the power management feature that turns the screen and video card off but I'm looking for something more user friendly.
Is there a way to do this programatically?
I know there's a SETI#home implementation which takes advantage of GPU processing. How does it keep the video card from going to sleep?
I'm not sure what OS you're on, but windows sends a message that it is about to enter a new power state. You can listen for that and then either start processing on the CPU or deny the request to enter a lower-power state.
For the benefit of Linux users encountering a similar issue, I thought I'd add that, you can obtain similar notifications and inhibit power state changes using the DBUS API. An example script in Python, taken from the link, to inhibit power state change:
#!/usr/bin/python
import dbus
import time
bus = dbus.Bus(dbus.Bus.TYPE_SESSION)
devobj = bus.get_object('org.freedesktop.PowerManagement',
'/org/freedesktop/PowerManagement')
dev = dbus.Interface (devobj, "org.freedesktop.PowerManagement.Inhibit")
cookie = dev.Inhibit('Nautilus', 'Copying files from /media/SANVOL')
time.sleep(10)
dev.UnInhibit(cookie)
According to MSDN, there is an API that allows an application to tell Windows that it is still working and that Windows should not go to sleep or turn off the display.
The function is called SetThreadExecutionState (MSDN). It works for me, using the flags ES_SYSTEM_REQUIRED and ES_CONTINUOUS.
Note, however, that using this function does not stop the screen saver from running, which might interfere with your OpenGL app if the screen saver also uses OpenGL (oder Direct3D).