Hacker News new | past | comments | ask | show | jobs | submit login
Windows Timer Resolution: Megawatts Wasted (randomascii.wordpress.com)
157 points by ceeK on July 10, 2013 | hide | past | favorite | 41 comments



In the meantime, Linux has (by default) an adaptable timer and will soon be fully tickless [1]. In other words there will be no fixed timer and the OS will calculate when the next wake-up should be scheduled and sleep until that time (or until an interrupt comes).

At the same time, PowerTOP [2] will nicely show you which programs or drivers are responsible for waking up the computer and estimate how much power each program is consuming.

[1] https://lwn.net/Articles/549580/ [2] https://01.org/powertop/


According to the reddit thread at the bottom of the page, Windows 8 also has dynamic ticks.

http://arstechnica.com/information-technology/2012/10/better...


Yet Linux 3.6 still lasts only 1h15m on idle compared to windows 7 at 4h45m on my laptop. Linux has all powertop optimisations on as well.

YMMV


What explains the abysmal power efficiency of Linux?


On consumer hardware like laptops, it is generally that the driver for the GPU is not able to put the hardware into its low power modes.

The power efficiency tends to be very good on server-class hardware (because large corporate users like Google tend to care a lot about it).


AMD has submitted dynamic power management for the 3.11 kernel, so hopefully this efficiency gap will start closing:

http://www.phoronix.com/scan.php?page=news_item&px=MTQwNTU


Try installing powertop, and see for yourself. It generally isn't the kernel.


Actually, while doing this is a good idea for individuals diagnosing their system, There does seem to be a systemic issue. I hope I didn't come across as flippant.

I run TLP, turn of my ethernet and wifi when not using them, and otherwise try to save power. Windows users do none of these things, yet they still seem to get okay battery life.

It does beg the question... if the problem isn't in the kernel but in userland, what is the issue, and can something be done?

I will gladly give money to someone who will work on figuring this out.


I have often wanted a similar tool like PowerTOP for MacOSX. (And maybe also iPhone.) Is there any?


Mavericks (10.9) has it as part of Activity Monitor.


This is an interesting post. jQuery was fixed to use 13ms as the minimum animation interval a some time ago. This seems like a legit Crome bug to file as the interval should be more deterministic. Chrome shouldn't take a 1ms tick unless it really needs it.

I wonder how much javascript code uses setTimeout(x,0) to push code to the end of the run loop.


Initially, Chrome attempted to allow setTimeout()s under the 15ms or so that was standard across browsers, which led to it winning some benchmarks and some accusations of foul play. The intent was pure -- why artificially clamp JavaScript timers to a Windows quirk? -- but eventually Chrome was changed to make timers behave like in other browsers. It appears that the spec now says 4ms is the minimum.

This bug (helpfully linked from MDN) has more of the story. https://code.google.com/p/chromium/issues/detail?id=792

I remember the Chrome timer code of years ago was careful to only adjust the interval when needed. From reading other bugs it looks like today's behavior is an accidental regression and will likely be fixed (until the next time it regresses).


> I remember the Chrome timer code of years ago was careful to only adjust the interval when needed. From reading other bugs it looks like today's behavior is an accidental regression and will likely be fixed (until the next time it regresses).

Indeed, although it seems the current behavior has been oustanding for some time:

https://code.google.com/p/chromium/issues/detail?id=153139

The original justification for lowering the resolution is an interesting read:

At one point during our development, we were about to give up on using the high resolution timers, because they just seemed too scary. But then we discovered something. Using WinDbg to monitor Chrome, we discovered that every major multi-media browser plugin was already using this API. And this included Flash, Windows Media Player, and even QuickTime. Once we discovered this, we stopped worrying about Chrome's use of the API. After all – what percentage of the time is Flash open when your browser is open? I don't have an exact number, but it's a lot. And since this API effects the system globally, most browsers are already running in this mode.[1]

[1] http://www.belshe.com/2010/06/04/chrome-cranking-up-the-cloc...


> It appears that the spec now says 4ms is the minimum.

I was playing with Windows timers a little while ago and I noticed that with IE11 open the timer interval sat at 15.6ms, occasionally changing to 4ms while the page was doing things. That was the first time I've heard of a program calling timeBeginPeriod without setting it to 1ms. I hope it catches on.


It could use requestAnimationFrame in the future, then the js won't need to specify an interval.

https://developer.mozilla.org/en-US/docs/Web/API/window.requ...


Use of requestAnimationFrame triggers high-precision timers in at least Chrome and Firefox...


> I wonder how much javascript code uses setTimeout(x,0) to push code to the end of the run loop.

Probably heaps, but why would that cause the system-wide timer to run more frequently? Even if it does need to it should only be temporary.


Any timer is constrained to the resolution set by the system timer. To go lower one would require something akin to a spin-loop, which defeats any gain by keeping the resolution high.


setTimeout(,0) is a special case. It means "make this function be ran after the current call stack clears". That requires neither a timer nor a spin loop.


If browsers did that they would break the Web. They must clamp to some low delay (4 ms, I believe) or else some pages will lock up.

See https://bugzilla.mozilla.org/show_bug.cgi?id=123273


Right, but that precedent was only set by incorrect implementations existing in the first place. Using setTimeout(,0) recursively in lieu of (a then unavailable) requestAnimationFrame or a non-zero timeout period is in my mind equivalent to while(true){}/infinite tail recursion.


And yes, in the HTML5 spec (years later), it was standardized to a 4ms clamp, also because of the incorrect implementations prior.



Interesting. I had clockres on my machine but never bothered to learn what it does. I've used that in code that I wanted a better timer but ended up using QueryPerformanceCounter/Frequency and rolling my own timer class but that can be a bigger pain than just using the timer.

On my machine, I got similar settings and found chrome being the sole offender which is probably the worst offender in many ways. Firefox and IE were clean so Google is the outlier and given I always have Chrome browser open somewhere while SQL or devenv is not always open, I suppose that's suboptimal wonder if they will change it.


Macs have a similar issue. Unexpected programs activate the dedicated GPU. Skype and twitter used to do this. Power users run special utilities to force the dedicated GPU off, but the normal user has no idea that his Mac battery won't last.

In my PC the programs raising the resolution are gtalk, chrome and skype. I run visual studio and sql server but they don't show up in powercfg.

quartz.dll is from DirectShow. A multimedia component is expected to requiere more resolution. The fault is in the program calling into DirectShow.


Only if you have a discrete GPU, of course. The power management story on the 15" MBP is a bit of a shitshow; I expect that the Haswell updates will go integrated-only.


"Another common culprit on my machine is sqlservr.exe. I think this was installed by Visual Studio but I’m not sure. I’m not sure if it is being used or not."

Is this attitude still prevalent in the windows community? I thought things had improved on that front.

Its worth pointing out that "the highest frequency wins" is not an example of "tragedy of the commons."


"the windows community". Huh. Do you really think such a thing exists?

(I'm just kidding; of course it does. We meet monthly and talk about ways to snuff out free software, as is our way).


Um, what? Of course there's a windows community, in every sense of the word. There are magazines, conferences, and forums for windows programming and windows programmers. There are trends, fads, and innovations.

.. and there are practices that are commonly found on windows programming that are beyond the pale in other environments (loading your app into memory every time the machine starts, installing malware during setup, etc).


I realize the phrase was a little awkward. As I was writing the comment I struggled with finding the appropriate phrase that would not sound diminutive. I thought awkward was preferable to diminutive.


"Windows ecosystem" is perhaps a better term.


It's fairly common among a lot of developers, sadly. Linux/OSS folks are less prone to it, but I've seen kitchen sink setups there as well.

An old colleague used to joke: "Developers should never have fast machines". Point being, they'll appreciate every spare CPU cycle and byte of memory available.


Like the author says, Megawatt is a measurement of energy wasted per second.

If we take his claim that 10 MW is being wasted, then the energy wasted at 10 MJs⁻¹ over a year is the energy of 5 small atomic bombs.

http://www.wolframalpha.com/input/?i=10MW+times+a+year


I don't think "5 small atomic bombs per year" is a particularly relatable example. It's about the average electricity consumption of 7500 US homes - that seems more concrete to me. If you can save the equivalent of switching off 7500 homes by fixing a bug in your software, that's a pretty big impact for one person to make.


Considering the amount of energy we use, it's barely a drop in the ocean.



I'm not sure whether you're agreeing with me or not, and in a way that's why people shouldn't just use large numbers and then act like they've said something meaningful.

Ratios matter, large numbers without a relevant basis for comparison on the other hand are just misleading. There are roughly 116 million households in the US, saving the energy of 7,500 is not a big change.

You're solving roughly 1 - 15,466 th of the problem. And that's assuming that all the savings could even be applied to the US, which they most certainly couldn't.

That's not a big impact. Chances are no-one - even if they were looking - could notice the figurative needle move on a change that small at all the power-stations serving the aggregated demand.


That's true, but it's also a lot larger than the impact any one individual's actions usually have.


Lots more energy is lost during power conversion than this.

Scary numbers though even if they are ballpark figures.


Author provides source code using non-monospace (!) slanted (!!!) font. I would have wasted a damn gigawatt to unsee that.


    if(comment.onTopic)){
      comment.post()
    } else {
      comment.delete()
    }




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: