Hacker News new | past | comments | ask | show | jobs | submit login

I'm not so sure. Given this random google hit (on 10gbps ethernet nics) -- I'm not sure it makes much difference: http://www.cl.cam.ac.uk/~acr31/pubs/sohan-10gbpower.pdf

I couldn't find anything on hdmi/4k, but this seems to suggest it doesn't make much difference (compared to what I assume the backlight and screen draws): http://en.wikipedia.org/wiki/Thunderbolt_(interface)

At any rate, if we assume it takes 4 times as much power, and we know that phones do just fine with 1080p+ displays -- I doubt the signal is the problem. Maybe it's the RAM used for the videoframes?

I always assumed it was the screen that consumed power with HDPI displays (flipping more pixels) -- but maybe it's something else.




I was always under the impression that the backlight is a lot of the power that displays consume, which is why LED displays use so much less power than CCFC displays. That would remain constant with resolution, right? If that's the case, I'd think that the actual GPU work (or the RAM, like you say) is where the power difference was for me.

I could be wrong about how significant the backlight is on an LED display though.

I also thought that modern video cards (as in from the last 15-20 years) optimized 2D graphics so that regular desktop use didn't require pushing a screen full of pixels to the GPU on every frame. That would seem to diminish the significance of the actual number of pixels at QHD+.

I don't have a good answer.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: