The 15 hours of battery was only for the FHD (1920x1080) display. I scanned The Verge review but wasn't clear if they tested battery life on the FHD or QHD+ display.
Toss up some terminus and vim and the FHD model sounds like a great all day coding machine.
On my Samsung ATIV (QHD+), I found that running the display at half the native resolution dramatically increased battery life and didn't seem to look any worse than a normal panel with a ~72 DPI native resolution.
Well....1920x1080@3 bytes per pixel@60Hz = 336MB/s, while
3200x1800@3 bytes per pixel@60Hz = 989MB/s. In QHD mode your laptop is sending nearly a gigabyte of data to the display...per second. It might not be very power hungry to compute those pixels with a modern GPU, but even simply sending that much data requires a lot more power than sending "only" 330MB/s.
At any rate, if we assume it takes 4 times as much power, and we know that phones do just fine with 1080p+ displays -- I doubt the signal is the problem. Maybe it's the RAM used for the videoframes?
I always assumed it was the screen that consumed power with HDPI displays (flipping more pixels) -- but maybe it's something else.
I was always under the impression that the backlight is a lot of the power that displays consume, which is why LED displays use so much less power than CCFC displays. That would remain constant with resolution, right? If that's the case, I'd think that the actual GPU work (or the RAM, like you say) is where the power difference was for me.
I could be wrong about how significant the backlight is on an LED display though.
I also thought that modern video cards (as in from the last 15-20 years) optimized 2D graphics so that regular desktop use didn't require pushing a screen full of pixels to the GPU on every frame. That would seem to diminish the significance of the actual number of pixels at QHD+.
Toss up some terminus and vim and the FHD model sounds like a great all day coding machine.