RAM energy efficiency is extremely important because it is a significant part of the baseline power consumption. Power-hungry components like CPU and GPU can be run in the lowest power state most of the time, so it really comes down to RAM and the display. If you have a 60Wh battery and want to get 15 hours battery life, you need to get your idle power consumption under 4 watts. So even 0.5 watt in baseline RAM consumption makes a huge difference.
Performance difference between system RAM and VRAM is an order of magnitude.
Specifically, DDR4-3200 peaks at 25.6 GB/second per channel, modern high-end GPUs peak at 500-900 GB/second depending on the GPU model.
VRAM consumes substantial power to deliver that performance. On many GPUs, VRAM chips are actively cooled. System RAM doesn’t even need passive heatsinks, uses too little electricity to care.
GPUs do use fairly fine-grained dynamic frequency scaling on their memory chips for precisely this reason: having a top-end GPU run at around 15-20 W under light loads simply wouldn't be possible otherwise.
Yeah, but at the time my card only went down half-way, to 1200 MHz IIRC, while I made it go down to 800 Mhz in desktop mode. In gaming it was closer to 3 GHz. This was before my NVIDIA GTX 970, some AMD model.
Just really blew my mind how much heat the VRAM contributed.