Hacker News new | past | comments | ask | show | jobs | submit login

Sorry, but I've benchmarked this recently as part of a project that I'm working on, and JPEG decoding with libjpeg-turbo consumed far more CPU than software H.264 decoding.

The encoding side is very expensive with H.264, but as I understand it a lot of work goes into building the right reference frames so that higher compression and faster decode can be achieved.




You are comparing CPUs to GPUs right now


he said software x264, so CPU to CPU?


It may be different when comparing GPU to GPU because of difference in parallelization


Is that the new apples to oranges?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: