Hacker News new | past | comments | ask | show | jobs | submit login

2002. I remember this. Matrox was one of major GPU players at the time. https://en.wikipedia.org/wiki/Matrox_Parhelia



So it looks like it supported 10 bit frame buffers, but not necessarily 10 bit output. A quick search suggests it only supported DVI, which didn't support 10 bit output. In the context of talking about monitors, this essentially means that 10 bit wasn't supported at the time. Otherwise you could claim you have 192 bit color by software rendering at 64 bits per color, but outputting at 8 bits.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: