Hacker News new | past | comments | ask | show | jobs | submit login

I still use a CRT, because features I want are still ludicrous expensive in newer tech (although they are getting cheaper over time).

1. Arbitrary resolutions, great to run old software and games, even better to run new games at lower resolution to increase performance.

2. Arbitrary refresh rates.

3. Zero (literally) response time.

4. Awesome color range (many modern screens still are 12bit, meanwhile silicon graphics had a CRT screen in 1995 that was 48bit)

5. No need to fiddle with contrast and brightness all the time when I switch between a mostly light or mostly dark content, for example I gave up on my last attempt to use flat panel because I couldn't play Witcher 3 and SuperHot one after the other, whenever I adjusted settings to make one game playable the other became just splotches (for example the optimal settings for Witcher 3 made SuperHot become a almost flat white screen, completely impossible to play).

6. For me, reading raster fonts on CRT gives much less eyestrain and is more beautiful than many fonts that need subpixel antialias on flat panels.

7. Those things are crazy resilient, I still have some working screens from 80286 era (granted, the colors are getting weird now with aging phosphors), while some of my new flatpanels failed within 2 years with no repair possible.




>many modern screens still are 12bit

Unless you're using a really cheap screen, most LCDs are 8 bits per color. The cheap ones use 6 bits + 2 bits FRC. 12 bits (presumably 4 bits per color) seems insanely low because it would mean only 16 possible shades per primary color.

>meanwhile silicon graphics had a CRT screen in 1995 that was 48bit

Source for this? 10 bit color support in graphics cards only became available around 2015. I find it hard to believe that there was 48 bits (presumably 16 bit per color) back in 1995, where 256 color monitors were common. Moreover, is there even a point of using 16 bits per color? We've only switched to 10 bit because of HDR.

>7. Those things are crazy resilient, I still have some working screens from 80286 era (granted, the colors are getting weird now with aging phosphors), while some of my new flatpanels failed within 2 years with no repair possible.

This sounds like standard bathtub curve/survivor bias to me.


I am talking about SGI workstations, indeed the 1995 ones didn't support (without modification) 48bit, instead it was "only" 12bit per channel, 3 channels, thus 36bit.

Here is a photo of John Carmack using such workstation: https://external-preview.redd.it/EnhEls7GJgm9UxR8FE9Dc3FfH4X...

In 1997 then they launched the "Octane" Workstation line, that could output 4 channels of 12bit, thus reaching 48bit.

https://hardware.majix.org/computers/sgi.octane/

One of the purposes of these machines, was make HDR images (among other things).

Sadly for THAT, I don't have time to track sources now, I am busy with something else.

As for a monitor that could support this stuff, one is SONY GDM90W11 that could do 1900x1200


> 10 bit color support in graphics cards only became available around 2015.

That's off by a decade.

> 256 color monitors

Is that a thing that exists?

> We've only switched to 10 bit because of HDR.

You can get clear banding on an 8 bit output, and 10 bit displays are used at the high end. 10-bit HDR isn't immune to banding, since most of the increased coding space goes into expanding the range. There's a good reason for 12 bit HDR to exist.


>That's off by a decade.

mainstream support, at least. 10 bit HDR support for AMD cards was introduced with Fiji (2015), and Nvidia was introduced with Maxwell (2014)


2002. I remember this. Matrox was one of major GPU players at the time. https://en.wikipedia.org/wiki/Matrox_Parhelia


So it looks like it supported 10 bit frame buffers, but not necessarily 10 bit output. A quick search suggests it only supported DVI, which didn't support 10 bit output. In the context of talking about monitors, this essentially means that 10 bit wasn't supported at the time. Otherwise you could claim you have 192 bit color by software rendering at 64 bits per color, but outputting at 8 bits.


> 1. Arbitrary resolutions

Consult the dot pitch of your monitor for the actual resolution. Everything is “scaled” to that resolution. Of course, the algorithm is physics, which is much better than cubic interpolation, so it does look slightly better, but a modern hidpi lcd will provide a higher dot pitch and thus a sharper, more accurate picture.


That is exactly my point, CRT don't scale things, monochrome CRT can even draw pure vector graphics, having "infinite" resolution.


Color CRTs do have something akin to a native resolution too, defined by the phosphor arrangement, so they do "scale" things. It just happens that the scaling is naturally blurry and artifacts aren't noticeable.


When do you ever see a monochrome CRT these days?

I had one mumble years ago as my second screen. It was really nice.


You raise some very interesting points. I've appreciated the physical lightness and ease of positioning of LCDs, plus the absence of flyback whine. And they can go to higher resolutions than the scan circuitry in a CRT can physically manage.

But all those other things, yes the color resolution, the smoother fonts, the response time. I might have to swing by the recycler and pick up a nice "new" Viewsonic. :)


3. 60hz CRT refresh rate is 16.67 millseconds of delay. Interestingly, I connected a VT220 to my 56' 4k Samsung TV via a BNC-to-RCA cable, and by comparing the cursor blinking on both screens there's a very noticeable and visible delay, like a 1/4 second.

4. CRTs are analog. It's as many bits as your output hardware can make up.

5. CRT is still supreme for contrast (at least over LCD) despite all the tricks and such for LCDs.

7. That VT220 I mentioned above is from 1986. It's monochrome, but works great.


> CRTs are analog. It's as many bits as your output hardware can make up

Actually the horizontal resolution of a CRT is limited by: the dot pitch of the colour phosphors, the bandwidth of the amplifiers, and the electron beam spread.

The vertical resolution is limited by a combination of: electron beam scan rate, delay for horizontal flyback/retrace, delay for vertical flyback, desired refresh rate, and electron beam spread.

There are more details for the resolution limitations, but I think I covered the main ones.


I kept a CRT for quite a while but when I switched, I realized I didn't miss it.

1- True, if there is a thing I'd miss, that's it. At low resolutions, CRT sometimes get really nice refresh rates too (I've seen up to 18O Hz).

2- Modern gaming monitors have freesync/g-sync that not only give you arbitrary refresh rates, but they are also adaptive.

3- Also true, but not as significant as one might think. The monitor itself is zero latency, but what's behind it isn't. We are not "racing the beam" like in an Atari 2600 anymore, the image is not displayed before the rendering of a full frame is complete. The fastest monitors are at around 144Hz, that's 7ms. So for a 1ms gaming monitor to a 0ms CRT, you actually go down from 8ms to 7ms, to which you need to add the whole pipeline to get "from motion to photon". In VR, where latency is critical, the total is about 20ms today. More typical PC games are at about 50ms.

4- CRTs are usually analog. They don't use "bits" and it is all your video card job. Also 48bits is per pixel, 12bits is per channel. Apples to oranges comparison. CRTs do have a nice contrast ratio though, good for color range. Something worth noting is that cheap LCDs are usually just 6bit with dithering. True 8bit is actually good and I'm not sure that you can actually make a difference passed 12bits.

5- Never noticed that, maybe some driver problem. An interesting thing is that CRTs have a natural gamma curve that matches the sRGB standard (because sRGB was designed for CRTs). LCDs work differently and correction is required to match this standard, and if done wrong, you may have that kind of problem.

6- I hate text on CRTs. And unless you have an excellent monitor (and cable!), you have tradeoffs to make between sharpness, resolution, and refresh rate. And refresh rate is not just for games, below 60Hz, you have very annoying flickering. I wouldn't go below 75 Hz. And at max resolution, it can start getting blurry: the electron beam is moving very fast and the analog circuitry may have trouble with sharp transitions, resulting in "ringing" artifacts and overall blurriness. One old games, that blurriness becomes a feature though, giving you some sort of free antialiasing.

7- Some CRTs are crazy resilient, others not so much. Same thing for LCDs. And as you said, phosphors wear out, that's actually the reason why I let go of my last CRT (after about 10 years). My current LCD is 8 years old and still working great, if fact, better than my CRT at the same age (because it doesn't have worn phosphors).


Do they even still make CRTs, or are you using a really old one?


Sadly, the CRT manufacturers back then when it was still obviously superior to LCD and Plasma, decided to kill it early, SONY was still selling CRTs faster than flat panels when they shut down their factories.

Many people today think that noone make CRT because noone buy it, but is the opposite, you couldn't buy CRTs anymore because manufacturers intentionally killed them.

There was even research ongoing at the time to make a slim flat panel CRT, but they cancelled that too.

CRT due to being analog, doesn't support DRM, thus this contributed a lot to its rapid death. (among other reasons).

People still use CRT in some arcade machines, medical industry (for example to diagnose certain visual disorders, and a plasticity research I know, require zero update lag, thus only possible with CRT), industrial applications where flat panels are too fragile and whatnot, but all of these are basically buying existing CRTs, driving up the price.

There was some people trying to restart production, but... the companies that have the patents refuse to sell them, the companies that know how to make them also refuse to sell the machinery and whatnot, and the few independent attempts failed often due to regulations.

Also in USA someone invented a machine to recycle CRTs, and ended being shut down due to regulations too, so in USA because of regulation instead of safely melting glass and lead, the law basically says to dump it all in landfills.


> medical industry (for example to diagnose certain visual disorders,

I think I can guess what's this is about but do tell


Nice, I think I still have a Sony Trinitron somewhere.

I wonder if there are any market for a modern take of CRT.


What CRT(s) are you using?


My personal workstation has a Samsung Syncmaster that I don't have model number available now. The maximum resolution is around what people call 2k, but 60hz, that I don't like. My current resolution is 1600x1200 75hz.


Reminds me of a Dell CRT I used to use - 1600x1200@80Hz, and 21" IIRC. Every monitor I've had until fairly recently has felt like a compromise in some way compared to it, but taking a CRT to university was a non-starter.

HD-DVDs playing on my Xbox 360 looks looked glorious on it (Xbox 360 supported 1600x1200 output, as did some games IIRC, and any widescreen content would just get letterboxed.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: