Hacker News new | past | comments | ask | show | jobs | submit login
TVs are all awful (mjg59.dreamwidth.org)
416 points by sciurus on Jan 3, 2012 | hide | past | favorite | 160 comments



If you are a gamer, TVs are now also awful for playing games, due to the built in lag between the TV receiving a signal and displaying it. On a CRT it was instantaneous, as game console data streams were sent basically straight to the electron gun -- nowadays there's a massive amount of processing before the signal is on screen.

This manifests primarily in a feel of poor controls, or a game not doing what you tried to do.

Good HDTVs will give you 100ms of lag or so, bad ones can be in excess of 400ms. With most games nowadays running at 30fps, that's between 3 and 13 game frames of lag.

It's pretty absurd the difference this makes, and as a game developer I have to fight with it constantly and it is infuriating.


The best I found so far for modern TVs is the most expensive Samsung model, the xxx8000 series - only two frames of lag (30ms). Slightly less expensive xxx6400 has three frames, or a bit over 40 ms. This all measured in "Game mode".

It's funny because a very old Westinghouse 37" LCD TV had between one and zero frames worth of lag.

Also of note, the Onkio receiver I measured added two more frames of lag. So you should always play direct.


Yes and yes, to both findings with both brands.


So 100 ms is on about human eye-hand reaction time. Not that humans think in discrete chunks, but 100-400ms is like 1-4 extra responses, which makes game play quite different.

Also, I know people will say "Game Mode!" but obviously, as a dev you can't just assume your end users will do that.


Is that what "game mode" is? I had always thought it was just changing the preset of brightness/contrast/etc. but is it also trying to reduce latency?

(I'm not a gamer)


Usually it turns off all of the filters/post-processing steps and tries to mimick the response times of CRT displays more closely by sending the framebuffer to the screen as soon as possible.

The works to varying degrees of success on a per-model basis. Just check out forums discussing which television to buy when you want to play music/rhythm games (like Guitar Hero) to read a lot more details about specific displays.


Guitar hero, and most other rhythm games have a configuration option to put in a delay so that the game and sound match up with your screen again. For shooters that wouldn't help, of course, so there you need a fast screen.


It seems like this would also affect audio, depending on how you have it set up: if it runs through the TV, the TV could delay the audio by 100-400ms to compensate, but if it goes straight from the computer to the stereo there would be a noticeable lag in the video. Is there something obvious I'm missing why this isn't a problem?


Many home theater systems have an adjustable audio delay for exactly that reason.


It was a problem with my TV (a Sony Bravia LCD display) so I had to run the sound through the TV and into the speakers instead of directly into the speakers.


I think the only people that tend to notice the difference between 100ms and 400ms are the more... ardent players... and those guys better damn well know what game mode is.


I disagree. 400ms is noticeable lag to anyone, still so against 100ms. Run this in bash, and imagine this lag in a game you're playing, as the time from when you hit a button on a controller to when you see the response on screen:

$ sleep .1

$ sleep .4


When I last talked to my mom, she was complaining to me that she's been getting worse scores on her WiiFit dance routine since upgrading to an HDTV from a CRT. I assume it must be HDTV lag since ... what else could it be?


Some rhythm games have a built-in calibration tool you can use - it'll offset the video/audio by a certain amount to compensate for HDTV lag. I know Guitar Hero/Rock Band have one, maybe WiiFit does too?


Why did you just step in such an egregious folly of "your mom?" was that your intent?


Another form of awfulness I've often seen: Sixteen bazillion settings in the menus, and nothing in the manual to indicate what combination of settings will make the TV stop post-processing the image.

One common offender is "Sharpness" settings. On many TVs, setting this to anything besides zero will make the TV apply a sharpening filter to the image, generating annoying halos around edges.

Another common offender are "color" (especially "tone" and "contrast") settings. These were meaningful for CRT TVs, or when decoding broadcast video, but applying them to digital input is simply ridiculous -- especially when there's no obvious neutral setting.


It doesn't help that TVs frequently ship with all of those settings optimized for standing out in a showroom next to a dozen other TVs trying to do the same, rather than actually looking good in a living room.


Another fun thing with processing that most people don't realize: almost all TVs will internally convert the input signal to YCbCr 4:2:2 for all this processing. Even though data is practically always transmitted as 4:4:4 over HDMI, and has to be converted back to RGB 4:4:4 to be displayed on the panel.

So if you have red/blue text (or anything with red/blue detail) looking awful, like it was lower resolution than it is, you have the TV to thank for that.

EDIT: to clarify, this doesn't apply if the text is a part of a movie (then you blame the codec for that, not the TV)


None of the broadcasts are 4:4:4, nor are the DVDs.

Hell, some of those HD broadcasts are 4:2:0 in order to reduce bandwidth.


Some? Practically every encoded video intended for end-users is 4:2:0 (and certainly everything in HDTV, Blu-ray, and DVD.)

I meant mainly for consoles, computers, and subtitles (though for subtitles it also depends on your DVD player) so I amended my post.


> and certainly everything in HDTV, Blu-ray, and DVD.

The HD broadcasts are 4:2:2, more often than not. There are a few stations that have their video servers down at 4:2:0, but many of those are scheduled to upgrade over the next 2 years.

Edit: The servers are 422, but the broadcasts are 420, as brigade points out. (The edit tag on my original comment has already expired. Grrr.)


ATSC only allows 4:2:0 in MPEG-2 (ATSC A/53 part 4 section 6.1.3) or H.264 high profile (which is restricted to 4:2:0).

I'm admittedly not familiar with other countries, but I would be incredibly surprised if any of them allowed 4:2:2.


Thank you, you are correct.

I've assumed that the video settings of the server is what they were delivering, but there must be a down-sample on the way to the transmitter. (Cablelabs is the other large distribution format that we encounter, which yields the same story.)


I used to write SW for TVs... Any HDTV worth it's salt should have a "game mode" if it doesn't take it back and buy a nicer one.


Yeah but Game Mode still doesn't provide 0ms of image lag.


Of course not. Even the best lcd monitors are at 2ms lag. However, 'Game Mode' does provide a much better experience that even the most hardcore console gamer won't notice. A PC gamer might notice when approaching SC/SC2 APM. I think the takeaway should be that a TV with game mode will be great for console games and PC gamers should steer clear of TVs as monitors.


That's not the same lag. I'm not talking about the time it takes for the TV to completely change pixels -- that lag happens after the TV has decided to put the image on screen, and is largely not a big deal -- as long as the image has started changing, the player can understand the new scene.

I'm talking about the time it takes for the TV to tell the pixels to change.


Right. Just the fact that it's digital means that it has to buffer at least a frame before it can be displayed.


I don't think a monitor needs to buffer the full image before you display it just because it's analog data. It really just depends on how the internal bandwidth is setup on the tv vs how the data is sent to it. (AKA if each pixel is updated in order or if each line of TV is updated in parallel etc.) Worst case you might update the immage left to right but receive the signal top to bottom.


This is especially bad for rhythm games like DDR, where I learned about the problem the hard way. Even the "game" mode still introduces way too much lag. So I had to switch to the old CRT when playing my PS2 DDR games and thus had to keep it around.


Uh, four tenths of a second to change the camera angle would make every 3d game completely and totally unplayable. Even one tenth would have everyone up in arms about how terrible every game is. Maybe you meant 10 and 40ms.


> Uh, four tenths of a second to change the camera angle would make every 3d game completely and totally unplayable. Even one tenth would have everyone up in arms about how terrible every game is. Maybe you meant 10 and 40ms.

No, as disturbing as it sounds, real TVs really do have hundreds of milliseconds of lag. Games that care deeply about precision timing, notably rhythm games like Guitar Hero, actually have HDTV lag settings to change the offsets between audio, video, and input, so that you can actually strum based on what you see and hear and not miss every note.


Citation? sites.google.com/site/hdtvlag/results shows some tests that show it's usually under 100ms for a wide range of TVs. That's bad but more believable. 400ms of latency is not believable. Such a TV would be unsuitable for real time games. Full stop.


Most people don't buy HDTVs to play real-time games; they buy them to watch TV. As long as the TV delays the audio to match the video people won't notice when watching TV; DVRs commonly introduce similar amounts of lag.

Looking at that site, they tested a relatively small number of TVs (only 14), and it sounds like they enabled "game mode" on any of the TVs that had it.

I'll certainly acknowledge that 400ms represents the most extreme case, but 100-200ms or more happens pretty regularly, especially when not using "game mode" or equivalent.


Most people don't play games, but I do. If even 200ms was common, it would be widely known among gamers that certain TVs are unsuitable for games. If you can't provide a citation, can you at least explain why you hold this belief? I'm obviously still incredulous.


Check out avsforums. Or check out high end scalars and the research papers with them. DVDO as even a cheap scalar (I run a VPS 50 from a few years ago on one projector). You'll find more details than you can shake a stick at about the frame lags of various sets. AVSForum is a great place to confirm JoshTriplet's comments.

30 fps is just 33ms per frame. At least one frame of delay is necessary baseline to create transfer and render the image. After that, you're adding delays. 4 frames processing is 133 ms, 6 frames is 200 ms. This is in addition to the console frame rendering time, HDMI transfer from console to TV's frame buffer, and TV's display, assuming a direct connection.

Now, the more interesting thing about all this for me isn't that most TVs and most new home video receivers lag, it is the lag in your own perception and how your mind compensates. As an avid gamer, you learn to compensate for the equipment's lag, and certain studies show a kind of precognition with which top gamers seem to anticipate moves before they should occur. (The reaction happens after the event in the CPU, but before your brain and muscles should have been able to respond to it given normal lag time to perceive and react.) Your own perceptions of whether you can't play a game from lag, especially if you are a gamer, is a bad source of data on this one.[1]

Fact is, home receivers have frame buffers, and home TVs have frame buffers. Most are two or three frames, many are as bad as four. This makes moving video, especially blocky or mosquito noise video, able to be cleaned much more. These receivers and sets look the best, which is why this lag is still increasing. To an extent, the more frames buffered, the more temporal data to use to clean the frames.

Finally, as a gamer, forget the FPS, they're easy to anticipate. You lead your shots naturally and subconsciously. Instead, try something like Geometry Wars Evolved. With the frame lag, you will occasionally be dead in the console before you've seen it on your screen. All my high scores are component video to a Sony Trinitron. I don't come close on a modern set through a receiver. This is well known to the gamers in the GW:E forums.

1. Just one example of improved "anticipation" (as opposed to "reaction"), as I don't have time to look up the various citations... "A different study by Kuhlman and Beitel (1991) measured the anticipation of seven through nine year old children who were categorized as non-experienced, moderately experienced, or highly experienced video game players. The researchers found that children with extensive experience playing video games can more accurately and consistently anticipate the arrival of a stimulus." – http://clearinghouse.missouriwestern.edu/manuscripts/847.php


I know huge number of people who connect their xbox 360s and playstation 3s to their HDTV. Do you think people buy different TVs for their consoles?


That doesn't make them the primary target market for HDTVs, and it doesn't mean that some HDTV manufacturers won't do something stupid that affects gamers while still serving the needs of their primary target audience. In particular, the filters that cause video lag also implement more feature bullet points for their primary audience.


Most people who think they are getting a deal dropping $300 on an HDTV probably aren't hardcore gamers and likely to notice.

There's a long laundry list of things that the vast majority of gamers won't notice directly that can affect gameplay. They have neither the vocabulary nor the technical expertise to identify or explain the things they experience.

I do agree with you though, 400ms is unsuitable for realtime games. But that doesn't stop people from ending up in that situation. However, one mitigating factor is that if you are buying those super cheap TVs, it is likely that you don't own a 360 or PS3 either.


It would also be unsuitable for movies! How can you watch a movie with the sound leading the video by 400ms!?

Then again I see a surprising amount of people watching stuff with the aspect ratio set wrong. And a surprising amount of broadcasters broadcasting things in the wrong aspect ratio, so maybe people would just be oblivious...


The TVs delay the sound to match the latency in the video processing. Very annoying if you have a computer attached with any sort of real time audio (microphone, guitar, etc).


That only helps if you're playing your audio through the TV. But no serious home theater setup is like that…


Any serious home theater setup allows you to delay the audio as well.


Most hdtvs have a setting which will skip a lot of processing and reduce lag. Alternately, you can do research to find a tv without much lag.


Your best bet if you are worried by lag is a PC monitor. But I don't think I've ever seen an HDTV that was verified to be below 60-100ms of image lag.

Also, doing that research is nearly impossible and will in general result in a TV that's inferior for the majority of what you'd do with it.


I don't think I experience this problem. This may just be me, though; or, perhaps it's because everyone has these problems that we're on a level playing field.

Also, I disable the vast majority of these post-processing features the TV provides. I'm a purist and a photographer by hobby and I can see all the 'fixing' that is done in in-TV post-processing, and it looks bad, less sharp, etc, to me. That may affect my experience... Or perhaps I just buy nice TV's (for the same reasons)


Disabling those features generally helps but does not remove the lag. It is pretty difficult to see it in action, as it's more of a subconscious issue.

A great litmus test is Assassin's Creed 2. I worked on that, and for all of development I had a CRT hooked to my dev kit. The game was always super responsive and I never missed a jump. And then, playing at home on my LCD TV... it was kind of insane the difference. It suddenly felt kind of mushy, and kind of like I was fighting the controls.

The reality is that we as developers do as much as possible to hide it, but it's still there if you know what to look for.


The way you're describing the feeling puts it into the 10-40ms range. In the 100-400ms range you'd immediately and consciously wonder WTF was wrong with the game.


You'd think that, but you are wrong. Games like Uncharted have a built in 60ms of input lag due to multithreading of the game engine renderer, and that's before your TV, which I guarantee you, likely adds another 60 at least, unless you spent hours and hours researching and got the Uber Ultimate Gamer TV of the Month.

In fact, Assassin's Creed 2 has 100ms of built in lag due to renderer multithreading. That means it takes a minimum of 100ms for the game engine to render the first frame of a reaction to your input press.

It sounds crazy when it's all listed out, but I assure you it is not only real but something you deal with every time you play a game, even though you don't notice.


Tv's too should have open firmware. Not to make RMS happy, but to protect savvy consumers from monumentally idiotic or short-sighted decisions made during the design. You can't make it perfect, leave the door open so your customers can.

I expect Apple will solve this problem in their usual way; pick slightly less stupid settings and lock those in. In this case, the difference will be stunning and people will marvel at how those apple tv's can look so good.


> Not to make RMS happy, but to protect savvy consumers from monumentally idiotic

Hrrm ... that was RMS's point, way back when, with some printer, if memory serves ...


I think that's exactly what your parent meant (or at least my interpretation of it). Some people might not (want to) believe it but there's actually a whole lot of very valid reasoning behind what RMS is advocating. We shouldn't desire open systems because RMS said so, we should do it because of what he said.


>I expect Apple will solve this problem in their usual way; pick slightly less stupid settings and lock those in.

As someone who's dealt with the inimitable joy of trying to get an MBP to work with an HDTV, Apple's "usual way" appears to be to code to the standards and to hell with anyone who breaks them (or the users stuck with non-compliant products). This appears to apply to wifi as well. There's been an open bug in OSX for years involving OSX assgining its own DHCP lease when it fails to negotiate one with the router. This results in the dreaded "Self-assigned IP" message, which is nigh-on impossible to rid yourself of short of voodoo dolls and wifi dances.


This appears to apply to wifi as well. There's been an open bug in OSX for years involving OSX assgining its own DHCP lease when it fails to negotiate one with the router. This results in the dreaded "Self-assigned IP" message, which is nigh-on impossible to rid yourself of short of voodoo dolls and wifi dances.

Getting a bit off-topic here, but that's not a bug: it's assigning itself a valid zeroconf* address because the DHCP server is not responding.

* http://en.wikipedia.org/wiki/Zero_configuration_networking#A...


"code to the standards and to hell with anyone who breaks them"

I think bug in this case just means that the software does not work as intended even though it follows the spec.


You can simply ask Mac OS X to request another DHCP address when it comes up with Self-assigned IP. The Self-assigned IP address is actually in a range that is specified in the RFC 3927 [1]. This is mainly done so that if it joins a network without a DHCP server it can still communicate with other hosts on the network, just not the outside.

Also, in my experience Apple's DHCP agent will re-request an IP address after having assigned itself a self-assigned IP address. This generally takes about a minute or so, in that time the DHCP server can then reply once again. I've never had issues with this at all.

[1] http://www.ietf.org/rfc/rfc3927.txt


I've had numerous issues on my university's network with the DHCP lease. They have a stupid VPN setup, so that may have helped fuck it up.


When it joins a what now? There are no wifi networks without DHCP servers. At least there are none in the consumer electronics market to which zeroconf is targetted.

Zeroconf is a historical mistake. It shouldn't be there any more; it does far more harm than good.


I think the use case was ad-hoc wi-fi networks.

But getting back to the original complaint, if not for zeroconf you'd have no IP address at all; I don't see how that's any better.


The problem is that OSX's behaviour (and not, notably, Windows') is to try once then fail as a "good enough" solution. It may indeed keep trying, but it's a problem that Windows never runs into in the first place.

I'll admit to having only a rudimentary knowledge of how wireless networks work, but it's better than 99% of users and I find OSX frustrating (in this regard). At the end of the day, the user doesn't care whether the router manufacturer isn't following the spec, or whether Apple's implementation is buggy. The simple fact is that it works on Windows but not on OSX, and that's a failure on Apple's part.


This behaviour is the same on Windows. If Windows doesn't receive a DHCP address is will assign one in the self-assigned range first. After about a minute or so it too will re-request an IP address from the DHCP server and if it receives one it will assign that to the interface.

What Windows does do wrong is that it then also drops the self-assigned IP address which may already be in use for communication, this can cause issues with other hosts that are communicating with it over zeroconf.


Create a WiFi network on one Mac, join it with another Mac. No DHCP server is running, but thanks to the self-assigned IP fallback, the two Macs can still communicate with each other. This can be tremendously useful.


This depends off course on how you create the wireless network. If you create one by routing another network such as using Network Sharing a DHCP server is set up since now it acts as a router.

You are correct for Ad-Hoc networks.


This dates back to long before wifi was widely available--the original use case was ad-hoc wired networks, where people with backgrounds in old-school AppleTalk-based Mac networking were expecting any two Macs with a cable between them to be able to talk to each other with no supporting infrastructure.


I'm pretty sure that's what RMS's reasoning was as well. You make it sound like RMS just arbitrarily chose to desire openness.


RMS would disagree. Noonespecial is making an "open-source" argument: that open software is better than closed software, where 'better' is some combination of less buggy, better designed, more secure, etc. RMS specifically disagrees with this principle, right down to the semantics (preferring the term 'free software' to 'open source'). "Free software" is better because it promotes freedom, i.e. it cannot be used by the designer to oppress others.

In RMS's worldview, software freedom is an inherent good, not a derivative good. "Many eyes make all bug shallow" is nice, but for him it's a side-benefit.

http://www.gnu.org/philosophy/free-software-for-freedom.html


Back in the standard definition days, I loved my HTPC and the tvtime software (http://tvtime.sf.net/). It was the next best thing to having an open firmware TV. It was also possible to get zero lag between the capture card and the video card by bypassing the CPU and having the capture card write directly to the video card's memory (using xawtv instead of tvtime).


Right. One of the rules is "be nice".

OK. This article is flawed and many of the comments are just as flawed. Having been involved in the design and manufacturing of LCD displays (down to writing all FPGA image processing code, scaling, deinterlacing, etc.) I think I can say that none of this is accurate if the intent is to apply it generally.

Caveat: If you buy a TV don't expect it to be a computer monitor. Most TV designs are just that: TV sets. They are made to do one thing reasonably well: Take a crappy satellite/cable/whatever signal and give you a reasonable image back.

EDID can be programmed with any resolution you want. Do you need 921 x 333 at 12 frames per second? No problem. There is no such thing as a resolution not being available in EDID. Standards are one thing, but the EDID mechanism isn't inherently limited by standards.

BTW, there are commercially available EDID modifier gadgets that allow you to modify the EDID readout from the monitor. So, the monitor says one thing and your computer (or whatever device) receives your programmed values.

If you need a TV that will play nice with a computer you need to find one that was explicitly designed to do so.

Most consumer TVs use one of a very few commercially available processor chips to do their image processing. With a few exceptions they all do the same kinds of things. And no, the signals are rarely converted to YCbCr 4:2:2 internally but for the absolute cheapest and crappiest of processors. All the good ones convert the input to a common internal integer RGB format. The nice ones might standardize at 12 bits per channel (36 bits total) internally. When I did custom FPGA video processing we went as far as 24 bits per channel in order to avoid truncation of calculated values until the very last moment. This can make a huge difference depending on the application.

In general terms "monitor mode", if you will, should be a mode that bypasses as much of the internal processing as possible. You can force this bypass by using and EDID modifier gadget programmed for the actual resolution and timings of the panel. In other words, open the back of the TV, get the panel model number, get the data-sheet and program the EDID modifier to output these values to your computer. The processor should push this straight to the panel and you get very little, if any, processing. Again, this does not work on all TVs. As I said before, they are designed to be TVs, not monitors.

That said, I've connected many computers to off-the-shelf, un-modified, consumer TVs via DVI and HDMI. I have yet to run into any real issues.


"EDID can be programmed with any resolution you want"

Indeed, as I said. I then went on to say that lots of 1366x768 devices don't provide a detailed timing block, so they're limited to what you can express in the standard timings - which limits you to horizontal resolutions that are multiples of 8, and so can't express 1366x768.

That's the only thing you appear to disagree with about the article, so "This article is flawed" doesn't seem entirely fair. I'll admit to not being a video expert - I write a lot of code that interfaces with firmware at all levels, including displays, and I work closely with people who are video experts, but I've never built a TV. I don't think I've misrepresented any facts or presented any gross inaccuracies, but where I have made mistakes I'd love to be corrected.


You and the article seem to be talking about two different things. The article is talking about how things are bad by default. You seem to be saying (e.g. with talk about the EDID gadget) that things can be made to work if enough effort is applied. That is scant comfort to someone who writes drivers to interface with televisions, as the author of the blog post seems to.

> I have yet to run into any real issues.

Possibly because of hacks like the one this guy pointed to in Linux?


> And no, the signals are rarely converted to YCbCr 4:2:2 internally but for the absolute cheapest and crappiest of processors.

Tell that to Sony, Samsung, LG, etc. who all do it by default (outside of game/PC modes) across most (all?) of their lineup nowadays.

Or are you implying that my HX909 uses the absolute cheapest and crappiest of processors?


> Caveat: If you buy a TV don't expect it to be a computer monitor. Most TV designs are just that: TV sets. They are made to do one thing reasonably well: Take a crappy satellite/cable/whatever signal and give you a reasonable image back.

...

> In general terms "monitor mode", if you will, should be a mode that bypasses as much of the internal processing as possible

...

> As I said before, they are designed to be TVs, not monitors.

I think this is 100% the problem. TVs these days essentially are big monitors, and they should be designed as such.

I believe that if a TV is taking content from a digital source (ie, HDMI, DVI, DisplayPort etc.), it should perform the bare minimum image processing possible. Things like scaling and gamma correction are necessary of course, but other operations like sharpening filters just destroy images and are unnecessary for digital inputs, and things like motion interpolation (ie. MotionFlow) are ridiculous (seriously, the only thing it improves is text like rolling credits, but it introduces serious motion artefacts to actual video and does strange things to noise in images, making the image look what I can only describe as 'slushy')...

This is just my opinion from being both an engineer but also having done a lot of film-making and photography work. It just appals me to see the default settings on these TVs degrading the images so much, and it takes a lot of searching through menus to set it to something sensible, which the vast majority of the population won't do...


Since you say you have been involved in the production of such things I have a question for you....

Many of the comments below discuss how people experience HDTV "lag", I was wondering if you could shed any light on what causes this. Initially I was thinking that, if its a commodity processor then of course there is the potential for lag, but then you mentioned the FPGA's.

Now I am no fool, I know that its going to take time to process things even with an FPGA, but you got me thinking, in the HDTV application surely an FPGA will be coupled with a commercial DSP chip, heck for high volume I am sure the manufacturers will go the whole hog and get the FPGA netlist converted out into an ASIC or even full blown foundry chip.

Wouldn't the FPGA coupled with a DSP kill most of the lag outright ?

So where in your experience is the lag ?


Rephrasing beambot, some of the algorithms employed use knowledge of the next frame (or two or five) to filter the current frame. So for those it doesn't matter whether you use a FPGA, DSP, ASIC, commodity processor, whatever to do the filtering - you have to wait until you have the needed future frames to even start.


Ah check that makes perfect sense, so until I invent the turing oracle we are stuck :)


Many DSP algorithms introduce delay (ie. think about simple FIR digital filters, like a running averager). They are literally using a "tapped delay line" to perform the computation. There is a tradeoff between the amount of delay you're willing to accept and the "sharpness" of your filters' frequency response. I'm not an expert in TV systems, but this gives the general gist.


There are multiple sources of processing delay in a TV image processor. The two main sources are the scaling and de-interlacing blocks. Scaling is pretty benign. A cheap scaler will introduce somewhere in the order of four lines of delay. A more advanced scaler might do 16 or 32 lines. These are lines of video. If we are talking about 1080, then 16 lines is 16/1080th of a frame. In other words, not insignificant for TV applications.

The de-interlacing block is where most of the delay comes from. Interlaced video (standard definition, 1080i, etc.) only transmits half the lines per field. This means that the processor has to synthesize the missing pixels in order to have full frames to display. A really cheap de-interlacer can be done with just a few lines of delay. You wouldn't want this as it would introduce bad imaging artifacts. Pretty much all consumer TVs use a method called "motion adaptive de-interlacing" (MADI). There are many implementations of MADI. In general terms you use data from frames before and after the one you want to process in order to detect pixels that have moved. Those that did not move can simply be replicated or averaged into the missing slots. Anything that moved requires different treatment. The key here is that you need to store a couple of frames worth of video before you can start with MADI. That's your two frames of delay (or more).

There might be other delays introduced by other subsystems such as the receiver.


> If you need a TV that will play nice with a computer you need to find one that was explicitly designed to do so.

Any suggestions?


Here's a non-facetious, completely honest question from someone who just doesn't know why- Why is is 2012 and my new TV and monitor each have about the same horizontal resolution as my CRT monitor from 1998? It's 14 years later, and I still only have about 2000 pixels to play with. I know the obvious answer is that everyone is just matching the resolution movies are sold as, but why can't I get a professional grade monitor with a "retina" quality display for my desk?


You can, phone any medical imaging shop. Keep in mind you'll pay professional grade prices for such a device.

Also, http://en.wikipedia.org/wiki/IBM_T220/T221_LCD_monitors

201 DPI for the T220/221.

If you think about it their pretty cheap too. Remember paying 2K for a monitor back in the 80s? It's basically the same price once you factor inflation.


The T221 is a great monitor for coding -- check out the size of the "fixed" font on the wikipedia image. But with most configurations it only does 12, 24 or 48 Hz depending on how many DVI channels you can use. Also, since it is being driven as four separate 1920x1280 displays with xinerama there tends to be tearing across the boundaries with movies or video since the refresh from the cards is not synchronized.

Other than that, I love mine and am considering a second one for dual head using eight DVI channels.


There are separate reasons for TVs and computers. For TVs, extra resolution just isn't all that important. Most people can't tell the difference between 480p and 720p, let alone 720p and 1080p.

For computers, there are concerns about display connector bandwidth and LCD fabrication yields, but they are surmountable. The real limiting factor has actually been the terrible state of DPI scaling support in desktop operating systems. Higher resolutions make the UI elements too small to see; it's as simple as that.

The future is bright, though. Apple realized that their uniquely tight control over the hardware and software environment on iOS would allow them to easily switch to a higher resolution display and scale the UI without any compatibility problems. Now that they've released the retina display there's pressure on competitors (including themselves with OS X) to follow suit.


Having worked with pathologists, radiologists, military air traffic controllers, and cinematographers, they seem fairly happy with BARCO 30" 2560 x 1600 (16:10) screens. I guess the question would be what high-stakes decision is going to benefit from higher resolution? Instead of using the zoom function in your imaging tool of choice, you break out a glass and metal loupe and hold it vertical? How is that a plus? I'm sure every imaging company on the planet is well aware of the Retina display and has plans for building out higher resolution manufacturing.

I just haven't met professionals who are sweating that particular detail of their workflow for several years. What industries are you seeing this problem in?


The human eye has much greater resolution at "typical" working distances than current PC displays, and especially TVs.

It's why I like my 1920x1200 14.1" Dell D830. You can find then for about $300 on Craigslist and you can fit a module bay batt plus a 9cell which, with an SSD and the integrated video can get 6-9 hours depending on use.


Funny, I have one of those high-res Latitudes. It's been demoted to home juke box as my Cr-48 has proven to be a much more capable travel companion.

I actually spent a summer working with Edward Tufte, lived in a guest house on his premises. Got the job in part as a physics major who then went to medical school and knew at least enough about brains and eyeballs to be conversant. I think its safe to say I'm a fanatic about transmitting as much data as possible in the eyespan. That said, what people are doing with the extra data is also non-trivial. We, the societal we, are not spending money in a vacuum: IBM discontinued the T221 monitor fleitz cited above.

My question stands: what industry is suffering mission failures due lack of screen resolution?


My understanding is that there's a few things at play in this. The first is the screen makers which you mention their reasons for doing it. The second is one of bandwidth to the display, my understanding is that that is the limiting factor here. At 2560x1600 you've got (2560 * 1600 pixels) * 24 bpp * (60 frames/second) about 5.5 Gigabits per second. at 3840x2400 (1.5x of above) you've got 12 Gigabits/s. At the moment HDMI has a bandwidth limit of 8.16 Gigabits/s after overhead, which isn't quite enough to do it. Display port 1.0/1.1 also had a limit of around 8.6 Gigabit/s. Displayport 1.2 however goes up to 17.28 Gb/s so it should allow for the larger displays but I've not seen many non-apple display port monitors myself. DP 1.2 came out in December of 2009 according to wikipedia so it should have started to filter out to people by now but I've not looked into that.


Most monitors that require dual-link DVI will also have DP inputs, since DP outputs are extremely common. That's every 30" and most 27" monitors.

Every laptop I've had in the last few years has had a DP output. I've read that this is because it's cheaper to license than HDMI. (But DP to HDMI conversion boxes are like $20 now.)


I can't say that I've seen that myself, I've got a 27" viewsonic monitor that only has HDMI, but it only does 1920x1080 so it wouldn't need a dual link dvi anyway. My laptop is the same way with a built in hdmi connector rather than a dvi.


A mixture of it becoming much more expensive to produce large high-density panels with an acceptable number of defects and the disproportionate increase in bandwidth and memory required to handle larger screens. Single-link DVI only supports a resolution of 1920x1200. HDMI didn't go above 2560x1600 until a couple of years ago. Displayport's still in that ballpark.


It's worse with laptops. In 2003, the highest resolution laptops were QXGA (2048x1536). In 2008, they were WUXGA (1920x1200). Now (with a couple WUXGA exceptions), they're FHD (1920x1080). I'm very frustrated with laptops mimicking TV resolutions. I use my laptop to make things (code), not to watch movies. I find additional vertical space more useful than horizontal and can't get a screen as tall as I used to be able to.

Note also that QXGA is the widest of these, despite the others being "widescreen".


Most notably, the new Thinkpad T-Series (Tx20) has 16:9 screens instead of 16:10. You think they got smaller? No, they have the same size, just a large part is covered in plastic where vertical screen space used to be. Did I mention that it looks retarded?


If you don't need color, a quick googling gives you monstrously expensive display with 4096 x 2560 pixels: http://accessories.us.dell.com/sna/productdetail.aspx?sku=A5.... If you need color and are willing to live with just 3280 x 2048, you can spend a lot less on a http://accessories.us.dell.com/sna/productdetail.aspx?sku=A5....

I'd be happy with 8 of these (http://barco.com/en/product/1219/specs) arranged in two 2x2 clusters on the sides of my webcam.

edit: Eizo has another cool one, color and 4096 x 2160: http://www.eizo.com/global/products/radiforce/rx840/index.ht.... I don't even want to know how much they cost.


According to http://www.geeks3d.com/20111208/eizo-radiforce-rx840-a-36-4-... the Eizo is about 21'000 Euros


My jaw literally dropped when I saw the price tag on the monochrome Barco. You could get several new cars for that much.


But they are insanely cool, aren't they?

It's the kind of money you spend to develop today against the kind of computer that will be mainstream a couple years from now. At least, that's one bet. Xerox bet correctly - GUIs, bit-mapped displays and object orientation got hot in the mid 80's but, nevertheless, they didn't collect their prize.


But better monitors won't be mainstream a couple years from now; there has been no improvement in the last five years, so why should the future be any different?


There is the persistent rumor Apple will equip their Macbooks and iPads with retina-like displays. Once they do, everyone will have to do it.


I strongly doubt that apple is going to make 300+ DPI screens 10+ inches across. I could always be wrong, but it seems like the amount of horsepower it would take just to draw the screen would destroy battery life. Maybe some super high end, three prong "Cinema Display", but definitely not a mobile device.


I guess as cheap monitors became good enough, demand for pro monitors plummeted, giving them diseconomies of non-scale. The good news is that you won't suffer buyer's remorse since prices don't go down.


Right, because AIUI there are huge fixed costs in setting up and running an LCD fabrication plant. So the biggest market wins.


I'm only postulating, but I think gains in flat-screen technology were put into other quality factors other than resolution.


I used an HDTV once that had three different HDMI ports on it (0, 1, 2). Each port reported a slightly-different EDID for the same TV!

One of the HDMI ports reported this (extracted using SwitchResX):

    Established Timings:
    -------------------- 
    		720 x 400 @ 70Hz
    		640 x 480 @ 60Hz
    		800 x 600 @ 60Hz
    		1024 x 768 @ 60Hz
    
    Standard Timing Identification:
    ------------------------------- 
    	#0:	1280 x 1024 @ 60Hz 	(8180)
The two other HDMI ports reported this instead:

    Established Timings:
    -------------------- 
    		640 x 480 @ 60Hz
    
    Standard Timing Identification:
    ------------------------------- 
    
Almost all other EDID data matched, including additional timing data inside the EDID extension block, so I'm not certain these differences were that big of a deal. None the less, weird when all coming from the same TV.


Is his final point (about overscan) still relevant? Years ago I used to hear HDTV enthusiasts urging everyone to check their TV settings, but in the last 2-3 years, I've only dealt with PCs hooked up to a few HDTVs (all with 1080p native resolution), and I haven't seen a single one that overscans a 1080p DVI or HDMI signal by default. The author acts like it's a certainty that your 1080p TV will by default overscan a 1080p signal.


It's not unusual for DVI input to be underscanned by default, but the HDMI spec says that 1080p should be overscanned. Disabling that requires your TV to advertise its feature set correctly (there must be a VCDB in the EDID, and it has to indicate that it supports underscanning of CE video modes) and your output device needs to send a type 2 infoframe with a request that the mode be underscanned. I really don't know how common it is for all of this to work without manual intervention - most of what I end up playing with is whatever's sitting in hotel rooms, and that's not necessarily a good representation of the current market.


Awesome to see you answering questions here. Let me state that your posts on pgo are better than anything tdwtf can deliver any day. Thanks a lot for looking into these pits (ACPI..?) so that I can be blissfully ignorant. I owe you gin.


I have a one-year-old Samsung LED-backlit LCD and it overscans from a unibody Mac Mini via HDMI into the TV's PC-specific HDMI port. I have to fix it manually with the overscan slider in OSX's display preference pane.


I also have a one-year-old Samsung and I managed to get it to not overscan for at least one input. Put your Mac Minis HDMI into the HDMI 1 / DVI input and then select the input in the Source menu on the TV, press Tools on the remote, and chose PC as the name of the input. This magically turns off overscan. Hope it works!


Did that already; no joy. It's not a problem because of the overscan slider in OSX, but it was a huge annoyance before I upgraded the mini (you need the built-in HDMI to get the overscan adjustment).


Unfortunately you're most likely losing resolution in this case. If, for example, your TV is cutting off 3% of the image, then the video card is scaling your whole desktop down from 1920x1080 to ~1862x1048, which will result in more than 3% apparent resolution lost due to less-than-optimal interpolation.


I'm fine with that; the Mini is used as an HTPC, so I really need the overscan correction only so I can see the menus in VLC / Quicktime.


You should probably have a look for a way to disable overscan in your television's manual. My several year old Samsung DLP supports "PC mode", which disables overscan.


Yeah, same with our LG LED-LCD. It may have been possible to disable it in some menu, I can't quite remember now.


Two data points:

My four-year-old Samsung has a "Just Scan" mode that maps 1:1 to the panel's pixels. I've never had a problem with overscan over HDMI, DVI, or VGA.

When I was home over the holidays, I noticed that even my parents' Black Friday mystery-name TV from Menards offered a 1:1 pixel mode. This mode was strict enough to map a 480i picture into a little box in the middle of the screen.


+1 look for "Just Scan" as a display mode on your TV. On my LG and many Samsungs this does a 1:1 pixel mapping from your source input.


I have a Panasonic VT30 plasma display that was released May 2011 and I had to explicitly disable overscan when I plugged in my PC via HDMI.


I have a LED 6400 Samsung SmartTV and as I plugged my PC on the HDMI (did no try DVI yet) I simply selected the Automatic video mode and no edges are cut.


I have a 6-month old 1080p TV that apparently has no way to disable overscan.

Poor purchasing decision I'll grant (though it is a decent TV), but my point is this area can still be a minefield.


Every single TV I've got right now does overscan (two HDTVs) only one gives the option to disable it. I've even got a computer monitor that has only HDMI connectors and overscans by default.

It seems that it's a wash as to whether you will hit this or not all depending on the brand.


While most newer TVs are smarter about overscanning signals from a PC, the issue has gone the other way - to compensate for TVs which overscan, AMD's Catalyst/Vision drivers automatically underscan their output when attached to an HDTV.


Relevant: Viewers are awful too.

Study: 18% of people can't tell if they're watching true HDTV content or not

http://techcrunch.com/2008/11/24/study-18-of-people-cant-tel...


Ugh, this. Comcast re-compresses some of their video streams so much that it almost looks like YouTube. The best video quality I've seen on cable has been HDNet, but many providers have dropped their channels because they can replace them with three low-quality channels.

Still, I've never talked to anyone who has noticed the difference, so I assume it's a rare gripe.


> Ugh, this. Comcast re-compresses some of their video streams so much that it almost looks like YouTube.

All the cable providers are using the same set of transmission boxes from the same vendors.

Usually, they are employing statistical compression: Take 16 channels and determine which video needs more bandwidth in real time. So if your favorite show is opposite talking heads, then it looks fine. But if you're opposite action-packed hunting in jungle scenes (lots of high-frequency content), you're gonna see it.


Except 20 or so channels on my dish network setup are nothing but direct sales garbage.

I think we're suffering because its economically more appealing to treat the TV as a sales machine instead of an entertainment machine. No wonder there's no bandwidth left over on satellite or over those fat DOCSIS connections. Carriers are too busy selling us "lose weight now" bullshit over providing the service we're actually buying.

Toss in its "Public Interest" channels which hold useless junk like religious programming and public access, we have about 50 channels of non-entertainment nonsense. Everything looks like shit because the bean counters and MBAs think Billy Grahame and "Look good in that dress for $19.99" should be in contention over the actual shows and movies I watch.


On satellite there are also the dozens or hundreds of local market broadcast channels. They lobbied hard against a de facto national ABC, CBS, etc.

One of the channels in my area is now encouraging people to pester Dish to pay them per-subscriber money. Meanwhile, Dish has audio dropouts and video artifacts aplenty.

(My impression is that many of the channels you label "Public Interest" are getting a free ride because they don't charge the carrier anything but allow the carrier to increase their advertised channel count.)


…and it's because of all the "useless junk" that they cover on those public gov't access channels that you don't have a single good choice of ISP or infomercial service.

Drop cable and lobby to end the Comcast/AT&T/Time Warner oligopoly.


The perhaps-surprising result is that OTA HD channels are probably higher bandwidth than most channels you'll receive on cable.


> are probably higher bandwidth

It's hard to compare. Yes they are higher bandwidth (15.6 Mbps or so OTA) but they are mpeg2-encoded. Many of the stat-mux cable boxes are mpg4 at a lower rate. So which is better? Depends entirely on what is being broadcast (and where in the stat-mux channels you're looking.)


MPEG-4 cable deployments in the USA are relatively small.


As long as I can see distracting digital compression artifacts on my SD television I have no intention of upgrading to an HD TV and HD cable package.

If anything, I'll drop cable altogether and go OTA.


Commercials for HDTV don't tell the whole story - unless you've got a big ass TV, you won't notice a difference, however if you do then the difference is huge: not because HD is magical fairy dust that enriches your viewing experience, but because non-HD channels become unwatchable.


Why is that a bad thing?


And then you being technical insist they need HDTV because it's better and now their DVR holds 20 hours instead of 200 and they were happier before. True story.


I want to put a display on the wall and stand five feet away with a keyboard and mouse while working (mostly Emacs, web browser, and reading pdfs). What should I check to determine whether a TV would work well in this configuration (without overscan issues and the like)? Is the only safe thing to go to a physical store with the computer, set everything up, and check for artifacts?


Search AVSForum for 1:1 pixel mapping. http://www.avsforum.com/avs-vb/forumdisplay.php?f=166

(It used to be the case that cheaper HDTVs didn't have "reality enhancement" stuff, but Moore's Law has probably eliminated that.)


This is probably most dependent on the video card and drivers that you have available. A good video driver should allow you to adjust things such as the overscan %, centering, etc. to get the best picture possible.

I know from experience using an LG LCD TV with Windows 7 and ATI graphics that all drivers are not created equal. I was forced to downgrade my graphics driver when ATI decided to remove overscan settings from more recent releases. That being said, there are drivers out there that allow you to mold the picture to fit your screen.

Your best bet would be to get a high resolution PC monitor and just use that. The monitor will have much better quality for computer use. If you must get a TV I would say the primary concern is what your computer is capable of, more so than the TV.


The reason that a TV seemed attractive is that I could get a 46" TV for about the same price as a 27" monitor, thus letting me work from a step further from the display with similar perceived pixel/display size.


Why would any technologist reading the likes of this over and over and over, have any faith in future brain-computer interfaces, mind-uploading or similar?


Just remember to keep your eyes closed when doing a firmware update.


I have one of those terrible older 1366x768 TV's. This TV accepts input at 1080i and 1080p as well as 720p. What they don't usually tell you, is that some of these TV's will up convert a 720p signal to 1080 and then down convert it back to 1366x768. So you're actually better off with a higher resolution signal.

Luckily I can get 1360x768 though the VGA port but the TV only accepts HD resolutions over HDMI -- this is becoming more of a problem as many computers now come with only with D-DVI or HDMI ports.


The VGA port is analog, meaning the signal from the computer gets sampled to analog by your video card, then re-sampled to digital by your TV. This seems like a lousy compromise compared to a computer monitor connected to that same computer via DVI or HDMI.


It's only lousy if digital->analog->digital conversion itself is lousy -- and from everything I've read it's nearly impossible to see any difference.

My original intention was to connect the computer via a DVI-to-HDMI cable but with only HD resolutions available with no 1-to-1 pixel mapping this is a no-go.


This would be incredibly funny if it weren't true :-(

It reminds me of Joel Spolsky's rant about standards: http://www.joelonsoftware.com/items/2008/03/17.html -- the bit about headphone jacks in particular.


"and so because it's never possible to kill technology that's escaped into the wild we're stuck with it."

Such a general truth, its why web devs have nightmares about old versions of IE


My only complaint about my Samsung TV is that the input select menu takes about 45 seconds to dismiss itself. Most equipment like receivers that drives your TV expects that input select is instant and doesn't show a menu at all. So you press the button to switch inputs and it sticks text on your screen for almost a minute. Stupid.


Mine does that too, but changing the volume dismisses it.


Good to know. The actual remote for the TV has a cancel or return button to kill it, but the BluRay player remote we just picked up only has 5 TV buttons (power, source, volume up, volume down, and mute).


TV's also uniformly have one of the stupidest designs for input ports.

My parent's HDTV has 8 input ports (4 are HDMI). All of them are crammed on the side of the TV and it looks like crap mounted on a wall. Not to mention being a pain to add new stuff. Why can't the TV come with a box that lies horizontal in my cabinet with all the in and out ports and have on umbilical cord hooked into the bottom of the TV? I know you can buy boxes, but it just seems like they should start looking at the implications of flat screens sometime in this century since they forgot to look in the last.


I'm surprised that when you pay more for a TV, you get more ports, even as it becomes more likely you only need a single HDMI to hook into your receiver.


The marginal cost (which is truly marginal) to add more ports can be easily absorbed by a more expensive TV.

Considering how many people use receivers, I'm surprised there aren't good HD monitors with one HDMI input and no audio support.


I would actually like to see the breakdown, because I am not sure it is the majority of people.


My intuition agrees with yours that less than half of HDTVs are attached to receivers. But I suspect it's over 10%, which should be enough to sustain a few specialized models.


The second comment on the target page has a nice explanation on why this weird resolution of 1366x768 is so popular.

Apparently individual screens are cut from larger sheets of pixels. Using the same vertical resolution for 4:3 screens (1024x768) and 16:9 screens (1366x768) makes it possible to cut them from the same sheet, pushing down the manufacturing costs.


Person who decided that HDReady should be 1366x768 was a little bit insane.

And personally for me I have even more crazy problem because of that: When I transfer signal through HDMI from 1280x720 notebook to HDReady TV it actually thinks that the signal IS HDReady from 1280x720 and stretches it by that 6% of difference and crops it by 6% and stretches again.

tl;dr As a result I have something around ~1210x660 center part from original 1280x720 signal stretched to 1366x768... Can't find any solution yet.


There are ways to achieve arbitrary upscaling without loss of information (e.g. FFT-based). It would be interesting if TVs out there utilize such methods or if they scale using some simple interpolation scheme.


TVs have ASICs that do that, see this subthread: http://mjg59.dreamwidth.org/8705.html?thread=254465#cmt25446...

But it's only possible with a signal designed for perfect scaling. Computers send cripser signals that are designed for no scaling at all, and the transformations hurt those signals.


I thought this was going to be an alarmist article about getting rid of your TV and doing something else with your time.


The funny thing is, Matthew would write that article if he thought it would work.


That's exactly what I also thought when reading the title.


Television is the worst -- and best -- invention of man.

Apart from radio. That was genius.


wow I just read the etiquette on hacker news and then clicked around to this topic. I then just happened to click on user "mrcharles" as I never knew people had profiles before.

Turns out he is a game designer too and I read lots of interesting stuff. I love this site it has great people on it!




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: