This race for more pixels is misguided. The change I most want to see is deeper pixels, at least 10bpp, preferably more. I'm getting really tired of all the banding I see on what should be smooth gradient images. If a landscape aspect image is P pixels wide then to display a gray scale gradient, black to white, you need
ceil(log2(P))
bits per pixel. So a "2K" display needs 11 bits, "4K" needs 12, etc. Then each column of pixels gets a distinct value and there is no banding.
So sure, give me a 4K screen. I've seen them and they are sweet, but you MUST also increase the pixel depth at the same time, or there will be artifacts.
And don't get me started on the horrid compression artifacts on basic cable. A crime against quality imagery.
Dithering can help a lot when it comes to displaying large smooth gradients in lower bit depths. Obviously it has its downsides, though - for one, you need to actually implement it where smooth gradients are used, and at when it comes to video games this basically never happens and you end up with notable banding instead.
Dithering in the field of video is pretty common, though. But it has a pretty large problem there as well - since dithering is essentially noise, it requires a lot of bitrate to compress efficiently, and if you don't have bitrate to throw at your source, you're most likely going to kill it and just end up introducing banding. Blu-ray is pretty much the main avenue where you have enough bitrate to spare for proper dithering in 8-bit video. Anything less than that, though... well, let's just say that House of Cards on Netflix was suffering from banding a lot.
Banding is actually one of the biggest reasons why anime fansubs these days are generally encoded in 10-bit H.264. Anime tends to have a lot of large and smooth color surfaces and banding was pretty much the hardest thing to avoid with regular 8-bit video - 10-bit on the other hands makes it an almost total non-issue. And for non-10bit displays, it moves the necessary dithering to the playback end, which is obviously a much nicer alternative since you don't have to compress any of that in the video itself. And beyond the gradients, 10-bit H.264 actually gives you better compression quality in general, which just makes it even better.
Now, it obviously comes with the downside of not being supported by hardware decoders anywhere, so you basically will need a decent CPU to decode 10-bit video. For fansubs and the people who make them this isn't that much of an issue though, since the advanced subtitles they use are also generally poorly-supported by hardware players, and this has been the case for a long time.
Next-generation video formats may actually bring higher bit depths to hardware decoders as well, though - H.265/HEVC has a Main 10 Profile intended for consumer applications.
This assumes infinite color discrimination abilities, though. I'm fairly certain that after a certain point, even if there are bands, most people would be unable to perceive them.
8 bits isn't "current", if you want "current" you get 10 bit panels. If you want "better than bargain basement" you get 10 bit panels.
At least where I live one of the differences between the TVs that CostCo sells and the ones that others sell are 8 bit (CostCo) vs 10 bit (others) panels.
I understand about the panels, but what about the source material? Is color information on a blu-ray stored at 8 bits per channel or is it higher? If the source material is 8 bits per color channel, I still think you'll see banding on smooth gradients no matter how good the panel is. I should try some tests...
Note that the panels are very unlikely to be true 10-bit; they're almost certainly 8-bit with temporal dithering. And the cheap panels are 6-bit with temporal dithering to 8-bit.
That is a good question. And strangely not as straight forward as I might like. "RGB" is 8 bit but video is not encoded with RGB, it is encoded YbCbCr [1], and the conversion from Chroma space to RGB space is fractional and depends on the Kr and Kb constants.
The math supports converting a YbCbCr 'pixel' into a 30 bit RGB pixel if your conversion constants allow for it. What is unclear is when quantization noise from the compression of the chroma signal shows up as bit instability in the resulting conversion cycle.
For TV I have no idea, but for a computer, you can get 10 bit per channel output from most graphics cards these days. You still have the problem of the source material being 8bit per channel though.
The banding you see is from image and video compression, not a lack of bpp. Human beings can almost never differentiate 6 bpp from 8 bpp grayscale, and with colors 7 bpp/c is almost always sufficient.
This is true, but transmitting video at 8 bits per channel is problematic because it's too close to the threshold of perception. Any video processing you do on an 8 bpc signal is likely to introduce visible artifacts, and of course today's TVs go crazy with ridiculous video processing even at their default settings before people start playing with the brightness and contrast controls. We need to be transmitting video at 12 bpc so that video processing done at the display doesn't destroy the signal.
I tend to agree, but I would prefer to discourage television manufacturers from doing any more "improving" than they are already doing. If you're sending a signal to a display with the signal so little compressed that 12 bpc would be an improvement for post processing, then you're probably sending the signal straight from a BlueRay player or desktop graphics card. In both of those cases, I'd rather my TV not do any post processing. (Brightness and contrast on TVs are analog operations and do change with changes in bit depth. Higher brightness = turn the lights up, not a pointwise linear map)
This has been in the HDMI spec for 7 years already and most high-end TVs support 10-12 bit input. And lower end TVs it doesn't matter anyway. Also TVs often use at least 10 bits between internal processing steps.
Anyway, what you want is happening for 4k - Rec. 2020 mandates at least 10bit for 4k
Reading a few pages in, if you're not seeing it on the market, it's because the technology is patented (and it's the obvious approach of using an array of LED backlights with DSP correction for the optical scattering of the backlight filter).
As soon as there's the slightest amount of noise, it's impossible to discern between 8bpp vs 10bpp. And real-life applications will almost always have noise.
It's probably going to be like the processor wars. It'll keep going up until they reach a peak and then it'll switch to multi-cores (or in this case, expanding the number of bits).
32-bit is 8 bits per colour channel (24-bit) plus an 8-bit alpha (transparency) channel, not 10-and-a-bit bits per channel. The alpha isn't (normally) sent to the display device, it's used to render overlapping elements at the display generation level (video card). Good LCDs can easily render the full colour depth with headroom to spare. Higher-end graphics monitors are often 12-bit (the extra range is used for calibration/display accuracy, not necessarily for greater depth).
Honest question, does 4K and up really make sense for private use ? I am pretty sure from the distance that you usually watch 1080p content, you wont see a difference to 4K content at all.
Of course for big presentations and conference setups i see the usecase, just not for the consumer mainstream. 4K would also set back game console performance by a huge margin, if they will ever adopt it.
So i am not really convinced that the mainstream will go up to 4K, 8K etc, as it also means that all production costs increase and the benefit for the consumer is marginal.
You probably won't on a 42" screen used as a TV. Maybe not even on a 50" screen used as a TV. Used as a monitor from short range, is another matter entirely.
But Amazon is selling 75" TV's in the UK, and 100"+ projectors. As screen sizes go up,
people won't be too happy to see dpi drop again.
We are also becoming accustomed to higher DPI on our screens via tablets etc.. After I got my phone (1280x720 on a 4.7" screen) my 1920x1080 on my 17" laptop was suddenly noticeably grainy. My full HD 42" LED TV is no longer nearly as impressive as it seemed when I got it and compared it to my old TV.
My next phone will be a full HD 5"+ model, and I expect my next tablet to be in the 2500+xwhatever range or more for a 10", because anything else will look worse than my current phone. For a lot of content, sure, you won't notice much difference - I quite happily still watch my DVD's on my full HD TV.
In other words there are plenty of situations where we will notice the difference to 4K, even if we'd be perfectly happy with 1080p if we didn't have a higher DPI device to compare to.
Screen size isn't the only relevant factor. Given any size of screen, more resolution means that viewers can sit closer to the TV, thus filling more of their field of view with the screen.
Honest question, does 4K and up really make sense for private use?
It's a good question, but my answer is definitely, Yes.
My ideal work station would be a room with a wall-sized 8K touch-screen monitor covering the wall in front of me. I would have an adjustable table with a comfy chair and treadmill looking over the table and across the room at the wall screen. Lower the table and sit in the chair. Raise the table and use it as a standing desk or step onto the treadmill. Step out from behind the table and pace around the room, examining the code and how it is running in the four browsers open in windows beside the code. Walk up to the screen and use the browsers via touch. Step back, look at the code, walk up to it and interchange two lines by dragging, then say, "Macro! Run!" and watch it go.
Walk back to the treadmill and turn up the speed for a victory lap as you watch your code finally doing what it is supposed to....
Have you used multi-monitor setups before? Anything more than about two screens, and I personally find myself getting really annoyed because I have to turn my head instead of moving my eyes. I feel like a full-wall monitor would be even worse.
I've been using a 39" Seiki 4K monitor for my application development 'needs' for about 2 weeks and I have to say 4K is a substantial step forward from 3 or 4 1080p screens. HDMI 2.0 is great because - especially for desktop use - we can increase from the 30Hz limit of HDMI 1.4.
Yes, I've used multiscreen setups before and always wanted more. I always have multiple apps running, which means that I have to flip from one to the other if I can't just turn my head. If it's paperwork, I tend to spread it out on my desk rather than keeping it all in one tidy stack and bringing each page to the top as I need it. I'm sure there are large screen setups I wouldn't like, but in general I'd rather turn my head (or my gaze) than repeatedly swap things to the front of the stack.
> Honest question, does 4K and up really make sense for private use ?
For big monitors I think it does. Even if 4k doesn't make sense, the increased bandwidth is a win as the previous HDMI spec couldn't drive a 2560x1440 or 2560x1600 monitor at 60hz, and those resolutions definitely make sense to me for computer monitors.
Yeah you are right, id definately adore it for computer monitors. Still, mainstream is stuck with 22 inch TN panel 1080p screens which are miserable, but consumer dont seem to really care. Theres only a handful of models that offer more than 1080p resolution in todays market.
That's interesting, because most of the high resolution monitors I've seen explicitly say you need to use DL-DVI to drive them at full resolution (for example the 27" Dell Ultrasharps.)
Many companies cheap out on the supporting electronics. In particular single-link DVI is electrically compatible with HDMI but is limited to ~1920 x 1200 @ 60Hz. Basically instead of properly implementing HDMI they route a single-link DVI transceiver through an HDMI connector and call it good.
We can digitally capture images at much, much higher resolution than we can hope to display them. Even my giant 27", 2550x1440 monitor is only 3.7 megapixels.
Given the importance of photography in modern culture, it's more than a little weird to be so unbalanced.
As someone who builds prototype displays, I can tell you, we can always use more bandwidth and more resolution. Even if we just abuse it to play tricks. the interfaces we have today are horrible for high frame rate, 3D, and anything that needs to sync up.
I think it's a mistake to assume the existing sizes and viewing distances will remain the same into the future. Yes 1080p is just fine when the thing you're viewing takes up 10 degrees of your vision. However, as screens get larger and/or closer, it's actually pretty easy to tell the difference between 1080p and 4k.
I'm really looking forward to getting a 4k computer monitor in the near future, and a 4k tv that's >100 inches in the middle future.
Most certainly right, but i really think sizes of >50 inches wont be mainstream for a long time, if ever. That means huge devices will be low volume and high price. You can see this in the PC monitor space already, as theres basically nothing happening as most people seem to be happy with their 1080p 22-24inch screens. My two 24inch 1920x1200 Dell from 2006 are still better than most of the TN panel Full-HD crap you can buy today, which is pretty sad.
> My two 24inch 1920x1200 Dell from 2006 are still better than most of the TN panel Full-HD crap you can buy today
Agreed, I use 3 of the new ones. 16x10 is a MUCH better aspect ratio in my opinion. I really don't understand manufacturers' obsession with 16x9 given that most of what people do on computers involves scrolling vertically, not horizontally.
Thankfully Dell are still selling them. And at $260 for 24" 16x10 IPS, I'm sure they're now much cheaper than in 2006.
EDIT: Indeed, looks like prices have falled about 70% [1], although I'm sure that was RRP
Well I recently saw a MacBook Retina and was shocked at how amazing it. I dislike the OS, the case design, the keyboard -- everything about it. But wow, that screen. Painted on is a good description.
On game performance, I thought pixels were a fairly parallel problem? That is, the Xbox vNext could simply overcome it by not using a low-end video card like the XB1? Just add shaders and whatnot.
TBH no. But in the livingroom, sitting a couple of meters away from my 42 inch 1080P tv, i can hardly see a difference between 720p and 1080p content. It would make sense for computer monitors though!
120 fps is the minimum for me when it comes to playing games. Then again I was used to 120hz monitors on CRTs in the early 2000s with 0 input lag.
I would only ever consider 4k for normal desktop usage but there's no way I would pay anything for 60 fps and we all know 4k monitors will be stupidly expensive for at least 10 years.
It always seemed strange that monitors aim for 60fps. That seems like a recipe for input lag. They could aim for 90fps, because when they fail to meet that, at least they'll still achieve 60fps.
Then again, the audience of hardcore gamers who would notice that sort of thing is probably small, so I guess it makes economic sense.
This is also why most LCD manufacturers use low quality TN panels. A majority of people just can't tell the difference, don't care enough or they don't care about things like viewing angles.
LCD manufacturers in general are really crooked. They have been accused and found guilty of price fixing on multiple occasions.
They also love to release monitors with high quality panels initially so they get good reviews and swap in garbage panels without letting consumers know and then sell it for the same price as the same model.
Any proof of panel swapping happening in later models? Not that I couldn't see it happening, I would just like to see an article documenting such an event.
I don't actively follow it anymore but I remember there being a huge fiasco over Dell's 2007fp series. A few hardware communities labeled buying one as playing "panel roulette" or "panel lotto".
It went on for a few years after its initial release so it's still semi-current although now a days I doubt anyone would be buying that model.
I remember a 100ish page thread on hardforum documenting which panels lined up with what serial numbers. I actually played the roulette and bought one off a dude. I ended up getting an s-ips panel and I still use it as my second monitor. It's not 120hz but I don't game with it.
I'm sure there were/are more cases but this was the most popular one because the s-ips version of this monitor was amazing for its time. It was my first LCD purchase.
I use a Dell U2711. Best monitor I've ever owned. Reviews say noticeable input lag, but I've never noticed any and I'm a fairly hardc0r3 gam3r.
More to the point, I'm a graphics programmer / researcher, so monitors with low Delta E are important for my work. The Dell U2711 combined with an i1Display Pro can be calibrated to exceptional color fidelity.
I have an HP LP2475W and a Dell U2412M; generally, among the IPS class, the consensus seems to be that HPs have more outputs and a better UI. Some of them are wide-gamut, however, which turns off a few purchasers. Research as to whether or not wide-gamut is important/detrimental to you.
I'd say go for 24s or a 30; if you have the 27 you'll just wish you had the 30 :-p but definitely go IPS, they are fantastic. I haven't read much about the Korean monitors as they were nonexistent when I researched the market 4-5 years ago.
>It always seemed strange that monitors aim for 60fps. That seems like a recipe for input lag. They could aim for 90fps, because when they fail to meet that, at least they'll still achieve 60fps.
It's not like modern games have that much CPU time to prepare each 1/60 sec frame -- 16ms actually.
As for input lag, they can scan inpput at different frequencies and adapt the next frame accordingly, but it's not like the input is gonna show in any less than 1/60 for an graphically involved game.
In this context, input lag refers to a monitor phenomenon. Some monitors have terrible input lag. The game can't compensate since it's due to the monitor.
Yep, input lag in an LCD monitor context is the amount of time it takes for a frame to be displayed on the screen.
A lot of panels have a varied amount of input lag. Some of them are in the single digits of milliseconds. On the other side of the spectrum you could see upwards of 100ms.
It's really bad when you get something like 30ms to 70ms of random delay. You can't predict this type of randomness and since it happens every frame it makes for a very poor gaming experience.
Think about a competitive game like quake 3 where you might be playing at a LAN where you have a 5ms ping but your monitor is giving you 30-50ms of random input lag. You would be at a -massive- disadvantage if your opponent had a monitor with 2-7ms of input lag.
There is a world of difference between 30ms and 60ms at a competitive level.
Well, your eyes can't see above 24fps so it does not matter
</joke>
Honestly, I don't care after 60fps, even for games. Are 4k monitors (smaller screens) available? I only see huge 4k TVs, never heard of monitors. (It is not like I can afford one though :()
People who enjoy games at a competitive level don't just out grow it once they hit late 20s or 30s. I'm 33 by the way. I'm extremely confident that anyone playing a fast paced game at a highly competitive level would notice the difference between 60 and 120 fps+hz -- even in a best case scenario where the frame rate wasn't varied. Frame rate variance is one of the most noticeable things, but even a rock solid 60 fps+hz is choppy.
Look at how many grown men/women play or watch televised sports. There is nothing different than some 40 year old person watching an NFL game and a 40 year old person watching a quakelive or heroes of newerth tournament.
I don't watch regular sports but I can see the appeal. I mean, look at a sport like basketball. It seems like a single player can almost carry a team by himself as long as the other players are somewhat competent. Watching someone who is so good that he makes "professionals" seem like amateurs is really entertaining.
You do not have the statistics to back that statement up.
People usually have less time for games once they have kids, but they have less time for basically everything. Super Mario Bros came out 28 years ago; it's existed for essentially the entire life of anyone younger than mid-30s, into people in their 40s for it being around since their childhood. And that still doesn't even cover people that grew up with an Atari 2600 (36 years ago) or grew up in the culture of arcades (Pong was 41 years ago).
Even getting past the "competitive level" that the GP is going on about, the comparison to spending the afternoon "watching the game" is exactly right, and it's really not that big a deal for most people.
(the best statistics you'll find, by the way, put the median age of gamers at 30--meaning the every single person in the group you think soley cares about games is matched by a person older than that--though the methodology is so-so and it includes a wide variety of "gaming", including on the ever-present mobile device)
There's nothing stopping a competitive gamer from having a job or family. You also don't need to play 10 hours a day to remain competitive.
If you sleep 7 hours a day and work 40 hours a week there's still 9 hours a day to do whatever you want and free weekends assuming you work a typical 9-5 job which might be common but certainly isn't what everyone does.
How you choose to spend those remaining 9-10 hours a day is up to you. Some people would rather watch TV for 3 hours a night and sleep more so they lose almost half their free time because they enjoy sleeping in and watching some shows. Nothing wrong with that but not everyone spends their time doing that.
I think topics are mixed up in this thread and more strangely nobody is mentioning it!
1. (120)fps you refer is about the graphic card not the monitor.
2. (120)hz is refresh rate of an LCD is completely different than CRT. BTW, I never heard of 120hz high-res monitor. Mitsubishi or NEC maybe. could you give the brand and model?
3. monitor input lag will _always_be there with LCDs. there is no 120hz or higher LCD panels. they do real-time interpolation with 60hz panels. and that's not natural if you look closely. maybe just maybe OLED will improve response time or input lag as you refer.
There are tons of 120hz LCDs. Some of the korean models that you can pickup for $350 can be manually set to run at 120hz but you have to be lucky. They run at 2560x1440.
Most 120hz LCDs run at 1080p though, and you can typically buy them at Newegg or wherever you usually get computer parts. You should be able to find dozens of models by Googling for "120hz LCD monitors".
goog point actually. Domains like vision research are again left in the cold. And then people sometimes wonder why oh why researchers like to stick to old standards and still use 20yr old fluor CRTs driven over VGA cables :P
Btw 3D using shuttered glasses on a nomral 120Hz LCD might look ok when playing games and whatnot but is in reality not: the amounts of crosstalk are beyond what can be used for scientific research unless you stick to a small area on the screen and tune everything specifically for that area.
It came out a little while ago, so it only supports 30fps at 4k, but this 39 inch 4k monitor is decently reviewed, and only $700: http://www.amazon.com/gp/aw/d/B00DOPGO2G
30 fps is even a bigger joke than 60. I think even the general public would get nauseous watching a game being played at a solid 30 fps, it's much different than watching a low fps movie.
If I were to drop $700 on a new display I would rather buy 2x $350 2650x1440 monitors that can run at 120hz at that resolution. They've been on the market at that price for quite a while now.
Back in 2001 or 2002 I bought a 21" 1600x1200 120hz CRT for $80 for comparison. It also had no input delay, perfect colors and the screen itself was flat. The only downside is it weighed like 80 pounds.
Most current-gen console games target 30FPS. An exception being Rage made by id software. Carmack was very proud that it ran at 60fps, but nobody else seemed to care.
Its a different story for PC gaming and competetive gaming though.
If I were to drop $700 on a new display I would rather buy 2x $350 2650x1440 monitors that can run at 120hz at that resolution. They've been on the market at that price for quite a while now.
May I ask, which monitor are you referring to? A $350 monitor that can do 2650x1440 and run at 120hz? I need a monitor like that for my graphics research.
>30 fps is even a bigger joke than 60. I think even the general public would get nauseous watching a game being played at a solid 30 fps, it's much different than watching a low fps movie.
People after 4K displays in the current market aren't going for game machines...
If his math is correct then 8k, which is 4x the pixels of 4k, would need 4x that bandwidth, and even Intel's Thunderbolt 2.0 is "barely" at 20 gibabit/s, which means we can't go much further yet. It will be a while until we reach 80 gigabit/s.
8k is also an OUTRAGEOUS amount of pixels to throw around. Unless you're eyeball to eyeball with a 40" display, your eye can't distinguish between 4k and 8k.
I'm not sure I understand the need for 8k. If you want to watch your TV at an optimal viewing angle of 20%~40% then 4k is more than enough to achieve "retina display" at the required viewing distance.
Anyone know how this compares to displayport? I've always heard (perhaps wrongly) of DP being technically better and patent free, so I wonder why HDMI still gets so much more attention...
I've found that the biggest problem with DisplayPort is that even if it is technically superior (unfortunately I'm not qualified to know whether it is or not), active DisplayPort/HDMI/DVI adapters are incredibly expensive.
Since HDMI and DVI came first, this limits the use of DisplayPort; manufacturers are unwilling to create a device that requires a $100 adapter to work with what most people own.
The failure to redesign the connection form factor is a major disappointment. Racking or repositioning a snug receiver with multiple HDMI inputs and outputs will almost guarantee that one will bend or break. This is especially true for low gauge runs over 50 feet.
The obvious positive side of this is that existing cables will support the new spec without being replaced... Says so right there in the doc.
I'd rather stick with the same form factor and cables than replace everything at some insane cost. Not to mention the awkwardness of supporting 2 types of inputs on a receiver/amp/switcher to maintain legacy compatibility.
Dissapointing that this means we won't see 4k 120 hz screens any time soon, and that most consumer hardware in the next 5 years will have compatability issues with them. I love my pixel density and refresh rate!
It's for things like Dolby Atmos[1]. Instead of having to be stuck with a premixed 5 or 7 channel format designed for an ideal that your setup probably is not, you instead include unmixed sound and where it would ideally be coming from. The receiver then has knowledge of speaker and seating positions and dynamically mixes the output to match the room.
Interesting that a "virtual speaker set" won over higher-order Ambisonics (representing incoming audio as spherical harmonics). I wonder if it just boils down to easier engineering.
Really? I've heard some pretty amazing demos of ambisonics even with just two speakers - things which completely blow away anything I hear through mainstream productions with 5.1 and 7.1
I find this curious, as you should in theory be able to map your 30 speakers to SH coeffs if you are using high-enough order SH (although 4th degree SH needs 25 coefficients so I guess there's not enough savings)
Even uncompressed and being extremely generous with the bit depth/rate: 32 ch * 24 bits * 96 kHz = 73 Mbps. That's a drop in the bucket on an 18Gbps connection.
As mentioned, this is probably aimed at theaters. Also consider that you could send four separate 7.1 audio programs simultaneously over the same link, which would let any audio program switching happen at the receiving end instead.
Actually, the Engadget article isn't very specific about "32 audio channels". Possibly they mean 32 separate audio programs instead? Audio is so cheap bandwidth-wise that it's practically free these days.
Can we just have 3D sound now, with X, Y and Z coordinates to go along with the sound clips, and the speaker system routes the right sound an arbitrary number of speakers? That works amazingly well for FPS games...
To me, hundred high-quality audio channels with lossless compression sounds not "INSANE" but "still less than the video". Playback processing might be tricky, streaming might be tricky, but actual storage for any movie or home cinema setup should be trivial. You can easily put a hundred audio channels on a fraction of a bluray disc, not counting any future improvements in storage.
>To me, hundred high-quality audio channels with lossless compression sounds not "INSANE" but "still less than the video".
Well, 100 audio channels 50% lossless compressed files would go about 60 GB for a 2-hour movie. So not really "less than the video".
And that's for 16/44.1 lossless. For 24/96 adjust accordingly.
(Of course not all 100 channels will have sound playing all the time -- except if we're talking for a musical or rock concert video. But it gives a taste of the max storage requirements).
In fairness, they also feature lossy audio, although that's less of a ratio around 6:1 for a typical consumer DTS track, maybe 8:1 for AC-3/Dolby, compared to 5000:1 or more for video. (Uncompressed 6 channel 16/48k is ~4Mbit, typical DTS is 750kbit, occasionally 1.5Mbit, AC-3 is around 450-500kbit)
I suspect not, but I can't find any details on whether this new spec provides for more power than the current standard so that low-power devices (Chromecast) don't need a separate power cord.
Hmm how will monster cables and the like sell their stuff when the standard makes clear that existing cables can already handle the greatly increased bandwidth? I look forward to the spin!
So the engineers pushing forward the video connection standard should wait for a giant monopoly to change its terrible ways before they make any progress? Come on.
On the source material, yes. But display devices may introduce some delay on the video due to scaling, de-interlacing, temporal noise reduction, and whatever other processing is done. The display device then needs to signal back to the amp how much delay it's introducing into the video so that the audio can be delayed by the same amount.
If you connect your source through an surround processor it helps to be able to delay the audio to match the delay of the TV which may vary depending on TV settings (e.g. Game mode may be lower delay but with less picture processing). I assume this spec allows the TV to inform the surround processor continuously of the current picture delay.
Once upon a time I hacked something like that into TVTime (http://tvtime.sf.net) that constantly updated a delay effect I'd modified on my SB Live 5.1's emu10k1 DSP.
This is a really nice feature for whole-house setups.
Different feeds can have slightly different syncs. This is especially true with a matrix operation that uses multiple screens feeds and a single audio feed.
I know "closed captions" are a special thing, but really, shouldn't the rendering device (TV decoder, GPU, software, whatever) be rendering the subtitles instead of the display device? I watch almost everything with subtitles, and can't imagine not being able to adjust subtitles (size/depth) and also enjoy the high-quality rendering my PC can perform.
ceil(log2(P))
bits per pixel. So a "2K" display needs 11 bits, "4K" needs 12, etc. Then each column of pixels gets a distinct value and there is no banding.
So sure, give me a 4K screen. I've seen them and they are sweet, but you MUST also increase the pixel depth at the same time, or there will be artifacts.
And don't get me started on the horrid compression artifacts on basic cable. A crime against quality imagery.