I have always been crying about latency computing for years. My earliest HN comment on the topic was in 2012. And I have been ranting about it for even longer outside
I really hope at the later part of this decade we start focusing on latency computing instead of bandwidth. From Display, Input, Output and Network, and down to OS and Software. I had always hoped VR would forced those optimisation due to VR being a latency sensitive, as John Carmack was doing at the time. But VR never happened.
To me, the threshold of cringing is when web forms are laggy. This 50-billion-instructions-per-second box can be slower rendering an input form than a VT220 terminal from 1983. When the task of the machine is to, e.g. enter appointment booking information, and it handles the task worse than hardware from decades ago could in some cases, that's when we know we've lost the plot.
>To me, the threshold of cringing is when web forms are laggy.
Same here that was what ticked me. I was reading up HN threads I saved and the Remix framework was one of them. People were cheering for a well design, animated and smooth webpage. Because most of the other similar design tends to Jank and not smooth. We are coming to 2022, rendering a webpage on modern CPU with GPU accelerated browser should be trivial. And yet that web page was, true to most comments and my experience, an outliner.
A lot of people continue to say those latency doesn't matter. It does. I am picky, I hate latency. I am latency sensitive. I can only wish some day ( again may be VR or Metaverse ) we could kick start latency computing. Or Real Time computing.
"To make Metaverse, we need Real Time Computing" seems like a good message to VC and consumer market :)
Even worse is text fields that cannot keep up with typing even on a brand new processor.
I haven't used facebook messenger in yearw, but last time I did, typing was at leas O(n^2) and would bring any cpu to its knees if messages were typically 100s of words.
Gumtree is another (doesn't register characters if you type them to fast). Probably the exact same code as they were acquired by facebook.
Yes, or, even more annoyingly, when slack lags as you type. I think I've had ssh connections to the south pole with less latency than slack sometimes has.
A big performance problem is live autocomplete on your input. The website running some kind of a query on every character you input sometimes causes huge delays. On my phone I can sometimes wait a good 5-10 seconds after I finished typing for it to actually show up on screen.
Well it certainly did not happen in way other technologies happened - smartphones, social networks, laptops, desktops... VR is definitely slower and smaller market right now than these other devices were after similar number of years.
I’m ok with VR being slow. There is no need to rush it. There are some amazing VR experiences. Half Life Alyx while standing, despite it being a disaster overall VirtaMate can have some really interesting erotic setups, beat saber is the only music game I’ve ever really got into, and… IDK, some things are just not possible outside of VR like SuperHot VR being a totally different everything than it’s normal version.
I would like there to be more quality content but sooner, but I think it’s largely in a grey area between hardware not being good enough yet and software being expensive to develop.
I used to poopoo it, but I’ve come around that VR is in the cards, just slowly.
And I'd say VR is advancing way faster than desktops, and maybe even the rest of your list. You just don't remember the products that existed when they were still fairly niche, like the iPaq, Amstrad and Friendster.
I lived through the beginning and was early adopter of laptops as well as smartphones. I had a laptop when others had a 386 PC, and I had a smartphone when others had monochromatic Nokias. I also saw desktops happen - not from the beginning, but it certainly looked differently.
I'm not writing VR off though. Boom can still happen, and I think it will - I just think that it won't look anything like today's headset. My money's on lasers-into-eyes (virtual retinal display) VR.
The transatlantic ping is waves traveling at the speed of light in fiber and a large fraction of the speed of light in copper. To send a pixel to a screen you need to wait for the next vsync if you're not using a variable refresh rate display. That alone can be double digit milliseconds. Then you need to wait for the crystals in the screen to physically move in response to applied voltage. Moving things is slow. Moving things happens at speeds far less than the speed of light unless you're putting in enough energy to vaporize a city.
If you wait around and rely on things that move extremely slowly compared to the light speed of fiber and near light speed of electrical signals, waiting around for a crystal to turn and align itself to a voltage is going to take an eternity in comparison.
I dunno, I don’t think it’s that obvious: the physical movement in an LCD is across tiny masses and distances (at molecular scale); the network signal has to cross a distance that’s many orders of magnitude larger. Comparing these two things without looking at actual number doesn’t strike me as valid, even for approximations.
Furthermore, if I’m reading the RTINGS.com source posted below correctly, modern LCDs have smaller latencies than a transatlantic ping. So not only is it not obvious, it’s no longer universally true.
That's only half the story. If you compare a PAL NES game, which runs at a fixed 50 FPS, to a modern video game that runs at 50 FPS, the former is incredibly snappy whereas the latter would be deemed unplayable sluggish. The NES signal is analog and if you update the video memory, it more than likely shows up on the screen the next frame, the worst possible delay is 20 milliseconds.
Digital displays and (hardware) pipeline-based rendering means that it in practice often takes takes several frames between sending the instructions to draw mario and the little guy showing up on the screen. It still takes less than 20 milliseconds to render, but the latency is much higher. In practice a latency of 100 ms isn't unheard of at 50 FPS with a modern rendering pipeline. This is why modern games feel so sluggish at low framerates.
Optimizing for latency is often counter-intuitive.
Polling as an example may be an effective way to lower latency, e.g. Linux NAPI.
Increasing buffer sizes may also decrease latency because data can be processed in bursts, see buffer starvation. But if data comes in at a higher rate than can be processed, buffers fill up and are horrible for latency.
> Polling as an example may be an effective way to lower latency,
I had to learn this lesson the hard way recently.
An embedded system where I had to get data from a couple SPI controller’s RAM. Made a bit complicated ISR and listener system that woke up threads to do a deferred-interrupt… by the time it was all said and done with a fair bit of code, I wasn’t happy with it.
I figured out that usually data would be available every 2ms and just wrote the read routine just to sleep every 2ms and attempt another read, I have it “hitting” 90% of the time and my avg latency is almost nothing. .7ms or so.
Imgur has has become so horrible that it loaded all recommendations first and it had still not loaded the main content before I closed the tab. I wish this was hosted somewhere else.
At right around the time that was published in 2009, I bought a new 17 inch MacBook pro and a $350 Intel 80GB 2.5" SATA ssd, and stuck the SSD into the position formerly occupied by the dvdrw in a special holder. Worked great as boot drive.
This is one of the reasons why the hardcore retro gaming people are still paying big money for 32" Sony Trinitron analog CRT TVs and such. Ultra low latency from image source to display.
If you have a good condition Trinitron like display these days (Mitsubishi diamondtron, sony pvm, etc) it is presently appreciating in value every year.
I really hope at the later part of this decade we start focusing on latency computing instead of bandwidth. From Display, Input, Output and Network, and down to OS and Software. I had always hoped VR would forced those optimisation due to VR being a latency sensitive, as John Carmack was doing at the time. But VR never happened.