Hacker News new | past | comments | ask | show | jobs | submit login

Over a gigabit network its pretty much unnoticeable, I've been playing Portal 2 and Metro 2033 via in home streaming and its not been a problem.



Sorry but gigabytes network simply bring more pixels at once, it does nothing to remove the latency. It's bandwidth, not speed-related. You still get latency. Whether you notice it or not is a totally different issue.


Your wireless game controller probably contributes more to latency than your in home network. Your television almost certainly does.

That said, latency is additive since it all happens in serial, and if the sum of it hits a certain threshold ("human perception") you begin to notice. So maybe you save a few ms from the network with wired vs wireless, or a few ms with a fancy "gamer" controller, or a few ms with a nice gpu-powered video encoding algorithm.

Shave enough of these things (which are only now being designed with this sort of over-engineered latency in mind), and eventually playing over the network is less latency-prone than playing directly on a machine from not that long ago.


Not quite true. Latency is also lower on a gigabit network; it doesn't send more packets in parallel, it sends packets faster, in serial. At any given speed, it takes time for some number of bytes to traverse the wire.

Though the minimum on-wire latency at 100MBit is 0.12ms for 1500-byte packets, per packet, typically 1GBit cards also have lower latency internally.

(You can multiply that 0.12ms number by the number of intervening switches, plus one.)


Interesting, I was not aware of that. Do you have data as to how much latency there is on 1Gbit cards then ? Is it significantly lower?


http://serverfault.com/questions/276651/network-latency-100m...

Here are some "real numbers" to give you a lead.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: