Sorry but gigabytes network simply bring more pixels at once, it does nothing to remove the latency. It's bandwidth, not speed-related. You still get latency. Whether you notice it or not is a totally different issue.
Your wireless game controller probably contributes more to latency than your in home network. Your television almost certainly does.
That said, latency is additive since it all happens in serial, and if the sum of it hits a certain threshold ("human perception") you begin to notice. So maybe you save a few ms from the network with wired vs wireless, or a few ms with a fancy "gamer" controller, or a few ms with a nice gpu-powered video encoding algorithm.
Shave enough of these things (which are only now being designed with this sort of over-engineered latency in mind), and eventually playing over the network is less latency-prone than playing directly on a machine from not that long ago.
Not quite true. Latency is also lower on a gigabit network; it doesn't send more packets in parallel, it sends packets faster, in serial. At any given speed, it takes time for some number of bytes to traverse the wire.
Though the minimum on-wire latency at 100MBit is 0.12ms for 1500-byte packets, per packet, typically 1GBit cards also have lower latency internally.
(You can multiply that 0.12ms number by the number of intervening switches, plus one.)