Hacker News new | past | comments | ask | show | jobs | submit login

The base unit is actually a single bit, so why aren't we talking about decabits instead of bytes, following the SI argument?



Networks are often measured in bits.

It's more about tradition. And sometimes a sprinkle of marketing lies (it's cheaper to make 1024 GB SSD than 1 TiB SSD).

Disk /dev/nvme0n1: 953.87 GiB, 1024209543168 bytes, 2000409264 sectors


Because deca isn't a good SI prefix, and only gets grandfathered in. Also because some people like weird derived units like moles. Fwiw, in many places ISPs advertise speeds in megabits per seconds, no doubt to sounds right times faster than they are.


> advertise speeds in megabits per seconds

"bits per second" is what it always has been for computer communication.

When I got started, I got a 300 bits/second modem, which later got upgraded to 1200/75 bits/s and then 2400 bits/second.

Later on, we had 57,6 kbit/second modems, 64 kbit/s ISDN lines, 2 Mbit/s ADSL, etc.

All the speedtest websites I've seen also use Mbit/s.

But sure, if I'm downloading the latest Ubuntu distro, I want to know the current speed in Megabytes/s.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: