Bittorrent can do random access just fine, especially with v2 which enables content verification in 16KB chunks. Clients tend to be throughput oriented which leads to sub-optimal latency, but that's not a limitation of the protocol.
Connection establishment in bittorrent might be marginally higher overhead than HTTP, but not by much and the cost is highly amortized if you're doing many range requests in the same torrent. There is indeed more overhead from HAVE messages and such, but that's just because bittorrent is doing a lot more than HTTP to enable P2P. It's still not that bad as a percentage of the payload traffic.
Can we really? For torrent to work you still need someone to host it in full at all times. Who will host a 40GB file that will be out of date in a week? The website provider will, so we’re back to square one.
No, you don't. You just need N people to host 1/N of the file some of the time. The only person who needs to upload it in full is the original creator.