So you're saying it's because each probability is a 7-bit number? I guess that's a valid issue, but it's still very "well, actually." So to counter with my own "well actually," you don't need complex probabilities, and the computation only matters with probability bounded away from 1/2. Surely you could do enough engineering tricks to make that happen and keep it at about a petabyte.
No, it's because each probability is a 128-bit number (2^7 = 128; really two 64-bit numbers).
Yes, you very much DO need complex probabilities. The ability to phase-shift qubits is key to most basic quantum algorithms.
I'm not sure what you mean by "the computation only matters with probability bounded away from 1/2".
Almost by definition, the most "interesting" quantum algorithms are those which are most difficult to simulate classically. e.g. you can use "tricks" to greatly speed up simulation if your states are separable, but then you're not really harnessing the full power of the quantum model. The most powerful quantum algorithms entail maximum entanglement and worst-case simulation performance.
Now I think you don't understand quantum computing as well as you claim. The standard model of quantum computing uses complex numbers, but it is well known that you can do everything with just real numbers instead. See exercise 10.4-10.5 of Arora-Barak [1]. The key insight in quantum computing is not that complex numbers are required, but that the invariant across computation is the 2-norm of a vector (as opposed to the 1-norm, as in the classic stochastic model of randomized computation).
Further, the standard quantum model of computation is probabilistic in the sense that you "compute" something if your program outputs the right answer with probability at least 2/3. But 2/3 is not special, you just need it to be some probability bounded away from 1/2, in the sense that it can't get closer and closer to 1/2 as the input size grows.
So it's certainly plausible that you could take advantage of this to reduce the precision enough to get to the size bound Lipton mentioned, especially if, as implied, you had the might of a hundred Google engineers working on it. And the guy is so freaking smart and experienced in theoretical computer science that chances are he thought of all this and considered it not interesting enough to spell out for the people who will say "well actually."