Unfortunately, addressing those issues would do little to address the underlying cause: We have many more ways to amuse ourselves compared to a generation ago, most of which require less "reach" for a dopamine hit (social media, netflix, video games, etc).
Just to add some more motivation: in a typical physics undergraduate curriculum, you will spend roughly as much time doing homework as attending lectures. If you skip the exercises, you are quite literally skipping half of the education.
Essentially all of the theory research (specifically, lattice QCD calculations) since the previous white paper in 2020 have been conducted blinded, and at any rate, the deadline to be included in the theory average has already passed. It would take an act of extraordinary brashness to fudge the numbers now.
Since the article doesn't mention it: Ian Tregillis (the first author of the paper and staff scientist at Los Alamos) also moonlights as a sci-fi/fantasy author. Personally I found The Milkweed Triptych a more compelling read than A Song of Ice and Fire, and it also has the benefit of being finished! (Here's a one sentence hook: WWII, except with English wizards fighting Nazi X-Men.)
Well, then don't read it! While that's the most campy distillation of the premise, the writing is anything but. George R.R. Martin himself declared Tregillis "a major new talent".
> If it's 11:55, you would usually mentally subtract and conclude: the meeting is in 5 minutes. But the most probable estimate given the available information is actually 4'30"!
Admittedly I'm being a bit pedantic, but this isn't true. The expectation value might be 4'30", but the time is as likely to be 4'59" as 4'30"; assuming it's 4'30" will simply minimize your expected error.
You might as well have written that mathematicians should stop doing mathematics. If every mathematician were to work full time on formalizing theorems in proof assistants, then no living mathematician would ever do original research again -- there is simply too much that would need to be translated to software. And to what end? It's not as if people suspect that the foundations of mathematics are on the verge of toppling.
> Code is easy to share, easy to collaborate on [...] collaboration is so easy that publishing your 1/2 done work will often prompt others to do some of the tedious stuff
Here's an experiment anyone can try at home: pick a random article from the mathematics arxiv [1]. Now rewrite the main theorem from that paper in Lean [2]. Did you find this task "easy"? Would you go out of your way to finish the "tedious" stuff?
> even horrible code is far better at documenting what it does than what you are describing
The "documentation" is provided by talking to other researchers in the field. If you don't understand some portion of a proof, you talk to someone about it. That is a far more efficient use of time than writing code for things that are (relatively) obviously true. (No shade on anyone who wants to write proofs in Lean, though.)
> the big benefit is parallel computing at a massive scale
The problem with this line of reasoning is that, even though a quantum system might have many possible states, we only observe a single one of those states at the time of measurement. If you could somehow prepare a quantum system such that it encoded the N equally-likely solutions to your classical problem, you would still need to rerun that experiment (on average) N times to get the correct answer.
Broadly speaking, quantum computing exploits the fact that states are entangled (and therefore correlated). By tweaking the circuit, you can make it so that incorrect solutions interfere destructively while the correct solution interferes constructively, making it more likely that you will measure the correct answer. (Of course this is all probabilistic, hence the need for quantum error correction.) But developing quantum algorithms is easier said than done, and there's no reason to think a priori that all classical problems can be recast in this manner.
I think that the big challenge is to recast any classical computations as quantum computations with a superpolynomial speedup.
I think that all classical problems can be cast as quantum computations because quantum computation is just computation - I believe that one can implement a turning machine using quantum gates, so arbitrary computation is possible with quantum gates.
The superpolynomial speedups are the thing.. I wonder if these will be limited to a class of computations that have no physical realization - just pure maths.
From the New York Times: "How Polarized Politics Led South Korea to a Plunge Into Martial Law" [1]
> From the start [...] Mr. Yoon faced two obstacles.
> The opposition Democratic Party held on to its majority in the National Assembly and then expanded it in parliamentary elections in April, making him the first South Korean leader in decades to never have a majority in Parliament. And then there were his own dismal approval ratings.
> Mr. Yoon’s toxic relationship with opposition lawmakers — and their vehement efforts to oppose him at every turn — paralyzed his pro-business agenda for two years, hindering his efforts to cut corporate taxes, overhaul the national pension system and address housing prices.
and also
> Opposition leaders warned that Mr. Yoon was taking South Korea onto the path of “dictatorship.” In turn, members of Mr. Yoon’s party called the opposition “criminals,” and voters on the right rallied against what they called “pro-North Korean communists.”
> (Mr. Yoon echoed that language on Tuesday in his declaration of martial law, saying he was issuing it “to protect a free South Korea from the North Korean communist forces, eliminate shameless pro-North Korean and anti-state forces.”)
So basically, Mr. Yoon was unable to pass his agenda (as his party never had control of the legislative branch), and rather than continue to negotiate, he decided to impose martial law, label the opposition communists, and then ban the National Assembly from gathering (they gathered anyway).
reply