Are we properly evaluating the potential negative impacts of safetyism, including that of unfriendly nation states making faster progress than we can? I see these sorts of requirements through a skeptical eye.
LLMs aren't FUD, they are literally the future we're betting on to keep Microsoft here doing the things that Microsoft does. Whatever those things are.