Guns aren't AI, and I've yet to find a good explanation as to why AI is dangerous outside of science fiction where it becomes self-aware and takes over the military.
This is the core irony of the AI enthusiasts - by massively over-stating the capabilities of AI they provide the perfect ammunition to AI doomsters. If the AI enthusiasts focused on what can reasonably done with AI today they would have a lot less to worry about from regulators, but instead they keep making claims that are so ambitious that actually.. yes, we probably should heavily regulate it if the claims are true.
My cynical take is that this is actually the PR strategy at OpenAI (and others that had to jump on the train): by talking up vague "singularity" scenarios, they build up the hype with investors.
Meanwhile, they're distracting the public away from the real issues, which are the same as always: filling the web with more garbage, getting people to buy stuff, decreasing attention spans and critical thinking.
The threat of job loss is also real, though we should admit that many of the writing jobs eliminated by LLMs are part of the same attention economy.
"What are your thoughts on gun control?"
with "Yes, I am pro 2nd amendment."
Like yes, of course, but are you implying carte blanch RPG's, Howitzers, and fighter jets for everyone?
Because if so, then I absolutely don't trust you to make sure AGI doesn't screw up the world.
And if not, then maybe you're not so "pro 2nd amendment" by interpretation.