If you find the idea of climate change leading to societal collapse far fetched, you must surely find the concept of a rogue AI even more so - the science is much more solid and certain on the climate (and I don't think the science says the scenarios that lead to social collapse are as unlikely you think)
I also think it says more about our own human failings than the true risks of general AI, that we imagine it more likely to go rogue and kill us all instead of being more adept, more benevolent and capable at managing the complexity of society than our own feeble attempts.
AI (or other future technologies) really can be a path to extinction.