I feel like declining human intelligence is a result of advancing machine intelligence. Computers are a force multiplier and societal pressure towards building intelligence is reduced.
So the AGI/ASI problem might solve itself: we slowly become incapable of iterating on the problem while existing AI is not nearly advanced enough to pick up the slack.
It’s quite beautiful. Once a civilization tries to build machine intelligence it slowly degrades its own capacity during the process thus eventually losing all hope of ever achieving their goal - assuming they still understand their goal at that point. Maybe it’s an algorithm in the Universe to keep us from being naughty.