I never understood AGI to mean "better than humans". A lot of people assumed it would easily be made to be, by simply throwing more silicon at it until it was, but being smarter than humans isn't what makes it "AGI".
Put this another way, suppose we create a computer program that is only as smart as a bottom 10% human (I'm not saying they have.) You can't reasonably say that is smarter than humans generally. But are you comfortable saying that bottom 10% humans lack any general intelligence at all? General intelligence doesn't mean extreme intelligence, and so Artificial General Intelligence doesn't either. You might say that the term is more than the sum of the parts, which is fair, but I still dispute that superhuman abilities was ever part of the definition of AGI. It was just a (thusfar) failed prediction about AGI.
By that token, you could find non-human animals that are smarter than some percentage of humanity in a few tasks. Are those animals AGI?
Now you could find a software that is smarter than some percentage of humanity in a few tasks. Is that software AGI? Is AlphaGo AGI? Is the Google Deep mind AI gamer AGI?
My definition and the one I found on Wikipedia is „AGI […] matches or surpasses human cognitive capabilities across a wide range of cognitive tasks.“. Being better that the bottom 10% of humans on some tasks doesn’t really qualify to me.
Put this another way, suppose we create a computer program that is only as smart as a bottom 10% human (I'm not saying they have.) You can't reasonably say that is smarter than humans generally. But are you comfortable saying that bottom 10% humans lack any general intelligence at all? General intelligence doesn't mean extreme intelligence, and so Artificial General Intelligence doesn't either. You might say that the term is more than the sum of the parts, which is fair, but I still dispute that superhuman abilities was ever part of the definition of AGI. It was just a (thusfar) failed prediction about AGI.