Philosophically, compression and intelligence are the same thing.
The decompression (which is the more important thing) involves a combination of original data of a certain size, paired with an algorithm, that can produce data of much bigger size and correct arrangement so it can be input into another system.
Much in a way that there will probably be some algorithm along with a base set of training data that will result in something like reinforcement learning being run (which could include loops of simulating some systems and learning the outcome of experiments) that will eventually result in something that resembles a human intelligence, which is the vocal/visual dataset arranged correctly that we humans need to believe something that is intelligent.
The question is how much you can compress something, which is measuring the intelligence of the algorithm. An hypothetical all powerful AGI == an algorithm that decompresses some initial data in to an accurate representation of reality in its sphere of influence including all the microscopic chaotic effects, into perpetuity, faster than reality happens (which means the decompressed data size for a time slice has more data than reality in that time slice)
LLMs may seem like a good amount of compression, but in reality they aren't that extraordinary. GPT4 is probably to the tune of about ~1TB in size. If you look at Wikipedia compressed without media, its like 33TB -> 24 GB. So with about the same compression ratio, its not farfetched to see that GPT4 is pretty much human text compressed, with just an VERY efficient search algorithm built in. And, if you look at its architecture, you can see that is just a fancy map lookup with some form of interpolation.
> accurate representation of reality in its sphere of influence including all the microscopic chaotic effects, into perpetuity, faster than reality happens
This sounds like a newtonian universe. Reality has been proven to be indeterminate before observation, and assuming there is more then one observer in the universe, your equating data compression and full reality simulation to 'absolute intelligence' becomes untenable
The decompression (which is the more important thing) involves a combination of original data of a certain size, paired with an algorithm, that can produce data of much bigger size and correct arrangement so it can be input into another system.
Much in a way that there will probably be some algorithm along with a base set of training data that will result in something like reinforcement learning being run (which could include loops of simulating some systems and learning the outcome of experiments) that will eventually result in something that resembles a human intelligence, which is the vocal/visual dataset arranged correctly that we humans need to believe something that is intelligent.
The question is how much you can compress something, which is measuring the intelligence of the algorithm. An hypothetical all powerful AGI == an algorithm that decompresses some initial data in to an accurate representation of reality in its sphere of influence including all the microscopic chaotic effects, into perpetuity, faster than reality happens (which means the decompressed data size for a time slice has more data than reality in that time slice)
LLMs may seem like a good amount of compression, but in reality they aren't that extraordinary. GPT4 is probably to the tune of about ~1TB in size. If you look at Wikipedia compressed without media, its like 33TB -> 24 GB. So with about the same compression ratio, its not farfetched to see that GPT4 is pretty much human text compressed, with just an VERY efficient search algorithm built in. And, if you look at its architecture, you can see that is just a fancy map lookup with some form of interpolation.