Is not black or white. Probably most of what we do is System 1, most of the time we are mostly meme machines, and so is a good part of the activity on some sectors.
But are able to make steps forward, intuition, hard step by step reasoning, finding connections between dots, etc. GPT can do some of that, and in some point of the road someone must decide if we reached somewhere else. Even if making the full road may or not be possible in a foreseeable future.
Indeed. What has caught researchers off guard is the way system 2 properties seem to appear as emergent phenomena in LLMs. This is also what has prompted people like Hinton and Sutskever to make this condensed point about statistical modelling and understanding (hypothetically) being simply matter of a spectrum.
But are able to make steps forward, intuition, hard step by step reasoning, finding connections between dots, etc. GPT can do some of that, and in some point of the road someone must decide if we reached somewhere else. Even if making the full road may or not be possible in a foreseeable future.