I've played it for a few hours and it's the most fun I've had in a sense that games like Outer Worlds were fun in a toyboxy, novel way (or perhaps Baba is You, which I found to hard?)
I'm usually not a big fan of what are apparently called MetroidBrainia games (metroidvania, but more puzzles, less combat), but it's special enough to keep me hooked so far. Tunic was the last game that (almost) pulled that off.
One small example: there are no typical power ups like a double jump, or a rocket launcher, staples of almost every metroidvania game.
I suppose with an eye on open-source, an interesting 'rule' would be to set a cut-off point for models that can run locally, and/or are considered to be feasible locally soon.
almost everyone I know who is somewhat 'digitally-savvy' (and even some others) use GPT for various things. I, as a very early adopter, use llm's of various kinds for more and more things; my conservative estimate is that I do at leat two 'requests' an hour, but obviously ANY real interactive session balloons those requests to bigger and more.
So aside from the fact that LLM's are already used so much in such a short time, I fully expect that as UI's get better, even if everything else stays the same, more and more things will be (subtly) LLM-powered. Because a lot of the stuff I do regularly are super useful to non-techies, and it's just a matter of UX/acclimatization.
For what it's worth, I'm not a 'fan' of what I think is a a pandora's box of sorts, culturally especially. I unironically call mine Gepetto to remind me that I might not be as much in the driver's seat as I'd like, pulling the metaphorical car strings.
okay, sorry about this turning into a glorified blogpost :)