Hacker News new | past | comments | ask | show | jobs | submit login

There's no reason why an AI-driven sandbox cannot have constraints, as well.

Now it's true that, with the current crop of LLMs, a persistent enough player would always be able to break through them. But if it takes conscious and deliberate effort, I think it's reasonable to say that whatever experience the person gets as a result, they were asking for it.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: