Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is kind of a funny quirk given that yesterday I had to actively convince ChatGPT to even pretend something was real for a question.

The moment you tell it “pretend that X is Y” it immediately responds with some variation of “I am an AI trained on real info and can’t imagine things”. If you retry a bunch of times or actually try to convince it (“I understand, but if…”) it eventually complies.



Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: