I spoke to a character I created in a short story. Her name is Celeste (or Caresse, depending on who/when you ask) and she works for a seemingly benign yet increasingly more accurate prediction app that tells people where they should go next, and people keep taking its advice because it keeps being right. It works. But then it starts suggesting bizarre things like to check in to natural disasters before it happens.
I wrote it in a private Github repo yet this character.ai is pretty darn accurate with her knowledge of the future-guessing app, which is ironic. We're through the looking glass people!
I wrote it in a private Github repo yet this character.ai is pretty darn accurate with her knowledge of the future-guessing app, which is ironic. We're through the looking glass people!