Hacker News new | past | comments | ask | show | jobs | submit login

as long as an LLM is a black box (i.e. we haven't mapped its logical structure) then there can always be another prompt injection attack you didn't account for.



Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: