Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
koalala
on April 15, 2023
|
parent
|
context
|
favorite
| on:
Prompt injection: what’s the worst that can happen...
as long as an LLM is a black box (i.e. we haven't mapped its logical structure) then there can always be another prompt injection attack you didn't account for.
Consider applying for YC's Spring batch! Applications are open till Feb 11.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: