Hacker News new | past | comments | ask | show | jobs | submit login

You seem to have this idea that LLM guardrails are anything more than telling it not to do something or limiting what actions it can perform. This is not the case.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: