Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
og_kalu
on Oct 2, 2023
|
parent
|
context
|
favorite
| on:
Bing ChatGPT image jailbreak
Aligning LLMs doesn't make any sense because aligning intelligence as we know it doesn't make any sense. And LLMs are nothing if not made in our image.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: