Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
two_in_one
on Oct 2, 2023
|
parent
|
context
|
favorite
| on:
Bing ChatGPT image jailbreak
You cannot have one kid who will please everybody. That's the problem. So they have to lobotomize their single model so that it at least does not offend nobody.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: