Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
hutzlibu
on June 7, 2023
|
parent
|
context
|
favorite
| on:
I'm afraid I can't do that: Prompt refusal in gene...
How would your behavior change, if some AI blackbox might decide on unknown parameters, that it is time to kill
you
?
The idea of an idealistic killbot overlord sounds a little bit dangerous to me.
dingledork69
on June 7, 2023
|
next
[–]
I'd probably plot to destroy the killbot, which would probably result in it trying to kill me. Doesn't sound ideal.
cwkoss
on June 8, 2023
|
prev
[–]
Well, I assume in that situation my behavior would stop altogether.
Consider applying for YC's Spring batch! Applications are open till Feb 11.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search:
The idea of an idealistic killbot overlord sounds a little bit dangerous to me.