Hacker News new | past | comments | ask | show | jobs | submit login

the difference is, a motor would not be able to provide means of doing a crime that you don't already have.

an LLM could educate you in how to commit crimes, which you would have no idea about otherwise

but crimes in general are a bit of an extreme example in my opinion. a better example of risks of unmoderated LLMs would be something that isnt illegal, like for example, manipulating people.

a sufficiently advanced unmoderated AI could provide detailed, tailor made instructions of how to gaslight, scam, and take advantage of vulnerable people.

and unlike straight up committing crimes, the danger of these would be that there is no legal consequences and so the temptation extends to a way wider group of users (including, and especially, kids).




> the difference is, a motor would not be able to provide means of doing a crime that you don't already have.

I posit that being able to only run away from a bank robbery would indeed prevent someone from successfully doing it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: