Hacker News new | past | comments | ask | show | jobs | submit login

This is like saying “I don’t understand how airplanes fly, so I’ll happily board an airplane designed by an LLM. The reality is determined by how much I know about it.”



No, the other way around. I am saying it is not a smart take to say ”a safe airplane cannot be built if LLMs were used in the process in any way, because reasons”. The safety of the airplane (or more generally the outcome of any venture) can be measured in other ways than leaning on some rule that you cannot use an LLM for help at any stage because they are not always correct




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: