Hacker News new | past | comments | ask | show | jobs | submit login

I think this view is incredibly dangerous to any kind of skills mastery. It has the potential to completely destroy the knowledge economy and eventually degrade AI due to a dearth of training data.



It reminds me of people needing to do a "clean room implementation" without ever seeing similar code. I feel like a human being who read a bunch of code and then wrote something similar without copy/paste or looking at the training data should be protected, and therefore an AI should too.


Okay, that’s an argument from consequences, but is the view factually wrong?


I mean those consequences are why patent law exists. New technology may require new regulatory frameworks, like we've been doing since railroads. The idea that we could not amend law and that we need to pedantically say "well this isn't illegal now" as an excuse for doing something unethical and harmful to the economy is in my opinion very flawed.


Is it really harmful to the economy, or only to entrenched players? Coding AI should be a benefit to many, like open source is. It opens the source even more, should be a dream come true for the community. It's also good for learning and lowering the entry barrier.

At the same time it does not replace human developers in any application, it might take a long time until we can go on vacation and let AI solve our Jira tickets. Remember the Self Driving task has been under intense research for more than a decade now, and it's still far from L5.

It's a trend that holds in all fields. AI is a tool that stumbles without a human to wield it, it does not replace humans at all. But with each new capability it invites us to launch new products and create jobs. Human empowerment without human replacement is what we want, right?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: