Hacker News new | past | comments | ask | show | jobs | submit login

If the thing you want it to do isn’t too unusual, LLMs can guide you into doing something vaguely sensible.

This is the one area of AI I’m actually pretty positive about. Computing largely passed people by, and computers have not lived up to their potential. I could see this approach making computers more useful for a large number of people.

Especially semi-technical people who’s specialty isn’t programming.




> Especially semi-technical people who’s specialty isn’t programming.

That would be worse kind . Imaging fixing bug of those people with LLM Hallucinated code that runs but having wrong logic all over the place.

I reduced use of LLM for coding these days. I only ask it to generate templates or write some repetitive codes and sampledata.


Cool how we're far enough into the adoption cycle that 'I reduced use of LLM' is a mainstream thing to hear among sophisticated techies.


Even with GPT4 the most useful usecase are to generate stuff that are so repetative and daunting to type and refactor code that interns wrote. And making it write documentations.

What i love most is to draft out project specs from user's requirements and to generate user stories.

Or let it write leet-code like problems that is too boring to do manually.

The actual layer of coding is still best done by experienced engineer's biological brains.


Some of us haven’t even incorporated LLMs into our development workflows yet. :)


Some of us looked at it and decided the results were so bad it was a net loss to even try.


Yep. Even the few times I use it via the ChatGPT interface I spend as much time fixing the bad assumptions it made.


Some of us looked at it and decided the results were so bad it was a net loss to even try, but it was just so cool...


I do not envy a non-programmer stuck with buggy code generated by an LLM.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: