Hacker News new | past | comments | ask | show | jobs | submit login

I think that confuses doing versus delegating. Delegation is easy to do via a text-box, because you're just kicking the interactive complexity-can down the road to someone else, often in a way which can be problematic even with actual humans.

For example, a project-manager or executive could verbally delegate "make a new registration page for the site" and "needs more rounded corners", either to an AI or to an employee or offshore contractor.

However that's not the same as trying to program exclusively by typing (or dictating) prose to a text-box. ("Page down more. Go to method pee-reg underscore apply. Show me its caller methods. Go to caller method two. Type the following into line 7 position 43...")

There might be some parallels we can draw with the last few decades of "programming business logic will be replaced by drawing diagrams" predictions.




> I think that confuses doing versus delegating.

This is exactly it, thanks for putting it down clearly.

Which is also why news of the death of programming as a profession are greatly exagerated. You're not being paid to write code, you're being paid to make decisions. Code is easy, or at least much easier than natural language.


You're also paid to tease out the REAL requirements out of PMs/management/users/etc.

Most of the time, what is being asked for, on its face, is not what is actually wanted, not as simple as spelled out, has some A-B tradeoffs to decide, or maybe not worth it given the side effects.

If a developer isn't asking multiple questions per feature, they deserve to be replaced by an LLM.


They won’t be replced by an LLM but by another person using an LLM, most likely a dev and possibly the same person who is asked to use an LLM to increase productivity. I see more and more companies/institutions adopting LLMs and training their workforce to use em. Interesting to know how all this will play out.


Seriously. I think posts/articles/etc related to replacement of Software Engineering jobs to AI are exaggerated and probably driven by jealousy or sadism. Just ignore those and move on.

(It was very depressing to believe that Software Engineers will lose jobs)


Views. I'm inundated with AI content but most of it lacks any substance. It's mostly "wow GPT is really dumb and can't behave like this supergod AGI I just made up" to "wow GPT will take over all our jobs in 3 years, it's so powerful".


> really dumb [...] take over all our jobs

Perhaps worse than the vacillation between getting terrible answers and great answers: When you simply can't tell which kind of answer it is, not until you've sunk a bunch of effort validating or implementing it. (Perhaps finding that the system invented some core fake APIs, non-existent citations, or algebra errors.)

Almost an echo of P/NP categorizations: It's tough when the effort of fully verifying a proposed answer is too close to the effort of just solving it normally.


The common occurrence of hallucinations makes it hard for me to believe anyone will be using LLMs to produce code anywhere outside of shops who really don't care about errors. Until they fix that, code is a use case where even slight errors make the output useless.


I'd add, architecture, design patterns, object-orientation, security (!), and maintainability to that list


"design patterns" and especially one specific pattern such as "object orientation" are just part of code, i.e., easy, according to GP.


I have been using Dall-e, and testing the dall-e chatgpt plugin. Even tough both are supposedly natural language interfaces, I find I approach the dall-e prompt more like writing a formula than real language. Using the gpt plugin is like delegating to a designer to write the prompt for you. personally I don't like the results of that compared to what I would make myself.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: