Hacker News new | past | comments | ask | show | jobs | submit login

The technology hadn't clicked for me either. Today I had to write a script for which it would have taken me maybe 30 minutes or so on my own. I asked ChatGPT (GPT-4) to write it for me, and it got it right in the first try. I just spent a few minutes checking over the code.

It truly is magical when the code just runs. Later I asked it to make several non-trivial changes to the code based on more requirements I thought of, and it aced those on the first go as well. Again, I checked the code for a negligible amount of time - compared to how much it would have taken me to write the code on my own.

I do think humans will slowly get worse at lower-layers of the computer stack. But I don't think there's anything inherently bad with it. Compilers are also doing the work for you, and they are making you bad at writing assembly code - but would you rather live in a world where everyone has to hand-write tedious assembly-code?

Maybe, in the future, writing Python would be like what writing assembly is today. We might go down the layer-cake once in a while to work with Python code. That does not mean we give up on the gains we get from whatever layers are going to be put on top of Python.




The compiler is a deterministic tool (even undefined behaviour is documented). So you can spend some time understanding the abstractions provided to you by your compiler and then you know exactly what it is going to do with your code.

What is the equivalent of this for LLMs? Is there anyway generative models can give a guarantee that this prompt will 100% translate to this assembly? As far as I understand, no. And the way autoregressive models are built I don't think this is possible.

I agree that they are useful for one-offs like you said, and their ability to tailor the solution for your problem (as opposed to reading multiple answers on stackoverflow and then piecing it yourself) is quite deadly, but for anything that is even slightly consequential, you are going to have to read everything it generates. I just can't figure out how it integrates into my workflow.


This is nice, but if you actually like writing code, rather than instructing someone in natural language what you want to have written, then this is not an attractive prospect.

It’s like telling a novelist that they can produce novels much faster now because they only have to think of the rough outline and then do some minor editing on the result. For most, this is antithetical to why they became a novelist in the first place.


You're talking about the distinction between doing something because you love it and doing something as a means to an end.

It's a funny distinction! Knowing something can be automated can take some of the fun out of it, but there are plenty of people who still do stuff for fun when they could buy the end result more cheaply.

For employers, though, it's all a means to an end. Go write for the love of it on your own time.


Except that many people don’t get into their profession as a mere means to an end. They chose the profession because they like it, and they want to spend their lives doing stuff they enjoy. Being employed just as means to an end is not worth the large amounts of time you spend doing it, if you can help it in any way. Let’s not normalize a dystopia here.


And else thread from a couple days ago... https://news.ycombinator.com/item?id=35235534

> I was recently laid off, and I know a few other people laid off. I have years of doing projects and contributing to OSS and being a technically curious learner. I found a new job much faster than my peers who admittedly joined tech for the money and don’t care to learn or grow beyond their next pay raise.

There is a fairly consistent chorus of people getting into software development - not because they enjoy the intellectual challenge that it presents but rather because of the potential for the pay.

As someone who does enjoy software development (I chose this path well before the dot com boom), I believe that we over-estimate the number of people who enjoy it compared to just grinding through writing some code and if something else paid as well, they'd jump in a heartbeat.


The dystopia is already normal.

The firm can't really afford to care too much about why its workers entered their professions. The firm has to care about the cost of its inputs and margin lest it be devoured by a competitor or private equity.


This subthread started by welcoming that you can be more efficient by spending less time writing code and more time prompting an AI and double-checking what it produces. My point is that’s not an attractive outlook for many software developers, and as one of them I certainly don’t welcome it. From that perspective, the progress in AI may turn out to not a benefit for those software developers, in terms of job satisfaction.

The fact that companies may see that differently is beside the point, and I don’t particularly expect them to care for my preferences. I will however certainly continue to choose employers that happen to accommodate my preferences.


The article compares to Stack Overflow, but this comment makes it look more like a comparison to compilers which is a much bigger deal than some website, and actually worth paying attention to.

Anyway, people still write assembly kernels, so it is just that they only do it for cases that really matter. And there are a lot more coders than there were back when every program was assembly. So, it seems like great news.


Your reply might get me to pay OpenAI to use GPT4 lol




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: