No need to get so upset ? It is like if I say that reading a book will help you learn faster about a domain than if you have to discover all by yourself.
Yes self-learning is better, but it takes a much longer period of time, whereas if you have a tutor, then it saves lot of time (and companies don't often have such resources, which is where AI and books fill the gap).
I strongly believe that LLMs are a serious helping tool for programming that helps programmers to onboard their project faster.
Regarding more exotic techs, as a cousin of ChatGPT, Google Gemini used to be very very bad, but with Gemini 1.5-Pro you can feed it very long documents, and this is super helpful for specific implementation (e.g. the exotic processors), and it's, really, really not bad at programming, or at least pushing you in the right direction.
Of course it's not autonomous (and whether it can be in the short-term on complex projects is unlikely), but it reduces the onboarding time, and this was the point raised in the conversation.
A dev paired with a LLM is much much more productive.
I suppose that you are concerned that it may push people to lose their jobs in the long-term. I am as well, but we still have some time ahead.
I don't like this situation either, but I have to recognize that it is a very helpful co-programming tool.
Reading the documentation is basically never the hard part. If it's where most of your work is going, you're not doing hard work. Gemini 1.5-Pro may be able to summarize documentation, but it's what isn't there that hurts you. It may make for a helpful reference, but that wasn't the initial claim. The claim was that "domain-specific knowledge can be solved with a $20 ChatGPT subscription", and being frank, that's just stupid. The difference between a smart person and a domain expert is orders of magnitude more than an LLM is able to paper over.
It's a struggle to replace even the most trivial paper pushing with an LLM. I'm sure we'll find something one day, but it's going to be doing a hell of a lot more than an LLM.
Yes self-learning is better, but it takes a much longer period of time, whereas if you have a tutor, then it saves lot of time (and companies don't often have such resources, which is where AI and books fill the gap).
I strongly believe that LLMs are a serious helping tool for programming that helps programmers to onboard their project faster.
Regarding more exotic techs, as a cousin of ChatGPT, Google Gemini used to be very very bad, but with Gemini 1.5-Pro you can feed it very long documents, and this is super helpful for specific implementation (e.g. the exotic processors), and it's, really, really not bad at programming, or at least pushing you in the right direction.
Of course it's not autonomous (and whether it can be in the short-term on complex projects is unlikely), but it reduces the onboarding time, and this was the point raised in the conversation.
A dev paired with a LLM is much much more productive.
I suppose that you are concerned that it may push people to lose their jobs in the long-term. I am as well, but we still have some time ahead.
I don't like this situation either, but I have to recognize that it is a very helpful co-programming tool.