There is a significant contingent of influential people that disagree. "Why the future doesn't need us" (https://www.wired.com/2000/04/joy-2/), Ray Kurzweil etc.
This is qualitatively different than what the Luddites faced, it concerns all of us and touches the essence of what makes us human. This isn't the kind of technology that has the potential to make our lives better in the long run, it will almost surely be used for more harm than good. Not only are these models trained on the collectively created output of humanity, the key application areas are to subjugate, control and manipulate us. I agree with you that this will not happen immediately, because of the very real complexities of physical manufacturing, but if this part of the process isn't stopped in its tracks, the resulting progress is unlikely to be curtailed. I at least fundamentally think that the use of all of our data and output to train these models is unethical, especially if the output is not freely shared and made available.
It seems we are running out of ways to reinvent ourselves as machines and automation replace us. At some point, perhaps approaching, the stated goal of improving quality of life and reduce human suffering ring false. What is human being if we have nothing to do? Where are the vast majority of people supposed to find meaning?
I don't see why machines automatically producing art takes away the meaning of making art. There's already a million people much better at art than you or I will ever be producing it for free online. Now computers can do it too. Is that supposed to take away my desire to make art?
I've been lucky enough to build and make things and work in jobs where I can see the product of my work - real, tangible, creative, and extremely satisfying. I can only do this work as long people want and need the work to be done.