Hacker News new | past | comments | ask | show | jobs | submit login

You're not being cynical. You're being overly optimistic here about creative professionals.

The creative professional IS a target for replacement. What LLMs do best is specifically what the "creative professional" does. Better than coding or mathematics, LLMs excel at creative English text. This is because creative output is trivial when compared with technical skills.

I don't think the "creative professional" will get replaced though. Because usually the task is so trivial that positions like these largely only exist because of politics. You convince people around you and yourself that your "creativity" has no peer.

It amounts to this: I want a photo-realistic animation of a cowboy with blue skin with a laser cannon riding a unicycle through the grand canyon with a dwarf chasing him. <<This sentence is trivial to come up with, ANYONE can come up with thousands a variations and tweak the sentence in various ways. No special talent needed here.

A creative director simply comes up with high level instructions (like the one above) THEN the ultra hard part is making that high level instruction a reality. That is HARD and as of now it looks like both skills are being replaced by AI.

But like I said, the directors' position is in actuality largely political. Thus his position is safe even though the actuality of what he does is EASY and therefore a prime target for replacement.




> A creative director simply comes up with high level instructions (like the one above) THEN the ultra hard part is making that high level instruction a reality. That is HARD and as of now it looks like both skills are being replaced by AI.

This is the opposite of what I see in reality right now. The hard part is to make the artists do exactly what you want, down to the subtle but meaningful details. That happens because the high-level instructions don't have enough capacity to deliver the full meaning. This is the same with generative models: extremely hard to control with the "prompt engineering" gimmick, it's only fine when you are OK with random output. Besides, they are purely functional and lack feedback mechanisms the creative director usually employ with artists.

That's why people are trying to make techniques more complex - large animation houses experiment with training their own model architectures and software. This is the way 3D CGI was born - simple at first, and yes it got plenty of doomposting at first (we are getting replaced by computers!), until it was clear that the field becomes extremely technical and complex, so it even has dozens of specializations inside it.

All entertainment is ultimately based on novelty, as human brain is really good at distilling meaning from the ocean of information. If you start with little meaning (your example - "a cowboy with blue skin with a laser cannon riding a unicycle through the grand canyon with a dwarf chasing him"), people get bored - no matter how much randomized artsy-looking stuff you add around it.

If you think that AI can generate all meaning that is relevant to people, I don't buy it, as it doesn't have the same training material. It's trained on the result and is forced to reverse engineer what moves people; reverse engineering is a fundamentally more costly and opaque task.


Train the AI exclusively on well lauded texts. Then you can get a generative AI that moves people.

The problem with "reverse engineering people" is that you don't need to, the things they like in any era follows predictable and generic patterns. These patterns can be encoded into AI provided we find enough curated training data.

> The hard part is to make the artists do exactly what you want, down to the subtle but meaningful details.

This is because the artist can't read your mind. It has nothing to do with your skill or the artists skill level. That feedback loop you are alluding to is more a reflection of the unclarity of your instructions or your imagination.

You thought what you imagined looked good but the artist in following your directions created something that showed you how flawed your imagination was.


> The problem with "reverse engineering people" is that you don't need to, the things they like in any era follows predictable and generic patterns. These patterns can be encoded into AI provided we find enough curated training data.

That only works to a certain degree. For the infamous example, try making GPT output the correct number of asterisks, you will get mixed results. You don't have any problem with typing exactly 1589 asterisks because you run the stateful counting algorithm in your head. GPT has no idea about the algorithm - it has to reverse engineer it from the text, and can only extract the vague correspondence between a number and a string about this or that length. You don't give humans examples to reverse engineer, you teach them to count.

This is a simplest example, it might even learn to count eventually, as it's far more capable in certain aspects. But as the dimensionality of the task grows, the amount of resources and training data required to reverse engineer it grows much faster.

Sure, it can spot some patterns and that can look good, but some things are just plain invisible in the result - you will have a hard time making it learn higher level concepts because they highly depend on hardwired things like the dumber part of neural circuitry and biochemistry in humans, which the model doesn't have.

It's like trying to make a photo in a dark room - no matter how you improve the sensitivity of your camera, you might not have a single photon in it.

> This is because the artist can't read your mind.

Yes, this is what I mean by the limited capacity of a simple textual description. It's a fundamental limitation - natural language is just poorly suited for the detailed conceptualization. A sketch, or a conceptual diagram, or other higher order control methods have far more capacity to explain your intent, and that's the direction those models move to. At which point their usage is nothing like "type something simple and receive the result".


The asterisks thing is another issue. LLMs don't need to do this to replace directors.

>Yes, this is what I mean by the limited capacity of a simple textual description. It's a fundamental limitation - natural language is just poorly suited for the detailed conceptualization.

Except LLMs can accept sketches as input. The higher order methods of communication are covered by encoders.


I think you're conflating imagination and creativity with taste. Your example might be cute and funny, but as you yourself alluded: it's completely bland and tasteless. It's maybe good enough for a "meme dump" but that's all.

Maybe I'm not sure what point you're trying to make either?

All fiction writers suck? I don't know how much fiction you read, but it's incredible varied. There's a lot writers need to think about and control in the reader (and account for multiple profiles) outside of just an idea. But the ideas themselves need to be coherent.

Creative directors suck? It's incredible difficult being s creative director having to organise multiple people to perform coherently, and even much harder for whole teams.

I think it's natural for humans to reduce and simplify things we don't engage with every day, or our brains would be overloaded. So we just handwave it off. We do it to other people too unfortunately... Remember how complex your life (internal and external) is; other's lives are equally complex and nuanced.


>Creative directors suck? It's incredible difficult being s creative director having to organise multiple people to perform coherently, and even much harder for whole teams.

Difficult in terms of effort. Not difficult in terms of skill. Make no mistake the quality of a movie is more the sum of the quality of the parts then it is the creative director. Who wrote the script? Who did the digital effects? Who did the lighting? Who did the editing?

The director did the hard work of picking the people and issuing orders.

>I think it's natural for humans to reduce and simplify things we don't engage with every day, or our brains would be overloaded. So we just handwave it off. We do it to other people too unfortunately... Remember how complex your life (internal and external) is; other's lives are equally complex and nuanced.

Except I am more or less a director. Not one for movies but for a company.

There is a difference between handwaving something off versus being delusional about your own role within the world. Directing is hard work, management is hard work. But none of these things are skilled work.

>All fiction writers suck? I don't know how much fiction you read, but it's incredible varied. There's a lot writers need to think about and control in the reader (and account for multiple profiles) outside of just an idea. But the ideas themselves need to be coherent. own position.

It is certainly easier to create an entire space opera in writing then it is to do it via a movie. Writing is skilled work in terms of one skill only: your ability to write. Every other aspect of it is hard work but, unfortunately, unskilled work.

I realize there are complex plots, paradoxical stories and imaginative settings and the pacing of a story is important as well. But all of this doesn't really require skill just time and deep thought to come up with. Plenty of the most popular authors never had a writing background or talent.

I would say again that these director positions while in principle are easily replaceable they are not in practice due to politics. A director or CEO is where he is mainly due to politics. Politics is unfortunately a skill with aspects that not only need to be imitated by AI but imitated by robotics as well, and it's simply a gateway into the role with no relation to the actuality of the job requirements.

Its hard to imagine a computer issuing the same exact orders as a director. But with LLMs computers are really close to doing that in principle. The issue is as I mentioned the social and political aspect of directing that cannot be replaced yet.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: