You seem to be misunderstanding the legalities at work here: reaching out to her multiple times beforehand, along with tweets intended to underline the similarity to her work on Her, demonstrates intention. If they didn’t think they needed permission, why ask for permission multiple times and then yank it when she noticed?
Answer: because they knew they needed permission, after working so hard to associate with Her, and they hoped that in traditional tech fashion that if they moved fast and broke things enough, everyone would have to reshape around OAs wants, rather than around the preexisting rights of the humans involved.
> If they didn’t think they needed permission, why ask for permission multiple times and then yank it when she noticed?
One very easy explanation is that they trained Sky using another voice (this is the claim and no reason to doubt it is true) wanting to replicate the stye of the voice in "Her", but would have preferred to use SJ's real voice for the PR impact that could have.
Yanking it could also easily be a pre-emptive response to avoid further PR drama.
You will obvious decide you don't believe those explanations, but to many of us they're quite plausible, in fact I'd even suggest likely.
(And none of this precludes Sam Altman and OpenAI being dodgy anyway)
I actually believe that’s quite plausible. The trouble is, by requesting permission in the first place, they demonstrated intent, which is legally significant. I think a lot of your confusion is attempting to employ pure logic to a legal issue. They are not the same thing, and the latter relies heavily on existing precedent — of which you may, it seems, be unaware.
Because a legal case under the current justice system and legislative framework would probably take hundreds of thousands to millions of dollars to bring a case that requires discovery and a trial to accomplish.
Maybe (maybe!) it’s worth it for someone like Johansson to take on the cost of that to vindicate her rights—but it’s certainly not the case for most people.
If your rights can only be defended from massive corporations by bringing lawsuits that cost hundreds of thousands to millions of dollars, then only the wealthy will have those rights.
So maybe she wants new legislative frameworks around these kind of issues to allow people to realistically enforce these rights that nominally exist.
For an example of updating a legislative framework to allow more easily vindicating existing rights, look up “anti-SLAPP legislation”, which many states have passed to make it easier for a defendant of a meritless lawsuit seeking to chill speech to have the lawsuit dismissed. Anti-SLAPP legislation does almost nothing to change the actual rights that a defendant has to speak, but it makes it much more practical for a defendant to actually excercise those rights.
So, the assumption that a call for updated legislation implies that no legal protection currently exists is just a bad assumption that does not apply in this situation.
She has a personal net worth of >$100m. She’s also married to a successful actor in his own right.
Her voice alone didn’t get her there — she did. That’s why celebrities are so protective about how their likeness is used: their personal brand is their asset.
There’s established legal precedent on exactly this—even in the case they didn’t train on her likeness, if it can reasonably be suspected by an unknowing observer that she personally has lent her voice to this, she has a strong case.
Even OpenAI knew this, or they would not have asked in the first place.
> If they didn’t think they needed permission, why ask for permission multiple times and then yank it when she noticed?
Many things that are legal are of questionable ethics. Asking permission could easily just be an effort for them to get better samples of her voice. Pulling the voice after debuting it is 100% a PR response. If there's a law that was broken, pulling the voice doesn't unbreak it.
They are trying to wriggle out of providing insight into how that voice was derived at all (like Google with the 100% of damages check). It would really suck for OpenAI if, for example, Altman had at some point emailed his team to ensure the soundalike was “as indistinguishable from Scarlet’s performance in HER as possible.“
Public figures own their likeness and control its use. Not to mention that in this case OA is playing chicken with studios as well. Not a great time to do so, given their stated hopes of supplanting 99% of existing Hollywood creatives.
Answer: because they knew they needed permission, after working so hard to associate with Her, and they hoped that in traditional tech fashion that if they moved fast and broke things enough, everyone would have to reshape around OAs wants, rather than around the preexisting rights of the humans involved.