Bryan Cranston’s voice and likeness were generated in Sora 2 outputs without consent or compensation when OpenAI launched the AI video platform in an invite-only release two weeks ago, prompting the company to strengthen guardrails around replication of voice and likeness.
SAG-AFTRA, OpenAI, Cranston, United Talent Agency, Creative Artists Agency and the Association of Talent Agents jointly released a statement on Monday addressing the incident. Whilst OpenAI’s policy from the start required opt-in for use of voice and likeness, the company expressed regret for the unintentional generations.
Cranston, who brought the issue to SAG-AFTRA’s attention, said he was deeply concerned not just for himself but for all performers whose work and identity can be misused. He added he is grateful to OpenAI for improving its guardrails and hopes all companies involved respect performers’ personal and professional right to manage replication of their voice and likeness.
OpenAI maintains an opt-in policy for uses of an individual’s voice or likeness in Sora 2, with all artists, performers and individuals having the right to determine how and whether they can be simulated. The company has committed to responding expeditiously to complaints.
Opt-in protocols “the only way”
SAG-AFTRA president Sean Astin said Cranston is one of countless performers whose voice and likeness are in danger of massive misappropriation by replication technology. Astin said Cranston did the right thing by communicating with his union and professional representatives, adding that “opt-in protocols are the only way to do business and the NO FAKES Act will make us safer”.
The joint statement endorsed the NO FAKES Act, pending federal legislation designed to protect performers and the public from unauthorised digital replication. OpenAI CEO Sam Altman said the company is deeply committed to protecting performers from the misappropriation of their voice and likeness and was an early supporter of the NO FAKES Act when it was introduced last year.
The Cranston incident follows OpenAI’s decision earlier this month to pause the ability to generate images of Dr Martin Luther King Jr in Sora after users created disrespectful depictions of the civil rights leader. The Estate of Martin Luther King, Jr, Inc. requested the pause, and OpenAI stated that, whilst there are strong free speech interests in depicting historical figures, the company believes public figures and their families should ultimately have control over how their likeness is used.
The controversy also follows SAG-AFTRA’s condemnation last month of AI-generated performer Tilly Norwood, unveiled at the Zurich Film Festival by AI talent studio Xicoia. The union stated Norwood was a character generated by a computer programme trained on the work of countless professional performers without permission or compensation, warning the development created problems by using stolen performances to put actors out of work, jeopardising performer livelihoods and devaluing human artistry. Scottish actor Briony Monroe, 28, from East Renfrewshire, believes her image was used to create Norwood, though Particle6, which produced Xicoia, denied the claims.
Meanwhile, the UK performing arts union Equity plans to coordinate mass subject access requests from thousands of its 50,000 members to force tech companies into rights negotiations. General secretary Paul W Fleming said the union would use data protection law to compel firms resistant to transparency into collective agreements, with producers privately admitting that using AI ethically is impossible because training data provenance remains unclear.