Bryan Cranston and Major Talent Agencies Applaud OpenAI’s New Sora 2 Safeguards After ‘Breaking Bad’ Star’s Image Misuse

ago 3 hours
Bryan Cranston and Major Talent Agencies Applaud OpenAI’s New Sora 2 Safeguards After ‘Breaking Bad’ Star’s Image Misuse

OpenAI’s recent advancements in its Sora 2 platform have garnered praise from notable figures, including Bryan Cranston. The generative AI video platform faced scrutiny after inadvertently using Cranston’s voice and likeness without consent. However, new safeguards aimed at protecting artists’ rights have alleviated some concerns surrounding this issue.

Concerns Over Image Misuse

Initially, Cranston expressed deep concern about his image being utilized on Sora 2 without permission. He voiced his worries not just for himself but for all performers whose identities could be compromised by similar situations. “I was deeply concerned… for all performers whose work and identity can be misused,” he stated.

Support from SAG-AFTRA and Talent Agencies

The Screen Actors Guild-American Federation of Television and Radio Artists (SAG-AFTRA) acknowledged that Cranston’s likeness was generated during the platform’s invite-only launch. Cranston promptly brought this matter to their attention. In a collaborative effort, OpenAI and SAG-AFTRA released a statement highlighting improvements to the platform’s guardrails regarding voice and likeness replication.

  • OpenAI has strengthened control measures for likeness replication.
  • Major talent agencies including Creative Artists Agency (CAA) and United Talent Agency (UTA) support these changes.

The Role of the NO FAKES Act

In tandem with these developments, newly elected SAG-AFTRA President Sean Astin emphasized the ongoing need for safeguarding artists’ rights. He highlighted the importance of the NO FAKES Act, legislation currently in Congress seeking to require explicit consent from individuals before producing any AI-generated replicas of their likeness or voice.

“Opt-in protocols are the only way to do business,” Astin affirmed. The act aims to eliminate unauthorized AI replica production, currently justified under “fair use” laws, which remain a gray area in AI copyright.

OpenAI’s Commitment

OpenAI CEO Sam Altman reiterated the company’s dedication to protecting performers. He expressed support for the NO FAKES Act, emphasizing the importance of consent for voice and likeness use. “We will always stand behind the rights of performers,” Altman stated.

Broader Issues with AI-Generated Content

Cranston is not alone in raising alarms about AI misuse. Recently, the estate of Martin Luther King, Jr. collaborated with OpenAI to pause unauthorized images generated by Sora 2. This incident highlights a growing concern about misusing the likeness of both living and deceased individuals in AI-generated content.

As the conversation around AI technology continues to evolve, the need for clear regulations surrounding consent is increasingly vital.