.footer { } Logo Logo
deutsch
/// News

Actors as involuntary advertising figures: the dark side of digital doppelgangers

[10:53 Thu,1.May 2025   by Thomas Richter]    

Who wouldn&t want to make money with minimal effort? That’s probably what actors thought when they licensed their digital doubles to a provider of AI avatars for several thousand dollars.

However, many experienced an unpleasant awakening when they saw how their avatar was used: for questionable advertising, scams, or even political propaganda – and the human originals have no way to stop it.

AI-powered services like Synthesia, valued at around 2 billion dollars, present themselves as a cheap alternative to productions with real actors for talking-head videos, for example for marketing, social media, or video messages.



The source material comes from half-day greenscreen video shoots with real actors who have to portray various emotions. From this footage, avatars are generated via AI, which can then be used in videos for clients. Although avatars that look like real people can also be generated purely by AI, those based on actor scans are (still) considered more realistic in terms of expressiveness.

Clients of the platform can then choose from many avatars and feed them their own texts in any language and tone, and the AI generates realistic-looking talking-head videos.

But the contracts the actors signed are tricky: some companies use extensive, irrevocable licenses with worldwide validity, against which no objection can be raised later. Thus, actors are powerless when their digital clones are used online in questionable campaigns.



For example, New York actor Adam Coy earned 1,000 dollars for the use of his digital avatar, which then appeared warning of disasters as a time traveler. British actor Connor Yeates found himself as the spokesperson for the president of Burkina Faso, and South Korean actor Simon Lee discovered that his virtual likeness had appeared alternately as a gynecologist and as a surgeon on TikTok and Instagram promoting lemon balm tea for weight loss or ice baths for acne.

Synthesia admitted in response to press inquiries that misleading and propagandistic clips were generated on its platform due to temporary content control gaps, but promised improvement. However, other AI avatar platforms have even looser rules. Experts are now calling for more transparency, legal regulations, and education – because behind every avatar is a real person with real consequences.

This conflict mirrors, on a smaller scale, the broader battle over the right to one&s own avatar, which is also being fought on a larger scale in Hollywood, where actors URL (www.slashcam.de/news/single/Streik-Erfolg--US-Schauspieler-erkaempfen-sich-Schu-18256.html) (successfully fought last year through a strike) to gain control over the use of their avatars as well as compensation.

Actors who are not unionized or who sign contracts with avatar companies should be aware of the implications of their decisions – their likeness may be used without their knowledge for campaigns that could damage their reputation and negatively impact their acting careers.


Bild zur Newsmeldung:
Synthesia-KI-Klon-1

Link more infos at bei arstechnica.com

deutsche Version dieser Seite: Schauspieler als unfreiwillige Werbefiguren - Die dunkle Seite digitaler KI-Klone

  



[nach oben]












Archiv Newsmeldungen

2025

May - April - March - February - January

2024
December - November - October - September - August - July - June - May - April - March - February - January

2023

2022

2021

2020

2019

2018

2017

2016

2015

2014

2013

2012

2011

2010

2009

2008

2007

2006

2005

2004

2003

2002

2001

2000






































deutsche Version dieser Seite: Schauspieler als unfreiwillige Werbefiguren - Die dunkle Seite digitaler KI-Klone



last update : 16.Mai 2025 - 18:02 - slashCAM is a project by channelunit GmbH- mail : slashcam@--antispam:7465--slashcam.de - deutsche Version