How Meta and AI companies recruited striking actors to train AI

Weekly Business Insights from Top Ten Business Magazines | Week 319 | Shaping Section | 4

Extractive summaries and key takeaways from the articles curated from TOP TEN BUSINESS MAGAZINES to promote informed business decision-making | Since September 2017 | Week 319 | October 20-26, 2023

How Meta and AI companies recruited striking actors to train AI

By Eileen Guo | MIT Technology Review | October 19, 2023

Extractive Summary of the Article | Listen

One evening in early September, T, a 28-year-old actor who asked to be identified by his first initial, took his seat in a rented Hollywood studio space in front of three cameras, a director, and a producer for a somewhat unusual gig.  The two-hour shoot produced footage that was not meant to be viewed by the public—at least, not a human public. Rather, T’s voice, face, movements, and expressions would be fed into an AI database “to better understand and express human emotions.” That database would then help train “virtual avatars” for Meta, as well as algorithms for a London-based emotion AI company called Realeyes.

Many actors across the industry, particularly background actors (also known as extras), worry that AI—much like the models described in the emotion study—could be used to replace them, whether or not their exact faces are copied. And in this case, by providing the facial expressions that will teach AI to appear more human, study participants may in fact have been the ones inadvertently training their own potential replacements. 

As the need for higher-quality data has grown, alongside concerns about whether data is collected ethically and with proper consent, tech companies have progressed from “scraping data from publicly available sources” to “building data sets with professionals,”. Or, at the very least, “with people who have been recruited, compensated.  But the need for human data, especially in the entertainment industry, runs up against a significant concern in Hollywood: publicity rights, or “the right to control your use of your name and likeness.

This was an issue long before AI, but AI has amplified the concern. Generative AI in particular makes it easy to create realistic replicas of anyone by training algorithms on existing data, like photos and videos of the person. The more data that is available, the easier it is to create a realistic image. This has a particularly large effect on performers. 

The AI landscape is different for noncelebrities. Background actors are increasingly being asked to undergo digital body scans on set, where they have little power to push back or even get clarity on how those scans will be used in the future. Studios say that scans are used primarily to augment crowd scenes, which they have been doing with other technology in postproduction for years—but according to SAG representatives, once the studios have captured actors’ likenesses, they reserve the rights to use them forever. (There have already been multiple reports from voice actors that their voices have appeared in video games other than the ones they were hired for.)

The decision to train AI may be an individual one, but the impact is not; it’s collective.

2 key takeaways from the article

  1. One evening in early September, T, a 28-year-old actor, took his seat in a rented Hollywood studio space in front of three cameras, a director, and a producer for a somewhat unusual gig.  The two-hour shoot produced footage that was not meant to be viewed by the public—at least, not a human public. Rather, T’s voice, face, movements, and expressions would be fed into an AI database “to better understand and express human emotions.” That database would then help train “virtual avatars” for Meta, as well as algorithms for a London-based emotion AI company called Realeyes.
  2. Many actors across the industry, particularly background actors, worry that AI could be used to replace them, whether or not their exact faces are copied. And in this case, by providing the facial expressions that will teach AI to appear more human, study participants may in fact have been the ones inadvertently training their own potential replacements.

Full Article

(Copyright)

Topics:  Technology, Artificial Intelligence, Entertainment

Be the first to comment

Leave a Reply