An In-Depth Analysis of the Actor Guild Strike. Hollywood Producers Are Pushing Background Actors to Sell Their Image for Pennies
Actor guilds are on strike over the push from Hollywood producers to use body scans of background actors. This is part of a plan to introduce AI-based extras on movie sets. It would be the beginning of the end for an entire craft.
"You get paid a full day’s work, and the studio would own that scan of their likeness, and be able to use it for the rest of eternity in any project they want with no consent and no compensation."
The "groundbreaking" proposal was put forth by AMPTP, a group negotiating for major Hollywood studios and streaming services. The Hollywood producers want background actors to agree to sell them their body scans for less than $200. This is part of a plan to introduce AI-based background characters in Hollywood movies.
It would essentially wipe out an entire craft, in favor of artificially-intelligent tech.
A Script Ripped off From Cyberpunk Stories
The Screen Actors Guild–American Federation of Television and Radio Artists said yesterday they will be going on strike over the studios' plan. This is a measure that studios are calling "groundbreaking." On the other side of the fence, professional actors found less flattering words for it. "Greedy" and "abusive" have been floated around.
Aspiring actors have always found an entry point into the acting career as extras on various Hollywood sets. They are also keeping afloat with the money they earn. Studios are now looking to close that window of opportunity. Producers want to buy digital duplicates of actors that they can use however they see fit for a single day’s pay.
Not only that, but actors would eventually lose complete control over their own image, in perpetuity, for the rest of eternity. Their alter egos could be filling in the background of a scene, could become famous, or play in R-rated movies. Financially, it wouldn’t make a difference to the owner of the face. Producers wouldn’t need any approval from them to use their image going forward.
"It literally was all of our worst fears confirmed when we heard that," actress Jamie Miller said, via the Rolling Stone publication. "It’s kneecapping people from the start."
This is an eerie move and has made the subject of one ominous Black Mirror episode less than a month ago.
The Heat Was on at the SAG-AFTRA Press Conference
According to chief negotiators Fran Drescher and Duncan Crabtree-Irelands, yesterday SAG-AFTRA voted unanimously to go on strike. Apparently, AMPTP has been dragging out the discussions with the guild and is hoping to starve out the actors.
The AMPTP sent out a press release saying SAG-AFTRA "has chosen a path that will lead to financial hardship for hundreds of thousands of people." They claim they gave SAG-AFTRA a deal for an AI proposal that protects actors’ digital likenesses. Their press release also states they maintain the right for performers to give their consent for the use of their digital replicas.
SAG-AFTRA begs to differ. They say actors would actually sell their likenesses for one day’s pay. The guild also explains that there was no mention of consent or any potential future payments in the deal that the producers had forwarded.
The Moral Implications of Selling Your Image to Hollywood
The recent episode of the Netflix series Black Mirror, called Joan Is Awful, illustrates the issue very well.
In the Netflix Episode, Salma Hayek plays herself in the role of an actress whose double is used without her consent. In fact, in the movie, her digital likeness is made to interrupt a wedding ceremony. She pulls down her pants in the middle of a Catholic church and defecates in a prolonged, drawn-out, off-camera moment. The actress is powerless to stop the airing of the scene, as she has already signed away her rights. While the scene seems outlandish and the stuff of sci-fi, these are the exact implications of not owning your digital double.
Actors need to maintain some control.
Ironically enough, Netflix, the producer of the movie series, is one of the major streaming services that is part of AMPTP. They defend the AI proposal. Another participant is Disney, a studio famous for lobbying and shaping industry legislation in the past.
Unionized members say compensation for selling away your image is humiliating
Let’s say you eventually decide to give away your image to a studio, with all the risks that entails. They are able to use your image indefinitely. In fact, it’s very unlikely that the real you can make a career, after studios have already purchased your image. How much would you expect to be paid?
Background acting has historically been a gig economy. According to the Rolling Stones magazine, union members get paid less than $200 per day, even on the longest days. There is 16 to 18 hours of work in an extra's day. This is how much actors would be compensated, under the proposal for AMPTP.
This reverberates to other Hollywood crafts
With the extras gone from the Hollywood sets, this would decrease the demand for supporting jobs. It includes costumes, hairdressing, prosthetics, coordinators, and an entire slate of professionals present to create a lively, authentic movie background. It would suck out a great deal of revenue from the industry. Unions say the money saved would end up exclusively in the pockets of producers.
"We’re not curing cancer here! It’s a collective art form!" said Fran Drescher in her press conference yesterday, as her voice was breaking in anger.
Producers are picking on the weak ones
Background actors are the underdog ensemble. They are more often than not, not in a financial position to negotiate. In fact, many of them are in positions to compromise and could end up selling their image for whatever is offered to them. They are simply the weak spot in the industry. It’s a crack, and it will allow producers to extend the practice, by using precedent. The industry could eventually condition many of its entertainers to sell the rights to their image. This is making all artists uncomfortable.
The craft itself is losing its authenticity
AIs and body scans hit the uncanny valley. In order to feel comfortable around a person, we look at the emotion on their face. It creates trust. As social creatures, we are fairly good at figuring out what’s off about another person. AIs, the way they are today, hit that precise spot of uneasiness. Our brain doesn’t understand why they don’t emote well. It makes sense that a lot of people would prefer human actors to tell the story. Although Hollywood is an industry, it's also an art and it shapes our societies and mindsets.
Studios would be in charge of protecting the scanned images
If we monetize people’s digital doubles, we have to be aware they will become a honeypot for hackers. How much would a hacker earn from stealing 15,000 digital doubles? How about for the digital double of Scarlet Johansen, to use it "for all eternity," however they see fit? Is it right to have studios in charge of securing that?
How Would the Digital Double Work? An Example From the Gaming Industry
Digital doubles already exist, and there are plenty of big actors who already have one. Keanu Reeves, Rami Malek, Elliott Page, Norman Reedus, Kevin Spacey, and Kit Harrington are just some of the most famous names. They already lent their image to the gaming industry long before AI came out. They have more or less uncanny valley performances in top-of-the-line games.
Studios can scan a person using photogrammetry, a technique that has been popular in gaming since 2014. The process involves taking overlapping photographs of the person in flat light. They use camera setups that allow them to get every angle of the person at the same instant. The procedure takes around 30 to 45 life-defining minutes, at the end of which your image is out there in the wild. You no longer own yourself.
The photographs can be stitched together to create a 3D model. There is a broad choice of software that studios can use to assemble the images into 3D. You can change the model however you see fit. You can swap bodies or heads around, dress them, stretch them, and change their colors and haircut. Those aspiring actors become 3D toys.
Models are cleaned and rigged—basically, they get a skeleton that is more or less complex depending on the movements they would end up doing. Animating the models takes time and is resource-intensive. Actors do the motion capturing themselves, rigged with equipment that will track the movement of their bodies and their face muscles. It’s not a very practical or cheap endeavor. Preparing a single character to perform in a 3D game can take artists hundreds and hundreds of hours.
This was a lengthy process before the advent of AI.
Here's How AI Is Changing the Game
AIs are now learning how to replicate human movement. The field of research is known as motion planning. It involves developing algorithms and techniques that allow AI systems to observe and imitate our movements. MidJourney learned to generate images by looking at pictures. ChatGPT got fed a lot of writing and learned how to produce its own original materials. Motion planning systems are being trained on human motion. They see a lot of pick up the ball, run, kick, walk, laugh, shake hands, eat and they eventually figure out their own type of movements. If they know how to pick a ball, they will figure out on their own how to pick an orange, and so on.
This is an ongoing process. The industry didn’t unveil more than rough prototypes, but at the speed at which innovation happens, it’s not going to be a long wait. Producers are trying to get a jump on the issue before actors figure out how valuable their image could become. Their issue is Hollywood is a highly unionized industry and guilds like SAG-AFTRA plan to give them a run for their money.
The objective is that at the end of the day, we can prompt the background actors to sit, talk, pretend to eat, walk by. Actually, the industry would do precisely what they've been doing so far: order some models around, only they wouldn't be paying anyone for the job. For eternity.