AI is here and making films. Is Hollywood Ready?

Scott Mann had a problem: too many F-bombs.

The writer-director spent the production on Fall, his dizzying thriller about rock climbers stranded on a remote TV tower, and encouraged the two leads to enjoy their dialogue. This improvisation earned the film a whopping 35 “f-cks” and placed it firmly in the R-rated range.

But when Lionsgate signed on to distribute Fall, the studio wanted a PG-13 edit. Disinfecting the film would mean scrubbing all but one of obscenities.

“How do you solve this?” Mann recalled that October, two months after the film’s debut, from the glass-paneled conference room of his Santa Monica office. A prop vulture he commandeered off the set sat outside in the lobby.

After all, reshoots are expensive and time-consuming. Mann had filmed “Fall” on a mountaintop, he explained, and was struggling not only with COVID but also with hurricanes and thunderstorms. A colony of fire ants had taken up residence at one point in the film’s main set, a hundred-foot-long metal pipe; When the crew woke her up, the crush enveloped the set “like a cloud”.

“‘Fall’ was probably the hardest film I’ve ever made,” Mann said. Could he avoid a redux?

The solution, he realized, could be a project he had developed alongside the film: artificially intelligent software that could edit footage of actors’ faces long after filming was complete, seamlessly altering their facial expressions and mouth movements in newly recorded dialogue compare.

It’s a deceptively simple application for a technology that experts say will transform almost every dimension of Hollywood, from the dynamics of work and financial models to the way audiences think about what’s real and fake.

Artificial intelligence will do to motion pictures what Photoshop did to still images, said Robert Wahl, an associate computer science professor at Concordia University Wisconsin who has written about the ethics of CGI, in an email. “We can no longer fully trust what we see.”

A software solution for dubious dubs

It took a particularly disturbing collaboration with Robert De Niro to push Mann into the world of software.

De Niro starred in Mann’s 2015 crime drama Heist, and the two had poured a lot of time and thought into the acclaimed actor’s performance. But when it came time to adapt the film for foreign releases, Mann said he was dissatisfied.

When films are released abroad, dialogue is often re-recorded in other languages. This process, called “dubbing,” makes the film internationally accessible, but can also result in the harrowing sight of an actor’s mouth flapping out of sync with the words they’re supposed to be saying. A typical solution is to rewrite the dialogue to better match the visuals already in place—but for the sake of readability, these changes sacrifice the creative team’s original vision.

“All the things that I had nuanced with Robert De Niro have now been changed,” Mann said of the dubs. “I was kind of devastated.”

A follow-up film he was working on, Final Score, deepened those frustrations. Mann tried scanning his performers’ heads so he could better synchronize their speech, but the process proved prohibitive and the end result looked odd.

It was only when he began researching more novel solutions that the visual effects enthusiast found a 2018 academic paper that outlined a possible solution: neural networks, or computer programs that mimic the structure of a brain that attempted to pick up an actor’s facial expression transfer someone else’s face.

Intrigued, Mann reached out to the paper’s authors and began collaborating with some of them on a rudimentary “vubbing” tool — that is, visual, rather than audio, syncing. The later addition of Nick Lynes, a friend of a friend with a background in online gaming, also gave the team a foothold in the technology sector.

Together, emissaries from three very different worlds – cinema, science and software industry – built Flawless, an AI film company with offices in Santa Monica and London.

More generally, the company’s technology can identify patterns in an actor’s phonemes (or the sounds they make) and visemes (or what they look like when they make those sounds) and then update them when confronted with newly recorded phonemes The screen shows two games. Last year, Time magazine deems The company’s “Fix for Film Dubbing” is one of the best inventions of 2021.

However, the scramble to scrub dozens of F-Bombs from “Fall” raised a question with potentially much broader implications: Instead of just changing the characters’ language, could Flawless change the content of what they said?

“We took the actresses to a recording studio down in … Burbank and we said, ‘All right, here’s the new lines,'” said Mann, who lives in Los Angeles. They then plugged the new audio into the Vubbing software, which adjusted the stars’ on-screen facial movements accordingly.

“We put the footage in, the MPAA rechecked it and gave it a PG-13, and that went out in theaters,” he said.

A few weeks after the film came out, Mann sat in his Santa Monica conference room surrounded by posters for Blade Runner and 2001: A Space Odyssey and showed off the results with a scene in which one of the “Fall” protagonists took her laments predicament .

“Now we’re stuck on this damn tower in the middle of damn nowhere!” Virginia Gardner called Grace to Caroline Currey as the two huddled together on a precariously high platform.

Virginia Gardner and Grace Caroline Currey "Autumn."

Virginia Gardner and Grace Caroline Currey in Fall.

(lion’s gate)

A moment later, Mann repeated the scene. But this time, Gardner’s dialogue was noticeably harsher: “Now we’re stuck on this stupid fucking tower in the middle of fucking nowhere.”

The first version came out on August 2nd over 1,500 American cinemas. But the latter – the one with sailor-worthy dialogue – was what Mann had actually filmed on that fire ant-infested mountaintop. If you didn’t know that a neural network reconstructed the actors’ faces, you probably would have had no idea that their sanitized dialogue was a late addition.

“You can’t tell what’s real and what’s not,” Mann said, “that’s the whole thing.”

The ethics of plastics

When it comes to filmmaking, this realism has obvious advantages. Nobody wants to spend money on something that looks like it came out of MS Paint.

But the rise of software that can seamlessly change what someone appears to have said is having a major impact on a media environment already awash with misinformation. Flawless’ core product, after all, is essentially just a more legitimate version of “deep fakes‘ or CGI that mimics a person’s face and voice.

It’s not hard to imagine a troll who, instead of using these tools to edit profanity out of a movie, would make a viral video of Joe Biden declaring war on Russia. Porn created using a person’s digital likeness has also become a problem.

And Flawless isn’t the only company working in this space. Papercup, a company that generates synthetic human voices for use in dubbing and voice-over, aims to “make any video, in any language, viewable,” CEO Jesse Shemen told The Times.

And its visual effects mainstay, Digital Domain, uses machine learning to render actors in cases where they can’t perform themselves, such as scenes that require a stunt double, Chief Technology Officer Hanno Basse said.

As these and other companies continue to automate the entertainment industry, many ethical questions arise.

Hollywood is already counting on his newfound ability to digitally recreate dead actors as well Anthony Bourdain’s voice in the documentary “Roadrunner” or Peter Cushing Spirit Carrie Fisher in recent Star Wars sequels. Holographic revival newer celebs are now also possible.

Digitally altered dialogue “risks compromising the consent of those originally involved,” said Scott Stroud, director of the University of Texas at Austin’s media ethics program. “What actors thought they were agreeing to is not literally what is being created.”

And this technology could result in movies being altered long after they’re released, said Denver D’Rozario, a Howard University marketing professor who has studied software resurrection of dead actors.

“Let’s say … in a movie a guy drinks a can of Pepsi, and in 20 years you get a Coke sponsorship,” D’Rozario said. “Are you changing the can of Pepsi for Coke?” “When can something be changed? When can you buy something?”

Mann said the benefits of his technology are many, from breaking down language barriers and encouraging cross-border empathy to relieving actors of the headache of re-shooting. In his view, scenarios like D’Rozario’s hypothetical Coke sponsorship represent new revenue streams.

Flawless has been proactive, Mann added, to create a product that supports, rather than replaces, authentic human performance.

“There is a way to use technology in a similar way to that [visual effects] Industry has already established itself, which means: do it safe, do it right, do it legal, with the consent of all parties involved,” he said.

And the company has already tasked “all major unions” with how this technology could be sensibly manufactured and used, the director continued.

SAG-AFTRA officials emphasized that AI filmmakers can either help or harm actors depending on how they are deployed.

“Technology that does little more than digitally enhance the work our members do may only require the ability to provide informed consent and possibly additional compensation,” said Jeffrey Bennett, general counsel of SAG-AFTRA, in an E -Mail. “At the other end of the spectrum are the technologies that are replacing traditional performances or taking our members’ performances and creating entirely new ones; for these we note that they are a mandatory subject of negotiation.”

It’s a train that, for better or for worse, has already left the station.

Fall is currently streaming, and Mann said other films his company has been working on are coming out this Christmas — though he can’t publicly name them just yet.

If you’re watching a movie over the holidays, an AI may have helped create it.

will you be able to tell Would it matter?

Leave a Reply

Your email address will not be published. Required fields are marked *