Are These the Hidden Deepfakes in the Anthony Bourdain Movie?

When  Roadrunner, a documentary about late TV chef and traveler Anthony Bourdain, opened in theaters last month, its director, Morgan Neville, spiced up promotional interviews with an unconventional disclosure for a documentarian. Some words viewers hear Bourdain speak in the film were faked by artificial intelligence software used to mimic the star’s voice.

Accusations from Bourdain fans that Neville had acted unethically quickly came to dominate coverage of the film. Despite that attention, how much of the fake Bourdain’s voice is in the two-hour movie, and what it said, has been unclear—until now.

In an interview that made his film infamous, Neville told The New Yorker that he had generated three fake Bourdain clips with the permission of his estate, all from words the chef had written or said but that were not available as audio. He revealed only one, an email Bourdain “reads” in the film’s trailer, but boasted that the other two clips would be undetectable. “If you watch the film,” The New Yorker quoted the Oscar-winning Neville saying, “you probably don’t know what the other lines are that were spoken by the AI, and you’re not going to know.”

Audio experts at Pindrop, a startup that helps banks and others fight phone fraud, think they do know. If the company’s analysis is correct, the deepfake Bourdain controversy is rooted in less than 50 seconds of audio in the 118-minute film.

click site
navigate to this website
my review here
get redirected here
useful reference
this page
Get More Info
see here
this website
great post to read
my company
imp source
click to read more
find more info
see it here
Homepage
a fantastic read
find this
Bonuses
read this article
click here now
browse this site
check here
original site
my response
pop over to these guys
my site
dig this
i thought about this
check this link right here now
his explanation
why not try these out
more info here
official site
look at this site
check it out
visit
click for more info
check these guys out
view publisher site
Get More Information
you can try this out
see this
learn this here now
directory
why not find out more
navigate to these guys
see this here
check my site
anchor
other
additional hints
look at this web-site
their explanation
internet
find more
Read More Here
here
Visit Website
hop over to this website
click
her latest blog
This Site
read review
try here
Clicking Here
page
read this post here
More Bonuses
recommended you read
go to this web-site
this
check that
Go Here
More hints
you could check here
Continued
More Help
try this
you could try here

Pindrop’s analysis flagged the email quote disclosed by Neville and also a clip early in the film apparently drawn from an essay Bourdain wrote about Vietnam titled “The Hungry American,” collected in his 2008 book, The Nasty Bits. It also highlighted audio midway through the film in which the chef observes that many chefs and writers have a “relentless instinct to fuck up a good thing.” The same sentences appear in an interview of Bourdain with food site First We Feast on the occasion of his 60th birthday in 2016, two years to the month before he died by suicide.

All three clips sound recognizably like Bourdain. On close listening, though, they appear to bear signatures of synthetic speech, such as odd prosody and fricatives such as “s” and “f” sounds. One Reddit user independently flagged the same three clips as Pindrop, writing that they were easy to hear on watching the film for a second time. The film’s distributor, Focus Features, did not respond to requests for comment; Neville’s production company declined to comment.

The director of Roadrunner said this clip of the chef musing on happiness was synthesized using AI software.

Audio source: Pindrop

 

When Neville predicted that his use of AI-generated media, sometimes termed deepfakes, would be undetectable, he may have overestimated the sophistication of his own fakery. He likely did not anticipate the controversy or attention his use of the technique would draw from fans and audio experts. When the furor reached the ears of researchers at Pindrop, they saw the perfect test case for software they built to detect audio deepfakes; they set it to work when the movie debuted on streaming services earlier this month. “We’re always looking for ways to test our systems, especially in real real conditions—this was a new way to validate our technology,” says Collin Davis, Pindrop’s chief technology officer.

Pindrop’s results may have resolved the mystery of Neville’s missing deepfakes, but the episode portends future controversies as deepfakes become more sophisticated and accessible for both creative and malicious projects.

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *