Samsung's latest deepfake technique requires just a single source image

Shawn Knight

TechSpot Staff
Staff member

Modern deepfakes traditionally require a large amount of source imagery – training data – to work their magic. Samsung’s new approach, dubbed few- and one-shot learning, can train a model on just one single still image. Accuracy and realism improve as the number of source images increases.

For example, a model trained on 32 photographs will be more convincing than a version trained with a single image – but still, the results are stunning.

As exhibited in a video accompanying the researchers’ paper, the technique can even be applied to paintings. Seeing the Mona Lisa, a work that exists solely as a single still image, come to life is quite fascinating and fun.

The downside, as Dartmouth researcher Hany Farid highlights, is that advances in these sorts of techniques are bound to increase the risk of misinformation, fraud and election tampering. “these results are another step in the evolution of techniques ... leading to the creation of multimedia content that will eventually be indistinguishable from the real thing,” Farid said.

That’s great if you’re watching a fictional movie but not so much when tuning in to the evening news.

Image credit: Face Recognition of a woman by Mihai Surdu

Permalink to story.

 
Last edited:

psycros

TS Evangelist
And the reason Samsung and other companies are building this tech? To sell to those with the deepest pockets. Infowar is the new gun running and the end results will be quite similar.
 

Marc Petersen

TS Rookie
Who the **** cares no one is going to be stealing your phone unless you are a celebrity if that's the case learn to not take nudes and sex videos. Hidem in a vault only you know how to get into...
 

Capaill

TS Evangelist
Who the **** cares no one is going to be stealing your phone unless you are a celebrity if that's the case learn to not take nudes and sex videos. Hidem in a vault only you know how to get into...
What makes you think the only photos of you exist on your own phone? Besides, I doubt anyone will be making deepfakes of you. They will be of celebrities and officials.
When videos come out showing the Chinese president saying that all Americans should be shot, will that be a deepfake or the real deal? Considering how gullible the average internet user already is, these deepfakes are going to cause a lot of trouble.
 

syrious01

TS Rookie
Okay, bringing History to life is one of the first actual uses of this research that makes sense. Other than that I can't think of a single good thing that could come out of this type of machine learning.
 

Markoni35

TS Addict
Who the **** cares no one is going to be stealing your phone unless you are a celebrity if that's the case learn to not take nudes and sex videos. Hidem in a vault only you know how to get into...
That's not true. Kids in your street may be using your video to make deep fake of you having sex with a goat. And it would be so convincing that your dad would think you've gone back to your roots.
 

kenc1101

TS Booster
Who the **** cares no one is going to be stealing your phone unless you are a celebrity if that's the case learn to not take nudes and sex videos. Hidem in a vault only you know how to get into...
That's a very shallow take on a technology that could be used to cause incredible harm. Creating something fake that is pretty much indistinguishable from the real thing has an unlimited potential, good or bad.