1. TechSpot is dedicated to computer enthusiasts and power users. Ask a question and give support. Join the community here.
    TechSpot is dedicated to computer enthusiasts and power users.
    Ask a question and give support.
    Join the community here, it only takes a minute.
    Dismiss Notice

Samsung's latest deepfake technique requires just a single source image

By Shawn Knight · 8 replies
May 23, 2019
Post New Reply
  1. Modern deepfakes traditionally require a large amount of source imagery – training data – to work their magic. Samsung’s new approach, dubbed few- and one-shot learning, can train a model on just one single still image. Accuracy and realism improve as the number of source images increases.

    For example, a model trained on 32 photographs will be more convincing than a version trained with a single image – but still, the results are stunning.

    As exhibited in a video accompanying the researchers’ paper, the technique can even be applied to paintings. Seeing the Mona Lisa, a work that exists solely as a single still image, come to life is quite fascinating and fun.

    The downside, as Dartmouth researcher Hany Farid highlights, is that advances in these sorts of techniques are bound to increase the risk of misinformation, fraud and election tampering. “these results are another step in the evolution of techniques ... leading to the creation of multimedia content that will eventually be indistinguishable from the real thing,” Farid said.

    That’s great if you’re watching a fictional movie but not so much when tuning in to the evening news.

    Image credit: Face Recognition of a woman by Mihai Surdu

    Permalink to story.

    Last edited: May 23, 2019
  2. psycros

    psycros TS Evangelist Posts: 2,662   +2,418

    And the reason Samsung and other companies are building this tech? To sell to those with the deepest pockets. Infowar is the new gun running and the end results will be quite similar.
  3. Marc Petersen

    Marc Petersen TS Rookie Posts: 17

    Who the **** cares no one is going to be stealing your phone unless you are a celebrity if that's the case learn to not take nudes and sex videos. Hidem in a vault only you know how to get into...
  4. stewi0001

    stewi0001 TS Evangelist Posts: 2,170   +1,589

    Welcome to Hogwarts!
  5. Capaill

    Capaill TS Evangelist Posts: 863   +474

    What makes you think the only photos of you exist on your own phone? Besides, I doubt anyone will be making deepfakes of you. They will be of celebrities and officials.
    When videos come out showing the Chinese president saying that all Americans should be shot, will that be a deepfake or the real deal? Considering how gullible the average internet user already is, these deepfakes are going to cause a lot of trouble.
    regiq and JaredTheDragon like this.
  6. syrious01

    syrious01 TS Rookie Posts: 24   +18

    Okay, bringing History to life is one of the first actual uses of this research that makes sense. Other than that I can't think of a single good thing that could come out of this type of machine learning.
  7. Markoni35

    Markoni35 TS Booster Posts: 132   +67

    That's not true. Kids in your street may be using your video to make deep fake of you having sex with a goat. And it would be so convincing that your dad would think you've gone back to your roots.
  8. Jon Tseng

    Jon Tseng TS Booster Posts: 59   +40

    Exactly what I thought!
    stewi0001 likes this.
  9. kenc1101

    kenc1101 TS Booster Posts: 31   +26

    That's a very shallow take on a technology that could be used to cause incredible harm. Creating something fake that is pretty much indistinguishable from the real thing has an unlimited potential, good or bad.

Add your comment to this article

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...