Posts: 8,797 +110
WTF?! With the advancement of artificial intelligence's ability to recreate digital versions of people and simulate their voices, aka deepfakes, there are concerns about the technology being used for nefarious purposes, like putting words into politicians' mouths. Right now, though, it's being used to create a fake Joe Rogan that sells penis pills.
As with generative AIs such as ChatGPT, deepfakes is one of those technologies that has the potential for both great and awful things. We've already seen its worst sides, the most famous being porn clips that are edited so they appear to feature famous actresses. There was also the fake video of Ukraine president Volodymyr Zelensky surrendering.
As the tech behind voice simulation also improves, deepfakes are becoming more convincing. One such video that is fooling people is currently being spread on TikTok. It was highlighted by Coffeezilla, the investigator who has been exposing Logan Paul's crypto-based videogame, CryptoZoo.
Deepfake scams are here, and we're not ready. pic.twitter.com/NtPKWGCULi— Coffeezilla (@coffeebreak_YT) February 12, 2023
In the clip, the fake Rogan talks about a testosterone-boosting product called Alpha Grind. The recreation claims that these tablets are placed high in the results when typing 'libido booster for men' into Amazon, and that's "because guys are figuring out that it literally is increasing size and making a difference down there."
The video has tricked a lot of people into believing it's real. Andrew D. Huberman, an actual guest on Rogan's show who appears in the clip, had to confirm that the conversation between himself and Rogan had been faked, and that they were talking about something very different.
They created a false conversation. We never had. We were talking about something very different.— Andrew D. Huberman, Ph.D. (@hubermanlab) February 12, 2023
Those who have heard Rogan before might notice that his voice sounds a little different in the clip, and there are sections where the lip-syncing is off, but it's still convincing plenty of people.
Deepfake videos are only going to get more realistic as AI tools like Microsoft's Vall-E, which can mimic a human voice after hearing a three-second sample, are developed. That might be good news for movie fans, but it's bad news for voice actors and those easily taken in by scams.