TechSpot means tech analysis and advice you can trust. Read our ethics statement.
In context: With deepfake and generative AI scams on the rise, the White House says that ways to cryptographically verify their official releases are "in the works." No details have been shared yet of what this process would ultimately look like, but it seems probable that it would be a form of 'signing' official releases in a manner that proves that the White House was the true source.
The White House has confirmed that it is currently exploring ways to cryptographically verify the statements and videos that it puts out, in an effort to combat the rise of politically motivated deepfakes.
In January, we reported on an AI-generated robocall that faked President Biden's voice and told New Hampshire residents not to vote in the upcoming primary election. This was followed by the news this week that FCC Chairwoman, Jessica Rosenworcel, has put forth a proposal to ban AI-generated voices from robocalls.
But banning such techniques is unlikely to be enough to stop people using them, so in an attempt to reassure the public of the authenticity of their releases, the White House is reportedly turning to cryptographic techniques, allowing people to verify what's real and what's not.
One common method for doing this is a private and public key pairing. The source for a piece of information generates a hash value for any given video or document and encrypts it using their private key. This hash can only be decrypted by the public key, which is available to all and attributed to the original author. Thus, successful decryption using the public key confirms the owner of the private key – verifying the source.
Any third-party attempts to alter the file would not contain the original hash value, and so would not be able to verify themselves as authentic.
While these efforts would certainly bring some benefits, there are some potential risks that need to be considered. Proper usage would undoubtedly help people verify real communications, but these powers would give the President and their staff a way of staking a claim on what is "the truth."
If the President made a mistake or gaffe during a White House video, they could simply not cryptographically sign the content and disavow it as fake. And it seems likely given the divisive state of the political landscape that such powers could and would be weaponized.
For now though, we do not have a timeline for this development. Speaking to Business Insider, Ben Buchanan, Biden's Special Advisor for Artificial Intelligence, simply confirmed that it's "in the works."