Google's AI-powered search can recommend malicious sites, including scams and malware

midian182

Posts: 9,745   +121
Staff member
Facepalm: Another day, another example of Google's generative artificial intelligence technology doing something it shouldn't. This time, it's the company's AI-powered search results that were misbehaving, recommending malicious websites including those pushing scams, malware, and fake giveaways.

Google started rolling out its Search Generative Experience (SGE) earlier this month. The company boasts that the feature allows users to ask complex and more descriptive questions, with the generative AI providing text summaries of sites and recommendations related to a user's search.

That might sound helpful, but SEO consultant Lily Ray noticed something was amiss with Google SGE. As per her post on X, SGE was recommending potentially malicious sites as part of its answers. Ray's question about Pitbull puppies for sale on Craigslist returned recommendations for several dangerous and scammy sites that could be packed with malware.

BleepingComputer notes that all the suspect results listed by SGE tend to use the same .online domain, HTML templates, and sites to perform redirects, indicating that they are part of the same SEO poisoning campaign that allowed them to be part of Google index.

The publication found that clicking on these links led visitors through a series of redirects until they reached the scam site. Most of the time, people are led to fake captchas or YouTube sites that try to get visitors to subscribe to browser notifications, which will send out a slew of scammy ads such as tech support and fake giveaways to the desktop. There was also a McAfee antivirus ad urging users to perform a scan or renew their license, thereby earning affiliate commission for the sender.

Another X user tried a similar search for Craigslist puppies for sale. SGE surfaced a site with the German .de domain. The first click led him to Amazon gift card spam, while the second led to porn.

Google says it constantly updates its systems and ranking algorithms to protect against spam. It appears that the company has removed these SGE results since they were highlighted.

Last month, Google's Gemini AI image generation feature was paused after the tool produced images showing people of color and women in historically inaccurate contexts, such as Nazi-era German soldiers and the Founding Fathers.

Permalink to story.

 
To nobody's surprise, an organic pre-baked algorithm is prone to organic error, training your data from community forums isn't always the best of ideas, no matter how mainstream it is, Alphabet Inc.
Mainstream goes by what clicks, even if truth is its sacrifice, or malware!
 
I am ok with that, I just want more sites about that famous black Mao who led China to glory and prosperity.
 
Looks like the DOJ needs to go after Google next since they have lost sight of reality ....
 
Back