Facepalm: Another day, another example of Google's generative artificial intelligence technology doing something it shouldn't. This time, it's the company's AI-powered search results that were misbehaving, recommending malicious websites including those pushing scams, malware, and fake giveaways.

Google started rolling out its Search Generative Experience (SGE) earlier this month. The company boasts that the feature allows users to ask complex and more descriptive questions, with the generative AI providing text summaries of sites and recommendations related to a user's search.

That might sound helpful, but SEO consultant Lily Ray noticed something was amiss with Google SGE. As per her post on X, SGE was recommending potentially malicious sites as part of its answers. Ray's question about Pitbull puppies for sale on Craigslist returned recommendations for several dangerous and scammy sites that could be packed with malware.

BleepingComputer notes that all the suspect results listed by SGE tend to use the same .online domain, HTML templates, and sites to perform redirects, indicating that they are part of the same SEO poisoning campaign that allowed them to be part of Google index.

The publication found that clicking on these links led visitors through a series of redirects until they reached the scam site. Most of the time, people are led to fake captchas or YouTube sites that try to get visitors to subscribe to browser notifications, which will send out a slew of scammy ads such as tech support and fake giveaways to the desktop. There was also a McAfee antivirus ad urging users to perform a scan or renew their license, thereby earning affiliate commission for the sender.

Another X user tried a similar search for Craigslist puppies for sale. SGE surfaced a site with the German .de domain. The first click led him to Amazon gift card spam, while the second led to porn.

Google says it constantly updates its systems and ranking algorithms to protect against spam. It appears that the company has removed these SGE results since they were highlighted.

Last month, Google's Gemini AI image generation feature was paused after the tool produced images showing people of color and women in historically inaccurate contexts, such as Nazi-era German soldiers and the Founding Fathers.