Alexa tells 10-year-old child to perform a deadly 'challenge'

Polycount

Posts: 3,005   +589
Staff member
In context: We've all heard stories about the funny, crazy, or otherwise entertaining things the world's most popular virtual assistants are capable of. Siri can famously give you directions to the nearest body disposal site, whereas Alexa can fart on command. However, as one Echo owner discovered on Sunday, some of the fun activities these AI helpers are known for can have disastrous outcomes.

Echo owner Kristin Livdahl took to Twitter on Sunday to share an odd story with the world: that very day, her 10-year-old child asked Alexa for a challenge to perform. Instead of suggesting something relatively-harmless, like answering a riddle, solving a math problem, or performing some minor physical stunt, Alexa advised them to "plug in a phone charger about halfway into a wall outlet, then touch a penny to the exposed prongs."

We probably don't have to tell you that this sort of 'challenge' is incredibly dangerous to try. Pennies are conductive, so sticking one anywhere near a piece of metal plugged into a live electrical outlet could pose a serious risk to one's life. Naturally, most adults that see this "challenge" would chuckle and shrug it off, but young children don't always possess that same restraint or knowledge of potential consequences.

Even if they do, they might be tempted to assume the challenge wouldn't be offered if it wasn't safe.

To be clear, we're not saying Alexa somehow went rogue here and maliciously chose a harmful challenge. The virtual assistant pulls most of its results from the web, and it just so happened to pick a dangerous one in this case. Internet sleuths determined that the assistant grabbed the challenge from an article that was discussing its dangers, so the AI simply wasn't able to parse the broader context.

Amazon later confirmed that the incident was legitimate, and they've already taken "swift action" to fix it. Judging by Twitter replies to Livdahl's original post, it sounds like Amazon has completely disabled the "tell me a challenge" feature for now, which is probably for the best.

Masthead credit: Tom's Guide

Permalink to story.

 

VitalyT

Posts: 6,173   +6,656
I tell my Alexa to perform a deadly challenge every time it starts explaining something to me. I tell her to "drop dead", but she's like Terminator 3 - "I'm unable to comply".
 

YouShallNotPass

Posts: 26   +63
I did a similar one with a knife and a electrical plug when I was young. No body asked me to do it. I was just curious. The effect is unforgettable. Besides the sparkle, the knife got two holes on the contact points.
 

umbala

Posts: 603   +1,007
I spent years pointlessly opposing smart assistants and this article just got me aroused.
If an article talking about a child nearly getting killed gets you aroused maybe you should seek professional help.

Also, should we ban all smart assistants because of one incident like this? Should we ban all cars after the first car crash, or ban all guns after the first person gets accidentally shot?
 

captaincranky

Posts: 18,529   +7,374
Electricity will take the shortest path to ground. In the case, it's through the penny, from one prong to the other, and not through the daredevil . It's most likely not "lethal" per se, but it could cause one to require a change of underwear.

OTOH, if you have a pacemaker implanted, you'll definitely void the warranty.
 

Uncle Al

Posts: 8,669   +7,575
Since it and it's responses are programmed by people and you can bet your bottom dollar all of those responses are documented extensively, the individual that created that extreme danger should be surrender to the police and charged, prosecuted, and given a maximum sentence. The excuse they were just playing around or really didn't mean it won't and should not get them off. If they are old enough to have a job they are plenty old enough to be responsible and accountable for their actions. If they think it's no big deal then let's duplicate the stunt and let THEM hold the penny ... while standing in a wet bathing suit in a pool of water. The best lessons in life are the ones you will never repeat.
 

ZackL04

Posts: 780   +590
If an article talking about a child nearly getting killed gets you aroused maybe you should seek professional help.

Also, should we ban all smart assistants because of one incident like this? Should we ban all cars after the first car crash, or ban all guns after the first person gets accidentally shot?

Kill almost getting killed? Do we know if they even attempted it?

Try it

I guarantee it wont kill you

Im a residential electrician, I get shocked like this a handful of times every few months. No real harm
 

Gezzer

Posts: 250   +124
If an article talking about a child nearly getting killed gets you aroused maybe you should seek professional help.

Also, should we ban all smart assistants because of one incident like this? Should we ban all cars after the first car crash, or ban all guns after the first person gets accidentally shot?

Ban? Of course not. Take a close look at and correct the issues? Makes sense to me. Problem for me is it's pretty easy to figure out why a car exploded and make a design change. More difficult to prevent a "smart" appliance from making an unforeseen mistake. It's why the military doesn't want autonomous drones firing ordinance of any kind. AI is fine at making decisions when the results are already foreseen. But there's no way to predict the results when they aren't and that's the big problem IMHO.