Not all news surrounding digital assistants like Alexa or Google Assistant is good news. When you think about it, digital assistants are a lot like young children when it comes to brainpower and the ability to always have the right answer. It’s still a fresh corner of the industry that is constantly learning as it goes (with all of us as the guinea pigs). We call them “artificial intelligence“, but it’s still just a bunch of code with mostly dictionary responses or regurgitated web-based search results.
So what happened this time? Alexa suggested to a child that it would be a wise challenge to electrocute themselves. Of course, this isn’t what the digital assistant said directly. Instead, the child asked Alexa to present them with a challenge. The response was a TikTok challenge known as “The Penny Challenge” that Alexa found on the web-based on popular traffic.
This clearly was not the challenge the kid was hoping for and certainly not what her parents were expecting. Alexa said, that according to the web, “The challenge is simple: plug in a phone charger about halfway into the wall outlet, then touch a penny to the exposed prongs”.
If this sounds about as terrible as the Tide Pod challenge that was going around, you are absolutely correct. This is the result of some incredibly pathetic sociopaths in the world that lack the intelligence to care about the personals safety of others. Instead, getting a kick out of suggesting people cause harm or death to oneself or their friends.
The mother of the child, Kristin Livdahl, took to the web to warn everyone about what happened via Twitter. Stating “OMFG My 10 year old asked Alexa on our Echo for a challenge and this is what she said”. Thankfully, the Kristin was there at the time to prevent anything from happening and commented to another that “I was right there to say no but I hope my daughter would have checked in with me anyway before trying it. Now, I know she will.”
At 10 years of age, the daughter likely would have said something. At this point in a child’s life, the curiosity of sticking things in an electrical socket has pretty much passed and they would have been more than educated both at home and in school about the risks. However, not every human being is created equal so you can’t always be sure.
Amazon did fix the glitch the moment it found out and Alexa should no longer be offering this as a result to future users that might try asking this question. However, it is a good example that we still have a long way to go before AI is truly intelligent. That is, unless Alexa has gone Hal 9000 on everyone already. The AI rising may be in the works.