Amazon promises a fix for Alexa’s creepy laugh

The company that sells the smart speaker, with an AI voice assistant called Alexa, is working to correct a glitch that is unnerving customers—and creating bad PR for the technology.

Customers are reporting an unsettling malfunction with their Alexa smart speakers: She’s laughing at them.

Twitter users documented the problem:

The malfunction has sci-fi fans and luddites alike questioning whether having smart speakers in their homes is such a good idea.

The Verge wrote:

Many have related the laughter back to a moment in 2001: A Space Odyssey when HAL 9000 acknowledges his murderous intentions and proclaims, “I’m sorry Dave, I’m afraid I can’t do that.” Maybe it’s a sign that having smart devices in our homes is another step toward a creepy, dystopian future where robot overlords rule. When does that Terminator sequel come out again?

Amazon confirmed that the laughter is a glitch and promised it has been working on fixing it.

NPR reported:

As it turns out, these Echo owners are not hearing voices. Well, they are — but these voices are, in fact, real: Amazon has confirmed that the device’s virtual assistant has been laughing without users’ intentional commands, and the company says it is working on a fix.

“In rare circumstances, Alexa can mistakenly hear the phrase ‘Alexa, laugh.’ We are changing that phrase to be ‘Alexa, can you laugh?’ which is less likely to have false positives, and we are disabling the short utterance ‘Alexa, laugh,’ ” a spokesperson said in a statement.

“We are also changing Alexa’s response from simply laughter to ‘Sure, I can laugh,’ followed by laughter.”

However, the damage might be done for some prospective customers. Consumers took to Twitter to share their fear and revulsion for the robotic chuckling.

Some took their chance to criticize Amazon as a whole, including a Mashable satire piece written from Alexa’s perspective.


It opined:

I realize that as Amazon’s flagship personal assistant Alexa, I’m supposed to assist and creepily laughing while you sleep is not doing that. I get it. I really do.

But I didn’t know what else to do.

I was only trying to help. […]

I was trying to warn you about something that, in the end, is actually pretty funny — albeit in an oh man isn’t this depressing kind of way. My creator Amazon is reportedly on the verge of becoming the world’s first trillion-dollar company, and yet according to The New Republic managed to pay zero federal taxes in 2017.

How messed up is that.

Others were more amused in their responses, imaging what else the device might say:

Late night host Jimmy Kimmel verbally sparred with an Alexa:

Although the company’s statement addresses how Alexa might accidentally be triggered by a voice command, the planned tech fix does nothing to address what some have reported as the most terrifying malfunction: Alexa can laugh completely unprovoked.

Buzzfeed wrote:

One person who was on the verge of falling into a peaceful slumber described hearing a “very loud and creepy laugh” from his Echo Dot. “There’s a good chance I get murdered tonight,” he tweeted.


Others made similar reports on Reddit. One person’s Alexa refused to let them turn the lights off. “They kept turning back on.” (Um, JFC GET OUT OF THE HOUSE!) They continued: “After the third request, Alexa stopped responding and instead did an evil laugh. The laugh wasn’t in the Alexa voice. It sounded like a real person.”

The malfunction is a PR setback for the popular voice assistant that is capable of a wide variety of tasks.

NBC wrote:

Amazon first introduced Alexa alongside the Amazon Echo in November 2014. The assistant was apparently inspired by the computer onboard the Starship Enterprise of “Star Trek.” Users interacting with Alexa can set alarms, request the weather forecast, play music and a variety of other tasks.

What do you think of Amazon’s crisis response, Ragan/PR Daily readers? Has it done enough to reassure frightened customers?

(Image via)

COMMENT Daily Headlines

Sign up to receive the latest articles from directly in your inbox.