Amazon employs thousands of people to listen to Echo recordings

By

Echo Dot second-generation, black and white
Amazon helped pioneer the smart speaker market.
Photo: Amazon

Amazon employs a team made up of thousands of people to listen to recordings made by Echo devices, a new report claims.

These recordings are transcribed, annotated, and ultimately used to “help improve” the quality of Amazon’s smart speakers.

Listening to 1,000 audio clips per shift

Bloomberg notes that this team is a “mix of contractors and full-time Amazon employees.” They work throughout the world. Alexa auditors typically listen to up to 1,000 audio clips per nine-hour shift.

The work they do is covered by a non-disclosure agreement. But the reporters behind the story did manage to glean some details. These include some ethical conundrums, which seemingly don’t include purposeful “Hey Alexa” call to action.

“Occasionally the listeners pick up things Echo owners likely would rather stay private: a woman singing badly off key in the shower, say, or a child screaming for help. The teams use internal chat rooms to share files when they need help parsing a muddled word—or come across an amusing recording.”

Elsewhere in the article, Bloomberg‘s report notes that:

“Sometimes they hear recordings they find upsetting, or possibly criminal. Two of the workers said they picked up what they believe was a sexual assault …  [T]wo Romania-based employees said that, after requesting guidance for such cases, they were told it wasn’t Amazon’s job to interfere.”

Amazon says that it listens to only a tiny portion of the overall smart speaker requests that it receives. It also notes that employees don’t have direct access to identifiable information about users. Bloomberg claims that Alexa auditors receive an account number, device serial number and user first name.

How does Amazon compare?

Apple has stressed the privacy-first focus of Siri, the voice assistant on iOS devices, Mac and HomePod smart speakers. Bloomberg‘s report suggests that Siri isn’t entirely different in its operation, however. It writes that:

“Apple’s Siri also has human helpers, who work to gauge whether the digital assistant’s interpretation of requests lines up with what the person said. The recordings they review lack personally identifiable information and are stored for six months tied to a random identifier, according to an Apple security white paper. After that, the data is stripped of its random identification information but may be stored for longer periods to improve Siri’s voice recognition.”

Google, for its part, also involves reviewers of audio snippets, as per Amazon. However, these audio samples are not associated with any personally identifiable information. The audio is also distorted to safeguard identity.

Source: Bloomberg

Newsletters

Daily round-ups or a weekly refresher, straight from Cult of Mac to your inbox.

  • The Weekender

    The week's best Apple news, reviews and how-tos from Cult of Mac, every Saturday morning. Our readers say: "Thank you guys for always posting cool stuff" -- Vaughn Nevins. "Very informative" -- Kenly Xavier.