A Guardian report last week found that Apple has employed a contractor to listen to the recordings which sometimes included highly personal data such as medical information, drug deals or the sound of people having sex. While only a small proportion of Siri recording are passed on to the contractor, Apple had not disclosed this in its public-facing privacy documentation.
“While we conduct a thorough review, we are suspending Siri grading globally,” an Apple spokeswoman said in a statement, adding that in a future software update, users will be able to opt out of the program.
The revelations could be damaging to the company which has positioned itself as more privacy orientated than rivals such as Google. In June it launched a new ‘Sign In With Apple’ feature that allows web users to more easily protect their personal information online.
Siri, Apple’s voice assistant, allows users to use their iPhone without their hands, and can send messages, make calls and open multiple applications with voice commands alone. In an effort to improve the voice assistant’s responses, contractors graded Siri’s answers to user queries, The Guardian reported. They also looked at whether the response was triggered accidentally, without a deliberate query from the user, the newspaper said.
Meanwhile Google has also agreed to stop listening to voice commands heard through Google Assistant which is used on its Home devices and comes as part of Android. It was recently revealed that the search giant has been employing people to listen to user recordings in order to optimise the speech recognition process.
The decision from Google only applies to recordings in the EU and the cessation will last three months. It was taken after Germany’s data protection commissioner launched an investigation into the practice.
“The use of automatic speech assistants from providers such as Google, Apple and Amazon is proving to be highly risky for the privacy of those affected,” it said.
In April, it was revealed by Bloomberg that thousands of workers were employed to listen to voice commands given to Amazon’s Alexa virtual assistant for the purposes of improving its natural language processing abilities. These recordings reportedly captured embarrassing and sometimes concerning moments reportedly without a wake word, such as singing in the shower or possible sexual assaults.