News

Conversations with Siri are not as private as you think

Before we get started

After spending over 20 years working with Macs, both old and new, theres a tool I think would be useful to every Mac owner who is experiencing performance issues.

CleanMyMac is highest rated all-round cleaning app for the Mac, it can quickly diagnose and solve a whole plethora of common (but sometimes tedious to fix) issues at the click of a button. It also just happens to resolve many of the issues covered in the speed up section of this site, so Download CleanMyMac to get your Mac back up to speed today.

mac-pc

It started with Amazon, and Google followed soon afterwards. And now also Apple. It would appear that just about all companies with digital assistants such as Siri use real humans to listen to and evaluate the conversations that owners are having with their smartphones and speakers.

A whistleblower recently contacted the Guardian about this practice also being followed at Apple, and the company subsequently acknowledged that “a small portion of Siri requests” are studied by contracted workers. The company denied that these recordings are linked to any specific Apple ID.

According to the newspaper’s anonymous source, however, data such as contact details, location and more are included in these recordings.

Apple added that “Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.” The company further said that less than 1% of Siri requests are analysed on any specific day.



The Guardian’s anonymous source revealed that many of these recordings come from owners accidentally activating the service. Conversations about drug deals, medical details and sexual encounters have all reportedly been recorded and studied.

As is the case with Google Assistant and Amazon Alexa, the goal of the whole review process is to boost accuracy. Apple says that its employees must grade the recorded clips, which are normally only a few seconds long, based on whether Siri responded appropriately or not.

Nevertheless, most people might not be comfortable knowing that real humans could be listening to their conversations with Siri, either intended or accidental. At the time of writing, Apple does not give device owners a way to deactivate this feature.

Apple is always priding itself on how much it values user privacy, so don’t be too surprised if the whole Siri review process is made more anonymous after this. Until then, be careful what you say when Siri is within hearing distance.

Tags

About the author

Chris

Add Comment

Click here to post a comment

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.