Apple is widely known as a tech company that respects the privacy of its users. But a recent story concerning Siri privacy may have hurt Apple’s reputation for being a vanguard of user data. Apple has halted Siri response grading program this week in an effort to look at its processes and determine how to boost privacy. To be fair, Apple only reviews less than 1% of total Siri requests!
- 7 iOS privacy tips and tricks you didn’t know
- iOS 13 Privacy and Security improvements announced at WWDC
- How do I clear my search history on my iPhone and protect my privacy
- The Complete Guide to Apple’s New Data Privacy Portal
To be fair, it isn’t an extraordinary scandal. But for the more privacy-conscious among us, it did feel like Apple may have let us down a bit.
Here’s what you should know — and how to make Siri a bit more private going forward.
Apple has human contractors that manually review Siri interactions for accuracy, according to several contractors who came forward as whistleblowers to The Guardian.
These contractors are located around the world and listen to Siri voice data collected from customers to improve Siri so that the digital assistant can better understand incoming commands and questions in the future.
But, like with any manually reviewed voice data, there’s always the potential for sensitive information to leak. The contractor said he sometimes overhears discussions between doctors and patents, sexual encounters and business deals.
And even if a particular voice file isn’t associated with a specific user (it isn’t in Apple’s case), there’s always the chance that a voice recording could contain a user’s first and last name or address.
Worringly, the contractor also told The Guardian that there’s a prevalence of accidental triggers. While Apple encourages the contractors to report accidental triggers, there’s apparently no policy in place for dealing with sensitive content.
In a statement to The Guardian, Apple said that it manually reviews only a “small portion” — less than 1 percent — of Siri requests. Apple’s contractors also review them in secure facilities and they’re required to adhere to Apple’s stringent privacy requirements.
It’s worth noting that Apple has long disclosed this information via security white paper. The company hasn’t ever tried to hide the fact that humans review voice data.
But according to the whistleblower, some of Apple’s contractors are concerned about the “lack of disclosure.” In other words, the fact that the average Apple user isn’t aware of this policy.
While Apple’s products are typically built with privacy in mind, it’s helpful to know some of the specifics of the company’s Siri-related privacy policies.
Here’s a quick run-through.
- Some data that’s used to improve Siri, like the music you listen to and searches, are sent to Apple’s servers.
- All data that’s sent to Apple is encrypted and anonymized and is not associated with your Apple ID.
- That data, however, is associated with a unique device identifier.
- If Location Services is enabled, your device’s exact location may be sent to Apple to improve Siri results.
- While it isn’t well-known, Apple has always said that a certain portion of Siri interactions will be manually reviewed.
It’s that last portion that has Apple in the spotlight, of course.
While Apple has never mislead the public about Siri’s privacy, there’s an argument to be made that it could have been a bit more transparent about the fact that human contractors do listen to Siri requests.
And because of the potential for a user to accidentally wake up or trigger Siri, there’s always a privacy risk when users take advantage of the digital assistant.
On an additional note, Apple does save that user voice data for a six-month period to improve your Siri’s voice recognition. After that, Apple saves some recordings without any device or user identifier. The company says it keeps a small number of completely anonymized recordings perpetually for ongoing Siri improvement.
How to tighten down Siri
Apple doesn’t monetize your data and it goes a bit further toward respecting your privacy than other technology companies.
On the other hand, extremely privacy- or security-conscious individuals may still want to shore up Siri’s privacy. Apple lets you do that. Here’s how.
Reset your device identifier
While Siri recordings aren’t associated with your name or Apple ID, they are linked to your device.
Apple says this random identifier rotates regularly, but users can also reset them manually at any time. Here’s how.
- Go to Settings.
- Tap on Siri & Search.
- Tap the toggles next to Listen for “Hey Siri” and Press (Button) for Siri.
- Now, go back to Settings and navigate to General —> Keyboard.
- Tap the toggle next to Enable Dictation so that it’s off.
Once you do both of these things, Apple says it deletes the device identifier and essentially “resets” your device’s relationship with Siri. It also deletes User Data and recent voice data input.
Note that this may impact how Siri works on your particular device. It may be less “intelligent” and not as great at predicting and answering actions or requests.
Restrict Siri’s access to services.
In addition to the device identifier, there are also a range of other data that’s sometimes sent to Apple’s servers.
That could include your device’s location. That means that Apple may collect your device’s location, depending on how you use Siri.
Apple may also share some data about your third-party app usage if you’ve allowed those apps to access Siri. Lastly, your iCloud account will sync Siri personalizations across your various devices by sending them to a server.
You can, luckily, control all of this behavior.
- Restrict Location Services for Siri. Go to Settings —> Privacy —> Location Services. Tap on Siri & Dictation and select Never from the options.
- Restrict Siri for third-party apps. Go to Settings —> Siri & Search. Scroll down and disable Siri for some of the apps that appear in the list below the native options.
- Disable iCloud Siri syncing. Open the Settings app and tap on the Apple iCloud card at the top (it should say your name). Tap on iCloud. Scroll down and tap the toggle next to Siri to disable personalization syncing across your iCloud-connected devices.
Stop Apple from logging Siri requests on its servers
If you’d like to go a step further than the previous methods and stop Siri data from being sent to Apple’s servers entirely, you can do that.
It’s a bit of an additional process, however. It involves creating a mobile configuration profile for your iPhone or iPad. And it requires that you own or have access to a Mac.
- Download Apple’s Configurator app from the Mac App Store.
- Open Configurator.
- In the top menu bar, click on File and then New Profile.
- The app will now walk you through some general setup tips. Follow them.
- When you can, click on Restrictions in the sidebar.
- Find the checkmark next to Allow server-side logging of Siri commands. Make sure it uncheck it.
- You can now save the profile as a .mobileconfig file by clicking on File —> Save.
To install the mobile configuration profile, just sent it as an attachment in an email to yourself. Open the email on your iOS device and tap on it to install it.
If you don’t have a Mac or you don’t want to go through the effort, security researcher Jan Kaiser has created a mobileconfig file that does exactly the same thing. Normally, we don’t recommend installing random mobileconfigs, but Kaiser is a professional cybersecurity expert. In other words, you can probably trust it.
You can download Kaiser’s mobileconfig file at this GitHub link.
Mike is a freelance journalist from San Diego, California.
While he primarily covers Apple and consumer technology, he has past experience writing about public safety, local government, and education for a variety of publications.
He’s worn quite a few hats in the journalism field, including writer, editor, and news designer.