Apple Makes Changes to Siri Grading →

Earlier this month, Apple suspended its program that used collected audio samples from Siri interactions to improve the service. The service was made public when a story was broken by The Guardian back in July:

Apple contractors regularly hear confidential medical information, drug deals, and recordings of couples having sex, as part of their job providing quality control, or “grading”, the company’s Siri voice assistant, the Guardian has learned.

Although Apple does not explicitly disclose it in its consumer-facing privacy documentation, a small proportion of Siri recordings are passed on to contractors working for the company around the world. They are tasked with grading the responses on a variety of factors, including whether the activation of the voice assistant was deliberate or accidental, whether the query was something Siri could be expected to help with and whether Siri’s response was appropriate.

Apple says the data “is used to help Siri and dictation … understand you better and recognise what you say”.

Today, Apple is addressing the story directly, making several changes to this program:

First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve.

Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.

Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.

These are good changes, but it is how the program should have worked from the day is started. There’s no doubt that Apple failed to live up to its own standards here.

I think the company could go further in making Siri more private, though. As I read this, having your Siri requests turned into computer-generated transcripts for review is still the default. I understand Apple needs to make Siri better, but the human review of these transcripts should be opt-in as well.

(Being able to hire some of the contractors caught in the middle would have been good, too.)

Don’t miss the support article about Siri Grading.