Apple Offers Rare Apology Over Siri Voice Recordings And Promises User Privacy Changes

homepod
Apple is a company that prides itself on protecting user privacy, and has gone so far as to call out the competition with huge billboards to drive that point home. However, Apple is not exactly without fault when it comes to protecting user privacy as we found out last month with a whistleblower brought attention to the fact that humans were listening in on the Siri recording of Apple users.

Now, Apple has issued an apology over the fiasco that received global attention. "As a result of our review, we realize we haven’t been fully living up to our high ideals, and for that we apologize," wrote the company in a statement posted on its website today.

"At Apple, we believe privacy is a fundamental human right," the company continued. "We design our products to protect users’ personal data, and we are constantly working to strengthen those protections. We heard [our customers'] concerns, immediately suspended human grading of Siri requests and began a thorough review of our practices and policies."

Apple Billboard

The reason why Apple had human contractors in place was to "grade" recordings in an effort to see if Siri AI actually properly understood a user's individual request. However, at no time was it made clear to Apple customers that humans would listen to what they were saying to Siri, or whatever ever taking place in the background. 

In a recent report, it was said that Apple contractors working in Ireland were listening to as many as 1,000 Siri recordings per day. In a blockbuster report posted by The Guardian in late July, the whistleblower confirmed that contractors would encounter “countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on.”

Apple has said that it is making changes to Siri to improve user privacy and will be more transparent about any data collection. The company will make these three primary changes going forward:

  • First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve. 
  • Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time. 
  • Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.

Apple isn't the only tech company to get caught with its hands in the cookie jar with respect to grading voice recordings without properly informing customers. Google, Facebook, and Microsoft have also acknowledged the practice, and have suspended the programs until further notice.

(Apple billboard image courtesy CTV News Toronto)