Thomas Le Bonniec, a former French contractor working on a program to improve Apple’s vocal assistant, Siri, just wrote to the European Data Protection Authorities to denounce what he calls “massive violation of the privacy of millions of citizens”. Le Bonniec told Fatto Quotidiano he is acting out of ethical reasons after listening to deeply personal and sometimes very disturbing conversations recorded by Siri
You are in a car with a loved one, discussing strictly personal matters. You are in a bedroom or maybe in your living room playing with the children. You are a journalist meeting a source, a doctor examining a patient, or a lawyer meeting a client. If you have an iPhone, an Apple Watch or even an iPad near you, the risk that your most intimate or professionally sensitive conversations may be listened to is not science fiction. These electronic devices have a voice assistant: it’s called Siri. It can help you find the nearest restaurant, or do an internet search for a specific medical condition.
In theory, Siri should be activated only when you specifically request it by saying: “Hey Siri”. In practice, the risk of triggering it accidentally is very real. How many people have been listened to, without their knowledge? So far, it has been impossible to answer that question without precise data. But now, things could change: a young former Apple contractor has just submitted an official request to the European data protection authorities asking them to investigate Siri. This could be a problem for the Silicon Valley company which, after Edward Snowden’s revelations, has cultivated the image of a privacy-concerned giant.
The Siri case first surfaced in July 2019, when the Guardian revealed the issue thanks to anonymous sources, but now, only two years after the General Data Protection Regulation (GDPR) entered into force, a former Apple contractor has taken the risk of stepping out to ask the European data protection authorities for an investigation. His name is Thomas Le Bonniec, he is French with an MSc in sociology, and he told Il Fatto Quotidiano that he is acting out of ethical reasons, after having heard extremely private and, in some cases, very disturbing conversations, such as a pedophile discussing his fantasies.
“Between May 13th 2019 and July 16th, 2019, I was hired by Globe Technical Services, one of Apple’s subcontractors, in Cork, Ireland. In this context I was assigned to the Siri transcription project (called “Bulk Data)” writes Le Bonniec in his letter. “I listened to hundreds of conversations every day,” he writes, clarifying that: “These recordings were often taken outside of any activation of Siri, e.g. in the context of an actual intention from the user to activate it for a request. These processings were made without users being aware of it, and were gathered into datasets to correct the transcription of the recording made by the device”.
Le Bonniec writes that “The recordings were not limited to the users of Apple devices, but also involved relatives, children, friends, colleagues“, so he heard “people talking about their cancer, referring to dead relatives, religion, sexuality, pornography, politics, school, relationships or drugs with no intention of activating Siri whatsoever”.
The work of human operators listening to the instructions users give to Siri is necessary to improve the performance of the voice assistant, a process that is technically called “grading”. According to Thomas Le Bonniec, technologies like Siri would not be able to perform the required tasks without human intervention. This does not mean, of course, that for every user request there must be a human being listening to that request to be ready to perform the task, rather it means that the ability to understand the requests is refined thanks to human work.
Il Fatto Quotidiano asked Apple how many conversations have been or were listened to in this grading program. The company replied: “Apple does not provide these numbers“. After the Guardian revealed the case last summer, the company introduced a series of changes. Today Apple claims that “By default, Apple does not retain audio recordings of Siri interactions. Users can opt-in to help Siri improve by learning from audio samples of their requests. These audio samples are limited to users. These users have ‘opted-in’ to share audio samples of their interactions to help improve Siri”.
In his letter to the European Data Protection authorities, Thomas Le Bonniec claims that “Nothing has been done to verify if Apple actually stopped the programme” and he offers his cooperation to the authorities “to provide any element to substantiate these facts”. Le Bonniec told Fatto Quotidiano that he has collected multiple screenshots to document his claims. “The risk I am taking will be worth it only if this letter is followed by a proper investigation and actions from your side”, he wrote to the European authorities.