The more prurient of the reports tell of iPhones listening in on couples having intercourse; whether advertent or inadvertent is not clear. However, it appears to be true that contract workers at Apple, Inc. have been given carte blanche to listen to us via Siri.
Conversations picked up by Siri also include private conversations between physicians and patients, (presumably) illegal activities and more general conversations. We are told the eavesdropping helps Apple maintain “quality control” of the artificial intelligence software. And I suppose that I understand the rationalization, it’s gotta learn somehow.
Siri – Did I Give You Permission?
The British newspaper, The Guardian, first broke this story and approached one of the contractors monitoring thousands of private conversations.
“A small portion of Siri requests are analyzed to improve Siri and dictation,” they answered. “User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”
The contractor said that their quality control monitored less than 1 percent of the daily Siri usage. Back in 2015, when AI was in its relative infancy, Siri was estimated to receive about a billion requests per month. Therefore, contractors have access to about 833 million per month. Obviously, it has increased so it is not unreasonable to project the contractors can go through 10 million a month.
An Apple whistleblower for one of the contractors told The Guardian “There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data.”
Therefore, the confidentiality alluded to by the contractors appears to be erroneous. They know exactly who the parties are on both ends of the phone. Apparently, no provisions were in place to protect highly confidential information. As no one gave permission to permit Apple to eavesdrop this sets up quite an ethical quandary.
‘Worse’ than Alexa?
A few months ago, it was discovered that the Amazon listening device was indeed capable of listening in on our conversations. The privacy we imagined of the device “turning off” immediately after we asked a specific question, was not quite as private as we imagined. The device was indeed capable of listening in simply because “it wanted.”
The difference here may well be in its specificity. “Alexa” is not aware of who is talking, only that someone is asking a question.
However, Apple contractors have a very good idea of who is asking a question. In addition, they know whom the party is talking to at any given time. The chances for abuse can boggle the mind. While those fixated on the relatively benign sexual confidentiality aspects might find such conversations interesting, it may not be as damaging as medical, financial, marketing, market research, political or other information that could potentially be bundled.
Given the sophistication of voice recognition software and the ability to lock in on keywords or phrases, any number of information collection devices could be utilized to get a profile of the user up to, and including employment desirability, medical status, legal status and information illegal to ask in a number of situations.
Apple, of course, had a clear-cut choice. To ask us for permission to record us – or not. They chose to not tell us. This lack of oversite is highly troubling. While their need to collect this data may be with noble intention, I am curious as to who monitors those who monitor? Obviously, if they had asked, I believe most of us would have said, “No thank you!”
They might rationalize the harmlessness in the act of recording our Siri data, so why not tell us? Artificial intelligence as it may turn out is not the great culprit here, but those with human intelligence and no ethical purpose.