Siri does listen to a ' small portion ' of conversations- says Apple

https://f002.backblazeb2.com/b2api/v1/b2_download_file_by_id?fileId=4_za8a2358db1d7f91b68b30916_f116beed63a57efd2_d20190729_m123325_c002_v0001121_t0040

It started with Alexa, which was then followed by Google assistant and now it continues with Siri. All these voice assistants have been eavesdropping over our conversations.The Guardian,  in their recent, revealed that Apple listens to its customer's conversation. The contractors of the label revealed that they use Siri to listen to the conversation. The reason for which they gave was, it is used for quality control or grading. This eavesdropping is apparently done to test Siri and grade it. Siri is tested for the replies it gives, the solutions it gives for a query, whether it was activated mistakenly or deliberately.The company also claimed that only 1 percent of Siri's daily activation is reviewed. No recording is associated with Apple ID. Only a small part of the conversation is used for reviewing and grading.

 

“A small portion of Siri requests are analyzed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”An anonymous official from the company also said that this matter should be taken care of. Seeing the number of Siri being activated accidentally and recording some of the extremely personal conversations. Conversations between doctors and patients, business deals, seemingly criminal deals have also been recorded which hikes the concern about this eavesdropping.

 

“You can definitely hear a doctor and patient, talking about the medical history of the patient. Or you’d hear someone, maybe with car engine background noise – you can’t say definitely, but it’s a drug deal … you can definitely hear it happening. And you’d hear, like, people engaging in sexual acts that are accidentally recorded on the pod or the watch,”a contractor told the publication.


For activating Apple's voice assistant, the words " Hello Siri " are used. People mostly link Siri to their iPhones and MacBooks only, but Siri also gets activates through Apple Watch and HomePod.The contractor also said that the workers are encouraged to report the accidental activation of Siri. But they are not provided with a procedure about the handling of the conversation recorded accidentally, even when they are highly private conversations.he  said " We’re encouraged to hit targets and get through work as fast as possible."


The fear of the sensitive information being misused is what made the contractor go public, and inform
about the recordings. He also said that it is not difficult to identify the person from the conversation as sometimes the name and address too are recorded accidentally.


Apple's voice assistant is different from the voice assistants of Amazon and Google. The former either gives an option of disabling Siri completely or not at all. While Alexa and Google Assistant can still use voice assistant while disabling the microphone recordings.“Apple is subcontracting out, there’s a high turnover. It’s not like people are being encouraged to have consideration for people’s privacy or even consider it. If there were someone with nefarious intentions, it wouldn’t be hard to identify [people on the recordings.”

Leave a Reply