Did you buy an iPhone because you were concerned about your privacy? Think again! Siri is onto shattering your trust
Apple has become an untainted trademark of privacy. Especially after the many reports on privacy breaches that keep hitting from various platforms. These reports have made everyone so conscious that they are ready to pay any price just to keep their data safe and the prying noses out. But all is not well with Apple’s assistant, Siri.
Hear out, Siri is secretly listening in.
A report by The Guardian says that ” Apple contractors regularly hear confidential medical information, drug deals, and recordings of couples having sex, as part of their job providing quality control, or “grading”, the company’s Siri voice assistant”
So, a few of the conversations with Siri do leak out. And later these conversations head to contracted workers for reviewing. While according to Apple, this process takes place only for further improvement in the quality and accuracy of responses that Siri gives but doesn’t this invite a possibility for a breach?
What’s more, this piece of information does not make it to Apple’s consumer-facing privacy documentation.
Data privacy is proving to be a myth as we walk into the future.
In response to all the accusations, Apple told says, “A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.” Moreover, the company claims that they listen to only 1% of the traffic. But that 1% could actually amount to 5,000,000 devices! Makes you think, doesn’t it? What if Siri listened in your last conversation? What was that about?
How secure are these quality tests, only time can tell. So, what are your views on the topic? Share them in the comments section below.