Judge rules that Apple can be sued for recording snippets of Siri’s interactions with users for grading purposes
At the time, Apple said that less than 1% of Siri’s daily activations were being sent to the contractors whose job was to determine whether Siri was activated on purpose or by accident. The firm also graded whether Siri responded appropriately to a user’s request or query. A small number of snippets were used to try and improve Siri’s diction.

A federal judge says class action lawsuit against Apple can proceed
One Siri user said that he was having a private conversation with his doctor about a “brand name surgical treatment” and soon received targeted ads for the procedure. Two other Siri users complained that conversations they had about “Air Jordan sneakers, Pit Viper sunglasses and ‘Olive Garden'” resulted in both receiving online ads for these specific brands.
Besides having products mentioned to Siri end up being advertised on iPhone users’ phones, more serious privacy breaches occurred. Accidental activations of the digital assistant allowed those working for the third-party company grading Siri’s responses, to hear couples having sex. Conversations containing private medical information were also turned over to the third-party firm along with drug deals.
Siri tells users that “I respect your privacy” and only listen when being talked to
Apple said that there was no way that the third party firm could determine the identity of the voices on the recordings. Back in July 2019, the tech giant said, “User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”
[ad_2]
Source link