Apple Agrees to Pay $95 Million Settlement in Siri Voice Assistant Privacy Class Action

Apple has agreed to pay a $95 million settlement to resolve 5-year-old claims that its voice-controlled digital assistant Siri eavesdropped on Apple product users’ conversations without their consent. 

According to a class action filed in 2019, Apple has consistently violated California’s “wiretapping” statute since the product debuted in October 2011 by “unintentionally” collecting unauthorized recordings from Siri, which misinterpreted various sounds and words for the specific vocal triggers that are designed to activate it. 

If approved, the proposed settlement, filed Tuesday in Oakland federal court, will require Apple to confirm that it has scrubbed all Siri audio recordings aggregated before October 2019 within six months of the settlement date. 

Class members can expect a pro rata payout of up to $20 per Siri device, depending on the number of devices claimed, and plaintiffs counsel can petition the court for up to $29.6 million from the settlement fund to remunerate fees and costs. Apple posted a quarterly revenue of $94.9 billion in its Q4 2024 earnings report. 

Library of California Insurance Defense Forms

180+ model documents covering preliminary matters through ADR trial, with sample letters, demands, pleadings, motions, certifications, interrogatories, settlement agree…

View This Book 

The original class action, filed on Aug. 7, 2019, in San Jose federal court, aired concerns about the potential misuse of voice biometric data and alleged that Apple had breached California’s Invasion of Privacy Act, California’s Unfair Competition Law, the California Consumers Legal Remedies Act and the Declaratory Judgment Act. One of the lead plaintiffs named in the complaint was a minor. 

“Plaintiffs and Class members reasonably expected, based on Apple’s representations, that Apple was not recording them unless they uttered one of the wake phrases,” it stated. 

Siri uses voice recognition software and artificial intelligence to answer questions, retrieve information and perform a variety of tasks, such as replying to texts, sending reminders or playing music on iPhones and other Apple devices. In 2014, Apple enabled Siri’s activation using the spoken phrase “Hey, Siri”; the feature also responds to the abbreviated “Siri.” Users can also manually switch Siri on by pressing and holding down the “home button” on Apple devices. 

According to court records, the claim was filed less than a month after the U.K.-based newspaper The Guardianreported, based on an anonymous source, that Apple employees routinely reviewed private user communications such as “confidential medical information, drug deals and recordings of couples having sex” as part of their “quality control” work on the Siri feature (which Apple calls “grading”). Siri recorded these details after unintentionally mistaking other noises (such as the sound of a zipper) for the “Hey, Siri” wake phrase, it said. 

In a response to The Guardian, Apple said that “less than 1% of daily Siri activations are used for grading” and “user requests are not associated with the user’s Apple ID.” 

In a statement subsequently published on its website, the company apologized for not “fully living up to our high ideals” and emphasized that it believes “privacy is a fundamental human right,” echoing sentiments espoused by CEO Tim Cook during the congressional grilling of Meta CEO Mark Zuckerberg about consumer privacy in the wake of the 2018 Cambridge Analytica scandal. It announced that it would temporarily halt the Siri “grading” process, allow users to opt in or out of providing audio recordings to Siri for training purposes and strive to “delete any recording which is determined to be an inadvertent trigger of Siri.” 

The plaintiff class, which comprises all U.S. residents whose private communications were recorded or shared with third parties without their permission after they inadvertently triggered the “Hey, Siri” function between Sept. 17, 2014, and Dec. 31, 2024, on their Apple devices, is represented by Lexington Law Group, Lowey Danneberg and Scott + Scott Attorneys at Law. The firms did not immediately respond to phone calls and emails seeking comment on Thursday. 

A similar class action against Google and its parent company, Alphabet, concerning the voice-activated Google Assistant feature is pending in the U.S. District Court for the Northern District of California, San Jose Division. Amazon’s virtual assistant Alexa has also attracted intense media scrutiny after reports of its employees reviewing up to 1,000 audio clips from Alexa a day surfaced in 2019. 

According to court documents, plaintiffs counsel has requested a Feb. 14, 2025, hearing before presiding U.S. District Judge Jeffrey White in the U.S. District Court for the Northern District of California in Oakland. 

Apple’s counsel at DLA Piper and Morrison & Foerster did not respond to requests for comment. 

In emailed correspondence with The Recorder, Apple maintained that it does not sell Siri data or use Siri recordings for targeted advertising and uses a “random identifier”—a string of letters and numbers—to keep track of data, rather than identifiers linked to an Apple account or phone number. Since 2019, Apple has not retained any Siri recordings and instead uses computer-generated transcripts to train Siri, it said. 

“Siri has been engineered to protect user privacy from the beginning,” said an Apple spokesperson. “Siri data has never been used to build marketing profiles and it has never been sold to anyone for any purpose. Apple settled this case to avoid additional litigation so we can move forward from concerns about third-party grading that we already addressed in 2019. We use Siri data to improve Siri, and we are constantly developing technologies to make Siri even more private.”

source: https://www.law.com/therecorder/2025/01/03/apple-agrees-to-pay-95-million-settlement-in-siri-voice-assistant-privacy-class-action-/

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *