The Secure Times

An online forum of the ABA Section of Antitrust Law's Privacy and Information Security Committee


Leave a comment

Caution: Your Company’s Biggest Privacy Threat is…the FTC

Technology companies – from startups to megacorporations – should not overlook an old privacy foe: the Federal Trade Commission (FTC).  Since its inception in 2002, the FTC’s data security program has significantly picked up steam.  In the last two years, the FTC has made headlines for its hefty privacy-related fines against Google and photo-sharing social network, Path.  In January 2014 alone, the agency settled with a whopping 15 companies for privacy violations.  What is more, many of these companies’ practices were not purposefully deceptive or unfair; rather the violations stem from mere failure to invest the time and security resources needed to protect data.

Vested with comprehensive authority and unburdened by certain hurdles that class actions face, the FTC appears poised for more action.  The FTC’s basis for its authority in the privacy context originates from the Federal Trade Commission Act (FTC Act) and is quite broad.  Simply put, it may investigate “unfair and deceptive acts and practices in or affecting commerce.”  In addition to this general authority, the FTC has authority to investigate privacy violations and breaches under numerous sets of rules, including the Children’s Online Privacy Protection Act (COPPA), the Fair Credit Reporting Act including disposal (FCRA), the Gramm-Leach-Bliley Act (GLB), and the Telemarketing and Consumer Fraud and Abuse Prevention Act.  Nor is the FTC hampered with the requirements of private class action litigation.  For example, successful privacy class actions often must establish that consumers were harmed by a data breach (as in In re Barnes & Noble Pin Pad Litigation), consumers actually relied on a company’s promises to keep the information confidential (as in In re Apple iPhone Application Litigation), or the litigation will not be burdened with consumer-specific issues (such as whether the user impliedly consented to the disclosure, as in In re: Google Inc. Gmail Litigation).

The FTC has often focused on companies failing to adhere to their own stated policies, considered a “deceptive” practice by the FTC.  More recently, the FTC settled with the maker of one of the most popular Android Apps, “Brightest Flashlight Free.”  While the App informed users that it collected their data, it is alleged to have failed to disclose that the data would be shared with third parties.  And though the bottom of the license agreement offered consumers an opportunity to click to “Accept” or “Refuse,” the App is alleged to have already been collecting and sending information (such as the location and the unique device identifier) prior to receiving acceptance.  Just last week, the FTC settled with Fandango for failing to adequately secure data transmitted through its mobile app, in contravention of its promise to users.  The FTC alleged that Fandango disabled a critical security process, known as SSL certificate validation, which would have verified that its app’s communications were secure.   As another example, the FTC recently settled with a maker of a camera device used in homes for a variety of purposes, including baby monitoring and security.  The device allows the video to be accessed from any internet connection.  The devices are alleged to have “had faulty software that left them open to online viewing, and in some instances listening, by anyone with the cameras’ Internet address.”

Companies have also been targeted for even slight deviations from their stated policies.  For example, the FTC recently reached settlements with BitTorrent and the Denver Broncos.  The entities were blamed for falsely claiming they held certifications under the U.S.-EU Safe Harbor framework.  In reality, the entities had received the certifications but failed to renew them.  The safe harbor is a streamlined process for US companies (that receive or process personally identifiable information either directly or indirectly from Europe) to comply with European privacy law.  Self-certifying to the U.S.-EU Safe Harbor Framework also ensures that EU organizations know that the organization provides “adequate” privacy protection.

Perhaps most surprising to companies is the FTC’s assertion that it may require them to have reasonable data protection policies in place (even if the company never promised consumers it would safeguard the data).  Failure to secure data, according to the FTC, is an “unfair” practice under the FTC Act.  For example, the FTC recently settled with Accretive Health, a company that handles medical data and patient-financial information.  Among other things, Accretive was alleged to have transported laptops with private information in an unsafe manner, leading to a laptop (placed in a locked compartment of an employee’s car) being stolen.  It is estimated that the FTC has brought over 20 similar types of cases, but all but one settled before any meaningful litigation.  The one: a case against Wyndham Hotels.  There, the FTC has alleged that Wyndham failed to adequately protect consumer data collected by its member hotels.  According to the FTC, hackers repeatedly accessed the data due to the company’s wrongly configured software, weak passwords, and insecure servers.  Though Wyndham’s Privacy Policy did not technically promise that the information would remain secure, the FTC faulted it for the lapse anyway.  Wyndham has challenged the FTC’s position in federal court and a decision is expected soon.

Being a target of an FTC action is no walk in the park.  In addition to paying for attorney fees, the FTC often demands significant remedial measures.  For instance, the company may be asked to (1) create privacy programs and protocols, (2) notify affected consumers, (3) delete private consumer data, (4) hire third-party auditors, and (5) subject itself to continual oversight by the FTC for 20 years.  What is more, if a company ever becomes a repeat offender and violates its agreement not to engage in future privacy violations, it will face significant fines by the FTC.  In this regard, for example, Google was required to pay $22.5 million for violating a previous settlement with the FTC.

All told, technology companies should not feel emboldened by recent class action victories in the privacy context.  To avoid FTC investigation, they should carefully review their data handling practices to ensure that they are in accord with their privacy policy.  Further, they would be wise to invest in the necessary resources required to safeguard data and regularly ensure that their methods are state of the art.

 

Advertisements


Leave a comment

Google Avoids Class Certification in Gmail Litigation

On March 18, 2014, Judge Koh in the Northern District of California denied Plaintiffs’ Motion for Class Certification in the In re: Google Inc. Gmail Litigation matter, Case No. 13-MD-02430-LHK. The case involved allegations of unlawful wiretapping in Google’s operation of its Gmail email service. Plaintiffs alleged that, without obtaining proper consent, Google unlawfully read the content of emails, extracted concepts from the emails, and used metadata from emails to create secret user profiles.

Among other things, obtaining class certification requires a plaintiff to demonstrate that class issues will predominate over individual issues. In this case, Judge Koh’s opinion focused almost exclusively on the issue of predominance. The Court noted that the predominance inquiry “tests whether proposed classes are sufficiently cohesive to warrant adjudication by representation.” Opinion (“Op.”) at 23 (citations omitted). The Court further emphasized that the predominance inquiry “is a holistic one, in which the Court considers whether overall, considering the issues to be litigated, common issues will predominate.” Op. at 24.

The Court in the Gmail litigation noted how the existence of consent is a common defense to all of Plaintiffs’ claims. Consent can either be express, or it can be implied “based on whether the surrounding circumstances demonstrate that the party whose communications were intercepted new of such interceptions.” Op. at 26. The decision explained how common issues would not predominate with respect to a determination of whether any particular class member consented to Google’s alleged conduct.

The Court briefly addressed whether the issue of express consent could be practically litigated on a class-wide basis, but the opinion focused largely on the issue of implied consent. The Court noted that implied consent “is an intensely factual question that requires consideration of the circumstances surrounding the interception to divine whether the party whose communication was intercepted was on notice that the communication would be intercepted.” Op. at 30. Google contended that implied consent would require individual inquiries into what each person knew. Google pointed to a plethora of information surrounding the scanning of Gmail emails including:  (1) Google’s Terms of Service; (2) Google’s multiple Privacy Policies; (3) Google’s product-specific Privacy Policies; (4) Google’s Help pages; (5) Google’s webpages on targeted advertising; (6) disclosures in the Gmail interface; (7) media reporting of Gmail’s launch and how Google “scans” email messages; (8) media reports regarding Google’s advertising system; and (9) media reports of litigation concerning Gmail email scanning. The Court thus agreed with Google that there was a “panoply of sources from which email users could have learned of Google’s interceptions other than Google’s TOS and Privacy Policies.” Op. at 33. With all these different means by which a user could have learned of the scanning practices (and provided implied consent to the practice) the issue of consent would overwhelmingly require individualized inquiries and thus precluded class certification.

This opinion demonstrates a key defense to class action claims where implied consent is at issue. Any class action defendant’s assessment of risk should include an early calculation of the likelihood of class certification, and that calculation should inform litigation strategy throughout the case. Google consistently litigated the matter to highlight class certification difficulties surrounding consent, and ultimately obtained a significant victory in defeating class certification.


Leave a comment

What’s More Challenging? Establishing Privacy Class Action Standing, or Climbing Mount Kilimanjaro?

Two opinions recently issued from the Northern District of California have important implications for parties seeking privacy class actions. Both opinions highlight the evolving jurisprudence around establishing standing for consumer privacy lawsuits.

In re Apple iPhone Application Litigation

On November 25, 2013, Judge Lucy Koh granted Apple’s motion for summary judgment on all of plaintiffs’ claims in In re Apple iPhone Application Litigation, 11-MD-02250-LHK (N.D. Cal. Nov. 25, 2013). Plaintiffs alleged that Apple violated its Privacy Policy by allowing third parties to access iPhone users’ personal information. Based on those misrepresentations, plaintiffs claimed they overpaid for their iPhones, and that their iPhones’ performance suffered. Plaintiffs also alleged that Apple violated its Software License Agreement (“SLA”) when it falsely represented that customers could prevent Apple from collecting geolocation information by turning off the iPhone’s Location Services setting. Plaintiffs alleged that, contrary to this representation, Apple continued to collect certain geolocation information from iPhone users even if those users had turned the Location Services setting off. Based on the SLA misrepresentations, plaintiffs alleged they overpaid for their iPhones and suffered reduced iPhone performance. Plaintiffs argued that Apple’s alleged conduct constituted a violation of California’s unfair competition law (“UCL”) and the Consumer Legal Remedies Act (“CLRA”).

Judge Koh disagreed, finding that plaintiffs failed to create a genuine issue of material fact concerning their standing under Article III, the UCL, and the CLRA. Judge Koh held that plaintiffs presented enough evidence of injury—that plaintiffs purportedly overpaid for their iPhones and suffered reduced iPhone performance. Conversely though, Judge Koh held that plaintiffs could not establish that such injury was causally linked to Apple’s alleged misrepresentations. Judge Koh ruled that actual reliance was essential for standing. Accordingly, plaintiffs must have (1) seen the misrepresentations; and (2) acted on those misrepresentations.  Judge Koh noted that none of the plaintiffs had even seen the alleged misrepresentations prior to purchasing their iPhones, or at any time thereafter. Because none of the plaintiffs had even seen the misrepresentations, they could not have relied upon such misrepresentations. Without reliance, Judge Koh held that plaintiffs’ claims could not survive.

In re Google, Inc. Privacy Policy Litigation

On December 3, 2013, Judge Paul Grewal granted Google’s motion to dismiss in In re Google, Inc. Privacy Policy Litigation, Case No. C-12-01382-PSG (N.D. Cal. Dec. 3, 2013), but not based on lack of standing. The claims stemmed from Google’s change in its privacy policies. Before March 1, 2012, Google maintained separate privacy policies for each of its products, and those policies purportedly stated that Google would only use a user’s personally-identifying information for that particular product. Google then introduced a new privacy policy informing consumers that it would commingle data between products. Plaintiffs contend that the new privacy policy violated Google’s prior privacy policies. Plaintiffs also alleged that Google shared PII with third parties to allow third parties to develop apps for Google Play.

In assessing standing, Judge Grewal noted that “injury-in-fact has proven to be a significant barrier to entry,” and that establishing standing in the Northern District of California is akin to climbing Mount Kilimanjaro. Notwithstanding the high burden, Judge Grewal found that plaintiffs adequately alleged standing.

Plaintiffs alleged standing based on (1) commingling of personally identifiable information; (2) direct economic injury; and (3) statutory violations. With respect to the commingling argument, plaintiffs contended that Google never compensated plaintiffs for the value associated with commingling PII amongst different Google products. Judge Grewal rejected this argument, noting that a plaintiff may not establish standing by pointing to a defendant’s profit; rather, plaintiff must actually suffer damages as a result of defendant’s conduct.

With respect to plaintiffs’ allegations of direct economic injury, Judge Grewal held that those allegations sufficed to confer standing. Plaintiffs argued they suffered direct economic injuries because of reduced performance of Android devices (plaintiffs had to pay for the battery power used by Google to send data to third parties). Plaintiffs also argued that they overpaid for their phones and had to buy different phones because of Google’s practices. These allegations sufficed to establish injury. Based on Judge Koh’s opinion in Apple, one key issue in the Google case will likely be whether any of the plaintiffs actually read and relied upon Google’s privacy policies.

Finally, Judge Grewal found that standing could be premised on the alleged violation of statutory rights. This ruling is consistent with the trend in other federal courts. Though Judge Grewal ultimately dismissed the complaint for failure to state a claim, the opinion’s discussion of standing will be informative to both the plaintiff and defense bars in privacy litigation.

The Apple and Google lawsuits represent a fraction of the many lawsuits seeking to recover damages and/or injunctive relief for the improper collection and/or use of consumer information. Establishing standing remains a difficult hurdle for plaintiffs in consumer privacy lawsuits, though courts are increasingly accepting standing arguments based on statutory violations and allegations of economic injuries. The Apple decision is on appeal, so we will see if the Ninth Circuit sheds further light on issues of standing in privacy lawsuits.


Leave a comment

EPIC is Suing the FTC to Compel Enforcement of Google Buzz Consent Order

The Electronic Privacy Information Center (EPIC) is suing the Federal Trade Commission (FTC) in order to compel the federal agency to enforce the October 2011 Google Buzz consent order, In the Matter of Google, Inc., FTC File No. 102 3136, which was issued following a complaint filed by EPIC with the FTC in February 2010.

 

Pursuant to this consent order, Google may not misrepresent the extent to which it maintains and protects the privacy and confidentiality of the information it collects, including the purposes for which the information is collected, and the extent to which consumers may exercise control over the collection, use, or disclosure of this information. Also, Google must obtain the express affirmative consent of Google users before making any new or additional sharing of information to third parties, which must be identified, and the purpose(s) for sharing the information must be disclosed to Google users. The consent order also requires Google to establish and implement a comprehensive privacy program.

 

Google announced in last January changes in its privacy policy, which will be effective March 1, 2012. Google will then start collecting user data across all the different Google sites, such as Gmail or YouTube, provided that the user logged into her Google account. Ms. Alma Whitten, Google’s Director of Privacy, Product and Engineering, stated that Google can thus provide “a simpler, more intuitive Google experience.” A Google user will have one single Google profile. There is, however, no opt-out available. The new privacy policy states that:

 

We may use the name you provide for your Google Profile across all of the services we offer that require a Google Account. In addition, we may replace past names associated with your Google Account so that you are represented consistently across all our services. If other users already have your email, or other information that identifies you, we may show them your publicly visible Google Profile information, such as your name and photo.”

 

According to EPIC’s complaint, these changes are “in clear violation of [Google] prior commitments to the Federal Trade Commission.” EPIC is arguing that Google violated the Consent Order “by misrepresenting the extent to which it maintains and protects the privacy and confidentiality of [users] information, by misrepresenting the extent to which it complies with the U.S.-EU Safe Harbor Framework… [and] by failing to obtain affirmative consent from users prior to sharing their information with third parties.

 

Indeed, the European Union (EU) is also concerned by these changes. The Article 29 Working Party sent a letter to Google on February 2, to inform the California company that it will “check the possible consequences for the protection of the personal data of [E.U. Member States ]citizensof these changes. Google answered to the Commission Nationale de l’Informatique et des Libertés (CNIL), France’s Data Protection Authority, in charge of coordinating the enquiry into Google Privacy changes, that changes were made in order to insure that Google’s privacy policy is “simpler and more understandable” and also “to create a better user experience.”

 

Meanwhile, EPIC is arguing that the FTC has a non-discretionary obligation to enforce a final order, yet has not yet taken any action with respect to changes ahead in Google’s privacy policy.


Leave a comment

Google Agrees to Settle FTC Charges and Will Implement a “Comprehensive Privacy Program”

The Federal Trade Commission (“FTC”) announced today that Google has agreed to settle FTC charges that it used deceptive tactics and violated its privacy promises when launching Google’s Buzz in 2010. Google will have to implement a “comprehensive privacy program,” as laid out in the proposed consent order. The  agreement is subject to public comment through May 1, 2011, after which the FTC will decide whether to make the proposed consent order final.

The proposed consent order refers to both the FTC Act and to the US-EU Safe Harbor Framework, a reference that is likely to be well appreciated in the European Union.

Agreement containing consent order available here.

Complaint available here.

The 2010 complaint

Google launched in February 2010 a social network within Gmail, Google Buzz (“Buzz”). Gmail users were sometimes set up with followers automatically, and without prior notice (Complaint at 7). These followers were the persons they emailed and chatted with the most in Gmail (Complaint at 8). Even if Gmail users chose to opt out of Buzz, they could nevertheless be followed by other Buzz users, and their public profile, if they had indeed created one, would then appear on their follower’s Google public profiles (Complaint at 8 and at 9).

The FTC complaint alleged that Google had violated the FTC Act, when it represented to consumers signing up for a Gmail account that Google would only use their information to provide them this webmail service, whereas Google also used this information to sign them up to Buzz automatically and without their consent. Also, Google represented that consumers would be able to control whether their information would be made public or not.

The complaint also alleged that Google did not adhere to the Safe Harbor Framework Privacy Principles of Notice and Choice, as Google did not give notice to users before using their personal information for a purpose different that than the one for which the data was originally collected. Also, Gmail users were not given a choice when Google used their information for a purpose incompatible for the purpose for which it was originally collected (Complaint at 25).

The complaint alleged that Google did not communicate “adequately” that “certain previously private information would be shared publicly by default,” and that the controls allowing users to change the defaults were “confusing and difficult to find” (Complaint at 9). Also, certain personal information was shared without Gmail users’ permission (Complaint at 10). For instance, individuals blocked by a Gmail user were not blocked in Buzz, and could be thus be a follower on Buzz (Complaint at 10).  Even more puzzling, it was not possible to block a follower who did not have a public Google profile, and the Gmail user could not even know this follower’s real identity (Complaint at 10).  Also, Buzz offered an @reply function which sometimes led to private mail addresses of contacts to be exposed to every followers, and could thus be found by search engines.

Google made some changes following widespread criticism and thousands of customer complaints. Users were given the ability to disable Buzz. Followers were no longer added automatically based on Gmail contacts, but merely suggested. Users could also block any follower, and Buzz users were given the option not to show their followers’ list on their public profile. The @reply function would no longer make private addresses public.

However, the FTC nevertheless issued a complaint in 2010, and Google has now agreed to settle.  

A comprehensive privacy program

The Buzz settlement is particularly interesting as it is the first time that an FTC settlement order requires a company to implement a comprehensive privacy program to protect the privacy of consumer data.

Indeed, the proposed consent order requires Google to implement a “comprehensive privacy program,” documented in writing, which must “(1) address privacy risks related to the development and management of new and existing products and services for consumers, and (2) protect the privacy and confidentiality of covered information” (proposed consent order p. 4). This program must designate which employees are responsible for the program. It must identify the reasonably foreseeable risks, external or internal, of Google collecting, using, or disclosing personal information without authorization, and put safeguards in place to prevent these risks. It must also design and implement “reasonable privacy controls and procedures, and regularly monitor the efficiency of privacy controls.” The program must also select service providers in charge of protecting personal data privacy, and enter into contracts with them. This comprehensive privacy program will be evaluated and adjusted if necessary, in light of its results (proposed consent order p. 4-5).

Also, Google will have to obtain from a qualified third-party professional an initial assessment, and then biennial assessments and reports, setting forth the specific privacy controls implemented by Google, explaining why such controls are appropriate, and explaining how they have been implemented. The third-party professional will also certify that such controls are effective (proposed consent order p. 5-6).

It will be interesting to see if U.S. companies will start to use the comprehensive privacy program framework as a reference for their own privacy programs,  and if EU Data Protection Agencies will require U.S. organizations that have self-certified to the U.S.-EU Safe Harbor Framework to implement such a privacy program to be deemed compliant.

 


Leave a comment

Data Privacy Day: January 28, 2011

Mark your calendars for Data Privacy Day – January 28, 2011.  Countries around the world are hosting events in honor of Data Privacy Day (or Data Protection Day).  This year is the thirtieth anniversary of the date on which the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data was opened for signature by the Council of Europe on January 28, 1981. Some highlights include:

 

– Panel Discussions around the world.  For example, the Council of Europe and European Commission are hosting a joint high-level meeting in Brussels (registration due January 24).  Google is opening its Washington, DC offices for Google breakfast and a panel discussion called “The Technology of Privacy: When Geeks Meet Wonks.”  

 

Local government initiatives – for example, the California Office of Privacy Protection will be launching a social media site: www.privacy.ca.gov. 

 

Happy Hours in many local areas on January 27, 2011, hosted by the International Association of Privacy Professionals (IAPP).

 

Check out  dataprivacyday2011.org, http://www.europeanprivacyday.org/, or http://www.capapa.org/DPD.html for events in your area. 


Leave a comment

FTC Steps Down from Google Data Privacy Investigation, U.K. Back On Board

Oct. 27, 2010. The Federal Trade Commission today posted a letter to Google’s counsel announcing that it is ending its inquiry into Google’s collection of information sent over unsecured wireless networks. The inquiry began after Google revealed in May 2010 that its Street View cars had been collecting more than just WiFi location information such as SSID information and MAC addresses. Instead, Google had also been capturing “payload” data sent over unsecured wireless networks. The May announcement came after the data protection authority in Hamburg, Germany, requested an audit of the Street View data.
 
Google’s May revelation generated a flurry of media attention (e.g. from the Wall Street Journal and New York Times), and regulatory investigations in the United States, Germany, Canada, Australia, the U.K., South Korea, and elsewhere. Several class-action lawsuits also resulted. 
 
Last week, on October 22, 2010 Google announced on its U.S. website that it has taken steps to improve its privacy practices, including appointing a new director of privacy to oversee both the engineering and product management groups, enhancing its privacy training, and implementing new internal privacy compliance practices.  This announcement, together with Google’s promise to delete the payload data as soon as possible, and assurance that it will not use the data in any product or service, appears to have appeased the FTC. The FTC’s letter did not contain any determination about whether Google’s actions did or did not breach any data privacy laws, nor did it require any remedial action or fines. Australia, in contrast, had concluded in June that Google violated Australia’s privacy laws, and required Google to publicly apologize, to conduct a Privacy Impact Assessment, and regularly consult with the Australian office about data collection. 
 
Google also acknowledged that – contrary to its earlier postings – “in some instances entire emails and URLs were captured, as well as passwords.” While Google’s October 22 posting satisfied the FTC, this revelation caused the U.K. to announce that is re-opening its investigation into Google’s privacy practices.  The U.K.  had closed its investigation in July after reviewing sample payload data, concluding that personal data, emails and passwords were not collected.