The Secure Times

An online forum of the ABA Section of Antitrust Law's Privacy and Information Security Committee


Leave a comment

Caution: Your Company’s Biggest Privacy Threat is…the FTC

Technology companies – from startups to megacorporations – should not overlook an old privacy foe: the Federal Trade Commission (FTC).  Since its inception in 2002, the FTC’s data security program has significantly picked up steam.  In the last two years, the FTC has made headlines for its hefty privacy-related fines against Google and photo-sharing social network, Path.  In January 2014 alone, the agency settled with a whopping 15 companies for privacy violations.  What is more, many of these companies’ practices were not purposefully deceptive or unfair; rather the violations stem from mere failure to invest the time and security resources needed to protect data.

Vested with comprehensive authority and unburdened by certain hurdles that class actions face, the FTC appears poised for more action.  The FTC’s basis for its authority in the privacy context originates from the Federal Trade Commission Act (FTC Act) and is quite broad.  Simply put, it may investigate “unfair and deceptive acts and practices in or affecting commerce.”  In addition to this general authority, the FTC has authority to investigate privacy violations and breaches under numerous sets of rules, including the Children’s Online Privacy Protection Act (COPPA), the Fair Credit Reporting Act including disposal (FCRA), the Gramm-Leach-Bliley Act (GLB), and the Telemarketing and Consumer Fraud and Abuse Prevention Act.  Nor is the FTC hampered with the requirements of private class action litigation.  For example, successful privacy class actions often must establish that consumers were harmed by a data breach (as in In re Barnes & Noble Pin Pad Litigation), consumers actually relied on a company’s promises to keep the information confidential (as in In re Apple iPhone Application Litigation), or the litigation will not be burdened with consumer-specific issues (such as whether the user impliedly consented to the disclosure, as in In re: Google Inc. Gmail Litigation).

The FTC has often focused on companies failing to adhere to their own stated policies, considered a “deceptive” practice by the FTC.  More recently, the FTC settled with the maker of one of the most popular Android Apps, “Brightest Flashlight Free.”  While the App informed users that it collected their data, it is alleged to have failed to disclose that the data would be shared with third parties.  And though the bottom of the license agreement offered consumers an opportunity to click to “Accept” or “Refuse,” the App is alleged to have already been collecting and sending information (such as the location and the unique device identifier) prior to receiving acceptance.  Just last week, the FTC settled with Fandango for failing to adequately secure data transmitted through its mobile app, in contravention of its promise to users.  The FTC alleged that Fandango disabled a critical security process, known as SSL certificate validation, which would have verified that its app’s communications were secure.   As another example, the FTC recently settled with a maker of a camera device used in homes for a variety of purposes, including baby monitoring and security.  The device allows the video to be accessed from any internet connection.  The devices are alleged to have “had faulty software that left them open to online viewing, and in some instances listening, by anyone with the cameras’ Internet address.”

Companies have also been targeted for even slight deviations from their stated policies.  For example, the FTC recently reached settlements with BitTorrent and the Denver Broncos.  The entities were blamed for falsely claiming they held certifications under the U.S.-EU Safe Harbor framework.  In reality, the entities had received the certifications but failed to renew them.  The safe harbor is a streamlined process for US companies (that receive or process personally identifiable information either directly or indirectly from Europe) to comply with European privacy law.  Self-certifying to the U.S.-EU Safe Harbor Framework also ensures that EU organizations know that the organization provides “adequate” privacy protection.

Perhaps most surprising to companies is the FTC’s assertion that it may require them to have reasonable data protection policies in place (even if the company never promised consumers it would safeguard the data).  Failure to secure data, according to the FTC, is an “unfair” practice under the FTC Act.  For example, the FTC recently settled with Accretive Health, a company that handles medical data and patient-financial information.  Among other things, Accretive was alleged to have transported laptops with private information in an unsafe manner, leading to a laptop (placed in a locked compartment of an employee’s car) being stolen.  It is estimated that the FTC has brought over 20 similar types of cases, but all but one settled before any meaningful litigation.  The one: a case against Wyndham Hotels.  There, the FTC has alleged that Wyndham failed to adequately protect consumer data collected by its member hotels.  According to the FTC, hackers repeatedly accessed the data due to the company’s wrongly configured software, weak passwords, and insecure servers.  Though Wyndham’s Privacy Policy did not technically promise that the information would remain secure, the FTC faulted it for the lapse anyway.  Wyndham has challenged the FTC’s position in federal court and a decision is expected soon.

Being a target of an FTC action is no walk in the park.  In addition to paying for attorney fees, the FTC often demands significant remedial measures.  For instance, the company may be asked to (1) create privacy programs and protocols, (2) notify affected consumers, (3) delete private consumer data, (4) hire third-party auditors, and (5) subject itself to continual oversight by the FTC for 20 years.  What is more, if a company ever becomes a repeat offender and violates its agreement not to engage in future privacy violations, it will face significant fines by the FTC.  In this regard, for example, Google was required to pay $22.5 million for violating a previous settlement with the FTC.

All told, technology companies should not feel emboldened by recent class action victories in the privacy context.  To avoid FTC investigation, they should carefully review their data handling practices to ensure that they are in accord with their privacy policy.  Further, they would be wise to invest in the necessary resources required to safeguard data and regularly ensure that their methods are state of the art.

 


Leave a comment

What’s More Challenging? Establishing Privacy Class Action Standing, or Climbing Mount Kilimanjaro?

Two opinions recently issued from the Northern District of California have important implications for parties seeking privacy class actions. Both opinions highlight the evolving jurisprudence around establishing standing for consumer privacy lawsuits.

In re Apple iPhone Application Litigation

On November 25, 2013, Judge Lucy Koh granted Apple’s motion for summary judgment on all of plaintiffs’ claims in In re Apple iPhone Application Litigation, 11-MD-02250-LHK (N.D. Cal. Nov. 25, 2013). Plaintiffs alleged that Apple violated its Privacy Policy by allowing third parties to access iPhone users’ personal information. Based on those misrepresentations, plaintiffs claimed they overpaid for their iPhones, and that their iPhones’ performance suffered. Plaintiffs also alleged that Apple violated its Software License Agreement (“SLA”) when it falsely represented that customers could prevent Apple from collecting geolocation information by turning off the iPhone’s Location Services setting. Plaintiffs alleged that, contrary to this representation, Apple continued to collect certain geolocation information from iPhone users even if those users had turned the Location Services setting off. Based on the SLA misrepresentations, plaintiffs alleged they overpaid for their iPhones and suffered reduced iPhone performance. Plaintiffs argued that Apple’s alleged conduct constituted a violation of California’s unfair competition law (“UCL”) and the Consumer Legal Remedies Act (“CLRA”).

Judge Koh disagreed, finding that plaintiffs failed to create a genuine issue of material fact concerning their standing under Article III, the UCL, and the CLRA. Judge Koh held that plaintiffs presented enough evidence of injury—that plaintiffs purportedly overpaid for their iPhones and suffered reduced iPhone performance. Conversely though, Judge Koh held that plaintiffs could not establish that such injury was causally linked to Apple’s alleged misrepresentations. Judge Koh ruled that actual reliance was essential for standing. Accordingly, plaintiffs must have (1) seen the misrepresentations; and (2) acted on those misrepresentations.  Judge Koh noted that none of the plaintiffs had even seen the alleged misrepresentations prior to purchasing their iPhones, or at any time thereafter. Because none of the plaintiffs had even seen the misrepresentations, they could not have relied upon such misrepresentations. Without reliance, Judge Koh held that plaintiffs’ claims could not survive.

In re Google, Inc. Privacy Policy Litigation

On December 3, 2013, Judge Paul Grewal granted Google’s motion to dismiss in In re Google, Inc. Privacy Policy Litigation, Case No. C-12-01382-PSG (N.D. Cal. Dec. 3, 2013), but not based on lack of standing. The claims stemmed from Google’s change in its privacy policies. Before March 1, 2012, Google maintained separate privacy policies for each of its products, and those policies purportedly stated that Google would only use a user’s personally-identifying information for that particular product. Google then introduced a new privacy policy informing consumers that it would commingle data between products. Plaintiffs contend that the new privacy policy violated Google’s prior privacy policies. Plaintiffs also alleged that Google shared PII with third parties to allow third parties to develop apps for Google Play.

In assessing standing, Judge Grewal noted that “injury-in-fact has proven to be a significant barrier to entry,” and that establishing standing in the Northern District of California is akin to climbing Mount Kilimanjaro. Notwithstanding the high burden, Judge Grewal found that plaintiffs adequately alleged standing.

Plaintiffs alleged standing based on (1) commingling of personally identifiable information; (2) direct economic injury; and (3) statutory violations. With respect to the commingling argument, plaintiffs contended that Google never compensated plaintiffs for the value associated with commingling PII amongst different Google products. Judge Grewal rejected this argument, noting that a plaintiff may not establish standing by pointing to a defendant’s profit; rather, plaintiff must actually suffer damages as a result of defendant’s conduct.

With respect to plaintiffs’ allegations of direct economic injury, Judge Grewal held that those allegations sufficed to confer standing. Plaintiffs argued they suffered direct economic injuries because of reduced performance of Android devices (plaintiffs had to pay for the battery power used by Google to send data to third parties). Plaintiffs also argued that they overpaid for their phones and had to buy different phones because of Google’s practices. These allegations sufficed to establish injury. Based on Judge Koh’s opinion in Apple, one key issue in the Google case will likely be whether any of the plaintiffs actually read and relied upon Google’s privacy policies.

Finally, Judge Grewal found that standing could be premised on the alleged violation of statutory rights. This ruling is consistent with the trend in other federal courts. Though Judge Grewal ultimately dismissed the complaint for failure to state a claim, the opinion’s discussion of standing will be informative to both the plaintiff and defense bars in privacy litigation.

The Apple and Google lawsuits represent a fraction of the many lawsuits seeking to recover damages and/or injunctive relief for the improper collection and/or use of consumer information. Establishing standing remains a difficult hurdle for plaintiffs in consumer privacy lawsuits, though courts are increasingly accepting standing arguments based on statutory violations and allegations of economic injuries. The Apple decision is on appeal, so we will see if the Ninth Circuit sheds further light on issues of standing in privacy lawsuits.


Leave a comment

FTC Examines Internet of things, Privacy and Security, in Recent Workshop

On November 19, 2013, the Federal Trade Commission held a day-long workshop, “Internet of Things: Privacy and Security in a Connected World” on the privacy implications concerning devices such as cars, home appliances, fitness equipment, and other machines that are able to gather data and connect to the internet. For consumers, these devices can help track health, remotely monitor aging family members, reduce utility bills, and even send alerts to buy more milk.

Ubiquitous Internet of Things

Technological advances and new business models centered around the internet of things have taken off.
It has been reported that crowd-sourcing start-up, Quirky, has teamed up with GE to develop connected-home products. Another start up company isdeveloping tracking technology through GPS-embedded tags. On November 20, 2013, Qualcomm announced that has developed a line of chips for the internet of things space. It has been argued that companies should adjust their business models and use the internet of things to connect to customers. These developments present the FTC with the challenge of keeping up with technology to protect consumers and the competitive landscape.

In her remarks, Chairwoman Edith Ramirez emphasized how ubiquitous smart devices have become. Five years ago, she remarked, there are more “things” than people connected to the Internet; by 2015, there will be an estimated twenty-five billion things connected to the Internet and by 2020, an estimated fifty billion. Commissioner Maureen Ohlhausen, in her remarks later in the workshop, stated that the FTC will conduct policy research to understand the effects that technological advances and innovative business models concerning the internet of things have on consumers and the marketplace.

Privacy and Security Challenges

Chairwoman Ramirez noted privacy and security challenges presented by the internet of things. Privacy risks are present since devices connected to the internet can collect, compile, and transmit information about consumers in ways that may not have been expected. When aggregated, the data pieces collected by devices present “a deeply personal and startlingly complete picture of each of us.” Security risks are present since “any device connected to the Internet is potentially vulnerable to hijack.” Indeed, these risks have been reported and present real concerns.

Chairwoman Ramirez noted that the FTC will be vigilant in bringing enforcement actions against companies who fail to properly safeguard consumers from security breaches. She noted as an example the FTC’s first enforcement forayinto the internet of things against TRENDnet for failing to properly design its software and test its internet-connected security cameras, leaving consumers vulnerable to a hacker who accessed the live feeds from 700 cameras and made them available on the Internet. When it encounters consumer harm, Commissioner Olhausen stated that the FTC will use its traditional enforcement tools to challenge any potential threats that arise, much like it has done in the data security, mobile, and big data spaces.

Chairwoman Ramirez said that companies that take part in the internet of things ecosystem are “stewards of the consumer data” and that “with big data comes big responsibility.” The FTC has published a number of best practices that Chairwoman Ramirez identified as useful for companies in the internet of things space: (1) privacy by design—privacy protections built in from the outset, (2) simplified consumer choice—allowing consumers to control their data, and (3) transparency—disclosure of what information the devices collect and how it is being used.

FTC Report Forthcoming

The FTC will produce a report on what it has learned from the November 19 workshop and provide fruther recommendations about best practices. The FTC report can educate consumers and businesses on how to maximize consumer benefits and avoid or minimize any identified risks. Commissioner Ohlhausen stressed that the FTC should identify whether existing laws and existing regulatory structures, including self-regulation, are sufficient to address potential harms.

Vint Cerf of Google, who gave the keynote presentation, advised that rather than relying on regulations to protect privacy, social conventions should be developed. He stated that “while regulation might be helpful, an awful lot of the problems that we experience with privacy is a result of our own behavior.”

The same day as the workshop, the Future of Privacy Forum released a white paper arguing for an updated privacy paradigm for the internet of things that focuses not on how information is collected and communicated but on how organizations use personally identifiable information.

The FTC will continue to accept comments until January 10, 2013.


3 Comments

Mobile Location Analytics Companies Agree to Code of Conduct

U.S. Senator Charles Schumer, the Future of Privacy Forum (“FPF”), a Washington, D.C. based think tank, and a group of location analytics companies, including Euclid, Mexia Interactive, Radius Networks, Brickstream, Turnstyle Solutions and SOLOMO,  released a Code of Conduct to promote customer privacy and transparency for mobile location analytics. 

Mobile location analytics technology, which allows stores to analyze shoppers’ behavior based on information collected from the shoppers’ cell phones, has faced a string of negative press in the last several months.  The location analytics companies gather Wi-Fi and Bluetooth MAC address signals  to monitor shoppers’ movements around the store, providing feedback such as how long shoppers wait in line at the check-out, how effective a window display draws customers into the store, and how many people who browse actually make a purchase.  Retailers argue that the technology provides them with the same type of behavioral data that is already being collected from shoppers when they browse retail sites online.  Customer advocates, on the other hand, raise concerns about the invasive nature of the tracking service, particularly as most customers aren’t aware that the tracking is taking place. Senator Schumer has been one of the most vocal critics of the mobile location analytics services, calling it an “unfair or deceptive” trade practice to fail to notify shoppers that their movements are being tracked or to give them a chance to opt-out of the practice.   In an open letter to the FTC in July 2013, Sen. Schumer described the technology thus:

“Retailers do not ever receive affirmative consent from the customer for [location analytics] tracking, and the only options for a customer to not be tracked are to turn off their phone’s Wi-Fi or to leave the phone at home. Geophysical location data about a person is obviously highly sensitive; however, retailers are collecting this information anonymously without consent.”

In response, a group of leading mobile location analytics companies agreed to a Code of Conduct developed in collaboration with Sen. Schumer and the Future of Privacy Forum to govern mobile location analytics services.   Under the Code:

  • A participating mobile location analytics firm will “take reasonable steps to require” participating retailers to provide customer notice through clear, in-store signage; using a standard symbol or icon to indicate the collection of mobile location analytics data; and to direct customers to industry education and opt-out website (For example, “To learn about use of customer location and your choices, visit www.smartstoreprivacy.com” would be acceptable language for in-store signage)
  • The mobile location analytics company will provide a detailed disclosure in its privacy policy about the use and collection of data it collects in-store, which should be separate from the disclosure of information collected through the company’s website.
  • Customers must be allowed the choice to opt-out of tracking.  The mobile location analytics company will post a link in its privacy policy to the industry site which provides a central opt-out.  A notice telling customers to turn off their mobile device or to deactivate the Wi-Fi signal is not considered sufficient “choice” under the Code.
  • The notice and choice requirements do not apply if the information collected is not unique to an individual device or user, or it is promptly aggregated so as not to be unique to a device or user, and individual level data is not retained. If a mobile location analytics firm records device-level information, even if it only shares aggregate information with retail clients, it must provide customer choice.
  •  A customer’s affirmative consent is required if: (1) personal information will be linked to a mobile device identifier, or (2) a customer will be contacted based on the analytic information.  

 The FTC has offered support to the self-regulatory process and provided feedback on the Code during the drafting negotiations.  “It’s great that industry has recognized customer concerns about invisible tracking in retail space and has taken a positive step forward in developing a self-regulatory code of conduct,” FTC Director of Customer Protection Jessica Rich told Politico

Some critics, however, feel that the Code does not go far enough.  The notice provision is weak, as it relies on the retailers to provide in-store signage for the customer.  Notably, retailers were not party to the negotiations developing the Code of Conduct and no retailer has publicly agreed to post signs in their stores.  Given the history – retailer Nordstrom was forced to drop its mobile location analytics pilot program in response to bad press from customers complaining after seeing posted signs – retailers are likely to want in-store signage to be as inconspicuous as possible. 

The next time you’re out shopping, keep your eyes peeled for in-store signage.  Are your retailers watching you? 

website-header-mission-logo1[1]


Leave a comment

Direct Marketing Association Launches “Data Protection Alliance”

Image

On October 29, 2013, the Direct Marketing Association (“DMA”) announced the launch of a new initiative, the Data Protection Alliance, which it describes “as  a legislative coalition that will focus specifically on ensuring that effective regulation and legislation protects the value of the Data-Driven Marketing Economy far into the future.” In its announcement release, the DMA reports the results of a study it commissioned on the economic impact of what calls “the responsible use of consumer data” on “data-driven innovation.” According to the DMA, its study indicated that regulation which “impeded responsible exchange of data across the Data-Driven Marketing Economy” would cause substantial negative damage to the U.S.’ economic growth and employment. Instead of such regulation, the DMA asks Congress to focus on its “Five Fundamentals for the Future”:

  1. Pass a national data security and breach notification law;

  2. Preempt state laws that endanger the value of data;

  3. Prohibit privacy class action suits and fund Federal Trade Commission enforcement;

  4. Reform the Electronic Communications Privacy Act (ECPA); and

  5. Preserve robust self-regulation for the Data-Driven Marketing Economy.

The DMA is explicitly concerned with its members’ interests, as any trade group would be, and this report and new Data Protection Alliance are far from the only views being expressed as to the need for legislation and regulation to alter the current balance between individual control and commercial use of personal information. Given the size and influence of the DMA and its members, though, this announcement provides useful information on the framing of the ongoing debate in the United States and elsewhere over privacy regulation.


1 Comment

Privacy and Data Protection Impacting on International Trade Talks

European Commission

The European Union and the United States are currently negotiating a broad compact on trade called the Transatlantic Trade and Investment Partnership (“TTIP”). While the negotiations themselves are non-public, among the issues that are reported to be potential obstacles to agreement are privacy and data protection. Not only does the European Union mandate a much stronger set of data protection and privacy laws for its member states than exist in the United States, but recent revelations of U.S. surveillance practices (including of European leaders) have highlighted the legal and cultural divide.

In an October 29, 2013 speech in Washington, D.C., Viviane Reding, Vice-President of the European Commission and EU Justice Commissioner, emphasized that Europe would not put its more stringent privacy rules at risk of weakening as part of the TTIP negotiations. She said in part,

Friends and partners do not spy on each other. Friends and partners talk and negotiate. For ambitious and complex negotiations to succeed there needs to be trust among the negotiating partners. That is why I am here in Washington: to help rebuild trust.

You are aware of the deep concerns that recent developments concerning intelligence issues have raised among European citizens. They have unfortunately shaken and damaged our relationship.

The close relationship between Europe and the USA is of utmost value. And like any partnership, it must be based on respect and trust. Spying certainly does not lead to trust. That is why it is urgent and essential that our partners take clear action to rebuild trust….

The relations between Europe and the US run very deep, both economically and politically. Our partnership has not fallen from the sky. It is the most successful commercial partnership the world has ever seen. The energy it injects into to our economies is measured in millions, billions and trillions – of jobs, trade and investment flows. The Transatlantic Trade and Investment Partnership could improve the figures and take them to new highs.

But getting there will not be easy. There are challenges to get it done and there are issues that will easily derail it. One such issue is data and the protection of personal data.

This is an important issue in Europe because data protection is a fundamental right. The reason for this is rooted in our historical experience with dictatorships from the right and from the left of the political spectrum. They have led to a common understanding in Europe that privacy is an integral part of human dignity and personal freedom. Control of every movement, every word or every e-mail made for private purposes is not compatible with Europe’s fundamental values or our common understanding of a free society.

This is why I warn against bringing data protection to the trade talks. Data protection is not red tape or a tariff. It is a fundamental right and as such it is not negotiable….

Beyond the TTIP talks, the divergence between European and U.S. privacy practices is putting new pressure on an existing legal framework, the Safe Harbor that was adopted after the enactment of the EU Data Protection Directive. A number of EU committees and political groups are either criticizing or recommending revocation of the Safe Harbor, a development that could significantly change the risk management calculus for the numerous companies which move personal information between the United States and Europe.