The Secure Times

An online forum of the ABA Section of Antitrust Law's Privacy and Information Security Committee


1 Comment

Privacy and Data Protection Impacting on International Trade Talks

European Commission

The European Union and the United States are currently negotiating a broad compact on trade called the Transatlantic Trade and Investment Partnership (“TTIP”). While the negotiations themselves are non-public, among the issues that are reported to be potential obstacles to agreement are privacy and data protection. Not only does the European Union mandate a much stronger set of data protection and privacy laws for its member states than exist in the United States, but recent revelations of U.S. surveillance practices (including of European leaders) have highlighted the legal and cultural divide.

In an October 29, 2013 speech in Washington, D.C., Viviane Reding, Vice-President of the European Commission and EU Justice Commissioner, emphasized that Europe would not put its more stringent privacy rules at risk of weakening as part of the TTIP negotiations. She said in part,

Friends and partners do not spy on each other. Friends and partners talk and negotiate. For ambitious and complex negotiations to succeed there needs to be trust among the negotiating partners. That is why I am here in Washington: to help rebuild trust.

You are aware of the deep concerns that recent developments concerning intelligence issues have raised among European citizens. They have unfortunately shaken and damaged our relationship.

The close relationship between Europe and the USA is of utmost value. And like any partnership, it must be based on respect and trust. Spying certainly does not lead to trust. That is why it is urgent and essential that our partners take clear action to rebuild trust….

The relations between Europe and the US run very deep, both economically and politically. Our partnership has not fallen from the sky. It is the most successful commercial partnership the world has ever seen. The energy it injects into to our economies is measured in millions, billions and trillions – of jobs, trade and investment flows. The Transatlantic Trade and Investment Partnership could improve the figures and take them to new highs.

But getting there will not be easy. There are challenges to get it done and there are issues that will easily derail it. One such issue is data and the protection of personal data.

This is an important issue in Europe because data protection is a fundamental right. The reason for this is rooted in our historical experience with dictatorships from the right and from the left of the political spectrum. They have led to a common understanding in Europe that privacy is an integral part of human dignity and personal freedom. Control of every movement, every word or every e-mail made for private purposes is not compatible with Europe’s fundamental values or our common understanding of a free society.

This is why I warn against bringing data protection to the trade talks. Data protection is not red tape or a tariff. It is a fundamental right and as such it is not negotiable….

Beyond the TTIP talks, the divergence between European and U.S. privacy practices is putting new pressure on an existing legal framework, the Safe Harbor that was adopted after the enactment of the EU Data Protection Directive. A number of EU committees and political groups are either criticizing or recommending revocation of the Safe Harbor, a development that could significantly change the risk management calculus for the numerous companies which move personal information between the United States and Europe.


Leave a comment

Recent FTC Actions and Statements Show Continuing Focus on Privacy

The Federal Trade Commission has long taken a lead role in issues of privacy and data protection, under its general consumer protection jurisdiction under Section 5 of the FTC Act (15 U.S.C. §45) as well as specific legislation such as the Children’s Online Privacy Protection Act of 1998 (“COPPA“) (which itself arose out of FTC reports). The FTC continues to bring legal actions against companies it believes have improperly collected, used or shared consumer personal information, including the recent settlement of a complaint filed against Aaron’s, Inc., a national rent-to-own retail chain based in Atlanta, GA. In its October 22, 2013 press release announcing the settlement, the FTC described Aaron’s alleged violations of Section 5:

Aaron’s, Inc., a national, Atlanta-based rent-to-own retailer, has agreed to settle FTC charges that it knowingly played a direct and vital role in its franchisees’ installation and use of software on rental computers that secretly monitored consumers including by taking webcam pictures of them in their homes.

According to the FTC’s complaint, Aaron’s franchisees used the software, which surreptitiously tracked consumers’ locations, captured images through the computers’ webcams – including those of adults engaged in intimate activities – and activated keyloggers that captured users’ login credentials for email accounts and financial and social media sites….

The complaint alleges that Aaron’s knew about the privacy-invasive features of the software, but nonetheless allowed its franchisees to access and use the software, known as PC Rental Agent. In addition, Aaron’s stored data collected by the software for its franchisees and also transmitted messages from the software to its franchisees. In addition, Aaron’s provided franchisees with instructions on how to install and use the software.

The software was the subject of related FTC actions earlier this year against the software manufacturer and several rent-to-own stores, including Aaron’s franchisees, that used it. It included a feature called Detective Mode, which, in addition to monitoring keystrokes, capturing screenshots, and activating the computer’s webcam, also presented deceptive “software registration” screens designed to get computer users to provide personal information.

The FTC’s Consent Order Agreement with Aaron’s includes a prohibition on the company using keystroke- or screenshot-monitoring software or activating the consumer’s microphone or Web cam and a requirement to obtain express consent before installing location-tracking technology and provide notice when it’s activated. Aaron’s may not use any data it received through improper activities in collections actions, must destroy illegally obtained information, and must encrypt any transmitted location or tracking data it properly collects.

The FTC is also continuing its efforts to educate and promote best practices about privacy for both consumers and businesses. On October 28, 2013, FTC Commissioner Julie Brill published an opinion piece in Advertising Age magazine entitled Data Industry Must Step Up to Protect Consumer Privacy. In the piece, Commissioner Brill criticizes data collection and marketing firms for failing to uphold basic privacy principles, and calls on them to join an initiative called “Reclaim Your Name” which Commissioner Brill announced earlier this year.

Brill writes in AdAge:

The concept is simple. Through creation of consumer-friendly online services, Reclaim Your Name would empower the consumer to find out how brokers are collecting and using data; give her access to information that data brokers have amassed about her; allow her to opt-out if a data broker is selling her information for marketing purposes; and provide her the opportunity to correct errors in information used for substantive decisions.

Improving the handling of sensitive data is another part of Reclaim Your Name. Data brokers that participate in Reclaim Your Name would agree to tailor their data handling and notice and choice tools to the sensitivity of the information at issue. As the data they handle or create becomes more sensitive — relating to health conditions, sexual orientation and financial condition, for example — the data brokers would provide greater transparency and more robust notice and choice to consumers.

For more information on the FTC’s privacy guidance and enforcement, see the privacy and security section of the FTC Web site.


Leave a comment

Upcoming 1 Hour Teleseminar on the Amended COPPA Rule – Tomorrow, Wednesday, October 30

Please register for a one-hour teleseminar entitled:

The Amended COPPA Rule:  Adapting to the Final Implementation

Wednesday, October 30, 2013
12:00 – 1:00 p.m. EST

Click HERE to register

In July 2013, the amended Children’s Online Privacy Protection Act (COPPA)
Rule went into effect. How is the industry is dealing with the new and revised
COPPA provisions such as the new definition of personal information, mixed
audience sites, liability for third party plug ins?  What are the expectations
of regulators?  Our expert panelists will offer different perspectives on key
provisions and implementation of the revised rule, including compliance,
enforcement, and education.  We will also cover traps for the unwary and best
practices regarding the collection and use of children’s personal
information.

Panelists:
Kristin Cohen, Federal Trade Commission, Washington, D.C.
Elizabeth Blumenfeld, Crowell & Moring LLP, Washington, D.C.
Phyllis Spaeth, Children’s Advertising Review Unit of the Council of Better Business Bureaus, New York, NY

Moderator:
Erika Brown Lee, Norton Rose Fulbright LLP, Washington, D.C.


Leave a comment

Dismissal of $16 Million Class Action Based on Theft of Patient Information Where No Evidence that Data Was “Released” May Provide Ammunition for Defending Breach Class Actions

A California appellate court recently dismissed a putative class action alleging that UCLA violated the California Confidentiality of Medical Information Act (CMIA) when an employee lost an encrypted hard drive containing 16,000 patient records.  The court concluded that the plaintiff’s claim—which sought a whopping $16 million based on $1,000 in nominal damages for each record—failed to allege that any “release” to a third party actually occurred.  Although the decision ostensibly applies only to the CMIA, it has potential broader implications for entities defending class actions seeking damages for data breaches. 

Background

In November 2011, UCLA notified approximately 16,000 patients that an encrypted hard drive containing personally identifiable medical information had been stolen from a physician’s home two months earlier.  The burglars also stole an index card stored near the hard drive that contained the encryption key.  On October 30, 2012, the plaintiff filed a putative class action alleging that UCLA “failed to have reasonable systems and controls in place to prevent the removal of protected health information from the hospital premises and as a result it negligently lost possession of the hard drive and encryption passwords.”  Although the plaintiff did not claim actual damages, she sought $1,000 in nominal damages for each of the 16,000 class members.

UCLA moved to dismiss the complaint, alleging that the hard drive theft did not constitute a “release” of information, which UCLA claimed was a prerequisite to recover nominal damages under the CMIA.   Although the lower court agreed that the hard drive theft did not amount to a “release” of information (and struck that part of the complaint), the court also held that the CMIA provided remedies for negligent “maintenance, preservation, and storage of confidential data” (and thus permitted a claim to proceed on that basis). 

The Appellate Opinion

The appellate court reversed, holding that the CMIA’s “remedies” section (Section 56.36(b)) permits the recovery of nominal damages only for the negligent “release” of confidential information.  The court acknowledged that the CMIA also requires medical information to be maintained and stored confidentially (in Section 56.101), and that entities that fail to do so “shall be subject to the remedies” set forth in Section 56.36(b).  The court held, however, that a “release” of information was still a prerequisite to recover nominal damages under Section 56.36(b) because the term “remedy” refers to the cause of action (i.e., for the “release” of information), not just the type of recovery (i.e., the nominal damages).  In other words, the negligent failure to maintain and store information confidentially permits recovery of nominal damages only if that failure results in a “release” of the information.

In holding that nominal damages are available only for the “release” of information, the court distinguished between the statutory availability of nominal damages for the “release” of information (in Section 56.36(b)) and the separate provision permitting compensatory and punitive damages for the “disclosure” of information (in Section 56.35).  To determine whether a “release” had occurred that would permit a nominal damages claim, the court therefore examined the difference between  a “disclosure” and a “release.”  Whereas a “disclosure” requires an affirmative act of communication, the word “release” is much broader, encompassing situations where information is “allowed to move away or spread from a source” or allowed “to escape confinement” such that it was accessed by a third party.  Thus, although a “disclosure” may also fall into the broader category of a “release,” the converse is not necessarily true because a release may not involve an affirmative communication.  The court concluded that the plain meaning of “release” encompassed a situation where a physician negligently maintains confidential information and thereby allows it to be accessed by a third party.

Turning back to the plaintiff’s complaint, the court noted that the plaintiff claimed only that UCLA had negligently “lost possession” of the hard drive.  Even under a broad definition of the word “release,” the court held that merely losing possession of information was not a violation of the CMIA that permitted recovery of nominal damages.  Absent an allegation that a third party actually accessed the information, thereby breaching the confidentiality of the information, the plaintiff could not establish any “release” in violation of the CMIA. 

Take-Away Points

UCLA is hardly the first HIPAA-covered entity that has encountered the loss of a laptop or other storage device containing patient information – and it certainly will not be the last.  So what helpful information can be gleaned from this opinion?

First, and most directly, this opinion suggests that the CMIA does not permit the recovery of nominal damages where information is lost, but there is no evidence that it was actually “released” – i.e., viewed by a third party.  If this opinion stands, plaintiffs will not be able to  morph a simple “negligent maintenance of records” claim into a claim for nominal damages under the CMIA short of an actual “release” of information.

Second, although the decision did not discuss the legal significance of encryption, the case highlights the trend toward encrypting data.  Here, the theft of the encryption key along with the device undermined the security that encryption normally provides, but it is questionable whether notification would have occurred or the class action ever brought if an encrypted device, but not the key, had been stolen. 

Finally, the opinion may provide persuasive authority for entities defending class actions in cases where a device is lost, but there is no evidence of misuse or access. Here, the court held that a “breach” (of confidentiality under the CMIA) only occurs when information is actually viewed.  Although this holding should not be construed as a modification to breach notification standards under different state breach notification laws (which typically focus on unauthorized “access” to and/or “acquisition” of personal information) or the Health Insurance Portability and Accountability Act (HIPAA) (which focuses on unauthorized “acquisition, access, use or disclosure” of protected health information that compromises that information), the decision may still provide ammunition for entities defending private class actions.  It remains to be seen how this will play out given the different definitions of “breach” in these various laws and the different causes of action that plaintiffs have brought, but this decision certainly provides entities with food for thought in lost device situations.


Leave a comment

California Veto of Electronic Communication Bill Makes Case for Federal Action

This past weekend, California Gov. Jerry Brown vetoed legislation (SB 467) which would have would have required California law enforcement officials to get a warrant to access online communications. The current Federal statute governing the search and seizure of these records is the Electronic Communications Privacy Act, known as ECPA for short. Enacted in 1986, many commentators believe that portions of ECPA have outlived their usefulness and that the law must be changed; that was the goal of SB 467.

ECPA consists of three main parts: Title III which outlaws unauthorized wiretaps while establishing procedures for law enforcement; the Stored Communications Act which deals with government access to stored electronic communications; and procedures governing the installation and use of pen registers. It is the Stored Communications Act portion that has become the focus of reform attempts. Written at a time when only a fraction of the population was using computer networks to communicate, it permits law enforcement to obtain the contents of electronic communications without a warrant so long as they are at least 180 days old and stored on a third party computer. With the advent of remote servers, cloud computing, and other realities of the internet age, advocates have been hoping for a broad rewrite of this seemingly arcane standard.

Efforts to reform the Stored Communications Act had a fair bit of momentum in the Senate prior to the 2012 election but stalled before Congress adjourned. In March of this year, Judiciary Chairman Sen. Patrick Leahy (D-Vt) and Sen. Mike Lee (R-Ut) again introduced ECPA reform legislation to create a search warrant requirement for electronic communications stored on third party computers. The bill also requires a notice to the individual whose communications have been seized within ten days of the warrants execution. Similar legislation has been introduced in the House. Both chambers seemed poise to act, but like so many other issues in the current Congress, efforts have become stalled over budget and fiscal issues.
The proposed California law paralleled the proposed Senate legislation in many ways, but departed significantly in its notice requirement. SB 467 would have mandated that individuals receive notice of the warrant within three days, a time frame that is more compressed than the 10-days outlined in Chairman Leahy’s bill. This requirement brought out opposition within California’s law enforcement community with police and prosecutors expressing their doubts.
In his veto statement Gov. Brown gave voice to those concerns saying, “The bill, however, imposes new requirements that go beyond those required by federal law and could impede ongoing criminal investigations.”

With this veto the focus will again (once Congress solves/punts its fiscal fights) come back to the efforts of Sens. Lee and Leahy to move ECPA reform out of the Senate. With strong bipartisan backing, the question is more of when, not if, this happens.


Leave a comment

Update on House Bipartisan Privacy Working Group

On August 1, the Commerce, Manufacturing, and Trade Subcommittee issued a press release announcing the creation of a bipartisan Privacy Working Group in the U.S. House of Representatives.  The group’s composition includes Representatives Marsha Blackburn (R-Tenn.) and Peter Welch (D-Vt.) serving as co-chairs, Representatives Joe Barton (R-Tex.), Pete Olson (R.-Tex.), Mike Pompeo (R.-Kan.), Schakowsky, Bobby Rush (D.-Ill.), and Jerry McNerney (D.-Cal.). The Subcommittee’s announcement stressed the group’s plans to take a bipartisan approach and the need to balance a consumer-oriented focus with fostering growth and innovation.

The group held its first meeting on September 26, inviting participants from the private sector for an informal, roundtable format discussion about how companies deal with online privacy issues and their thoughts regarding Congressional involvement. Representatives from Google, Walmart and data broker BlueKai attended, with discussions centering on the companies’ data collection practices.  Referring to the crash-course nature of the discussion, Blackburn called it “Privacy 101.”

In response to the closed-door nature of the September 26 discussion, consumer groups including the Electronic Privacy Information Center, Consumer Federation of America, Center for Digital Democracy, Consumer Watchdog, U.S. PIRG, and Consumer Action sent Representatives Blackburn and Welch a letter dated October 1, acknowledging the importance of the meetings but calling for the Working Group’s meetings to be conducted in a public format in accordance with the “Open meeting and hearings” provision contained in the Rule of the House of Representatives for the 113rd Congress. Specifically, the letter signatories stressed the need for the group’s meetings to be ones in which a public record is created, reporters and news organizations are in attendance, and various viewpoints are heard

While it remains to be seen what Blackburn’s and Welch’s response will be to the consumer privacy groups’ letter, nine additional sessions are planned for the next several months. The intention is to include members of industry, government and consumer groups and “create the conditions where there is some capacity for consensus,” according to Welch.


Leave a comment

Microsoft’s Xbox One Kinect Consumer Privacy Concerns: “You are fully in control of your personal data.”

Microsoft’s Xbox One Kinect Consumer Privacy Concerns: “You are fully in control of your personal data.”

This November, consumers will be able to get their hands on the highly anticipated “next-generation” of gaming consoles from both Microsoft and Sony. The Playstation 4 will be Sony’s fourth gaming console under their Playstation brand, while the Xbox One will be Microsoft’s third. In an effort to both differentiate the Xbox One from the Playstation 4 and encourage software developer adoption of their motion-sensing peripheral — the Xbox One Kinect — Microsoft has opted to ship each Xbox One with the new Kinect at a price point of $500. Amid growing consumer privacy concerns pertaining to allegations of Microsoft’s participation with the NSA’s PRISM program, Microsoft has made a number of attempts to assuage apprehensions of those who see the Xbox One Kinect as nothing more than this year’s best-selling Trojan horse.

Most recently, Ad Age posted an article quoting Microsoft’s VP of marketing and strategy, Yusuf Mehdi, who was speaking at the Association of National Advertisers Masters of Marketing Conference in Phoenix, Arizona on October 5. As originally reported – and apparently interpreted by those in attendance – Mr. Mehdi’s comments regarding Microsoft’s Xbox Live platform as a future outlet for advertisers, as the “holy grail in terms of how you understand the consumer” were apparently misconstrued to mean that biometric data, captured by the Xbox One Kinect, would in some way be accessible to advertisers. Considering the Xbox One Kinect’s impressive biometric capabilities, one could reasonably understand how this particular audience could draw such a conclusion. Ad Age has since then amended their article, reflecting Microsoft’s clarification that Mr. Mehdi’s comments were misunderstood.

While the novelty of playing games with the original Kinect was arguably met with mixed reactions when it launched back in 2010, Microsoft hopes that many of the impressive, yet somewhat alarming, improvements made for its successor will prove to be welcomed additions for both consumers and software developers alike. The Xbox One Kinect boasts a number of new features, including the ability to do the following:

  • Recognize multiple moving subjects – even in the dark, thanks to its 1080p IR camera lens;
  • Distinguish between multiple voices within listening distance; and
  • Detect one’s heart rate.

Unlike its predecessor, this version of the Kinect is designed to always be on. That is, regardless of whether the Xbox One is turned off or being used in any capacity, the Kinect – in what some have likened to Hal 9000’s omnipresence in Stanley Kubrick’s 1968 release, Space Odyssey 2001 — is always there, listening and watching for an audible command.

Despite Microsoft’s best efforts to reassure consumers that they “are in control of [their own] personal data” when it comes what the Xbox One’s Kinect shares, there is no denying the existence of the risk of misuse and inadvertent disclosures. According to Microsoft, there are several privacy settings that Kinect owners can use to control how their biometric data will be used, which includes pausing the Kinect or disabling its ability to start the console via voice commands. Moreover, Microsoft states that biometric data collected by the Kinect – videos, photos, and yes, even your heart rate – “will not leave your Xbox One without your explicit permission.”

The question is, however, what happens if system vulnerabilities are discovered and exploited? Moreover, beyond the hacking community’s various motivations for targeting the Kinect’s controls, to what extent are consumers willing to forgo the privacy of their living room for the utility of Microsoft’s next-gen console?

Microsoft’s Xbox One gaming console, including the new Kinect, is scheduled to be released on November 22.