The Secure Times

An online forum of the ABA Section of Antitrust Law's Privacy and Information Security Committee


Leave a comment

Yesterday at FTC, President Obama Announced Plans for new data privacy and security laws: Comprehensive Data Privacy Law, Consumer Privacy Bill of Rights, and Student Digital Privacy Act

Yesterday afternoon, President Barak Obama gave a quip-filled speech at the Federal Trade Commission where he praised the FTC’s efforts in protecting American consumers over the past 100 years and unveiled his plans to implement legislation to protect American consumers from identity theft and to protect school children’s personal information from being used by marketers.   These plans build upon past legislative efforts and the Administration’s focus on cybersecurity, Big Data, and Consumer Protection.  Specifically, On February 23, 2012, the White House released “Consumer Data Privacy in a Networked World: A Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy” (the “Privacy Blueprint”) and in January 2014, President Obama asked his Counselor, John Podesta, to lead a working group to examine Big Data’s impact on government, citizens, businesses, and consumers.  The working group produced Big Data: Seizing Opportunities, Preserving Values on May 1, 2014.

In his speech, the President highlighted the need for increased privacy and security protections as more people go online to conduct their personal business—shop, manage bank accounts, pay bills, handle medical records, manage their “smart” homes, etc.—stating that “we shouldn’t have to forfeit our basic privacy when we go online to do our business”.  The President referenced his “Buy Secure” initiative that would combat credit card fraud through a “chip-and-pin” system for credit cards and credit-card readers issued by the United States government.  In that system, a microchip would be imbedded in a credit card and would replace a magnetic strip since microchips are harder than magnetic strips for thieves to clone.   A pin number would also need to be entered by the consumer into the credit card reader just as with an ATM or debit card.  The President praised those credit card issuers, banks, and lenders that allowed consumers to view their credit scores for free.   He also lauded the FTC’s efforts in the efforts to help identity theft victims by working with credit bureaus and by providing guidance to consumers on its website, identitytheft.gov.

The first piece of legislation the President discussed briefly was a comprehensive breach notification law that would require companies to notify consumers of a breach within 30 days and that would allow identity thieves to be prosecuted even when the criminal activity was done overseas. Currently, there is no federal breach notification law and many states have laws requiring companies to notify affected consumers and/or regulators depending on the type of information compromised and the jurisdiction in which the organization operates.  The state laws also require that breach notification letters to consumers should include certain information, such as information on the risks posed to the individual as a result of the breach along with steps to mitigate the harm.   This “patchwork of laws,” President Obama noted, is confusing to customers and costly for companies to comply with.  The plan to introduce a comprehensive breach notification law adopts the policy recommendation from the Big Data Report that Congress pass legislation that provides for a single national data breach standard along the lines of the Administration’s May 2011 Cybersecurity legislative proposal.  Such legislation should impose reasonable time periods for notification, minimize interference with law enforcement investigations, and potentially prioritize notification about large, damaging incidents over less significant incidents.

The President next discussed the second piece of legislation he would propose, the Consumer Privacy Bill of Rights.  This initiative is not new.  Electronic Privacy Bills of Rights of 1998 and 1999 have been introduced.  In 2011, Senators John Kerry, John McCain, and Amy Klobucher introduced S.799 – Commercial Privacy Bill of Rights Act of 2011.   The Administration’s  Privacy Blueprint of February 23, 2012 set forth the Consumer Privacy Bill of Rights and, along with the Big Data Report, directed The Department of Commerce’s The National Telecommunications and Information Administration (NTIA) to seek comments from stakeholders in order to develop legally-enforceable codes of conduct that would apply the Consumer Privacy Bill of Rights to specific business contexts.

The Big Data Report of May 1, 2014 recommended that The Department of Commerce seek stakeholder and public comment on big data developments and how they impact the Consumer Privacy Bill of Rights draft and consider legislative text for the President to submit to Congress.  On May 21, 2014, Senator Robert Menendez introduced S.2378 – Commercial Privacy Bill of Rights Act of 2014.  The Consumer Privacy Bill of Rights set forth seven basic principles:

1) Individual control – Consumers have the right to exercise control over what information data companies collect about them and how it is used.

2) Transparency – Consumers have the right to easily understandable and accessible privacy and security practices.

3) Respect for context – Consumers expect that data companies will collect, use, and disclose the information they provided in ways consistent with the context it was provided.

4) Security – consumers have the right to secure and responsible handling of personal data.

5) Access and accuracy – Consumers have the right to access and correct their personal data in usable formats in a manner that is appropriate to the data’s sensitivity and the risk of adverse consequences if the data is not accurate.

6) Focused Collection – Consumers have the right to reasonable limits on the personal data that companies collect and retain.

7) Accountability – Consumers have the right to have companies that collect and use their data to have the appropriate methods in place to assure that they comply with the consumer bill of rights.

The President next discussed the third piece of legislation he would propose, the Student Digital Privacy Act.  The President noted how new educational technologies including tailored websites, apps, tablets, digital tutors and textbooks transform how children learn and help parents and teachers track students’ progress.  With these technologies, however, companies can mine student data for non-educational, commercial purposes such as targeted marketing.  The Student Privacy Act adopts the Big Data Report’s policy recommendation of ensuring that students’ data, collected and gathered in an educational context, is used for educational purposes and that students are protected against having their data shared or used inappropriately.  The President noted that the Student Digital Privacy Act would not “reinvent the wheel” but mirror on a federal level state legislation, specifically the California law to take effect next year that bars education technology companies from selling student data or using that data to target students with ads.   The current federal law that protects student’s privacy is the Family Educational Rights and Privacy Act of 1974, which does not protect against companies’ data mining that reveals student’s habits and profiles for targeted advertising but rather protects against official educational records from being released by schools. The President highlighted current self-regulation, the Student Privacy Pledge, signed by 75 education technology companies committing voluntary not to sell student information or use education technologies to send students targeted ads.  It has been discussed whether self-regulation would work and whether the proposed Act would go far enough.  The President remarked that parents want to make sure that children are being smart and safe online, it is their responsibility as parents to do so but that structure is needed for parents to ensure that information is not being gathered about students without their parents or the kids knowing about it.  This hinted at a notification requirement and opt-out for student data mining that is missing from state legislation but is a requirement of the Children’s Online Privacy Protection Act of 1998.  Specifically, COPPA requires companies and commercial website operators that direct online services to children under 13, collect personal information from children under 13, or that know they are collecting personal information from children under to children under 13 to provide parents with notice about the site’s information-collection practices, obtain verifiable consent from parents before collecting personal information, give parents a choice as to whether the personal information is going to be disclosed to third parties, and give parents access and the opportunity to delete the children’s personal information, among other things.

President Obama noted that his speech marked the first time in 80 years—since FDR—that a President has come to the FTC.   His speech at the FTC on Monday was the first of a three-part tour leading up to his State of the Union address.  Next, the President also planned to speak at the Department of Homeland Security on how the government can collaborate with the private sector to ward off cyber security attacks.  His final speak will take place in Iowa, where he will discuss how to bring faster, cheaper broadband access to more Americans.


Leave a comment

FTC Examines Internet of things, Privacy and Security, in Recent Workshop

On November 19, 2013, the Federal Trade Commission held a day-long workshop, “Internet of Things: Privacy and Security in a Connected World” on the privacy implications concerning devices such as cars, home appliances, fitness equipment, and other machines that are able to gather data and connect to the internet. For consumers, these devices can help track health, remotely monitor aging family members, reduce utility bills, and even send alerts to buy more milk.

Ubiquitous Internet of Things

Technological advances and new business models centered around the internet of things have taken off.
It has been reported that crowd-sourcing start-up, Quirky, has teamed up with GE to develop connected-home products. Another start up company isdeveloping tracking technology through GPS-embedded tags. On November 20, 2013, Qualcomm announced that has developed a line of chips for the internet of things space. It has been argued that companies should adjust their business models and use the internet of things to connect to customers. These developments present the FTC with the challenge of keeping up with technology to protect consumers and the competitive landscape.

In her remarks, Chairwoman Edith Ramirez emphasized how ubiquitous smart devices have become. Five years ago, she remarked, there are more “things” than people connected to the Internet; by 2015, there will be an estimated twenty-five billion things connected to the Internet and by 2020, an estimated fifty billion. Commissioner Maureen Ohlhausen, in her remarks later in the workshop, stated that the FTC will conduct policy research to understand the effects that technological advances and innovative business models concerning the internet of things have on consumers and the marketplace.

Privacy and Security Challenges

Chairwoman Ramirez noted privacy and security challenges presented by the internet of things. Privacy risks are present since devices connected to the internet can collect, compile, and transmit information about consumers in ways that may not have been expected. When aggregated, the data pieces collected by devices present “a deeply personal and startlingly complete picture of each of us.” Security risks are present since “any device connected to the Internet is potentially vulnerable to hijack.” Indeed, these risks have been reported and present real concerns.

Chairwoman Ramirez noted that the FTC will be vigilant in bringing enforcement actions against companies who fail to properly safeguard consumers from security breaches. She noted as an example the FTC’s first enforcement forayinto the internet of things against TRENDnet for failing to properly design its software and test its internet-connected security cameras, leaving consumers vulnerable to a hacker who accessed the live feeds from 700 cameras and made them available on the Internet. When it encounters consumer harm, Commissioner Olhausen stated that the FTC will use its traditional enforcement tools to challenge any potential threats that arise, much like it has done in the data security, mobile, and big data spaces.

Chairwoman Ramirez said that companies that take part in the internet of things ecosystem are “stewards of the consumer data” and that “with big data comes big responsibility.” The FTC has published a number of best practices that Chairwoman Ramirez identified as useful for companies in the internet of things space: (1) privacy by design—privacy protections built in from the outset, (2) simplified consumer choice—allowing consumers to control their data, and (3) transparency—disclosure of what information the devices collect and how it is being used.

FTC Report Forthcoming

The FTC will produce a report on what it has learned from the November 19 workshop and provide fruther recommendations about best practices. The FTC report can educate consumers and businesses on how to maximize consumer benefits and avoid or minimize any identified risks. Commissioner Ohlhausen stressed that the FTC should identify whether existing laws and existing regulatory structures, including self-regulation, are sufficient to address potential harms.

Vint Cerf of Google, who gave the keynote presentation, advised that rather than relying on regulations to protect privacy, social conventions should be developed. He stated that “while regulation might be helpful, an awful lot of the problems that we experience with privacy is a result of our own behavior.”

The same day as the workshop, the Future of Privacy Forum released a white paper arguing for an updated privacy paradigm for the internet of things that focuses not on how information is collected and communicated but on how organizations use personally identifiable information.

The FTC will continue to accept comments until January 10, 2013.


Leave a comment

State Attorneys General enter into 17 Million Dollar Settlement With Google Over Safari Web Tracking

36 states and the District of Columbia recently entered into a $17 million dollar settlement and injunction regarding Google’s use of third-party tracking cookies on Safari web browsers. This settlement puts to bed the Attorneys General’s allegations that Google mislead consumers and violated state consumer protection and privacy laws by circumventing Safari’s default privacy settings.

Third party cookies are small files placed on a web browser, often by advertising networks, that track users as they visit websites and that gathers data on users’ browsing activities. Google is a search engine that owns an ad network, DoubleClick. Apple’s Safari web browser has a default privacy setting that blocks third-party cookies that track a consumer’s browsing history, including those used by DoubleClick.

Starting in June 2011 Google altered its DoubleClick coding to circumvent Safari’s default cookie-blocking privacy settings without the users’ knowledge or consent. This practice ended in February 2012 after the discovery by Stanford researcher Jonathan Mayer that Google and three other online-advertising companies had circumvented Safari’s ad-blocker.  In addition to tracking without consent, Google mislead Safari users by suggesting that they need not install a Google opt-out plugin to block third-party advertising cookies because Safari already blocks all third-party cookies by default.

Google’s settlement with the states follows Google’s $22.5 million settlement with the Federal Trade Commission reached in August 2012 to settle the FTC’s allegation that Google violated an earlier FTC order by misrepresenting to Safari users that it would not place cookies on their browsers.  


1 Comment

Amicus Briefs filed asking Court to determine if warrentless searches of cell phone data are permissible under the Fourth Amendment

Two recent petitions for certiorari were filed regarding whether the Fourth Amendment permits police officers to search all or some digital contents of an arrestee’s cell phone incident to arrest.  Federal courts of appeal and state courts of last resort are divided on this issue.  On July 30, 2013, a petition for certiorari was filed asking the Supreme Court to review a California Court of Appeal, Fourth District case, Riley v. California.  On August 19, 2013 the U.S. Solicitor General submitted an amicus brief asking the Supreme Court to reverse the First Circuit Court of Appeal’s decision in U.S. v. Wurie.  These cases are noteworthy since they touch on arrestee’s rights to their cell phone data and since the Fourth Amendment is a bedrock for privacy law in the United States.

In U.S. v. Wurie, the police confiscated the arrestee’s Verizon LG flip-phone and retrieved the phone number of an incoming call labeled “my house.”  The police used that phone number to determine the arrestee’s residence and gather further evidence.  In Riley v. California, the police searched the arrestee’s smartphone, made an extenstive search of its digital contents, and were able to gather evidence linking the arrestee to more serious crimes.  In both instances, the police made the searches without a warrant pursuant to the search-incident-to-arrest exception to the Fourth Amendment that allows police officers to perform a class of searches that have been deemed potentially necessary to preserve destructible evidence or protect police officers. 

The question of whether the search of cell phone data could ever be justified under the search-incident-to-arrest exception has come up in federal and state courts in the past, some finding that warrantless cell phone data searches are categorically lawful, others upholding a limited search.  In Riley v. U.S., the California Court of Appeal held that because the cell phone was immediately associated with the arrestee’s person at the time of his arrest, the warrantless search was valid.  The First Circuit joined at least two other state courts of last resort in creating a bright-line rule rejecting all warrantless cell phone data searches and declined to create a rule based on particular instances.  In its amicus brief, the Solicitor General argued that even if cell phone data searches do not fall under the search-incident-to-arrest exception, the First Circuit erred in imposing a blanket prohibition.

Cell phone data searches struck the First Circuit as “a convenient way for the police to obtain information related to a defendant’s crime of arrest—or other, as yet undiscovered crimes—without having to secure a warrant.”  In rendering its opinion, the court found that data contained on cell phones, such as photographs, videos, written and audio messages, contacts, calendar appointments, web search and browsing history, purchases, and financial and medical records is highly personal in nature, would previously have been stored in one’s home, and reflects private thoughts and activities.  Additionally, the court noted that certain applications, if installed on modern cell phones, provide direct access to the home by remotely connecting to a home computer’s webcam.  Given the highly personal nature of the data and the scope of the search, potentially a home search, the court found that cell phone data is categorically different from otherwise allowable categories of searches incident to arrest. 

The First Circuit rejected the government’s argument that the cell phone data search was necessary to prevent evidence from being destroyed by remote wiping before a warrant issued.  The First Circuit noted that the police have evidence preservation methods, such as removing the phone’s battery, turning off the phone, placing the phone in a device that blocks external electromagnetic radiation, or by making a mirror copy of the phone’s entire contents.  Unlike other circuits, the First Circuit viewed the “slight and truly theoretical risk of evidence destruction,” a risk that was “‘remote’ indeed,” as insufficient when weighed against the “significant privacy implications inherent in cell phone data searches.”  In its amicus brief, the Solicitor General argued that cell phone searches are more critical to preserving extractable evidence than previously allowed searches since co-conspirators could remove data remotely. 

The First Circuit also rejected the government’s argument that searches of items carried on one’s person are justified since the arrestee had a reduced expectation of privacy caused by the arrest.  This was the basis for the California Court of Appeal’s decision in Riley.  The Solicitor General tried to revive this argument in its amicus brief.  The First Circuit rejected this argument since at the time of the precedent cited, the court “could not have envisioned a world in which the vast majority of arrestees would be carrying on their person an item containing not physical evidence but a vast storage of intangible data—data that is not immediately destructible and poses no threat to the arresting officers.”   Allowing police to search such data at the time of arrest would create, in the court’s view, “a serious and recurring threat to the privacy of countless individuals.” 

In making its categorical ban on warrantless cell phone data searches under the search-incident-to-arrest exception, the First Circuit noted that the exigent circumstances exception to the Fourth Amendment warrant requirement might apply where the police have probable cause to believe that the phone contains evidence of a crime, as well as a compelling need to act quickly, that makes it impractical for them to obtain a warrant.