The Secure Times

An online forum of the ABA Section of Antitrust Law's Privacy and Information Security Committee


Leave a comment

Yesterday at FTC, President Obama Announced Plans for new data privacy and security laws: Comprehensive Data Privacy Law, Consumer Privacy Bill of Rights, and Student Digital Privacy Act

Yesterday afternoon, President Barak Obama gave a quip-filled speech at the Federal Trade Commission where he praised the FTC’s efforts in protecting American consumers over the past 100 years and unveiled his plans to implement legislation to protect American consumers from identity theft and to protect school children’s personal information from being used by marketers.   These plans build upon past legislative efforts and the Administration’s focus on cybersecurity, Big Data, and Consumer Protection.  Specifically, On February 23, 2012, the White House released “Consumer Data Privacy in a Networked World: A Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy” (the “Privacy Blueprint”) and in January 2014, President Obama asked his Counselor, John Podesta, to lead a working group to examine Big Data’s impact on government, citizens, businesses, and consumers.  The working group produced Big Data: Seizing Opportunities, Preserving Values on May 1, 2014.

In his speech, the President highlighted the need for increased privacy and security protections as more people go online to conduct their personal business—shop, manage bank accounts, pay bills, handle medical records, manage their “smart” homes, etc.—stating that “we shouldn’t have to forfeit our basic privacy when we go online to do our business”.  The President referenced his “Buy Secure” initiative that would combat credit card fraud through a “chip-and-pin” system for credit cards and credit-card readers issued by the United States government.  In that system, a microchip would be imbedded in a credit card and would replace a magnetic strip since microchips are harder than magnetic strips for thieves to clone.   A pin number would also need to be entered by the consumer into the credit card reader just as with an ATM or debit card.  The President praised those credit card issuers, banks, and lenders that allowed consumers to view their credit scores for free.   He also lauded the FTC’s efforts in the efforts to help identity theft victims by working with credit bureaus and by providing guidance to consumers on its website, identitytheft.gov.

The first piece of legislation the President discussed briefly was a comprehensive breach notification law that would require companies to notify consumers of a breach within 30 days and that would allow identity thieves to be prosecuted even when the criminal activity was done overseas. Currently, there is no federal breach notification law and many states have laws requiring companies to notify affected consumers and/or regulators depending on the type of information compromised and the jurisdiction in which the organization operates.  The state laws also require that breach notification letters to consumers should include certain information, such as information on the risks posed to the individual as a result of the breach along with steps to mitigate the harm.   This “patchwork of laws,” President Obama noted, is confusing to customers and costly for companies to comply with.  The plan to introduce a comprehensive breach notification law adopts the policy recommendation from the Big Data Report that Congress pass legislation that provides for a single national data breach standard along the lines of the Administration’s May 2011 Cybersecurity legislative proposal.  Such legislation should impose reasonable time periods for notification, minimize interference with law enforcement investigations, and potentially prioritize notification about large, damaging incidents over less significant incidents.

The President next discussed the second piece of legislation he would propose, the Consumer Privacy Bill of Rights.  This initiative is not new.  Electronic Privacy Bills of Rights of 1998 and 1999 have been introduced.  In 2011, Senators John Kerry, John McCain, and Amy Klobucher introduced S.799 – Commercial Privacy Bill of Rights Act of 2011.   The Administration’s  Privacy Blueprint of February 23, 2012 set forth the Consumer Privacy Bill of Rights and, along with the Big Data Report, directed The Department of Commerce’s The National Telecommunications and Information Administration (NTIA) to seek comments from stakeholders in order to develop legally-enforceable codes of conduct that would apply the Consumer Privacy Bill of Rights to specific business contexts.

The Big Data Report of May 1, 2014 recommended that The Department of Commerce seek stakeholder and public comment on big data developments and how they impact the Consumer Privacy Bill of Rights draft and consider legislative text for the President to submit to Congress.  On May 21, 2014, Senator Robert Menendez introduced S.2378 – Commercial Privacy Bill of Rights Act of 2014.  The Consumer Privacy Bill of Rights set forth seven basic principles:

1) Individual control – Consumers have the right to exercise control over what information data companies collect about them and how it is used.

2) Transparency – Consumers have the right to easily understandable and accessible privacy and security practices.

3) Respect for context – Consumers expect that data companies will collect, use, and disclose the information they provided in ways consistent with the context it was provided.

4) Security – consumers have the right to secure and responsible handling of personal data.

5) Access and accuracy – Consumers have the right to access and correct their personal data in usable formats in a manner that is appropriate to the data’s sensitivity and the risk of adverse consequences if the data is not accurate.

6) Focused Collection – Consumers have the right to reasonable limits on the personal data that companies collect and retain.

7) Accountability – Consumers have the right to have companies that collect and use their data to have the appropriate methods in place to assure that they comply with the consumer bill of rights.

The President next discussed the third piece of legislation he would propose, the Student Digital Privacy Act.  The President noted how new educational technologies including tailored websites, apps, tablets, digital tutors and textbooks transform how children learn and help parents and teachers track students’ progress.  With these technologies, however, companies can mine student data for non-educational, commercial purposes such as targeted marketing.  The Student Privacy Act adopts the Big Data Report’s policy recommendation of ensuring that students’ data, collected and gathered in an educational context, is used for educational purposes and that students are protected against having their data shared or used inappropriately.  The President noted that the Student Digital Privacy Act would not “reinvent the wheel” but mirror on a federal level state legislation, specifically the California law to take effect next year that bars education technology companies from selling student data or using that data to target students with ads.   The current federal law that protects student’s privacy is the Family Educational Rights and Privacy Act of 1974, which does not protect against companies’ data mining that reveals student’s habits and profiles for targeted advertising but rather protects against official educational records from being released by schools. The President highlighted current self-regulation, the Student Privacy Pledge, signed by 75 education technology companies committing voluntary not to sell student information or use education technologies to send students targeted ads.  It has been discussed whether self-regulation would work and whether the proposed Act would go far enough.  The President remarked that parents want to make sure that children are being smart and safe online, it is their responsibility as parents to do so but that structure is needed for parents to ensure that information is not being gathered about students without their parents or the kids knowing about it.  This hinted at a notification requirement and opt-out for student data mining that is missing from state legislation but is a requirement of the Children’s Online Privacy Protection Act of 1998.  Specifically, COPPA requires companies and commercial website operators that direct online services to children under 13, collect personal information from children under 13, or that know they are collecting personal information from children under to children under 13 to provide parents with notice about the site’s information-collection practices, obtain verifiable consent from parents before collecting personal information, give parents a choice as to whether the personal information is going to be disclosed to third parties, and give parents access and the opportunity to delete the children’s personal information, among other things.

President Obama noted that his speech marked the first time in 80 years—since FDR—that a President has come to the FTC.   His speech at the FTC on Monday was the first of a three-part tour leading up to his State of the Union address.  Next, the President also planned to speak at the Department of Homeland Security on how the government can collaborate with the private sector to ward off cyber security attacks.  His final speak will take place in Iowa, where he will discuss how to bring faster, cheaper broadband access to more Americans.


Leave a comment

FTC Chairwoman Edith Ramirez Comments on Data Security for the Internet of Things

Happy New Year! For many, the holidays included exciting new gadgets. Whether it’s a new fitness tracker, a smart thermostat, or a smart glucose meter, these new connected devices have arrived, and new products are on the horizon. These products, termed the “Internet of Things” by privacy professionals, are broadly defined as products that can connect to a network.

On January 6, 2015, FTC Chairwoman Edith Ramirez delivered the opening remarks at the International Consumer Electronics Show, during which she spoke on security issues surrounding the Internet of Things (“IoT”). Chairwoman Ramirez discussed what she viewed as three key risks to consumer privacy, along with suggested industry solutions to mitigate those risks.

IoT Risks to Consumer Privacy
The first privacy and security risk of connected devices Chairwoman Ramirez identified was that connected devices engage in “ubiquitous data collection.” Because these devices can potentially collect personal information, including our habits, location, and physical condition, the data can lead to rich profiles of consumer preferences and behavior.

The second risk Chairwoman Ramirez identified was the possible unexpected use of consumer data acquired through connected devices. As an example, she asked whether data from a smart TV’s tracking of consumer television habits could be combined with other data to enable businesses to engage in targeted advertising or even exacerbate socio-economic disparities.

The third risk she identified was that connected devices can be hijacked, leading to misuse of personal information.

Suggested Industry Solutions
To combat the risks identified above, Chairwoman Ramirez suggested three solutions for the IoT industry. First, IoT companies should engage in “Security by Design,” namely that IoT products should be built initially with a priority on security, and that IoT companies should implement technical and administrative measures to ensure reasonable security. Chairwoman Ramirez identified five aspects of Security by Design:

  • conduct a privacy or security risk assessment as part of the design process;
  • test security measures before products launch;
  • use smart defaults—such as requiring consumers to change default passwords in the set-up process;
  • consider encryption, particularly for the storage and transmission of sensitive information, such as health data; and
  • monitor products throughout their life cycle and, to the extent possible, patch known vulnerabilities.

Second, Chairwoman Ramirez suggested that companies that collect personal information should engage in data minimization, viz. that they should collect only the data needed for a specific purpose and then safely destroy that data afterwards. Chairwoman Ramirez also urged companies to de-identify consumer data where possible.

Finally, Chairwoman Ramirez suggested that IoT companies provide notice and choice to consumers for unexpected collection or uses of their data. As an example, Chairwoman Ramirez stated that if IoT companies are sharing data from a smart thermostat or fitness band with data brokers or marketing firms, those companies should provide consumers with a “simple notice of the proposed uses of their data and a way to consent.”

Although not official FTC statements, these remarks by Chairwoman Ramirez provide valuable insight into how the Federal Trade Commission may regulate connected devices in the future. Companies in the IoT space should monitor further developments closely and review their data collection, security, and sharing practices accordingly.


2 Comments

Before Liftoff, Drones Must Maneuver Through Privacy Laws

Unmanned aerial vehicles, better known as drones, are expected to revolutionize the way companies deliver packages to their customers.  Some also imagine these small aircrafts delivering pizzas to a customer’s home or nachos to a fan at a ballgame.  Researchers are even investigating the possibility of using drones to assist farmers with monitoring their crops.  Before drone technology takes flight, however, it will have to maneuver through privacy laws.

The Federal Aviation Administration (FAA) is the agency charged with developing rules, including privacy rules, for private individuals and companies to operate drones in national airspace.  While the precise breadth of FAA rules is not entirely clear, a framework is beginning to develop.  When the FAA recently announced test sites for drones, it also noted that test site operators must: (1) comply with existing federal and state privacy laws, (2) have publicly available privacy policies and a written plan for data use and retention, and (3) conduct a review of privacy practices that allows for public comment.  When soliciting the public for comment on these test site-privacy rules, the FAA received a wide spectrum of feedback.  This feedback ranged from suggestions that the agency must articulate precise elements of what constitutes a privacy violation, to the federal agency was not equipped (and therefore should not attempt) to regulate privacy at all.  It appears that the FAA settled on a middle ground of requiring drones to comply with existing privacy law, which is largely regulated by individual states.

Accordingly, state privacy laws are likely to be the critical privacy hurdle to commercial drone use.  It appears that only four states have thus far expressly addressed the use of private drones (as distinguished from drones used by public agencies, such as law enforcement).  Idaho and Texas generally prohibit civilians from using a drone to take photographs of private property.  They also restrict photography of any individual – even in public view – by such a drone.  And Oregon prevents drones from flying less than 400 feet above a property of a person who makes such a request.  The fourth state, Illinois, restricts use of drones that interfere with hunting and fishing activities.

As for the other states, they may be simply getting up to speed on the technology.  On the other hand, many of these states have considered or enacted laws restricting use of drones by the police.  Because these laws are silent on the use of private drones, one could argue that these states intentionally chose not to regulate private drones (and accordingly, existing laws regarding use of aircrafts or other public cameras, govern use of private drones).

Even though a state has passed a drone-related privacy law, it may very well be challenged on constitutional or other grounds.  For instance – to the extent they prohibit photography of public areas or objects and people in plain view – the Idaho and Texas laws may raise First Amendment questions.  As described in Hurley v. Irish-American, photographers generally receive First Amendment protection when taking public photos if he or she “possessed a message to be communicated” and “an audience to receive that message, regardless of the medium in which the message is to be expressed.”  Under this test, in Porat v. Lincoln Towers Community Association, a photo hobbyist taking pictures for aesthetic and recreational purposes was denied First Amendment protection.  In contrast, in Pomykacz v. Borough of West Wildwood, a “citizen activist” – whose pictures were taken out of concern about an affair between a town’s mayor and a police officer – was found to have First Amendment protection.  To be sure, however, the Supreme Court has acknowledged that “even in a public forum the government may impose reasonable restrictions on the time, place, or manner of protected speech, provided the restriction are justified without reference to the content of the regulated speech, that they are narrowly tailored to serve a significant governmental interest, and that they leave open ample alternative channels for communication of the information.”  For example, under this premise, some courts have upheld restrictions on public access to crime and accident scenes.  All told, we may see drone users assert First Amendment protection for photographs taken of public areas.

Another future legal challenge may involve the question of who owns the airspace above private property.  In United States v. Causby, the Supreme Court appeared to reject the idea of private ownership of airspace.  More specifically, it held that government aircrafts flying over private land do not amount to a government “taking”, or seizure of private property, unless the aircrafts are so low and frequent that they constitute an immediate interference with enjoyment of the land.  In other words, under Causby, the landowner owns the airspace necessary to use and enjoy the land.  But the Court declined to draw a specific line.  At the moment, it is unclear whether Oregon’s law – restricting drones within 400 feet of a home – is consistent with principle.

Lastly, we may see a legal challenge asserting that certain state privacy laws (such as the Idaho or Texas law or others that disallow drone use altogether) are preempted, or trumped.  Congress’s intent to impliedly preempt state law may be inferred (1) from a pervasive scheme of federal regulation that Congress left no room for the states to supplement, or (2) where Congress’s actions touch a field in which the federal interest is so dominant that the federal system will be assumed to preclude enforcement of state laws on that subject.  Applied here, one could argue that Congress has entrusted the FAA with sole authority for creating a scheme for regulating the the narrow field of national airspace, and drones in particular.  Additionally, the argument goes, the federal government has a dominant interest in regulating national airspace as demonstrated by the creation of the FAA and numerous other aircraft regulations.  Under the preemption line of reasoning, state privacy laws may be better focused on regulating data gathered by the drone rather than the space where the drone may fly or actions the drone may take while in the space (e.g. taking pictures).

All told, before official drone liftoff, companies employing drones will have to wait for final FAA rules on privacy.  Whether these final rules track the test site rules discussed above is not for certain.  Likely, the final rules will depend on the public comments received by the drone test sites.  Assuming the final rules track the test site rules, companies using commercial drones should focus on compliance with the various state privacy laws.  But, as noted above, we may see a constitutional challenge to these laws along the way.  Stay tuned.


Leave a comment

Direct Marketing Association Launches “Data Protection Alliance”

Image

On October 29, 2013, the Direct Marketing Association (“DMA”) announced the launch of a new initiative, the Data Protection Alliance, which it describes “as  a legislative coalition that will focus specifically on ensuring that effective regulation and legislation protects the value of the Data-Driven Marketing Economy far into the future.” In its announcement release, the DMA reports the results of a study it commissioned on the economic impact of what calls “the responsible use of consumer data” on “data-driven innovation.” According to the DMA, its study indicated that regulation which “impeded responsible exchange of data across the Data-Driven Marketing Economy” would cause substantial negative damage to the U.S.’ economic growth and employment. Instead of such regulation, the DMA asks Congress to focus on its “Five Fundamentals for the Future”:

  1. Pass a national data security and breach notification law;

  2. Preempt state laws that endanger the value of data;

  3. Prohibit privacy class action suits and fund Federal Trade Commission enforcement;

  4. Reform the Electronic Communications Privacy Act (ECPA); and

  5. Preserve robust self-regulation for the Data-Driven Marketing Economy.

The DMA is explicitly concerned with its members’ interests, as any trade group would be, and this report and new Data Protection Alliance are far from the only views being expressed as to the need for legislation and regulation to alter the current balance between individual control and commercial use of personal information. Given the size and influence of the DMA and its members, though, this announcement provides useful information on the framing of the ongoing debate in the United States and elsewhere over privacy regulation.


1 Comment

Privacy and Data Protection Impacting on International Trade Talks

European Commission

The European Union and the United States are currently negotiating a broad compact on trade called the Transatlantic Trade and Investment Partnership (“TTIP”). While the negotiations themselves are non-public, among the issues that are reported to be potential obstacles to agreement are privacy and data protection. Not only does the European Union mandate a much stronger set of data protection and privacy laws for its member states than exist in the United States, but recent revelations of U.S. surveillance practices (including of European leaders) have highlighted the legal and cultural divide.

In an October 29, 2013 speech in Washington, D.C., Viviane Reding, Vice-President of the European Commission and EU Justice Commissioner, emphasized that Europe would not put its more stringent privacy rules at risk of weakening as part of the TTIP negotiations. She said in part,

Friends and partners do not spy on each other. Friends and partners talk and negotiate. For ambitious and complex negotiations to succeed there needs to be trust among the negotiating partners. That is why I am here in Washington: to help rebuild trust.

You are aware of the deep concerns that recent developments concerning intelligence issues have raised among European citizens. They have unfortunately shaken and damaged our relationship.

The close relationship between Europe and the USA is of utmost value. And like any partnership, it must be based on respect and trust. Spying certainly does not lead to trust. That is why it is urgent and essential that our partners take clear action to rebuild trust….

The relations between Europe and the US run very deep, both economically and politically. Our partnership has not fallen from the sky. It is the most successful commercial partnership the world has ever seen. The energy it injects into to our economies is measured in millions, billions and trillions – of jobs, trade and investment flows. The Transatlantic Trade and Investment Partnership could improve the figures and take them to new highs.

But getting there will not be easy. There are challenges to get it done and there are issues that will easily derail it. One such issue is data and the protection of personal data.

This is an important issue in Europe because data protection is a fundamental right. The reason for this is rooted in our historical experience with dictatorships from the right and from the left of the political spectrum. They have led to a common understanding in Europe that privacy is an integral part of human dignity and personal freedom. Control of every movement, every word or every e-mail made for private purposes is not compatible with Europe’s fundamental values or our common understanding of a free society.

This is why I warn against bringing data protection to the trade talks. Data protection is not red tape or a tariff. It is a fundamental right and as such it is not negotiable….

Beyond the TTIP talks, the divergence between European and U.S. privacy practices is putting new pressure on an existing legal framework, the Safe Harbor that was adopted after the enactment of the EU Data Protection Directive. A number of EU committees and political groups are either criticizing or recommending revocation of the Safe Harbor, a development that could significantly change the risk management calculus for the numerous companies which move personal information between the United States and Europe.


Leave a comment

Recent FTC Actions and Statements Show Continuing Focus on Privacy

The Federal Trade Commission has long taken a lead role in issues of privacy and data protection, under its general consumer protection jurisdiction under Section 5 of the FTC Act (15 U.S.C. §45) as well as specific legislation such as the Children’s Online Privacy Protection Act of 1998 (“COPPA“) (which itself arose out of FTC reports). The FTC continues to bring legal actions against companies it believes have improperly collected, used or shared consumer personal information, including the recent settlement of a complaint filed against Aaron’s, Inc., a national rent-to-own retail chain based in Atlanta, GA. In its October 22, 2013 press release announcing the settlement, the FTC described Aaron’s alleged violations of Section 5:

Aaron’s, Inc., a national, Atlanta-based rent-to-own retailer, has agreed to settle FTC charges that it knowingly played a direct and vital role in its franchisees’ installation and use of software on rental computers that secretly monitored consumers including by taking webcam pictures of them in their homes.

According to the FTC’s complaint, Aaron’s franchisees used the software, which surreptitiously tracked consumers’ locations, captured images through the computers’ webcams – including those of adults engaged in intimate activities – and activated keyloggers that captured users’ login credentials for email accounts and financial and social media sites….

The complaint alleges that Aaron’s knew about the privacy-invasive features of the software, but nonetheless allowed its franchisees to access and use the software, known as PC Rental Agent. In addition, Aaron’s stored data collected by the software for its franchisees and also transmitted messages from the software to its franchisees. In addition, Aaron’s provided franchisees with instructions on how to install and use the software.

The software was the subject of related FTC actions earlier this year against the software manufacturer and several rent-to-own stores, including Aaron’s franchisees, that used it. It included a feature called Detective Mode, which, in addition to monitoring keystrokes, capturing screenshots, and activating the computer’s webcam, also presented deceptive “software registration” screens designed to get computer users to provide personal information.

The FTC’s Consent Order Agreement with Aaron’s includes a prohibition on the company using keystroke- or screenshot-monitoring software or activating the consumer’s microphone or Web cam and a requirement to obtain express consent before installing location-tracking technology and provide notice when it’s activated. Aaron’s may not use any data it received through improper activities in collections actions, must destroy illegally obtained information, and must encrypt any transmitted location or tracking data it properly collects.

The FTC is also continuing its efforts to educate and promote best practices about privacy for both consumers and businesses. On October 28, 2013, FTC Commissioner Julie Brill published an opinion piece in Advertising Age magazine entitled Data Industry Must Step Up to Protect Consumer Privacy. In the piece, Commissioner Brill criticizes data collection and marketing firms for failing to uphold basic privacy principles, and calls on them to join an initiative called “Reclaim Your Name” which Commissioner Brill announced earlier this year.

Brill writes in AdAge:

The concept is simple. Through creation of consumer-friendly online services, Reclaim Your Name would empower the consumer to find out how brokers are collecting and using data; give her access to information that data brokers have amassed about her; allow her to opt-out if a data broker is selling her information for marketing purposes; and provide her the opportunity to correct errors in information used for substantive decisions.

Improving the handling of sensitive data is another part of Reclaim Your Name. Data brokers that participate in Reclaim Your Name would agree to tailor their data handling and notice and choice tools to the sensitivity of the information at issue. As the data they handle or create becomes more sensitive — relating to health conditions, sexual orientation and financial condition, for example — the data brokers would provide greater transparency and more robust notice and choice to consumers.

For more information on the FTC’s privacy guidance and enforcement, see the privacy and security section of the FTC Web site.