The Secure Times

An online forum of the ABA Section of Antitrust Law's Privacy and Information Security Committee


Leave a comment

Yesterday at FTC, President Obama Announced Plans for new data privacy and security laws: Comprehensive Data Privacy Law, Consumer Privacy Bill of Rights, and Student Digital Privacy Act

Yesterday afternoon, President Barak Obama gave a quip-filled speech at the Federal Trade Commission where he praised the FTC’s efforts in protecting American consumers over the past 100 years and unveiled his plans to implement legislation to protect American consumers from identity theft and to protect school children’s personal information from being used by marketers.   These plans build upon past legislative efforts and the Administration’s focus on cybersecurity, Big Data, and Consumer Protection.  Specifically, On February 23, 2012, the White House released “Consumer Data Privacy in a Networked World: A Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy” (the “Privacy Blueprint”) and in January 2014, President Obama asked his Counselor, John Podesta, to lead a working group to examine Big Data’s impact on government, citizens, businesses, and consumers.  The working group produced Big Data: Seizing Opportunities, Preserving Values on May 1, 2014.

In his speech, the President highlighted the need for increased privacy and security protections as more people go online to conduct their personal business—shop, manage bank accounts, pay bills, handle medical records, manage their “smart” homes, etc.—stating that “we shouldn’t have to forfeit our basic privacy when we go online to do our business”.  The President referenced his “Buy Secure” initiative that would combat credit card fraud through a “chip-and-pin” system for credit cards and credit-card readers issued by the United States government.  In that system, a microchip would be imbedded in a credit card and would replace a magnetic strip since microchips are harder than magnetic strips for thieves to clone.   A pin number would also need to be entered by the consumer into the credit card reader just as with an ATM or debit card.  The President praised those credit card issuers, banks, and lenders that allowed consumers to view their credit scores for free.   He also lauded the FTC’s efforts in the efforts to help identity theft victims by working with credit bureaus and by providing guidance to consumers on its website, identitytheft.gov.

The first piece of legislation the President discussed briefly was a comprehensive breach notification law that would require companies to notify consumers of a breach within 30 days and that would allow identity thieves to be prosecuted even when the criminal activity was done overseas. Currently, there is no federal breach notification law and many states have laws requiring companies to notify affected consumers and/or regulators depending on the type of information compromised and the jurisdiction in which the organization operates.  The state laws also require that breach notification letters to consumers should include certain information, such as information on the risks posed to the individual as a result of the breach along with steps to mitigate the harm.   This “patchwork of laws,” President Obama noted, is confusing to customers and costly for companies to comply with.  The plan to introduce a comprehensive breach notification law adopts the policy recommendation from the Big Data Report that Congress pass legislation that provides for a single national data breach standard along the lines of the Administration’s May 2011 Cybersecurity legislative proposal.  Such legislation should impose reasonable time periods for notification, minimize interference with law enforcement investigations, and potentially prioritize notification about large, damaging incidents over less significant incidents.

The President next discussed the second piece of legislation he would propose, the Consumer Privacy Bill of Rights.  This initiative is not new.  Electronic Privacy Bills of Rights of 1998 and 1999 have been introduced.  In 2011, Senators John Kerry, John McCain, and Amy Klobucher introduced S.799 – Commercial Privacy Bill of Rights Act of 2011.   The Administration’s  Privacy Blueprint of February 23, 2012 set forth the Consumer Privacy Bill of Rights and, along with the Big Data Report, directed The Department of Commerce’s The National Telecommunications and Information Administration (NTIA) to seek comments from stakeholders in order to develop legally-enforceable codes of conduct that would apply the Consumer Privacy Bill of Rights to specific business contexts.

The Big Data Report of May 1, 2014 recommended that The Department of Commerce seek stakeholder and public comment on big data developments and how they impact the Consumer Privacy Bill of Rights draft and consider legislative text for the President to submit to Congress.  On May 21, 2014, Senator Robert Menendez introduced S.2378 – Commercial Privacy Bill of Rights Act of 2014.  The Consumer Privacy Bill of Rights set forth seven basic principles:

1) Individual control – Consumers have the right to exercise control over what information data companies collect about them and how it is used.

2) Transparency – Consumers have the right to easily understandable and accessible privacy and security practices.

3) Respect for context – Consumers expect that data companies will collect, use, and disclose the information they provided in ways consistent with the context it was provided.

4) Security – consumers have the right to secure and responsible handling of personal data.

5) Access and accuracy – Consumers have the right to access and correct their personal data in usable formats in a manner that is appropriate to the data’s sensitivity and the risk of adverse consequences if the data is not accurate.

6) Focused Collection – Consumers have the right to reasonable limits on the personal data that companies collect and retain.

7) Accountability – Consumers have the right to have companies that collect and use their data to have the appropriate methods in place to assure that they comply with the consumer bill of rights.

The President next discussed the third piece of legislation he would propose, the Student Digital Privacy Act.  The President noted how new educational technologies including tailored websites, apps, tablets, digital tutors and textbooks transform how children learn and help parents and teachers track students’ progress.  With these technologies, however, companies can mine student data for non-educational, commercial purposes such as targeted marketing.  The Student Privacy Act adopts the Big Data Report’s policy recommendation of ensuring that students’ data, collected and gathered in an educational context, is used for educational purposes and that students are protected against having their data shared or used inappropriately.  The President noted that the Student Digital Privacy Act would not “reinvent the wheel” but mirror on a federal level state legislation, specifically the California law to take effect next year that bars education technology companies from selling student data or using that data to target students with ads.   The current federal law that protects student’s privacy is the Family Educational Rights and Privacy Act of 1974, which does not protect against companies’ data mining that reveals student’s habits and profiles for targeted advertising but rather protects against official educational records from being released by schools. The President highlighted current self-regulation, the Student Privacy Pledge, signed by 75 education technology companies committing voluntary not to sell student information or use education technologies to send students targeted ads.  It has been discussed whether self-regulation would work and whether the proposed Act would go far enough.  The President remarked that parents want to make sure that children are being smart and safe online, it is their responsibility as parents to do so but that structure is needed for parents to ensure that information is not being gathered about students without their parents or the kids knowing about it.  This hinted at a notification requirement and opt-out for student data mining that is missing from state legislation but is a requirement of the Children’s Online Privacy Protection Act of 1998.  Specifically, COPPA requires companies and commercial website operators that direct online services to children under 13, collect personal information from children under 13, or that know they are collecting personal information from children under to children under 13 to provide parents with notice about the site’s information-collection practices, obtain verifiable consent from parents before collecting personal information, give parents a choice as to whether the personal information is going to be disclosed to third parties, and give parents access and the opportunity to delete the children’s personal information, among other things.

President Obama noted that his speech marked the first time in 80 years—since FDR—that a President has come to the FTC.   His speech at the FTC on Monday was the first of a three-part tour leading up to his State of the Union address.  Next, the President also planned to speak at the Department of Homeland Security on how the government can collaborate with the private sector to ward off cyber security attacks.  His final speak will take place in Iowa, where he will discuss how to bring faster, cheaper broadband access to more Americans.


Leave a comment

Canada’s Anti-Spam Law (CASL) in force July 1

Canada’s Anti-Spam Law (CASL) enters into force on Canada Day, July 1.  It was passed in 2010 as a “made-in-Canada” solution to “drive spammers out of Canada“. 

Are you outside Canada?  It’s important to know that this law reaches beyond Canada’s borders.  CASL is already affecting businesses in the United States, Europe and elsewhere as they change their communications practices to send emails and other “commercial electronic messages” into Canada. 

As we have described in our presentation Comparing CASL to CAN-SPAM, the new law applies to messages that are accessed by a computer system in Canada.  That means that messages sent by a person, business or organization outside of Canada, to a person in Canada, are subject to the law.

CASL expressly provides for sharing information among the Government of Canada, the Canadian CASL enforcement agencies, and “the government of a foreign state” or international organization, for the purposes of administering CASL’s anti-spam (and other) provisions.  The MOU among the Canadian CASL enforcement agencies similarly references processes to share and disseminate information received from and provided to their foreign counterpart agencies. 

In a speech on June 26, the Chair of the Canadian Radio-television and Telecommunications Commission, Jean-Pierre Blais, emphasized the CRTC’s cooperation with its international counterparts to combat unlawful telemarketers, hackers and spammers that “often operate outside our borders“.  The Chairman specifically named “the Federal Trade Commission in the U.S., the Office of Communication (OFCOM) in the U.K., the Authority for Consumers and Markets in the Netherlands, the Australian Communications and Media Authority and others”, and noted that the CRTC has led or participated in many international networks on unlawful telecommunications.

Companies should also take note that a violation of CASL might also result in the CRTC exercising its so-called “name and shame” power, by posting the name of the offender and the violation on its online compliance and enforcement list.  The CRTC has for years published notices of violation with respect to its “Do Not Call List”, and is expected to take a similar approach for CASL notices of violation as well. 

The CRTC recently published a Compliance and Enforcement Bulletin on its Unsolicited Telecommunications Rules and on CASL, available here.  The CRTC recommends implementing a corporate compliance program as part of a due diligence defence: 

Commission staff may take into consideration the existence and implementation of an effective corporate compliance program if the business presents the program as part of a due diligence defence in response to an alleged violation of the Rules or CASL. Although the pre-existence of a corporate compliance program may not be sufficient as a  complete defence to allegations of violations under the Rules or CASL, a credible and effective documented program may enable a business to demonstrate that it took  reasonable steps to avoid contravening the law.


Leave a comment

FTC Examines Internet of things, Privacy and Security, in Recent Workshop

On November 19, 2013, the Federal Trade Commission held a day-long workshop, “Internet of Things: Privacy and Security in a Connected World” on the privacy implications concerning devices such as cars, home appliances, fitness equipment, and other machines that are able to gather data and connect to the internet. For consumers, these devices can help track health, remotely monitor aging family members, reduce utility bills, and even send alerts to buy more milk.

Ubiquitous Internet of Things

Technological advances and new business models centered around the internet of things have taken off.
It has been reported that crowd-sourcing start-up, Quirky, has teamed up with GE to develop connected-home products. Another start up company isdeveloping tracking technology through GPS-embedded tags. On November 20, 2013, Qualcomm announced that has developed a line of chips for the internet of things space. It has been argued that companies should adjust their business models and use the internet of things to connect to customers. These developments present the FTC with the challenge of keeping up with technology to protect consumers and the competitive landscape.

In her remarks, Chairwoman Edith Ramirez emphasized how ubiquitous smart devices have become. Five years ago, she remarked, there are more “things” than people connected to the Internet; by 2015, there will be an estimated twenty-five billion things connected to the Internet and by 2020, an estimated fifty billion. Commissioner Maureen Ohlhausen, in her remarks later in the workshop, stated that the FTC will conduct policy research to understand the effects that technological advances and innovative business models concerning the internet of things have on consumers and the marketplace.

Privacy and Security Challenges

Chairwoman Ramirez noted privacy and security challenges presented by the internet of things. Privacy risks are present since devices connected to the internet can collect, compile, and transmit information about consumers in ways that may not have been expected. When aggregated, the data pieces collected by devices present “a deeply personal and startlingly complete picture of each of us.” Security risks are present since “any device connected to the Internet is potentially vulnerable to hijack.” Indeed, these risks have been reported and present real concerns.

Chairwoman Ramirez noted that the FTC will be vigilant in bringing enforcement actions against companies who fail to properly safeguard consumers from security breaches. She noted as an example the FTC’s first enforcement forayinto the internet of things against TRENDnet for failing to properly design its software and test its internet-connected security cameras, leaving consumers vulnerable to a hacker who accessed the live feeds from 700 cameras and made them available on the Internet. When it encounters consumer harm, Commissioner Olhausen stated that the FTC will use its traditional enforcement tools to challenge any potential threats that arise, much like it has done in the data security, mobile, and big data spaces.

Chairwoman Ramirez said that companies that take part in the internet of things ecosystem are “stewards of the consumer data” and that “with big data comes big responsibility.” The FTC has published a number of best practices that Chairwoman Ramirez identified as useful for companies in the internet of things space: (1) privacy by design—privacy protections built in from the outset, (2) simplified consumer choice—allowing consumers to control their data, and (3) transparency—disclosure of what information the devices collect and how it is being used.

FTC Report Forthcoming

The FTC will produce a report on what it has learned from the November 19 workshop and provide fruther recommendations about best practices. The FTC report can educate consumers and businesses on how to maximize consumer benefits and avoid or minimize any identified risks. Commissioner Ohlhausen stressed that the FTC should identify whether existing laws and existing regulatory structures, including self-regulation, are sufficient to address potential harms.

Vint Cerf of Google, who gave the keynote presentation, advised that rather than relying on regulations to protect privacy, social conventions should be developed. He stated that “while regulation might be helpful, an awful lot of the problems that we experience with privacy is a result of our own behavior.”

The same day as the workshop, the Future of Privacy Forum released a white paper arguing for an updated privacy paradigm for the internet of things that focuses not on how information is collected and communicated but on how organizations use personally identifiable information.

The FTC will continue to accept comments until January 10, 2013.


1 Comment

FTC v. Wyndham Update

Edit (Feb. 5, 2014): For a more recent update on this case, please see this post.

On November 1, Maureen Ohlhausen, a Commissioner at the Federal Trade Commission (FTC), held an “ask me (almost) anything” (AMAA) session on Reddit. There were no real surprises in the questions Commissioner Ohlhausen answered, and the AMAA format is not well-suited to lengthy responses. One interesting topic that did arise, however, was the FTC’s complaint against Wyndham Worldwide Corporation, and Wyndham’s subsequent filing of a motion to dismiss the FTC action against them. Commissioner Ohlhausen declined to discuss the ongoing litigation, but asserted generally that the FTC has the authority to bring such actions under Section 5 of the FTC Act, 15 U.S.C. § 45. While there were no unexpected revelations in the Commissioner’s response, I thought it presented an excellent opportunity to bring everyone up to speed on the Wyndham litigation.

On June 26, 2012, the Federal Trade Commission (FTC) filed a complaint in Arizona Federal District Court against Wyndham Worldwide Corporation, alleging that Wyndham “fail[ed] to maintain reasonable security” on their computer networks, which led to a data breach resulting in the theft of payment card data for hundreds of thousands of Wyndham customers, and more than $10.6 million in fraudulent charges on customers’ accounts.  Specifically, the complaint alleged that Wyndham engaged in deceptive business practices in violation of Section 5 of the FTC Act by misrepresenting the security measures it undertook to protect customers’ personal information. The complaint also alleged that Wyndham’s failure to provide reasonable data security is an unfair trade practice, also in violation of Section 5.

On August 27, 2012, Wyndham  responded by filing a motion to dismiss the FTC’s complaint, asserting, inter alia, that the FTC lacked the statutory authority to “establish data-security standards for the private sector and enforce those standards in federal court,” thus challenging the FTC’s authority to bring the unfairness count under the FTC Act. In their October 1, 2012 response, the FTC asked the court to reject Wyndham’s arguments, stating that the FTC’s complaint alleged a number of specific security failures on the part of Wyndham, which resulted in two violations of the FTC Act. The case was transferred to the Federal District of New Jersey on March 25, 2013, and Wyndham’s motions to dismiss were denied. On April 26, Wyndham once again filed motions to dismiss the FTC’s complaint, again asserting that the FTC lacked the legal authority to legislate data security standards for private businesses under Section 5 of the FTC Act.

At stake in this litigation is the FTC’s ability to bring enforcement claims against companies that suffer data breach due to a lack of “reasonable security.” What is unique in this case is Wyndham’s decision to fight the FTC action in court rather than make efforts to settle the case, as other companies have done when faced with similar allegations by the FTC. For example, in 2006, the FTC hit ChoicePoint Inc. with a $10 million penalty over data breach where over 180,000 payment card numbers were stolen. The FTC has also gone after such high-profile companies as Twitter, HTC, and Google based on similar facts and law. These actions resulted in out-of-court settlements.

If Wyndham’s pending motions to dismiss are denied, and the FTC ultimately prevails in this case, it is likely that the FTC will continue to bring these actions, and businesses will likely see an increased level of scrutiny applied to their network security. If, however, Wyndham succeeds and the FTC case against them is dismissed, public policy questions regarding data security will likely fall back to Congress to resolve.

Oral argument for the pending motions to dismiss are scheduled for November 7. No doubt many parties will be following these proceedings with great interest.


1 Comment

Amendments to CalOPPA Allow Minors to “Erase” Information from the Internet and Also Restricts Advertising Practices to Minors

On September 23, 2013, California Governor Jerry Brown signed SB568 into law, which adds new provisions to the California Online Privacy Protection Act. Officially called “Privacy Rights for California Minors in the Digital World,” the bill has already garnered the nickname of the “Internet Eraser Law,” because it affords California minors the ability to remove content or information previously posted on a Web site. The bill also imposes restrictions on advertising to California minors.

California Minors’ Right to Remove Online Content

Effective January 1, 2015, the bill requires online operators to provide a means by which California minors may remove online information posted by that minor. Online operators can elect to allow a minor to directly remove such information or can alternatively remove such information at a minor’s request. The bill further requires that online operators notify California minors of the right to remove previously-posted information.

Online operators do not need to allow removal of information in certain circumstances, including where (1) the content or information was posted by a third party; (2) state or federal law requires the operator or third party to retain such content or information; or (3) the operator anonymizes the content or information. The bill further clarifies that online operators need only remove the information from public view; the bill does not require wholesale deletion of the information from the online operator’s servers.

New Restrictions on Advertising to California Minors

Also effective January 1, 2015, the bill places new restrictions on advertising to California minors. The bill prohibits online services directed to minors from advertising certain products, including alcohol, firearms, tobacco, and tanning services. It further prohibits online operators from allowing third parties (e.g. advertising networks or plug-ins) to advertise certain products to minors. And where an advertising service is notified that a particular site is directed to minors, the bill restricts the types of products that can be advertised by that advertising service to minors.

Implications

Given the sheer number of California minors, these amendments to CalOPPA will likely have vast implications for online service providers. First, the bill extends not just to Web sites, but also to mobile apps, which is consistent with a general trend of governmental scrutiny of mobile apps. Online service providers should expect regulation of mobile apps to increase, as both California and the Federal Trade Commission have issued publications indicating concerns over mobile app privacy. Second, the bill also reflects an increased focus on privacy of children and minors. Developers should consider these privacy issues when designing Web sites and mobile apps, and design such products with the flexibility needed to adapt to changing legislation. Thus, any business involved in the online space should carefully review these amendments and ensure compliance before the January 1, 2015 deadline.



1 Comment

Less Than Satisfied with Self-Regulation? FTC Chair Renews Push for Do Not Track

Edith RamirezFTC Chair Edith Ramirez created some waves in her first speech to the advertising industry this week. Ramirez renewed the call for a universal Do Not Track mechanism—and impliedly ignored the progress of AdChoices, the Digital Advertising Alliance’s opt-out program.  The FTC’s critical stance, along with a renewed initiative in the Senate, signal that the government is unsatisfied with the industry’s progress toward enhanced consumer controls over privacy and may seek a public, rather than private, solution.

“Consumers await a functioning Do Not Track system, which is long overdue,” Ramirez said. “We advocated for a persistent Do Not Track mechanism that allows consumers to stop control of data across all sites, and not just for targeting ads.”

The comments, spoken before the American Advertising Federation at their annual advertising day on Capitol Hill, illustrated a rift between advertisers and regulators over the progress of self-regulatory programs and consumers’ perceptions of online behavioral advertising. Two years ago, the FTC called on advertisers to develop a program that would give consumers a choice to opt out of behaviorally targeted ads. Speaking to AdWeek, Stu Inglis, a partner at Venable who acts as the DAA’s attorney, said of Ramirez’s remarks:  “We have solved it. The DAA’s program covers 100 percent of the advertising ecosystem. We made our agreements.”

The DAA also recently released the results of a poll it commissioned, stating that nearly 70 per cent of consumers responding that they would like at least some ads tailored directly to their interests, and 75 per cent saying that they preferred an ad-supported internet model. (The poll comes with some caveats, described by an AdWeek piece today.)

However, in her speech Ramirez spoke of consumers’ “unease” with online tracking: “An online advertising system that breeds consumer discomfort is not a foundation for sustained growth. More likely, it is an invitation to Congress and other policymakers in the U.S. and abroad to intervene with legislation or regulation and for technical measures by browsers or others to limit tracking,” she said.

Ramirez also urged the advertising community to keep working within the multiparty process led by the W3C  (World Wide Web Consortium) to develop a browser-based Do Not Track program. However, there has been little concrete progress in the talks so far.

The online advertising industry may be running out of time. Senator Jay Rockefeller D-W.Va.), chair of the Senate Commerce Committee, announced that he would hold a hearing next week to discuss legislation that would mandate a Do Not Track standard.  The chairman, along with Sen. Richard Blumenthal (D-CT), introduced the Do Not Track Online Act in February.  The bill would direct the FTC to write regulations governing when internet firms must honor a consumer’s request that their information not be collected, and deputize the FTC and state attorneys general to enforce the rules.

“Industry made a public commitment to honor Do-Not-Track requests from consumers but has not yet followed through,” Rockefeller said of the hearing. “I plan to use this hearing to find out what is holding up the development of voluntary Do-Not-Track standards that should have been adopted at the end of last year.”

If Congress and the FTC agree that the advertising industry hasn’t honored its commitments, the chances for self-regulation without a government mandate may dwindle further.

Sources:

AdWeek:  FTC Chair Stuns Advertisers

The Hill: Sen. Rockefeller to Push for Do Not Track at Hearing


Leave a comment

The FTC Publishes a Staff Report on Mobile Apps for Children and Privacy

The Federal Trade Commission (FTC) just released a Staff Report (the Report) titled ‘Mobile Apps for Kids: Current Privacy Disclosures Are Disappointing.

 

Mobile Applications (Apps) are getting increasingly popular among children and teenagers, even very young. Indeed, the Report found out that 11% of the apps sold by Apple have toddlers as their intended audience (Report p. 6). Apps geared to children are often either free or inexpensive, which makes them easy to purchase, even on a pocket-money budget (Report p. 7-8).

As such, according to the Report, these apps seem to be intended for children’s use, and some may even be “directed to children” within the meaning of the Children’s Online Privacy Protection Act (COPPA) and the FTC’s implementing Rule (the Rule). The Rule defines what is a “[w]ebsite or online service directed to children”) at 16 C.F.R. § 312.2. Under COPPA and the Rule, operators of online services directed to children under age 13 of age must provide notice and obtain parental consent before collecting children’s personal information. This includes apps. Yet, the FTC staff was unable, in most instances, to find out whether an app collected any data, or, if it did, the type of data collected, the purpose for collecting it, and who collected or obtained access to such data (Report p. 10).

 

‘The mobile app market place is growing at a tremendous speed, and many consumer protections, including privacy and privacy disclosures, have not kept pace with this development’ (Report p.3)

 

Downloading an app on a smart phone may an impact on children’s privacy, as apps are able to gather personal information such as the geolocation of the user, her phone number or a list of contacts, and this, without her parent’s knowledge. Indeed, if app stores and operating systems provide rating systems and controls which allow parents to restrict access to mobile content and features, and even to limit data collection, they do not provide information about which data is collected and whether it is shared. (Report, p. 15)

 

The Report concludes by recommending that app stores, app developers, and third parties providing services within apps, increase their efforts to provide parents with “clear, concise and timely information” about apps download by children. Parents would then be able to know, before downloading an app, what data will be collected, how it will be used, and who will obtain access to this data (Report p. 17). This should be done by using “simple and short disclosures or icons that are easy to find and understand on the small screen of a mobile device.” (Report p. 3)

 

One remembers that United States of America v. W3 Innovations, LLC, in August 2011, was the first FTC case involving mobile applications.