The Secure Times

An online forum of the ABA Section of Antitrust Law's Privacy and Information Security Committee


1 Comment

Amendments to CalOPPA Allow Minors to “Erase” Information from the Internet and Also Restricts Advertising Practices to Minors

On September 23, 2013, California Governor Jerry Brown signed SB568 into law, which adds new provisions to the California Online Privacy Protection Act. Officially called “Privacy Rights for California Minors in the Digital World,” the bill has already garnered the nickname of the “Internet Eraser Law,” because it affords California minors the ability to remove content or information previously posted on a Web site. The bill also imposes restrictions on advertising to California minors.

California Minors’ Right to Remove Online Content

Effective January 1, 2015, the bill requires online operators to provide a means by which California minors may remove online information posted by that minor. Online operators can elect to allow a minor to directly remove such information or can alternatively remove such information at a minor’s request. The bill further requires that online operators notify California minors of the right to remove previously-posted information.

Online operators do not need to allow removal of information in certain circumstances, including where (1) the content or information was posted by a third party; (2) state or federal law requires the operator or third party to retain such content or information; or (3) the operator anonymizes the content or information. The bill further clarifies that online operators need only remove the information from public view; the bill does not require wholesale deletion of the information from the online operator’s servers.

New Restrictions on Advertising to California Minors

Also effective January 1, 2015, the bill places new restrictions on advertising to California minors. The bill prohibits online services directed to minors from advertising certain products, including alcohol, firearms, tobacco, and tanning services. It further prohibits online operators from allowing third parties (e.g. advertising networks or plug-ins) to advertise certain products to minors. And where an advertising service is notified that a particular site is directed to minors, the bill restricts the types of products that can be advertised by that advertising service to minors.

Implications

Given the sheer number of California minors, these amendments to CalOPPA will likely have vast implications for online service providers. First, the bill extends not just to Web sites, but also to mobile apps, which is consistent with a general trend of governmental scrutiny of mobile apps. Online service providers should expect regulation of mobile apps to increase, as both California and the Federal Trade Commission have issued publications indicating concerns over mobile app privacy. Second, the bill also reflects an increased focus on privacy of children and minors. Developers should consider these privacy issues when designing Web sites and mobile apps, and design such products with the flexibility needed to adapt to changing legislation. Thus, any business involved in the online space should carefully review these amendments and ensure compliance before the January 1, 2015 deadline.


Advertisements


1 Comment

New York Senator Asks FTC to Allow Consumers to Opt Out of Store Tracking Programs

Senator Charles Schumer (D-NY) held a press conference last Sunday in Manhattan and called on the Federal Trade Commission (FTC) to allow consumers to opt out of being tracked while visiting retail stores.

Senator Schumer suggested that the FTC should require retailers to inform consumers about their opt-out option by sending an electronic notice to their smartphones before starting to track them.

Senator Schumer also sent a letter to Edith Ramirez, the FTC chairwoman, asking the FTC to investigate this practice which he called unfair and deceptive.

Indeed, The New York Times reported this month that some brick-and-mortar stores track shoppers during store visits. The article explained how Nordstrom had tested a new technology which allowed the retailer to use Wi-Fi signals to track customers’ shopping habits.

Nordstrom stopped the experience following customers’ complaints, but the department store is not the only retailer interested in these new tracking technologies. American Apparel and Benetton are among retailers tracking their customers inside their stores.

CBS reported that Nordstrom used a company named Euclid for its tracking experiment. The Euclid web site explains how retailers may track consumers. Its system senses consumers’ smartphones when they come into a store and records the “ping” sent to the store’s Wi-Fi systems. The system scrambles the MAC address of each phone by using one-way hashing algorithms, and then data is processed, analyzed, and stored in the cloud, although it is unclear how long. Euclid calls this data “anonymous foot-traffic” and states on its privacy page that “[n]o personally identifiable data is ever collected or used.”

But privacy advocates know that rendering data anonymous may not be a fool-proof way to safeguard the privacy of data subjects. Therefore, it is welcome that Euclid is one of the companies which will participate in a Future of Privacy Forum group to develop best practices for companies in the business of retail location analytics.

Jules Polonetsky, Director of the Future of Privacy Forum, is quoted saying that “[c]ompanies need to ensure they have data protection standards in place to de-identify data, to provide consumers with effective choices to not be tracked and to explain to consumers the purposes for which data is being used.”

It remains to be seen if the issue will be tackled by a set of best practices, regulation, or both.


1 Comment

Less Than Satisfied with Self-Regulation? FTC Chair Renews Push for Do Not Track

Edith RamirezFTC Chair Edith Ramirez created some waves in her first speech to the advertising industry this week. Ramirez renewed the call for a universal Do Not Track mechanism—and impliedly ignored the progress of AdChoices, the Digital Advertising Alliance’s opt-out program.  The FTC’s critical stance, along with a renewed initiative in the Senate, signal that the government is unsatisfied with the industry’s progress toward enhanced consumer controls over privacy and may seek a public, rather than private, solution.

“Consumers await a functioning Do Not Track system, which is long overdue,” Ramirez said. “We advocated for a persistent Do Not Track mechanism that allows consumers to stop control of data across all sites, and not just for targeting ads.”

The comments, spoken before the American Advertising Federation at their annual advertising day on Capitol Hill, illustrated a rift between advertisers and regulators over the progress of self-regulatory programs and consumers’ perceptions of online behavioral advertising. Two years ago, the FTC called on advertisers to develop a program that would give consumers a choice to opt out of behaviorally targeted ads. Speaking to AdWeek, Stu Inglis, a partner at Venable who acts as the DAA’s attorney, said of Ramirez’s remarks:  “We have solved it. The DAA’s program covers 100 percent of the advertising ecosystem. We made our agreements.”

The DAA also recently released the results of a poll it commissioned, stating that nearly 70 per cent of consumers responding that they would like at least some ads tailored directly to their interests, and 75 per cent saying that they preferred an ad-supported internet model. (The poll comes with some caveats, described by an AdWeek piece today.)

However, in her speech Ramirez spoke of consumers’ “unease” with online tracking: “An online advertising system that breeds consumer discomfort is not a foundation for sustained growth. More likely, it is an invitation to Congress and other policymakers in the U.S. and abroad to intervene with legislation or regulation and for technical measures by browsers or others to limit tracking,” she said.

Ramirez also urged the advertising community to keep working within the multiparty process led by the W3C  (World Wide Web Consortium) to develop a browser-based Do Not Track program. However, there has been little concrete progress in the talks so far.

The online advertising industry may be running out of time. Senator Jay Rockefeller D-W.Va.), chair of the Senate Commerce Committee, announced that he would hold a hearing next week to discuss legislation that would mandate a Do Not Track standard.  The chairman, along with Sen. Richard Blumenthal (D-CT), introduced the Do Not Track Online Act in February.  The bill would direct the FTC to write regulations governing when internet firms must honor a consumer’s request that their information not be collected, and deputize the FTC and state attorneys general to enforce the rules.

“Industry made a public commitment to honor Do-Not-Track requests from consumers but has not yet followed through,” Rockefeller said of the hearing. “I plan to use this hearing to find out what is holding up the development of voluntary Do-Not-Track standards that should have been adopted at the end of last year.”

If Congress and the FTC agree that the advertising industry hasn’t honored its commitments, the chances for self-regulation without a government mandate may dwindle further.

Sources:

AdWeek:  FTC Chair Stuns Advertisers

The Hill: Sen. Rockefeller to Push for Do Not Track at Hearing


Leave a comment

The FTC Publishes a Staff Report on Mobile Apps for Children and Privacy

The Federal Trade Commission (FTC) just released a Staff Report (the Report) titled ‘Mobile Apps for Kids: Current Privacy Disclosures Are Disappointing.

 

Mobile Applications (Apps) are getting increasingly popular among children and teenagers, even very young. Indeed, the Report found out that 11% of the apps sold by Apple have toddlers as their intended audience (Report p. 6). Apps geared to children are often either free or inexpensive, which makes them easy to purchase, even on a pocket-money budget (Report p. 7-8).

As such, according to the Report, these apps seem to be intended for children’s use, and some may even be “directed to children” within the meaning of the Children’s Online Privacy Protection Act (COPPA) and the FTC’s implementing Rule (the Rule). The Rule defines what is a “[w]ebsite or online service directed to children”) at 16 C.F.R. § 312.2. Under COPPA and the Rule, operators of online services directed to children under age 13 of age must provide notice and obtain parental consent before collecting children’s personal information. This includes apps. Yet, the FTC staff was unable, in most instances, to find out whether an app collected any data, or, if it did, the type of data collected, the purpose for collecting it, and who collected or obtained access to such data (Report p. 10).

 

‘The mobile app market place is growing at a tremendous speed, and many consumer protections, including privacy and privacy disclosures, have not kept pace with this development’ (Report p.3)

 

Downloading an app on a smart phone may an impact on children’s privacy, as apps are able to gather personal information such as the geolocation of the user, her phone number or a list of contacts, and this, without her parent’s knowledge. Indeed, if app stores and operating systems provide rating systems and controls which allow parents to restrict access to mobile content and features, and even to limit data collection, they do not provide information about which data is collected and whether it is shared. (Report, p. 15)

 

The Report concludes by recommending that app stores, app developers, and third parties providing services within apps, increase their efforts to provide parents with “clear, concise and timely information” about apps download by children. Parents would then be able to know, before downloading an app, what data will be collected, how it will be used, and who will obtain access to this data (Report p. 17). This should be done by using “simple and short disclosures or icons that are easy to find and understand on the small screen of a mobile device.” (Report p. 3)

 

One remembers that United States of America v. W3 Innovations, LLC, in August 2011, was the first FTC case involving mobile applications.

 


Leave a comment

Federal Trade Commission is Seeking the Public’s Comments on COPPA Rule

The Federal Trade Commission (FTC) is seeking comments from the general public on proposed amendments to the Children’s Online Privacy Protection Rule (COPPA Rule or the Rule).

The Children’s Online Privacy Protection Act (COPPA) was passed in 1998. It required the FTC to issue regulations regarding the collection of children’s personal information by operators of websites or online services directed to children under 13, and to enforce these regulations. The COPPA Rule was issued in November 1999, and became effective on April 21, 2000.

The COPPA Rule required the FTC, no later than April 21, 2005, to do a review of the Rule and to report the results of this review to Congress. The FTC sought public comments in 2005 on the Rule, and also sought additional comments on the COPPA Rule’s sliding scale approach to obtaining parental consent, which takes into account how children’s collected information  will be used. The FTC announced in April 2006 its decision to retain the COPPA Rule without changes.

In March 2010, the FTC asked the public to comment on whether changes to technology warrant changes to the COPPA Rule. The FTC also held a public roundtable during the comment period to discuss COPPA’s definitions of “Internet,” “website,” and “online service” as they apply to new devices and technologies.

After reviewing these public comments, the FTC is now proposing to amend the COPPA Rule. It proposes to modify some of the Rule’s definitions, and to update the requirements for parental consent, confidentiality and security, and safe harbor provisions. The FTC also proposes to add a new provision addressing data retention and deletion.

Parental Consent (16 CFR 312.5):

(p. 59 and following)

The FTC proposes to eliminate the “email plus” method for parental consent. This method allows operators to obtain verifiable parental consent through an email from the parent, but the email must be coupled with an additional step, such as postal address or telephone number from the parent, and confirming the parent’s consent by letter or telephone.

The FTC found that electronic scans and video conferencing technologies are functionally equivalent to the written and oral methods of parental consent originally recognized by the FTC in 1999. Therefore, the FTC proposes to recognize these two methods as a way to obtain verifiable parental consent.  The FTC also proposes to allow operators to collect a form of government-issued identification (driver’s license, truncated social security number) from the parent, as a way to verify the parent’s identity, provided that the parent’s identification is deleted “promptly” once the verification is done (p. 63).

Confidentiality, Security, and Integrity of Personal Information Collected From Children (16 CFR 312.8):

(p. 76 and following)

The Commission proposes to amend § 312.8 to strengthen the provision for maintaining the confidentiality, security, and integrity of personal information. The FTC thus proposes adding a requirement that “operators take reasonable measures to ensure that any service provider or third party to whom they release children’s personal information has in place reasonable procedures to protect the confidentiality, security, and integrity of such personal information.” Indeed, COPPA requires operators to establish and maintain reasonable procedures to protect the confidentiality, security, and integrity of personal information collected from children, but does not explain what would be the data security obligations of third parties.

The FTC Commission proposes to amend § 312.8 to add:

 

The operator must establish and maintain reasonable procedures to protect the confidentiality, security, and integrity of personal information collected from children. The operator must take reasonable measures to ensure that any service provider or any third party to whom it releases children’s personal information has in place reasonable procedures to protect the confidentiality, security, and integrity of such personal information.”

 

Safe Harbors (current 16 CFR 312.10, proposed 16 CFR 312.11):

(p. 80 and following)

COPPA established a “safe harbor” for participants in FTC-approved COPPA self-regulatory programs: compliance with these programs serve as a “safe harbor” against an FTC’s enforcement action. Such programs are, for example, the Children’s Advertising Review Unit of the Council of Better Business Bureaus, or TRUSTe.

The FTC proposes to amend paragraph (b)(2) of the safe harbor provisions of the Rule to read:

An effective, mandatory mechanism for the independent assessment of subject operators’ compliance with the self regulatory program guidelines . At a minimum, this mechanism must include a comprehensive review by the safe harbor program, to be conducted not less than annually, of each subject operator’s information policies, practices, and representations. The assessment mechanism required under this paragraph can be provided by an independent enforcement program, such as a seal program.”

Data Retention and Deletion Requirements (proposed 16 CFR 312.10):

(p. 78 and following)

The FTC proposes to add new data retention and deletion provisions. Operators would retain children’s personal information for only as long as is reasonably necessary to fulfill the purpose for which the information was collected. Also, operators would have to delete this information by taking reasonable measures to protect against unauthorized access to, or use of, the information in connection with its deletion.

The new data retention and deletion provision (§ 312.10) would read:

“An operator of a website or online service shall retain personal information collected online from a child for only as long as is reasonably necessary to fulfill the purpose for which the information was collected. The operator must delete such information using reasonable measures to protect against unauthorized access to, or use of, the information in connection with its deletion.”

Written comments must be received on or before November 28, 2011.


Leave a comment

FTC’s Data Privacy Staff Report – Comments Due Jan. 31

Last week, the Federal Trade Commission released its long-awaited privacy report.  Called “Protecting Consumer Privacy in an Era of Rapid Change”, the 79-page preliminary staff report outlines a framework for consumer privacy based on three principles: (1) Privacy By Design; (2) Simplified Choice; and (3) Transparency. 
 
Some of its key proposals include: a “Do Not Track” browser add-on and other changes to consumer privacy choices; broadening the scope “to all commercial entities that collect consumer data in both offline and online contexts, regardless of whether such entities interact directly with consumers;” and looking at whether COPPA-style consent requirements should apply to teenagers. The FTC is requesting comments on the report by January 31, 2011, and plans to issue a final report later in 2011. Annexed to the report are six pages of questions to which the FTC seeks comments.
 
The first half of the report discusses the principles of “notice and choice” and “harm” that have formed the basis for the FTC’s privacy-related policy work, educational efforts, and enforcement actions. It also summarizes the FTC’s activities and provides an overview of key issues raised during several years of roundtable discussions involving consumer advocacy groups, businesses, academicians and others. The second half of the report expands on the new principles, which appear to simply consolidate and expand upon the earlier principles – “notice” becomes “transparency”, “choice” becomes “simplified choice”, and “harm” becomes “privacy by design”:
  • Privacy by Design – Companies are urged to “incorporate substantive privacy and security protections into their everyday business practices and consider privacy issues systemically, at all stages of the design and development of their products and services.” Companies are urged to collect information only for a specific purpose, limit the amount of time that data is stored, use reasonable safeguards, and develop comprehensive, company-wide privacy programs. However, the FTC staff also recognizes that these measures need to be tailored to each company’s data practices – companies that collect limited amounts of non-sensitive data need not implement the same types of programs required by a company that sells large amounts of sensitive personal data.
  • Simplified Choice – Companies should “describe consumer choices clearly and concisely, and offer easy-to-use choice mechanisms . . .at a time and in a context in which the consumer is making a decision about his or her data.”  The FTC is proposing a new “laundry list” approach to determine whether or not companies need to provide choice to consumers. For example, defined “commonly accepted practices” generally will not require choice, whereas other practices may require either (1) some type of choice mechanism; (2) enhanced choice mechanism; or (3) even more restrictions than enhanced consent. As this is designed for both online and offline behaviors, categorizing each company’s practices as “commonly accepted” or not could be a daunting task.  A chart below outlines the basics of simplified choice.  
    • Do-Not-Track: The day after the report issued, the Commerce Department’s NTIA testified to Congress that it would be convening industry and consumer groups to discuss the “achieving voluntary agreements” on Do-Not-Track.   The FTC would then “ensure compliance with these voluntary agreements, as appropriate.” 
    • ABA Antitrust Section Members note: Companies in markets with limited competition may be subject to “Enhanced Privacy protections” and/or “Additional Enhanced Privacy Protections.” 
  • Greater Transparency – Companies should “make their data practices more transparent to consumers”. The FTC suggests developing a standardized policy like the notice templates currently developed for financial companies complying with Gramm-Leach-Bliley. The FTC is also considering whether increase the transparency of data broker activities and proposes allowing consumers to access (but not necessarily change) profiles compiled about them from many sources.
Two Commissioners issued concurring statements to the proposed framework. Commissioner Kovacic called some of the recommendations “premature” – including the Do-Not-Track proposal. He also pointed out the report lacked consideration of the existing federal and state oversight of privacy concerns. Commissioner Rauch issued a concurring statement that applauds the report as a useful “horatory exercise”, but criticizes the new approach. He states that it could be overstepping the FTC’s bounds to consider “reputational harm” and “other intangible privacy interests” if no deception is involved.
 
Stay tuned – there are many privacy developments on the horizon. In remarks delivered with the report, Chairman Liebowitz declared that “despite some good actors, self-regulation of privacy has not worked adequately and is not working adequately for Americans consumers.” He signaled that the FTC will be bringing more cases in the coming months – and that cases involving children are of particular interest.  In addition, the Commerce Department’s “green paper” on Commercial Data Privacy is expected soon.
 
                                                            Table – Simplified Choice
 
Choice Not Required
Choice Mechanism REQUIRED
Choice Not Required
No choice, but Additional Transparency (Notice)
(Unspecified – presumably Company Discretion; also Do Not Track)
Enhanced Consent (Affirmative Express Consent)
“Even more heightened restrictions” than Enhanced Consent
Do Not Track
1. “Commonly Accepted Practices” 
Laundry list of practices, report suggests: first party marketing (FTC seeks comment on scope); internal operations, legal compliance, fraud prevention.
 
1. Technically Difficult/not feasible to provide choice mechanism: e.g. Data Brokers? (comment sought)
2.“Enhancement?” – compiling data from several sources to profile consumers (comment sought re: whether choice should be provided about these practices?) 
1. Not “Commonly Accepted Practices” and not “Technically Difficult” e.g. Data Brokers (comment sought).
1. Sensitive Information for online behavioral advertising; information about children, financial & medical information, precise geolocation data.
2. Sensitive Users: Children: Teenagers (staff seeks comment); Users who lack meaningful choice (lack of competition in market) (Staff seeks comment).
3. Changing specific purpose: Use of data in materially different manner than claimed when data was posted, collected, or otherwise obtained.
1. Lack of alternative consumer choices through Industry factors (competition): Broadband ISP deep packet inspection.
2, Others?
 
1. Online Behavioral Advertisers.
2. Others?
 


Leave a comment

New FTC Privacy Report – Telephone Press Conference Today – Dec. 1, 2010

Dec. 1, 2010.   The Federal Trade Commission Chairman Leibowitz, Deputy Director of the Bureau of Consumer Protection Jessica Rich, and Chief Technologist Edward Felten will be holding a telephone conference this afternoon at 1pm to answer reporters’ questions about the new FTC privacy report released today.  

Call-in lines (press only):

United States – (800) 398-9367 /  International – (612) 332-0820

Confirmation # – 182971

For more information, contact the FTC Office of Public Affairs – 202-326-2180