The Secure Times

An online forum of the ABA Section of Antitrust Law's Privacy and Information Security Committee


Leave a comment

EPIC is Suing the FTC to Compel Enforcement of Google Buzz Consent Order

The Electronic Privacy Information Center (EPIC) is suing the Federal Trade Commission (FTC) in order to compel the federal agency to enforce the October 2011 Google Buzz consent order, In the Matter of Google, Inc., FTC File No. 102 3136, which was issued following a complaint filed by EPIC with the FTC in February 2010.

 

Pursuant to this consent order, Google may not misrepresent the extent to which it maintains and protects the privacy and confidentiality of the information it collects, including the purposes for which the information is collected, and the extent to which consumers may exercise control over the collection, use, or disclosure of this information. Also, Google must obtain the express affirmative consent of Google users before making any new or additional sharing of information to third parties, which must be identified, and the purpose(s) for sharing the information must be disclosed to Google users. The consent order also requires Google to establish and implement a comprehensive privacy program.

 

Google announced in last January changes in its privacy policy, which will be effective March 1, 2012. Google will then start collecting user data across all the different Google sites, such as Gmail or YouTube, provided that the user logged into her Google account. Ms. Alma Whitten, Google’s Director of Privacy, Product and Engineering, stated that Google can thus provide “a simpler, more intuitive Google experience.” A Google user will have one single Google profile. There is, however, no opt-out available. The new privacy policy states that:

 

We may use the name you provide for your Google Profile across all of the services we offer that require a Google Account. In addition, we may replace past names associated with your Google Account so that you are represented consistently across all our services. If other users already have your email, or other information that identifies you, we may show them your publicly visible Google Profile information, such as your name and photo.”

 

According to EPIC’s complaint, these changes are “in clear violation of [Google] prior commitments to the Federal Trade Commission.” EPIC is arguing that Google violated the Consent Order “by misrepresenting the extent to which it maintains and protects the privacy and confidentiality of [users] information, by misrepresenting the extent to which it complies with the U.S.-EU Safe Harbor Framework… [and] by failing to obtain affirmative consent from users prior to sharing their information with third parties.

 

Indeed, the European Union (EU) is also concerned by these changes. The Article 29 Working Party sent a letter to Google on February 2, to inform the California company that it will “check the possible consequences for the protection of the personal data of [E.U. Member States ]citizensof these changes. Google answered to the Commission Nationale de l’Informatique et des Libertés (CNIL), France’s Data Protection Authority, in charge of coordinating the enquiry into Google Privacy changes, that changes were made in order to insure that Google’s privacy policy is “simpler and more understandable” and also “to create a better user experience.”

 

Meanwhile, EPIC is arguing that the FTC has a non-discretionary obligation to enforce a final order, yet has not yet taken any action with respect to changes ahead in Google’s privacy policy.


Leave a comment

Rep. Stearns Introduces New Privacy Bill

Rep. Cliff Stearns, (R-FL), introduced yesterday a new privacy bill, H.R.1528, “To protect and enhance consumer privacy, and for other purposes.” Rep. Stearns had worked on a draft privacy bill with Rep. Rick Boucher (D-VA) during the last Congress. Rep. Boucher was defeated during the last election.

Rep. Stearns said: “Using my privacy legislation from the 109th Congress as a base, I took the comments submitted to Chairman Boucher and worked with stakeholders on developing this bill.  The introduction of this bill is not the end of the process.  I will continue to work to improve the language to ensure that regulatory distinctions are not being made on like services and that privacy is administered by a single agency, across the entire Internet economy.”

Violation of any provision of the Act would be an unfair or deceptive act or practice unlawful under 16 section 5(a)(1) of the Federal Trade Commission Act. The Act would not provide any private right of action, and would preempt state laws.

The bill would apply to an entity, its agents, or affiliates that “collects, sells, discloses for consideration, or uses personally identifiable information of more than 5,000 consumers during any consecutive 12-month period.” This definition includes non-profit organizations, but does not include governmental agencies, provider of professional services, and data processing outsourcing entities, Section 3(4).

Regulating the “cloud”

Data processing outsourcing entities would be have to be “contractually obligated to comply  with security controls specified by [covered entities] and [would have] no right to use the covered entity’s personally identifiable information other than for performing data processing outsourcing services for the covered entity or as required by contract or law,” Section 3(5).

Notice to consumers before using personally identifiable information for a purpose unrelated to the transaction

Covered entities would have to notify consumers before using any personally identifiable information they collected for a purpose unrelated to a transaction, Section 4(a)(1).

Notice to consumers of any material change in their privacy policy

Covered entities would have to provide notice to consumers after making a material change to their privacy policies, Section 4(a)(2).

Establishing a written and clear privacy policy, and a security policy

Covered entities would have to establish a privacy policy with respect to the collection, sale, disclosure, dissemination, use, and security of the personally identifiable information of consumers, Section 5(a), using written “brief, concise, clear, and conspicuous (… ) plain language,” Section 5(b)(1). The privacy policy would inform consumers about the “types of information that may be collected or used, how  the information may be used, and whether the consumer is required to provide the information in order to do business with the covered entity,” Section 5(b)(3). 

The policy would also inform consumers about the extent to which their information is “subject to sale or disclosure for consideration to a covered entity that is not an information sharing affiliate of the covered entity,” Section (b)(3)(E), and whether the information security practices of the covered entity meet “security requirements necessary to prevent unauthorized disclosure or release of personally identifiable information,” Section (b)(3)(F).

Indeed, covered entities would have to implement an “information security policy applicable to the information security practices and treatment of personally identifiable information maintained by the covered entity, that is designed to prevent the unauthorized disclosure or release of such information,” Section 8.

Providing consumers the opportunity to preclude the sale or disclosure of their information to any organization that is not an information-sharing partner

Covered entities would have to provide consumers, at no charge, the “opportunity to preclude any sale or disclosure for consideration of the consumer’s personally identifiable information, provided in a particular data collection, that may be used for a purpose other than a transaction with consumer, to any covered entity that is not an information-sharing affiliate of the covered entity providing such opportunity,” Section 6(a)(1). This preclusion would remain in effect during 5 years, or until the consumer indicates otherwise, whichever occurs sooner, Section 6(a)(2). Covered entities could provide the consumer an opportunity to allow the sale or disclosure “in exchange for a benefit to the consumer, “Section (6)(b).

Self-regulatory programs approved by the FTC

The Federal Trade Commission (“FTC”) would presume that a covered entity complies with the provisions of the Act if it participates in a self-regulatory program, Section 9(a), which would have to be approved by the FTC, Section 9(b). Denial of approval of a self-regulatory program would be subject to judicial review, Section 9(b)(5).

Self-Regulatory consumer dispute resolution process

If a consumer has a dispute with a participant in a self-regulatory program, and if this dispute pertains to the entity’s privacy policy or practices required for participation in the self-regulatory program, the consumer would have to initially seek resolution through a dispute resolution process, Section 9(d).


Leave a comment

Representatives Markey and Barton Send a Letter to Facebook over Announced Feature

Representative Edward Markey and Representative Joe Barton, Co-Chairmen of the Congressional Privacy Caucus, sent a letter on February 2nd to Facebook’s CEO, Mark Zuckerberg, requesting information about Facebook’s announcement on January 14 that it plans to make its users’ addresses and mobile phone numbers available to third-party web sites. The feature would make a user’s address and mobile phone number accessible to external web site application developers, but not a user’s friend’s addresses or mobile phone numbers

Facebook then announced on January 17 that it decided to delay this new feature, after having received “some useful feedback that [Facebook] could make people more clearly aware of when they are granting access to this data.” Facebook is currently making changes to the feature “to help ensure [users] only share this information when [they] intend to do so.”

The letter asks several questions about the feature:

The First question asks Facebook to describe whether any user information in addition to address and mobile phone number would be shared with third party applications developers, and also to describe whether such information was shared prior to the January 17 announcement of the suspension of the feature.

The Second question asks Facebook to describe what user information will be shared with third party applications developers once the feature is again implemented.

The Fourth question asks Facebook to describe the process which led to the suspension of the program.

The Sixth question asks Facebook to describe Facebook’s internal policies and procedures for ensuring that this feature complies with Facebook’s privacy policy.

The Eighth question asks whether users who had opted in to sharing their addresses and mobile phone numbers will be able to have this information deleted by third party applications or web sites.

The Ninth question asks Facebook whether Facebook’s privacy policy would have been violated if this feature would have been implemented.

The Tenth question asks Facebook if, given the sensitivity of personal addresses and mobile phone numbers, Facebook believes that opt-in should be clearer and more prominent.

Representative Markey said that “Facebook needs to protect the personal information of its users to ensure that Facebook doesn’t become Phonebook." Representative Barton said that “The computer – especially with sites like Facebook – is now a virtual front door to your house allowing people access to your personal information. You deserve to look through the peep hole and decide who you are letting in.”


Leave a comment

FTC’s Data Privacy Staff Report – Comments Due Jan. 31

Last week, the Federal Trade Commission released its long-awaited privacy report.  Called “Protecting Consumer Privacy in an Era of Rapid Change”, the 79-page preliminary staff report outlines a framework for consumer privacy based on three principles: (1) Privacy By Design; (2) Simplified Choice; and (3) Transparency. 
 
Some of its key proposals include: a “Do Not Track” browser add-on and other changes to consumer privacy choices; broadening the scope “to all commercial entities that collect consumer data in both offline and online contexts, regardless of whether such entities interact directly with consumers;” and looking at whether COPPA-style consent requirements should apply to teenagers. The FTC is requesting comments on the report by January 31, 2011, and plans to issue a final report later in 2011. Annexed to the report are six pages of questions to which the FTC seeks comments.
 
The first half of the report discusses the principles of “notice and choice” and “harm” that have formed the basis for the FTC’s privacy-related policy work, educational efforts, and enforcement actions. It also summarizes the FTC’s activities and provides an overview of key issues raised during several years of roundtable discussions involving consumer advocacy groups, businesses, academicians and others. The second half of the report expands on the new principles, which appear to simply consolidate and expand upon the earlier principles – “notice” becomes “transparency”, “choice” becomes “simplified choice”, and “harm” becomes “privacy by design”:
  • Privacy by Design – Companies are urged to “incorporate substantive privacy and security protections into their everyday business practices and consider privacy issues systemically, at all stages of the design and development of their products and services.” Companies are urged to collect information only for a specific purpose, limit the amount of time that data is stored, use reasonable safeguards, and develop comprehensive, company-wide privacy programs. However, the FTC staff also recognizes that these measures need to be tailored to each company’s data practices – companies that collect limited amounts of non-sensitive data need not implement the same types of programs required by a company that sells large amounts of sensitive personal data.
  • Simplified Choice – Companies should “describe consumer choices clearly and concisely, and offer easy-to-use choice mechanisms . . .at a time and in a context in which the consumer is making a decision about his or her data.”  The FTC is proposing a new “laundry list” approach to determine whether or not companies need to provide choice to consumers. For example, defined “commonly accepted practices” generally will not require choice, whereas other practices may require either (1) some type of choice mechanism; (2) enhanced choice mechanism; or (3) even more restrictions than enhanced consent. As this is designed for both online and offline behaviors, categorizing each company’s practices as “commonly accepted” or not could be a daunting task.  A chart below outlines the basics of simplified choice.  
    • Do-Not-Track: The day after the report issued, the Commerce Department’s NTIA testified to Congress that it would be convening industry and consumer groups to discuss the “achieving voluntary agreements” on Do-Not-Track.   The FTC would then “ensure compliance with these voluntary agreements, as appropriate.” 
    • ABA Antitrust Section Members note: Companies in markets with limited competition may be subject to “Enhanced Privacy protections” and/or “Additional Enhanced Privacy Protections.” 
  • Greater Transparency – Companies should “make their data practices more transparent to consumers”. The FTC suggests developing a standardized policy like the notice templates currently developed for financial companies complying with Gramm-Leach-Bliley. The FTC is also considering whether increase the transparency of data broker activities and proposes allowing consumers to access (but not necessarily change) profiles compiled about them from many sources.
Two Commissioners issued concurring statements to the proposed framework. Commissioner Kovacic called some of the recommendations “premature” – including the Do-Not-Track proposal. He also pointed out the report lacked consideration of the existing federal and state oversight of privacy concerns. Commissioner Rauch issued a concurring statement that applauds the report as a useful “horatory exercise”, but criticizes the new approach. He states that it could be overstepping the FTC’s bounds to consider “reputational harm” and “other intangible privacy interests” if no deception is involved.
 
Stay tuned – there are many privacy developments on the horizon. In remarks delivered with the report, Chairman Liebowitz declared that “despite some good actors, self-regulation of privacy has not worked adequately and is not working adequately for Americans consumers.” He signaled that the FTC will be bringing more cases in the coming months – and that cases involving children are of particular interest.  In addition, the Commerce Department’s “green paper” on Commercial Data Privacy is expected soon.
 
                                                            Table – Simplified Choice
 
Choice Not Required
Choice Mechanism REQUIRED
Choice Not Required
No choice, but Additional Transparency (Notice)
(Unspecified – presumably Company Discretion; also Do Not Track)
Enhanced Consent (Affirmative Express Consent)
“Even more heightened restrictions” than Enhanced Consent
Do Not Track
1. “Commonly Accepted Practices” 
Laundry list of practices, report suggests: first party marketing (FTC seeks comment on scope); internal operations, legal compliance, fraud prevention.
 
1. Technically Difficult/not feasible to provide choice mechanism: e.g. Data Brokers? (comment sought)
2.“Enhancement?” – compiling data from several sources to profile consumers (comment sought re: whether choice should be provided about these practices?) 
1. Not “Commonly Accepted Practices” and not “Technically Difficult” e.g. Data Brokers (comment sought).
1. Sensitive Information for online behavioral advertising; information about children, financial & medical information, precise geolocation data.
2. Sensitive Users: Children: Teenagers (staff seeks comment); Users who lack meaningful choice (lack of competition in market) (Staff seeks comment).
3. Changing specific purpose: Use of data in materially different manner than claimed when data was posted, collected, or otherwise obtained.
1. Lack of alternative consumer choices through Industry factors (competition): Broadband ISP deep packet inspection.
2, Others?
 
1. Online Behavioral Advertisers.
2. Others?