The Secure Times

An online forum of the ABA Section of Antitrust Law's Privacy and Information Security Committee


Leave a comment

Yesterday at FTC, President Obama Announced Plans for new data privacy and security laws: Comprehensive Data Privacy Law, Consumer Privacy Bill of Rights, and Student Digital Privacy Act

Yesterday afternoon, President Barak Obama gave a quip-filled speech at the Federal Trade Commission where he praised the FTC’s efforts in protecting American consumers over the past 100 years and unveiled his plans to implement legislation to protect American consumers from identity theft and to protect school children’s personal information from being used by marketers.   These plans build upon past legislative efforts and the Administration’s focus on cybersecurity, Big Data, and Consumer Protection.  Specifically, On February 23, 2012, the White House released “Consumer Data Privacy in a Networked World: A Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy” (the “Privacy Blueprint”) and in January 2014, President Obama asked his Counselor, John Podesta, to lead a working group to examine Big Data’s impact on government, citizens, businesses, and consumers.  The working group produced Big Data: Seizing Opportunities, Preserving Values on May 1, 2014.

In his speech, the President highlighted the need for increased privacy and security protections as more people go online to conduct their personal business—shop, manage bank accounts, pay bills, handle medical records, manage their “smart” homes, etc.—stating that “we shouldn’t have to forfeit our basic privacy when we go online to do our business”.  The President referenced his “Buy Secure” initiative that would combat credit card fraud through a “chip-and-pin” system for credit cards and credit-card readers issued by the United States government.  In that system, a microchip would be imbedded in a credit card and would replace a magnetic strip since microchips are harder than magnetic strips for thieves to clone.   A pin number would also need to be entered by the consumer into the credit card reader just as with an ATM or debit card.  The President praised those credit card issuers, banks, and lenders that allowed consumers to view their credit scores for free.   He also lauded the FTC’s efforts in the efforts to help identity theft victims by working with credit bureaus and by providing guidance to consumers on its website, identitytheft.gov.

The first piece of legislation the President discussed briefly was a comprehensive breach notification law that would require companies to notify consumers of a breach within 30 days and that would allow identity thieves to be prosecuted even when the criminal activity was done overseas. Currently, there is no federal breach notification law and many states have laws requiring companies to notify affected consumers and/or regulators depending on the type of information compromised and the jurisdiction in which the organization operates.  The state laws also require that breach notification letters to consumers should include certain information, such as information on the risks posed to the individual as a result of the breach along with steps to mitigate the harm.   This “patchwork of laws,” President Obama noted, is confusing to customers and costly for companies to comply with.  The plan to introduce a comprehensive breach notification law adopts the policy recommendation from the Big Data Report that Congress pass legislation that provides for a single national data breach standard along the lines of the Administration’s May 2011 Cybersecurity legislative proposal.  Such legislation should impose reasonable time periods for notification, minimize interference with law enforcement investigations, and potentially prioritize notification about large, damaging incidents over less significant incidents.

The President next discussed the second piece of legislation he would propose, the Consumer Privacy Bill of Rights.  This initiative is not new.  Electronic Privacy Bills of Rights of 1998 and 1999 have been introduced.  In 2011, Senators John Kerry, John McCain, and Amy Klobucher introduced S.799 – Commercial Privacy Bill of Rights Act of 2011.   The Administration’s  Privacy Blueprint of February 23, 2012 set forth the Consumer Privacy Bill of Rights and, along with the Big Data Report, directed The Department of Commerce’s The National Telecommunications and Information Administration (NTIA) to seek comments from stakeholders in order to develop legally-enforceable codes of conduct that would apply the Consumer Privacy Bill of Rights to specific business contexts.

The Big Data Report of May 1, 2014 recommended that The Department of Commerce seek stakeholder and public comment on big data developments and how they impact the Consumer Privacy Bill of Rights draft and consider legislative text for the President to submit to Congress.  On May 21, 2014, Senator Robert Menendez introduced S.2378 – Commercial Privacy Bill of Rights Act of 2014.  The Consumer Privacy Bill of Rights set forth seven basic principles:

1) Individual control – Consumers have the right to exercise control over what information data companies collect about them and how it is used.

2) Transparency – Consumers have the right to easily understandable and accessible privacy and security practices.

3) Respect for context – Consumers expect that data companies will collect, use, and disclose the information they provided in ways consistent with the context it was provided.

4) Security – consumers have the right to secure and responsible handling of personal data.

5) Access and accuracy – Consumers have the right to access and correct their personal data in usable formats in a manner that is appropriate to the data’s sensitivity and the risk of adverse consequences if the data is not accurate.

6) Focused Collection – Consumers have the right to reasonable limits on the personal data that companies collect and retain.

7) Accountability – Consumers have the right to have companies that collect and use their data to have the appropriate methods in place to assure that they comply with the consumer bill of rights.

The President next discussed the third piece of legislation he would propose, the Student Digital Privacy Act.  The President noted how new educational technologies including tailored websites, apps, tablets, digital tutors and textbooks transform how children learn and help parents and teachers track students’ progress.  With these technologies, however, companies can mine student data for non-educational, commercial purposes such as targeted marketing.  The Student Privacy Act adopts the Big Data Report’s policy recommendation of ensuring that students’ data, collected and gathered in an educational context, is used for educational purposes and that students are protected against having their data shared or used inappropriately.  The President noted that the Student Digital Privacy Act would not “reinvent the wheel” but mirror on a federal level state legislation, specifically the California law to take effect next year that bars education technology companies from selling student data or using that data to target students with ads.   The current federal law that protects student’s privacy is the Family Educational Rights and Privacy Act of 1974, which does not protect against companies’ data mining that reveals student’s habits and profiles for targeted advertising but rather protects against official educational records from being released by schools. The President highlighted current self-regulation, the Student Privacy Pledge, signed by 75 education technology companies committing voluntary not to sell student information or use education technologies to send students targeted ads.  It has been discussed whether self-regulation would work and whether the proposed Act would go far enough.  The President remarked that parents want to make sure that children are being smart and safe online, it is their responsibility as parents to do so but that structure is needed for parents to ensure that information is not being gathered about students without their parents or the kids knowing about it.  This hinted at a notification requirement and opt-out for student data mining that is missing from state legislation but is a requirement of the Children’s Online Privacy Protection Act of 1998.  Specifically, COPPA requires companies and commercial website operators that direct online services to children under 13, collect personal information from children under 13, or that know they are collecting personal information from children under to children under 13 to provide parents with notice about the site’s information-collection practices, obtain verifiable consent from parents before collecting personal information, give parents a choice as to whether the personal information is going to be disclosed to third parties, and give parents access and the opportunity to delete the children’s personal information, among other things.

President Obama noted that his speech marked the first time in 80 years—since FDR—that a President has come to the FTC.   His speech at the FTC on Monday was the first of a three-part tour leading up to his State of the Union address.  Next, the President also planned to speak at the Department of Homeland Security on how the government can collaborate with the private sector to ward off cyber security attacks.  His final speak will take place in Iowa, where he will discuss how to bring faster, cheaper broadband access to more Americans.


Leave a comment

Canada’s Anti-Spam Law (CASL) in force July 1

Canada’s Anti-Spam Law (CASL) enters into force on Canada Day, July 1.  It was passed in 2010 as a “made-in-Canada” solution to “drive spammers out of Canada“. 

Are you outside Canada?  It’s important to know that this law reaches beyond Canada’s borders.  CASL is already affecting businesses in the United States, Europe and elsewhere as they change their communications practices to send emails and other “commercial electronic messages” into Canada. 

As we have described in our presentation Comparing CASL to CAN-SPAM, the new law applies to messages that are accessed by a computer system in Canada.  That means that messages sent by a person, business or organization outside of Canada, to a person in Canada, are subject to the law.

CASL expressly provides for sharing information among the Government of Canada, the Canadian CASL enforcement agencies, and “the government of a foreign state” or international organization, for the purposes of administering CASL’s anti-spam (and other) provisions.  The MOU among the Canadian CASL enforcement agencies similarly references processes to share and disseminate information received from and provided to their foreign counterpart agencies. 

In a speech on June 26, the Chair of the Canadian Radio-television and Telecommunications Commission, Jean-Pierre Blais, emphasized the CRTC’s cooperation with its international counterparts to combat unlawful telemarketers, hackers and spammers that “often operate outside our borders“.  The Chairman specifically named “the Federal Trade Commission in the U.S., the Office of Communication (OFCOM) in the U.K., the Authority for Consumers and Markets in the Netherlands, the Australian Communications and Media Authority and others”, and noted that the CRTC has led or participated in many international networks on unlawful telecommunications.

Companies should also take note that a violation of CASL might also result in the CRTC exercising its so-called “name and shame” power, by posting the name of the offender and the violation on its online compliance and enforcement list.  The CRTC has for years published notices of violation with respect to its “Do Not Call List”, and is expected to take a similar approach for CASL notices of violation as well. 

The CRTC recently published a Compliance and Enforcement Bulletin on its Unsolicited Telecommunications Rules and on CASL, available here.  The CRTC recommends implementing a corporate compliance program as part of a due diligence defence: 

Commission staff may take into consideration the existence and implementation of an effective corporate compliance program if the business presents the program as part of a due diligence defence in response to an alleged violation of the Rules or CASL. Although the pre-existence of a corporate compliance program may not be sufficient as a  complete defence to allegations of violations under the Rules or CASL, a credible and effective documented program may enable a business to demonstrate that it took  reasonable steps to avoid contravening the law.


Leave a comment

FTC Examines Internet of things, Privacy and Security, in Recent Workshop

On November 19, 2013, the Federal Trade Commission held a day-long workshop, “Internet of Things: Privacy and Security in a Connected World” on the privacy implications concerning devices such as cars, home appliances, fitness equipment, and other machines that are able to gather data and connect to the internet. For consumers, these devices can help track health, remotely monitor aging family members, reduce utility bills, and even send alerts to buy more milk.

Ubiquitous Internet of Things

Technological advances and new business models centered around the internet of things have taken off.
It has been reported that crowd-sourcing start-up, Quirky, has teamed up with GE to develop connected-home products. Another start up company isdeveloping tracking technology through GPS-embedded tags. On November 20, 2013, Qualcomm announced that has developed a line of chips for the internet of things space. It has been argued that companies should adjust their business models and use the internet of things to connect to customers. These developments present the FTC with the challenge of keeping up with technology to protect consumers and the competitive landscape.

In her remarks, Chairwoman Edith Ramirez emphasized how ubiquitous smart devices have become. Five years ago, she remarked, there are more “things” than people connected to the Internet; by 2015, there will be an estimated twenty-five billion things connected to the Internet and by 2020, an estimated fifty billion. Commissioner Maureen Ohlhausen, in her remarks later in the workshop, stated that the FTC will conduct policy research to understand the effects that technological advances and innovative business models concerning the internet of things have on consumers and the marketplace.

Privacy and Security Challenges

Chairwoman Ramirez noted privacy and security challenges presented by the internet of things. Privacy risks are present since devices connected to the internet can collect, compile, and transmit information about consumers in ways that may not have been expected. When aggregated, the data pieces collected by devices present “a deeply personal and startlingly complete picture of each of us.” Security risks are present since “any device connected to the Internet is potentially vulnerable to hijack.” Indeed, these risks have been reported and present real concerns.

Chairwoman Ramirez noted that the FTC will be vigilant in bringing enforcement actions against companies who fail to properly safeguard consumers from security breaches. She noted as an example the FTC’s first enforcement forayinto the internet of things against TRENDnet for failing to properly design its software and test its internet-connected security cameras, leaving consumers vulnerable to a hacker who accessed the live feeds from 700 cameras and made them available on the Internet. When it encounters consumer harm, Commissioner Olhausen stated that the FTC will use its traditional enforcement tools to challenge any potential threats that arise, much like it has done in the data security, mobile, and big data spaces.

Chairwoman Ramirez said that companies that take part in the internet of things ecosystem are “stewards of the consumer data” and that “with big data comes big responsibility.” The FTC has published a number of best practices that Chairwoman Ramirez identified as useful for companies in the internet of things space: (1) privacy by design—privacy protections built in from the outset, (2) simplified consumer choice—allowing consumers to control their data, and (3) transparency—disclosure of what information the devices collect and how it is being used.

FTC Report Forthcoming

The FTC will produce a report on what it has learned from the November 19 workshop and provide fruther recommendations about best practices. The FTC report can educate consumers and businesses on how to maximize consumer benefits and avoid or minimize any identified risks. Commissioner Ohlhausen stressed that the FTC should identify whether existing laws and existing regulatory structures, including self-regulation, are sufficient to address potential harms.

Vint Cerf of Google, who gave the keynote presentation, advised that rather than relying on regulations to protect privacy, social conventions should be developed. He stated that “while regulation might be helpful, an awful lot of the problems that we experience with privacy is a result of our own behavior.”

The same day as the workshop, the Future of Privacy Forum released a white paper arguing for an updated privacy paradigm for the internet of things that focuses not on how information is collected and communicated but on how organizations use personally identifiable information.

The FTC will continue to accept comments until January 10, 2013.


1 Comment

FTC v. Wyndham Update

Edit (Feb. 5, 2014): For a more recent update on this case, please see this post.

On November 1, Maureen Ohlhausen, a Commissioner at the Federal Trade Commission (FTC), held an “ask me (almost) anything” (AMAA) session on Reddit. There were no real surprises in the questions Commissioner Ohlhausen answered, and the AMAA format is not well-suited to lengthy responses. One interesting topic that did arise, however, was the FTC’s complaint against Wyndham Worldwide Corporation, and Wyndham’s subsequent filing of a motion to dismiss the FTC action against them. Commissioner Ohlhausen declined to discuss the ongoing litigation, but asserted generally that the FTC has the authority to bring such actions under Section 5 of the FTC Act, 15 U.S.C. § 45. While there were no unexpected revelations in the Commissioner’s response, I thought it presented an excellent opportunity to bring everyone up to speed on the Wyndham litigation.

On June 26, 2012, the Federal Trade Commission (FTC) filed a complaint in Arizona Federal District Court against Wyndham Worldwide Corporation, alleging that Wyndham “fail[ed] to maintain reasonable security” on their computer networks, which led to a data breach resulting in the theft of payment card data for hundreds of thousands of Wyndham customers, and more than $10.6 million in fraudulent charges on customers’ accounts.  Specifically, the complaint alleged that Wyndham engaged in deceptive business practices in violation of Section 5 of the FTC Act by misrepresenting the security measures it undertook to protect customers’ personal information. The complaint also alleged that Wyndham’s failure to provide reasonable data security is an unfair trade practice, also in violation of Section 5.

On August 27, 2012, Wyndham  responded by filing a motion to dismiss the FTC’s complaint, asserting, inter alia, that the FTC lacked the statutory authority to “establish data-security standards for the private sector and enforce those standards in federal court,” thus challenging the FTC’s authority to bring the unfairness count under the FTC Act. In their October 1, 2012 response, the FTC asked the court to reject Wyndham’s arguments, stating that the FTC’s complaint alleged a number of specific security failures on the part of Wyndham, which resulted in two violations of the FTC Act. The case was transferred to the Federal District of New Jersey on March 25, 2013, and Wyndham’s motions to dismiss were denied. On April 26, Wyndham once again filed motions to dismiss the FTC’s complaint, again asserting that the FTC lacked the legal authority to legislate data security standards for private businesses under Section 5 of the FTC Act.

At stake in this litigation is the FTC’s ability to bring enforcement claims against companies that suffer data breach due to a lack of “reasonable security.” What is unique in this case is Wyndham’s decision to fight the FTC action in court rather than make efforts to settle the case, as other companies have done when faced with similar allegations by the FTC. For example, in 2006, the FTC hit ChoicePoint Inc. with a $10 million penalty over data breach where over 180,000 payment card numbers were stolen. The FTC has also gone after such high-profile companies as Twitter, HTC, and Google based on similar facts and law. These actions resulted in out-of-court settlements.

If Wyndham’s pending motions to dismiss are denied, and the FTC ultimately prevails in this case, it is likely that the FTC will continue to bring these actions, and businesses will likely see an increased level of scrutiny applied to their network security. If, however, Wyndham succeeds and the FTC case against them is dismissed, public policy questions regarding data security will likely fall back to Congress to resolve.

Oral argument for the pending motions to dismiss are scheduled for November 7. No doubt many parties will be following these proceedings with great interest.


1 Comment

Amendments to CalOPPA Allow Minors to “Erase” Information from the Internet and Also Restricts Advertising Practices to Minors

On September 23, 2013, California Governor Jerry Brown signed SB568 into law, which adds new provisions to the California Online Privacy Protection Act. Officially called “Privacy Rights for California Minors in the Digital World,” the bill has already garnered the nickname of the “Internet Eraser Law,” because it affords California minors the ability to remove content or information previously posted on a Web site. The bill also imposes restrictions on advertising to California minors.

California Minors’ Right to Remove Online Content

Effective January 1, 2015, the bill requires online operators to provide a means by which California minors may remove online information posted by that minor. Online operators can elect to allow a minor to directly remove such information or can alternatively remove such information at a minor’s request. The bill further requires that online operators notify California minors of the right to remove previously-posted information.

Online operators do not need to allow removal of information in certain circumstances, including where (1) the content or information was posted by a third party; (2) state or federal law requires the operator or third party to retain such content or information; or (3) the operator anonymizes the content or information. The bill further clarifies that online operators need only remove the information from public view; the bill does not require wholesale deletion of the information from the online operator’s servers.

New Restrictions on Advertising to California Minors

Also effective January 1, 2015, the bill places new restrictions on advertising to California minors. The bill prohibits online services directed to minors from advertising certain products, including alcohol, firearms, tobacco, and tanning services. It further prohibits online operators from allowing third parties (e.g. advertising networks or plug-ins) to advertise certain products to minors. And where an advertising service is notified that a particular site is directed to minors, the bill restricts the types of products that can be advertised by that advertising service to minors.

Implications

Given the sheer number of California minors, these amendments to CalOPPA will likely have vast implications for online service providers. First, the bill extends not just to Web sites, but also to mobile apps, which is consistent with a general trend of governmental scrutiny of mobile apps. Online service providers should expect regulation of mobile apps to increase, as both California and the Federal Trade Commission have issued publications indicating concerns over mobile app privacy. Second, the bill also reflects an increased focus on privacy of children and minors. Developers should consider these privacy issues when designing Web sites and mobile apps, and design such products with the flexibility needed to adapt to changing legislation. Thus, any business involved in the online space should carefully review these amendments and ensure compliance before the January 1, 2015 deadline.



1 Comment

Less Than Satisfied with Self-Regulation? FTC Chair Renews Push for Do Not Track

Edith RamirezFTC Chair Edith Ramirez created some waves in her first speech to the advertising industry this week. Ramirez renewed the call for a universal Do Not Track mechanism—and impliedly ignored the progress of AdChoices, the Digital Advertising Alliance’s opt-out program.  The FTC’s critical stance, along with a renewed initiative in the Senate, signal that the government is unsatisfied with the industry’s progress toward enhanced consumer controls over privacy and may seek a public, rather than private, solution.

“Consumers await a functioning Do Not Track system, which is long overdue,” Ramirez said. “We advocated for a persistent Do Not Track mechanism that allows consumers to stop control of data across all sites, and not just for targeting ads.”

The comments, spoken before the American Advertising Federation at their annual advertising day on Capitol Hill, illustrated a rift between advertisers and regulators over the progress of self-regulatory programs and consumers’ perceptions of online behavioral advertising. Two years ago, the FTC called on advertisers to develop a program that would give consumers a choice to opt out of behaviorally targeted ads. Speaking to AdWeek, Stu Inglis, a partner at Venable who acts as the DAA’s attorney, said of Ramirez’s remarks:  “We have solved it. The DAA’s program covers 100 percent of the advertising ecosystem. We made our agreements.”

The DAA also recently released the results of a poll it commissioned, stating that nearly 70 per cent of consumers responding that they would like at least some ads tailored directly to their interests, and 75 per cent saying that they preferred an ad-supported internet model. (The poll comes with some caveats, described by an AdWeek piece today.)

However, in her speech Ramirez spoke of consumers’ “unease” with online tracking: “An online advertising system that breeds consumer discomfort is not a foundation for sustained growth. More likely, it is an invitation to Congress and other policymakers in the U.S. and abroad to intervene with legislation or regulation and for technical measures by browsers or others to limit tracking,” she said.

Ramirez also urged the advertising community to keep working within the multiparty process led by the W3C  (World Wide Web Consortium) to develop a browser-based Do Not Track program. However, there has been little concrete progress in the talks so far.

The online advertising industry may be running out of time. Senator Jay Rockefeller D-W.Va.), chair of the Senate Commerce Committee, announced that he would hold a hearing next week to discuss legislation that would mandate a Do Not Track standard.  The chairman, along with Sen. Richard Blumenthal (D-CT), introduced the Do Not Track Online Act in February.  The bill would direct the FTC to write regulations governing when internet firms must honor a consumer’s request that their information not be collected, and deputize the FTC and state attorneys general to enforce the rules.

“Industry made a public commitment to honor Do-Not-Track requests from consumers but has not yet followed through,” Rockefeller said of the hearing. “I plan to use this hearing to find out what is holding up the development of voluntary Do-Not-Track standards that should have been adopted at the end of last year.”

If Congress and the FTC agree that the advertising industry hasn’t honored its commitments, the chances for self-regulation without a government mandate may dwindle further.

Sources:

AdWeek:  FTC Chair Stuns Advertisers

The Hill: Sen. Rockefeller to Push for Do Not Track at Hearing


Leave a comment

The FTC Publishes a Staff Report on Mobile Apps for Children and Privacy

The Federal Trade Commission (FTC) just released a Staff Report (the Report) titled ‘Mobile Apps for Kids: Current Privacy Disclosures Are Disappointing.

 

Mobile Applications (Apps) are getting increasingly popular among children and teenagers, even very young. Indeed, the Report found out that 11% of the apps sold by Apple have toddlers as their intended audience (Report p. 6). Apps geared to children are often either free or inexpensive, which makes them easy to purchase, even on a pocket-money budget (Report p. 7-8).

As such, according to the Report, these apps seem to be intended for children’s use, and some may even be “directed to children” within the meaning of the Children’s Online Privacy Protection Act (COPPA) and the FTC’s implementing Rule (the Rule). The Rule defines what is a “[w]ebsite or online service directed to children”) at 16 C.F.R. § 312.2. Under COPPA and the Rule, operators of online services directed to children under age 13 of age must provide notice and obtain parental consent before collecting children’s personal information. This includes apps. Yet, the FTC staff was unable, in most instances, to find out whether an app collected any data, or, if it did, the type of data collected, the purpose for collecting it, and who collected or obtained access to such data (Report p. 10).

 

‘The mobile app market place is growing at a tremendous speed, and many consumer protections, including privacy and privacy disclosures, have not kept pace with this development’ (Report p.3)

 

Downloading an app on a smart phone may an impact on children’s privacy, as apps are able to gather personal information such as the geolocation of the user, her phone number or a list of contacts, and this, without her parent’s knowledge. Indeed, if app stores and operating systems provide rating systems and controls which allow parents to restrict access to mobile content and features, and even to limit data collection, they do not provide information about which data is collected and whether it is shared. (Report, p. 15)

 

The Report concludes by recommending that app stores, app developers, and third parties providing services within apps, increase their efforts to provide parents with “clear, concise and timely information” about apps download by children. Parents would then be able to know, before downloading an app, what data will be collected, how it will be used, and who will obtain access to this data (Report p. 17). This should be done by using “simple and short disclosures or icons that are easy to find and understand on the small screen of a mobile device.” (Report p. 3)

 

One remembers that United States of America v. W3 Innovations, LLC, in August 2011, was the first FTC case involving mobile applications.

 


Leave a comment

EPIC is Suing the FTC to Compel Enforcement of Google Buzz Consent Order

The Electronic Privacy Information Center (EPIC) is suing the Federal Trade Commission (FTC) in order to compel the federal agency to enforce the October 2011 Google Buzz consent order, In the Matter of Google, Inc., FTC File No. 102 3136, which was issued following a complaint filed by EPIC with the FTC in February 2010.

 

Pursuant to this consent order, Google may not misrepresent the extent to which it maintains and protects the privacy and confidentiality of the information it collects, including the purposes for which the information is collected, and the extent to which consumers may exercise control over the collection, use, or disclosure of this information. Also, Google must obtain the express affirmative consent of Google users before making any new or additional sharing of information to third parties, which must be identified, and the purpose(s) for sharing the information must be disclosed to Google users. The consent order also requires Google to establish and implement a comprehensive privacy program.

 

Google announced in last January changes in its privacy policy, which will be effective March 1, 2012. Google will then start collecting user data across all the different Google sites, such as Gmail or YouTube, provided that the user logged into her Google account. Ms. Alma Whitten, Google’s Director of Privacy, Product and Engineering, stated that Google can thus provide “a simpler, more intuitive Google experience.” A Google user will have one single Google profile. There is, however, no opt-out available. The new privacy policy states that:

 

We may use the name you provide for your Google Profile across all of the services we offer that require a Google Account. In addition, we may replace past names associated with your Google Account so that you are represented consistently across all our services. If other users already have your email, or other information that identifies you, we may show them your publicly visible Google Profile information, such as your name and photo.”

 

According to EPIC’s complaint, these changes are “in clear violation of [Google] prior commitments to the Federal Trade Commission.” EPIC is arguing that Google violated the Consent Order “by misrepresenting the extent to which it maintains and protects the privacy and confidentiality of [users] information, by misrepresenting the extent to which it complies with the U.S.-EU Safe Harbor Framework… [and] by failing to obtain affirmative consent from users prior to sharing their information with third parties.

 

Indeed, the European Union (EU) is also concerned by these changes. The Article 29 Working Party sent a letter to Google on February 2, to inform the California company that it will “check the possible consequences for the protection of the personal data of [E.U. Member States ]citizensof these changes. Google answered to the Commission Nationale de l’Informatique et des Libertés (CNIL), France’s Data Protection Authority, in charge of coordinating the enquiry into Google Privacy changes, that changes were made in order to insure that Google’s privacy policy is “simpler and more understandable” and also “to create a better user experience.”

 

Meanwhile, EPIC is arguing that the FTC has a non-discretionary obligation to enforce a final order, yet has not yet taken any action with respect to changes ahead in Google’s privacy policy.


Leave a comment

Borders’s Sale of Personal Information Approved by Bankruptcy Court

The Wall Street Journal reported this week that Judge Martin Glenn of the U.S. Bankruptcy Court in Manhattan approved on September 26th the $13.9 million sale of Borders’s intellectual property to Barnes & Noble. Intellectual property assets include personal information (PI) that Borders collected from 48 million customers. This PI includes customer’s email addresses, but also records of books and videos they have purchased.

The issue of the privacy rights of Border’s customers was debated during the process. At a September 22 hearing, Judge Glenn had hesitated to approve the sale over concerns about customer’s privacy. The two sides, working with the Consumer Privacy Ombudsman (CPO) appointed by the court overseeing the Borders bankruptcy, agreed to email Border’s customers within a day of the sale’s closing to ask them if they wish to opt out of Barnes & Noble’s email list. Records about specific titles bought in the past at Border’s won’t be included in the sale.

The CPO had contacted the Federal Trade Commission (FTC) requesting it to provide a written description of its concerns regarding the possible sale of the PI collected by Borders during bankruptcy proceeding.

Bureau of Consumer Protection Director David Vladeck answered in a letter to the CPO on September 14, which was submitted to the court.

Borders and Its Privacy Policies

Selling PI during bankruptcy is regulated by section 363(b) of the Bankruptcy Code, 11 U.S.C. § 363(b), which provides that:  (our emphasis)

(b) (1) The trustee, after notice and a hearing, may use, sell, or lease, other than in the ordinary course of business, property of the estate, except that if the debtor in connection with offering a product or a service discloses to an individual a policy prohibiting the transfer of personally identifiable information about individuals to persons that are not affiliated with the debtor and if such policy is in effect on the date of the commencement of the case, then the trustee may not sell or lease personally identifiable information to any person unless —

(A) such sale or such lease is consistent with such policy; or

(B) after appointment of a consumer privacy ombudsman in accordance with section 332, and after notice and a hearing, the court approves such sale or such lease —

(i) giving due consideration to the facts, circumstances, and conditions of such sale or such lease; and

(ii) finding that no showing was made that such sale or such lease would violate applicable nonbankruptcy law.

Border’s 2006 and 2007 privacy policies had promised customers that the retailer would only disclose to third parties a customer’s email address or other PI if the customer “expressly consents to such disclosure.” The 2008 privacy policy, however, stated that:

Circumstances may arise where for strategic or other business reasons, Borders decides to sell, buy, merge or otherwise reorganize its own or other businesses. Such a transaction may involve the disclosure of personal or other information to prospective or actual purchasers, or receiving it from sellers. It is Borders’ practice to seek appropriate protection for information in these types of transactions. In the event that Borders or all of its assets are acquired in such a transaction, customer information would be one of the transferred assets.”

However, Mr. Vladeck wrote that the FTC “views this provision as applying to business transactions that would allow Borders to continue operating as a going concern and not to the dissolution of the company and piecemeal sale of assets in bankruptcy” and that “[e]ven if the provision were to apply in the event of a sale or divestiture of assets through bankruptcy, Borders represented that it would “seek appropriate protection” for such information.”

Privacy Policies and Unfair Practice

Mr. Vladeck wrote that the FTC was concerned that any sale or transfer of the PI of Borders’ customers “would contravene Borders’ express promise not to disclose such information and could constitute a deceptive or unfair practice.”

Mr. Vladeck ‘s letter noted that the FTC brought cases in the past where it alleged that the failure to adhere to a privacy policy is a deceptive practice under the FTC Act. In one of these cases, FTC v. Toysmart, an online retailer had filed for bankruptcy and then tried to sell its customer’s PI. The FTC alleged that the sharing of PI in connection with an offer for sale violated section 5 of the FTC Act, as the retailer had represented in its privacy policy that such information would never be shared with third parties.

Mr. Vladeck wrote that the “Toysmart settlement is an appropriate model to apply” in the Border’s case. The FTC entered a settlement with Toysmart allowing the transfer of customer information under certain limited circumstances:

1) the buyer had to agree not to sell customer information as a standalone asset, but instead to sell it as part of a larger group of assets, including trademarks and online content;

 2) the buyer had to be an entity that concentrated its business in the family commerce market, involving the areas of education, toys, learning, home and/or instruction;

3) the buyer had to agree to treat the personal information in accordance with the terms of Toysmart’s privacy policy; and

 4) the buyer had to agree to seek affirmative consent before making any changes to the policy that affected information gathered under the Toysmart policy.

Mr. Vladeck concluded his letter by offering these guidelines:

          Borders agrees not to sell the customer information as a standalone asset;

          The buyer is engaged in substantially the same lines of business as Borders;

          The buyer expressly agrees to be bound by and adhere to the terms of Borders’ privacy policy; and

          The buyer agrees to obtain affirmative consent from consumers for any material changes to the policy that affect information collected under the Borders’ policy.”

It seems that Mr. Vladeck’ s letter had a significant impact on the ruling.  Curiously, only a small percentage of customers understand the value their PI may have for a company, even though PI may be sold as assets.


Leave a comment

Senators Kerry and McCain Introduce the Privacy Bill of Rights Act of 2011

Sen. John Kerry (D-Mass.) and Sen. John McCain (R-Ariz.) introduced today a bi-partisan privacy bill, the Privacy Bill of Rights Act of 2011, a bill “[t]o establish a regulatory framework for the comprehensive protection of personal data for individuals under the aegis of the Federal Trade Commission, and for other purposes.”

The Act would apply to covered entities, that is, “any person who collects, uses, transfers, or stores covered information concerning more than 5,000 individuals during any consecutive 12-month period,” and is a person a person over which the FTC has authority pursuant to section 5(a)(2) of the Federal Trade Commission Act , a common carrier, or is a non-profit organization (p.29-30).

The Act would be enforced by the FTC and by State Attorney Generals (p.30-31). However, no simultaneous enforcement by a State Attorney General and the FTC would be allowed. The Act would also prevent private rights of action (p.37).

The FTC would establish rules to be followed by nongovernmental organizations administrating safe harbor programs. Participation in these programs would be voluntary, not mandatory (p. 37-41).

The right of security and accountability

Not later than 180 days after the enactment of the Act, the FTC would initiate a rulemaking proceeding requiring covered entities to implement security measures protecting the coverd personal data information they collect and maintain. These security measures would be proportional to the size, type, and nature of the collected information (p.15). Covered entities would have a duty “to respond to non-frivolous inquiries from individuals regarding the collection, use, transfer, or storage of [their] covered information.”

Covered entities will “in a manner proportional to the size, type, and nature of the covered information that it collects, implement a comprehensive information privacy program.” Such a program would be a “privacy by design program,” which would “incorporate necessary development processes and practices throughout the product life cycle that are designed to safeguard (… ) [individuals’] personally identifiable information based on both individuals’ reasonable privacy expectations in their data  and relevant threats against data privacy (p.17).

The right to notice, consent, access, and correct information

Not later than 60 days after the enactment of the Act, the FTC would initiate a rulemaking proceeding requiring covered entities “to provide clear, concise, and timely notice to individuals” regarding data collection, use, transfer, and storage of covered information, and also regarding the specific purposes of those practices.  Covered entities would have “to provide clear, concise, and timely notice to individuals before implementing a material change in such practice (p.18).

The FTC rulemaking proceeding would also require covered entities:

“to offer individuals a clear and conspicuous mechanism for opt-out consent for any use of their covered information(…);  to offer individuals a robust, clear, and conspicuous mechanism for opt-out consent for the use by third parties of the individuals’ covered information for behavioral advertising or marketing;  to offer individuals a clear and conspicuous mechanism for opt-in consent for (… ) the collection, use, or transfer of sensitive personally identifiable information other than (i)to process or enforce a transaction or deliver a service requested by that individual; (ii) for fraud prevention and detection; or (iii) to provide for a secure physical or virtual environment ( … )”

In case of bankruptcy or if an individual requests termination of a service, a individual would have the right to request that all of his personally identifiable information maintained by covered entity, except for some publicly shared information or information that the individual authorized the sharing of, “be rendered not personally identifiable,” or, if this is not possible, the covered entity will have to stop the use or transferring such information to a third party (p.18-20).

The right to data minimization, constraints on distribution, and data integrity

Covered entities would be authorized to collect “only as much covered information relating to an individual as is reasonably necessary “during a transaction or when delivering a service requested by the individual. However, covered entities could keep such data “for research and development conducted for the improvement of carrying out a transaction or delivering a service” and for “internal operations” such as customer satisfaction surveys (p. 24-25).

Covered entities will have to require by contract that any third party to which they transfer covered information use the information only for purposes consistent with the provisions of the Act, and as specified in the contract (p. 25), unless the covered entity obtains individuals’ consents to such transfers. Data transfers to unreliable third parties would be prohibited (p.27).

Data integrity would be achieved by covered entities by establishing and maintaining “reasonable procedures to ensure that (… ) covered information (… ) is accurate in those instances where the covered information could be used to deny consumers benefits or cause significant harm” (p. 28).

More information here.