The Secure Times

An online forum of the ABA Section of Antitrust Law's Privacy and Information Security Committee


1 Comment

White House releases Big Data Report

The White House released its report on big data, “Big Data: Seizing Opportunities, Preserving Values,” on Thursday, May 1, 2014, which looks at the ways that businesses and the government are able to perform analytics on massive data sets culled from a wide variety of sources to develop new observations and measurements about individual consumers.   The Report offers findings and recommendations, based on 90 day review of big data and privacy led by White House counselor John Podesta and an executive branch working group, including the Secretaries of Commerce and Energy, the President’s Science and Economic Advisors and other administration officials at the request of President Obama. The working group sought public input from academic researchers, privacy advocates, advertisers, civil rights groups and the public during its review in an effort to evaluate the opportunities and challenges presented by big data.

The Report recognizes the inherent value big data has added to society, citing as examples the ability of big data analysis to enhance and improve medical treatment of premature infants, increase efficiencies across transportation networks and utility providers, and identify fraud and abuse in Medicare and Medicaid reimbursements. However, the Report also acknowledges serious privacy concerns, noting that big data may reveal intimate personal details of an individual user, and that big data tools may lead to discriminatory outcomes, particularly with regard to housing, employment and credit.

The Report offered several policy recommendations:

  • Move forward with the Consumer Bill of Rights.  In 2012, the President announced the concept of a Consumer Bill of Rights, which establishes certain baseline consumer privacy principles such as offering transparency about data privacy and security practices, providing consumers control over data practices, respecting the context in which the data was collected, increasing the accuracy of data files, and providing the opportunity for consumers to access collected data. This Report reiterates the importance of passing legislation to enforce the Bill of Rights principles, but also questions whether the principles are well-suited to the world of big data.  Perhaps, the Report suggests, there should be a greater emphasis placed on how the data is used and reused rather than an emphasis on establishing notice and consent for the initial data collection.
  • Pass National Data Breach Legislation.  The Report notes that the amalgamation of so much information about consumers results in much greater harm to the consumer in the event of a data breach, and finds an even greater need for Congress to pass national data breach legislation to preempt the 47 different state laws currently in effect.
  • Extend privacy protections to non-US persons.  The Report urges government departments and agencies to apply the Privacy Act of 1974 and other privacy protections to all individuals, regardless of nationality.
  • Ensure data collected on students in schools is used for educational purposes.  Acknowledging the growing and valuable use of educational technologies in schools, the Report calls for protections to ensure that student data is not used inappropriately when it is collected in an educational setting.  The Report suggests modernizing COPPA and FERPA to protect student data in the digital age, while still encouraging innovation in the educational technology industry.
  • Expand technical expertise to stop discrimination.  Businesses decisions affecting consumers’ access to healthcare, education, employment, credit and goods and services are increasingly made on the basis of big data algorithms. The Report calls on the DOJ, the FTC, the CFPB and the EEOC to develop their technical expertise to be able to detect whether these automated decision-making processes have discriminatory effects on protected classes of people, and to develop tools to redress such discrimination.
  • Amend the Electronic Communications Privacy Act (ECPA).  The Stored Communications Act, which is part of the ECPA, articulates the rules for obtaining the content of stored communications including email and cloud servers, but was written well before personal computing, email, texting, cloud storage, and smart phones were used as the primary means of communication.  The Report calls on Congress to amend the ECPA to ensure the standards of protection for digital online content is consistent with the protections afforded in the physical world.

While the Report provides a useful overview of the big data phenomenon, its benefits and its challenges, it remains to be seen what impact this Report will have on the industry.  By and large, the Recommendations do not contain wholly new ideas.  The ECPA is widely considered to be antiquated and there have been repeated calls for reform.  There have been many attempts to offer national data breach notification legislation, but no bill has made it through Congress to date.  The White House first offered its support for a Consumer Bill of Rights in 2012, but spent the last 2 years involved in multi-stakeholder meetings without producing draft legislation. This recent Recommendation shows little evidence of advancing the ball significantly on that front, as calls for additional “stakeholder and public comment” before crafting the legislative proposal.  However, the call for greater protections for student data is well-timed, as one of the largest school technology providers, inBloom, was forced to shut down over privacy concerns just a few weeks prior to the Report’s release.

Advertisements


Leave a comment

New COPPA Compliance Mechanisms Now Available

FTC recently approved a new COPPA safe harbor program and a new method for obtaining parental consent, providing flexibility to companies striving to comply with COPPA obligations.

The revisions to the COPPA rule that took effect July 2013 expanded COPPA provisions in several ways, including by expanding the definition of “personal information” and clarifying that third party operators are also subject to COPPA compliance obligations.  The revised rules also imposed stricter requirements for companies wishing to provide COPPA safe harbor certification and created a mechanism through which companies could submit approval for new methods of obtaining parental consent.

Safe Harbor.   Websites that participate in an FTC-approved COPPA safe harbor program will generally be subject to review and disciplinary actions under the program guidelines rather than be subject to a formal FTC investigation and enforcement action.  In the amended Rule, the FTC imposed stricter requirements for companies wishing to provide safe harbor certification programs. A potential safe harbor program provider must now provide extensive documentation about the program’s requirements and the organization’s capability to oversee the program during the approval process and, after approval, the program must submit annual reports to the FTC.

On February 12, the FTC announced its approval of the kidSAFE Seal Safe Harbor program, which is designed for child-friendly websites and applications, including kid-targeted games, educational sites, virtual worlds, social networks, mobile apps, tablet devices and other similar interactive services and technologies.

The FTC approved the kidSAFE seal safe harbor program after determining that it had (1) a requirement that participants in the safe harbor program implement substantially similar requirements that provide the same or greater protection for children as those contained in the COPPA Rule; (2) an effective, mandatory mechanism for independent assessment of the safe harbor program participants’ compliance with the guidelines; and (3) disciplinary actions for noncompliance by safe harbor participants.

The kidSAFE Seal program as the first safe harbor program approved under the amended version of the rule.  The program joins five other safe harbor certifications previously approved by the FTC: the Children’s Advertising Review Unit of the BBB, the Entertainment Software Rating Board, TRUSTe, Privo Inc. and Aristotle International, Inc. 

Parental Verification MethodsThe FTC recently approved a new authentication method proposed by Imperium, LLC for verifying the identity of parents who consent to the collection of their children’s data.  Imperium proposed a “knowledge-based authentication system,” for its identify verification system ChildGuardOnline, which verifies a user’s identity by asking a series of out-of-wallet challenge questions (e.g., questions which cannot be determined merely by looking in a person’s wallet).  Knowledge-based authentication systems are already used by entities that handle sensitive information like financial institutions and credit bureaus. The FTC found this was a reliable method of verification because the questions were sufficiently difficult that a child age 12 and under in the parent’s household could not reasonably ascertain the answers and noted that knowledge-based authentication has already proven reliable in the market place in other contexts.

Previously, the FTC had rejected an application by AssertID Inc. for its ConsentID product, which proposed to verify parental identify by asking that “friends” on the parent’s social media sites vouch for the parental-child relationship. The FTC found that this method was not “reasonably calculated in light of available technology” to ensure the person providing consent was the child’s parent and that the process could easily be circumvented by children who create fake social media accounts.  To date, the Imperium methodology of parental consent verification is the only method approved by the FTC that was not in the text of the Rule itself.  The other methods for verifying parental consent as provided in the text of the Rule are (a) requesting such consent be provided by written form returned by mail, fax or scanned email; (b) requesting a credit or debit card in connection with a monetary transaction; (c) requesting parent call a toll-free phone number, (d) connect with parent via video-conference, or (e) check a form of ID against a government database.

The FTC recently closed its public comment period for another proposed verification system submitted by iVeriFly. The iVeriFly methodology combines a knowledge-based authentication system similar to the method imposed by Imperium, wherein the program scans non-FCRA consumer databases to generate out-of-wallet questions for the parent to answer. If the parent answers the questions correctly, the iVeryFly system then places a call to the parent requesting that consent be provided through a series of telephone key presses.


3 Comments

Mobile Location Analytics Companies Agree to Code of Conduct

U.S. Senator Charles Schumer, the Future of Privacy Forum (“FPF”), a Washington, D.C. based think tank, and a group of location analytics companies, including Euclid, Mexia Interactive, Radius Networks, Brickstream, Turnstyle Solutions and SOLOMO,  released a Code of Conduct to promote customer privacy and transparency for mobile location analytics. 

Mobile location analytics technology, which allows stores to analyze shoppers’ behavior based on information collected from the shoppers’ cell phones, has faced a string of negative press in the last several months.  The location analytics companies gather Wi-Fi and Bluetooth MAC address signals  to monitor shoppers’ movements around the store, providing feedback such as how long shoppers wait in line at the check-out, how effective a window display draws customers into the store, and how many people who browse actually make a purchase.  Retailers argue that the technology provides them with the same type of behavioral data that is already being collected from shoppers when they browse retail sites online.  Customer advocates, on the other hand, raise concerns about the invasive nature of the tracking service, particularly as most customers aren’t aware that the tracking is taking place. Senator Schumer has been one of the most vocal critics of the mobile location analytics services, calling it an “unfair or deceptive” trade practice to fail to notify shoppers that their movements are being tracked or to give them a chance to opt-out of the practice.   In an open letter to the FTC in July 2013, Sen. Schumer described the technology thus:

“Retailers do not ever receive affirmative consent from the customer for [location analytics] tracking, and the only options for a customer to not be tracked are to turn off their phone’s Wi-Fi or to leave the phone at home. Geophysical location data about a person is obviously highly sensitive; however, retailers are collecting this information anonymously without consent.”

In response, a group of leading mobile location analytics companies agreed to a Code of Conduct developed in collaboration with Sen. Schumer and the Future of Privacy Forum to govern mobile location analytics services.   Under the Code:

  • A participating mobile location analytics firm will “take reasonable steps to require” participating retailers to provide customer notice through clear, in-store signage; using a standard symbol or icon to indicate the collection of mobile location analytics data; and to direct customers to industry education and opt-out website (For example, “To learn about use of customer location and your choices, visit www.smartstoreprivacy.com” would be acceptable language for in-store signage)
  • The mobile location analytics company will provide a detailed disclosure in its privacy policy about the use and collection of data it collects in-store, which should be separate from the disclosure of information collected through the company’s website.
  • Customers must be allowed the choice to opt-out of tracking.  The mobile location analytics company will post a link in its privacy policy to the industry site which provides a central opt-out.  A notice telling customers to turn off their mobile device or to deactivate the Wi-Fi signal is not considered sufficient “choice” under the Code.
  • The notice and choice requirements do not apply if the information collected is not unique to an individual device or user, or it is promptly aggregated so as not to be unique to a device or user, and individual level data is not retained. If a mobile location analytics firm records device-level information, even if it only shares aggregate information with retail clients, it must provide customer choice.
  •  A customer’s affirmative consent is required if: (1) personal information will be linked to a mobile device identifier, or (2) a customer will be contacted based on the analytic information.  

 The FTC has offered support to the self-regulatory process and provided feedback on the Code during the drafting negotiations.  “It’s great that industry has recognized customer concerns about invisible tracking in retail space and has taken a positive step forward in developing a self-regulatory code of conduct,” FTC Director of Customer Protection Jessica Rich told Politico

Some critics, however, feel that the Code does not go far enough.  The notice provision is weak, as it relies on the retailers to provide in-store signage for the customer.  Notably, retailers were not party to the negotiations developing the Code of Conduct and no retailer has publicly agreed to post signs in their stores.  Given the history – retailer Nordstrom was forced to drop its mobile location analytics pilot program in response to bad press from customers complaining after seeing posted signs – retailers are likely to want in-store signage to be as inconspicuous as possible. 

The next time you’re out shopping, keep your eyes peeled for in-store signage.  Are your retailers watching you?