The Secure Times

An online forum of the ABA Section of Antitrust Law's Privacy and Information Security Committee

Leave a comment

Article 29 Working Party Publishes Letter Criticizing the Proposed Online Behavioral Advertising Self-Regulatory Framework.

Earlier this week, the Article 29 Working Party published a letter it sent to the Interactive Advertising Bureau Europe (IAB Europe) and the European Advertising Standards Alliance (EASA) regarding their proposed self-regulatory framework for online behavioral advertising (OBA) to satisfy the EU’s ePrivacy Directive.   The letter referred to a meeting between the Working Party and the OBA industry scheduled for sometime in September and was sent in advance of the meeting to inform the OBA industry of the Working Party’s main concerns with the proposed framework.

Continue reading

Leave a comment

Class action filed against comScore over alleged privacy violations.

            A putative class action was filed yesterday (8/23/11) against comScore, Inc., an internet research and analytics company.  The plaintiffs allege that comScore violated federal law and the Illinois mini-FTC Act by collecting personal information from consumers’ computers without the consumers’ knowledge or consent.  The complaint was filed in the federal district court for the Northern District of Illinois, Dunstan et al. v comScore, Inc. (No. 1:11-cv-05807).

            The complaint alleges that comScore induced consumers to download its surveillance software by bundling the software with third-party free software products such as screensavers, games, and CD burning software, but failed to clearly disclose the extent to which the surveillance software will monitor a consumer’s internet activity and the access the software will have to change privacy and security settings.   The complaint also alleges that comScore intentionally made the surveillance software difficult to disable or uninstall by not deleting it when the freeware with which it was bundled was deleted. 

The claims asserted include violations of the federal Stored Communications Act (18 U.S.C. § 2701 et seq.),  Electronic Communications Privacy Act (18 U.S.C. § 2510 et seq.), Computer Fraud and Abuse Act (18 U.S.C. § 1030 et seq.),  the Illinois Consumer Fraud and Deceptive Practices Act (815 ILCS 505/1 et seq.), and common law unjust enrichment.  The plaintiffs are seeking actual, statutory and punitive damages, an injunction to stop comScore’s  illegal practices, disgorgement of profits, and attorneys’ fees.

Leave a comment

Ninth Circuit: DDPA Does Not Forbid Buying Driver’s Data in Bulk

The Ninth Circuit found, in Howard v. Criminal Information Services, Inc., that the Driver’s Privacy Protection Act (DPPA), 18 U.S.C. §§ 2721–2725, does not prohibit the buying in bulk of state driver databases for future use of the information therein.

Two different groups of plaintiffs had filed suit seeking to represent a class in Oregon and Washington states, seeking damages on the ground that their personal information was obtained by defendants, among them a newspaper company and a company performing background checks, in violation of the DPPA.

The DPPA provides that personal information from state driver license databases can be obtained, disclosed, or used only for certain specified purposes, such as verifying the accuracy of personal information submitted by the individual, or to use in connection with matters of motor vehicle or driver safety and theft.  

However, plaintiffs did not complain that the ultimate use of their information was for purposes not permitted by the DPPA, but rather that the DPPA forbids bulk purchasing of driver’s personal information for future use, as future use is not a permitted purpose under the DPPA. Indeed, defendants had not requested driver’s records individually, but instead bought the entire database from the state, for the purpose of “stockpiling” it, a term used by the statute.  However, their ultimate use of the information was permitted purposes under the DPPA.

The Ninth Circuit concluded that plaintiffs did not state a claim that stockpiling information for a permitted use is not a violation of the DPPA, as the statute is concerned with the use to which the information is put, not the way it is acquired:

The DPPA does not contain a temporal requirement for when the information obtained must be used for the permitted purpose. Nor is there a requirement that once the information is obtained for a permitted purpose that it actually be used at all. The DPPA only requires that Defendants obtained the information for a permitted purpose.”


Leave a comment

U.K. Equality and Human Rights Commission Publishes “Protecting Information Privacy” Report

The United Kingdom Equality and Human Rights Commission (EHRC) published this week a report, “Protecting information privacy,” written by Charles Raab and Benjamin Goold, from the University of Edinburgh and the University of British Columbia. The report represents the views of the two authors and do not necessarily represent the views of the Commission.

The report claims that current U.K. privacy laws and regulation do not adequately protect human rights, and that fundamental reform is needed, especially as data security breaches happen regularly (see p. 9-10 for examples). Such breaches are bound to happen more frequently, as demand for personal information increases, and new technology facilitates its collection. Indeed, “personal information privacy is under particular threat in today’s ‘information economy’ and ‘information-age government’” (p.10).

The public sector has increased its use of personal information, and the state plays an expanded role. The U.K. legal framework has “a weak, fractured and piecemeal approach to [privacy] regulation” (p.12), and it is more and more difficult for individuals to understand how their personal information is used, and what they should do when it is misused.

 The 1984 Data Protection Act (DPA) was the first statutory information privacy protection law. Also, Article 8 of the European Convention on Human Rights (ECHR) protects an individual’s ‘right to respect for his private and family life, his home and his correspondence.’ The ECHR is incorporated into U.K. law by the Human Rights Act (HRA) of 1998 (for an overview of current laws, see p. 25 and following).

According to the report, U.K legislation has not kept pace with technology changes, and that the state has failed to adequately protect the right to privacy. The report states that “[n]ew strategies must continually be developed to cope with the increasingly novel ways in which privacy, including information privacy, is at risk” (p.75).

The report makes four main recommendations:

(1)    The government should develop a clear set of ‘privacy principles’ to be used as a basis for future legislation, and as a guide to regulators and governments agencies concerned with information privacy and data collection.


(2)    Existing privacy legislation should be reformed to be consistent with ‘privacy principles’ in order to enhance existing provisions of the HRA.


(3)    There should be greater regulatory coherence, that is, the U.K. needs to rationalize and consolidate its current approach to the regulation of surveillance and data collection.


(4)     Technological, organizational, and other ways to protect privacy should be improved, and the development and use of technological and non-legal solutions to the problem of information privacy protection should be encouraged by government.

Leave a comment

Settlement in FTC First Case Involving Mobile Applications

The Federal Trade Commission announced today that W3 Innovations, LLC, a developer of mobile applications, will pay $50, 000 to settle charges that it violated the Children’s Online Privacy Protection Act (COPPA) and the FTC COPPA Rule (the Rule). The case, United States of America v. W3 Innovations, LLC, is the first FTC case involving mobile applications.

The Rule applies to any operator of a commercial website or online service directed to children that collects, uses and/or discloses children’s personal information. A website operator must obtain “verifiable parental consent prior to collecting, using, and/or disclosing personal information from children.”

The Complaint alleged that defendant had offered some forty apps for download from the Apple’s app store, which allowed users to play games and share information online. These apps, listed by the Defendant in the Games-Kids section of the Apple store, and similar to games played by elementary school girls and boys, were targeted to children.

The Complaint also alleged that the defendant had collected over 30,000 email addresses, and also had collected, maintained, and/or disclosed personal information from about 600 users, but had failed to provide direct notice to parents about this practice and had failed to maintain or link to an online notice of the way it collects data. Defendant had not obtained verifiable consent from parents prior to collecting, using, or disclosing children’s personal information.

The Consent Order (the Order) ordered that Defendant must, within 5 days from the date of entry of the Order, delete all personal information collected and maintained in violation of the Rule, and also pay a $50,000 penalty.


1 Comment

Spain Enforces “Right to Be Forgotten”

Spain’s Data Protection Agency has ordered Google to delete personal information regarding approximately 90 individuals from Google’s search engine indexes. These individuals filed formal complaints with the Data Protection Agency alleging that certain personal information, such as decades old arrest records and the current address of a domestic violence victim, should not be accessible through the Internet. In ordering Google to delete information, the Data Protection Agency indicated that every individual has the “right to be forgotten” and have certain information deleted from the Internet.

The Agency and Google are now engaged in a lawsuit regarding whether Google can be required to remove certain information from its search indexes. Privacy experts have expressed concern that requiring search engines to delete certain personal information could restrict access to public information. Regardless of the outcome, however, the European Union is expected to draft legislation later this year that could include a “right to be forgotten” provision and allow individuals to have certain information deleted from the search indexes or websites.

Leave a comment

Connecticut Enacts Law Restricting Access to Credit Reports

In late July 2011, Connecticut passed a law restricting employers’ access to employee’s or potential employee’s credit reports. Public Act No. 11-223 prohibits employers from requiring an employee or prospective employee to consent to a credit report request as a condition of employment, unless one of the following conditions is met:

  • The employer is a financial institution;
  • A credit report is required by law;
  • The employer reasonably believes that the employee has engaged in specific activity that constitutes a violation of the law related to the employee’s employment; or
  • A credit report is substantially related to an employee’s current or potential job or the employer has a bona fide purpose for requesting or using information in the credit report that is substantially job-related.

The new statute defines “substantially related to an employee’s current or potential job” to include a number of situation where an employee or potential employee would have managerial or fiduciary responsibilities, or would have access to personal information, confidential business information, or other sensitive data. Connecticut’s statute becomes effective October 1, 2011. This law is similar to employer credit report restrictions that have recently been enacted in other states, such as Illinois and Oregon.