The Secure Times

An online forum of the ABA Section of Antitrust Law's Privacy and Information Security Committee


Leave a comment

UK Investigates Privacy Implications of Email-Hacking by Journalists

The UK communications regulator Ofcom this week announced that it is investigating alleged email-hacking by journalists at Sky News, a satellite news channel controlled by Rupert Murdoch.  The investigation raises interesting issues regarding the role of online privacy in news reporting; in particular whether privacy should trump the public interest in media led investigations into criminal activity.

Ofcom announced its investigation after a Sky News representative admitted to hacking the email accounts of John Darwin, who faked his own death in order to claim life insurance, and later reappeared living abroad.  In addition to email-hacking, Sky News also admitted to posting a hacked voicemail message from Mr. Darwin’s wife on its website.  These hacking admissions were made at the wide-ranging Leveson Inquiry into press ethics and culture, which was prompted by last year’s UK press phone-hacking scandal.

Under UK law, email hacking is a violation of the Computer Misuse Act 1990, and may trigger criminal sanctions.  In addition, Ofcom’s broadcasting code – Rule 8.1 – provides that: "Any infringement of privacy in programs, or in connection with obtaining material included in programs, must be warranted."  The potential sanctions for breach of Ofcom’s code range from a warning, to a fine, to revoking a broadcasting license in the most serious circumstances. 

For its part, Sky News argues that its actions were "editorially justified" since there are rare instances when it is defensible for a journalist to commit an offense in the public interest, in this case the detection of insurance fraud.

An Ofcom spokesperson stated that the agency "is investigating the fairness and privacy issues raised by Sky News’ statement that it had accessed without prior authorization private email accounts during the course of its news investigations." 

The regulator also announced that it will "make the outcome [of its investigation] known in due course." 

 

 

 

Advertisements


Leave a comment

The Federal Trade Commission Publishes its Final Privacy Report (Part II)

This is the second part of a post about the recently published FTC Privacy Report.

Simplified Consumer Choice (Consent)

Some practices do not require choice

Under the Final Framework, companies would not have to provide consumers with a choice if they collect and use data for ‘commonly accepted practices’ (p.36). Instead of defining rigidly what would be considered as being commonly accepted practices, the FTC focus on the interaction between a business and the consumer (p.38). Is the practice “consistent with the context of the transaction or the consumer’s existing relationship with the business, or is [it] required or specifically authorized by law? ” (p. 39).

One may remember that the Telephone Consumer Protection Act has a similar “existing business relation” exception to consent.

However, the six practices originally identified  in the preliminary staff report as those that companies may engage in without offering consumer choice (fulfillment, fraud prevention, internal operations, legal compliance, public purpose, and most first-party marketing) remain useful as guidance as to whether a practical practice would be indeed considered as being commonly accepted.

First party marketing occurs when a company collects customer data and uses it for its own marketing purposes, as opposed to third party marketing, where collected data is sold to third party for their own marketing purposes.  Entities having a first–party relationship with a consumer would not be exempt from providing consumers with choices if it also collects consumer data not consistent with the first-party relationship, such as tracking the consumer across sites (p. 40-41).

The FTC’s final principle on choice is that companies do not need to provide choice before collecting consumer data for practices either consistent with their relationship with the customer or if required by law (p.48).

Companies should give a choice if the practice is inconsistent with the interaction with the consumer 

Such choice should be given “at a time and in a context in which the consumer is making a decision about his or her data” (p. 48).

The FTC still advocates the implementation of a universal, one-stop mechanism for online behavioral tracking (Do Not Track) (p. 52).

A Do Not Track system should include five key principles (p. 53):

1.     It should cover all parties tracking consumers

 

2.     It should be easy to find, understand and use

 

3.     The choices offered should be persistent and should not be overridden

 

4.     It should be comprehensive, effective and enforceable

 

5.     It should allow the consumer to opt out of receiving targeted advertisements , and also allow consumers to opt out of collection of behavioral data for all purposes other than those consistent with the context of the interaction

Express consent would, however, be required at the time and in the context in which the consumer is making its decision if the company is using data in a materially different manner then the one stated when collecting the data, and if it collects sensitive data, such as social security numbers, information about children, or financial and health data.

Large platform providers (ISPS, operating systems, browsers…)

Such entities have access to a very large spectrum of unencrypted consumer data, which would allow them to build very detailed consumer profiles. Indeed, an ISP has access to all of its customer online activity when using that particular connection, raising privacy concerns. The FTC will host a workshop in the second half of 2012 to discuss privacy issues raised by data collection by large platforms.

Transparency

There are several ways companies could increase the transparency of their data practices.

            Privacy Notice

Privacy notices should be:

          Clearer

          Shorter

          More Standardized

However, prescribing a rigid privacy statement format to be used in all sectors is “not appropriate” according to the FTC. Some elements should be standardized, such as format and terminology, in order for consumers to be able to easily compare privacy practices (p. 62).

            Access

Companies should provide reasonable access to the consumer data they maintain, and this access should be proportionate to the sensitivity of the data and the nature of its use (p. 64).

For entities maintaining data solely for marketing purposes, the FTC agrees that the costs of providing consumer a right to access and correct data would likely outweigh the benefits. However, entities should provide consumers with the lists of data categories they keep, and inform them of their right to state that they do not want their data to be used for marketing purposes (p. 65). However, such companies should provide more individualized access to data if possible, citing as an example Yahoo’s Ad Interest Manager, allowing users to opt out of certain advertising categories.

The FTC also noted that companies compiling consumer data to then sell it to other companies, who then use the data in order to make a decision about a particular person’s ability to be offered a job, an insurance rata, or a credit, are subjected to the FCRA. Consumers then have a right to access and correct their information under the FCRA, 15 U.S.C. §§ 1681g-1681h,even if the company compiling the data is not sure of the use it will be make of the data, but “has reason to believe” it will be used for making such decisions (p. 67).

Entities maintaining data for other, non-marketing purposes that fall outside the scope of the FCRA, such as fraud management risk companies, or social networking sites, should use a sliding scale approach. The consumer access to his data would depend on the use being made of it, and of its sensitive character (p. 67).

The FTC supports legislation, such as the Data Accountability and Trust Act, which would give consumers a right to access their data held by data brokers. It also supports the creation by the data broker industry of a centralized web site where data brokers would inform consumers about their data collection practices, and disclose the companies buying this data (p. 69).

The FTC also supports the idea of an “eraser button,” which would allow people to delete the content they have posted online, a right somewhat similar to the right to be forgotten stated by the recent EU Commission Proposal for a new privacy framework (p. 70).

            Consumer Education

Consumers should be better educated about commercial data privacy practices, and this should be done by all stakeholders.


1 Comment

The European Commission launches a public consultation on the ‘Internet of Things

The European Commission has launched a public consultation on the ‘Internet of Things’ (IoT) and is inviting comments until July 12, 2012. Members of the public are invited to respond to an online questionnaire.

Privacy

The public is invited to submit comments on the privacy implications of the IoT, as smart objects collect data which may also reveal information about an individual, his habits, location, or interests, and this whether his identity is known, or unknown, and might be indirectly revealed by combining data from different sources. One of the questions is:

Traditional data protection principles include fair and lawful data processing; data collection for specified, explicit, and legitimate purposes; accurate and kept up-to-date data; data retention for no longer than necessary. Do you believe that additional principles and requirements are necessary for IoT applications?”

Safety and Security

The questions are also about the safety and security issues which may be raised by IoT. Indeed, IoT objects are able to act on behalf of people and therefore need protection against false requests for information and against unauthenticated commands by using user authentication to ensure the authenticity of both the device and the data.

The public is invited to state whether they agree that “[d]ata life cycle management in the IoT infrastructure includes data creation, processing, sharing, storing, archiving, and deletion of data… [and that] [g]uidelines should be developed to ensure secure and trusted data life cycle management.”

Security of Critical IoT Supported Infrastructures

Comments may also address the security of critical IoT supported infrastructures, as there is a risk of abuse and attacks of such systems. The public may answer whether they agree that “[p]olicy makers should provide guidance on security-by-design and applicable security technologies.”

Ethical Issues

The questionnaire also addresses ethical questions. One of the questions is whether “IoT applications could change our sense and definition of personal identity.”Another question asks whether “IoT applications could interfere with individuals’ autonomy when decisions are taken by autonomous systems.

Open Object Identifiers and Interoperability

The IoT is able to identify each connected object by its identifier, and the questionnaire states that, if there are right now some 5 billion mobile phone subscribers, there may be 50 billion connected non-phone devices in 10 years, a rather stunning figure.

Should openly accessible identifier solutions allowing for the interoperability of smart devices be authorized?  The public is invited to state whether IoT identifier policy should promote business models for open interoperable platforms.

Other topics of the questionnaire include governance issues and standards for meeting policy objectives.


Leave a comment

The Federal Trade Commission Publishes its Final Privacy Report (Part I)

The Federal Trade Commission (FTC) issued its much-awaited final privacy report, “Protecting Consumer Privacy in an Era of Rapid Change, Recommendations for Businesses and Policymakers” (the Report).

The Report provides companies with self-regulation guidelines, and it calls for businesses collecting consumer data to implement best practices to protect this data. According to the Report,the framework is meant to encourage best practices and is not intended to conflict with requirements of existing laws and regulations” (p.16).

The FTC believes that self-regulation has not yet gone far enough, with the expectation of Do Not Track (p. 11). Yet, the Report also recommends that Congress pass baseline and technologically neutral privacy legislation, as well as data security legislation. Privacy legislation would give businesses clear guidance, and would also serve as a deterrent by providing remedies to aggrieved parties.  The FTC also recommends the passage of legislation targeted at data brokers, which would allow consumers to have access to their personal data held by data brokers.

Scope of the Privacy Framework

The framework would apply to all commercial entities collecting or using consumer data that can be reasonably linked to a specific consumer, computer, or other device.

It would not apply however, to entities collecting only non-sensitive data from fewer than 5,000 consumers a year, if they do not share that data with third parties, thus to avoid an entity out of the scope of the Framework from selling its collected data to a data broker.

As noted by the Report, HR 5777, the Best Practices Act, contained a similar exclusion, for entities collecting information for fewer than 10,000 individuals during any 12-month period, if the data is not sensitive.

The frameworks would, however, apply to both online and offline data. That way data collected by data brokers would be included in its scope. Also, as noted by the FTC, consumer data collection is ‘ubiquitous,’ whether it occurs online and offline, and the privacy concerns these practices raise are similar (p. 17).

The framework would apply to data that is reasonably linkable to a specific consumer, computer, or device (p. 18).

Under the final framework, data would not be considered as “reasonably linkable to a particular consumer or device” if a company implements three signification protections for that data (p. 21):

          Taking reasonable measures to ensure that the data is de-identified

          Publicly commit to maintain and use the data in a de-identified fashion

          If making the de-identified data available to third parties, prohibiting by contract that third parties  attempt to re-identify the data

Interestingly, the issue of what is personal data is also debated right now in the European Union EU). Recital 24 of the recent EU Commission data protection proposal hints that IP addresses or cookies do not need to be necessarily considered as personal data, as they need to be combined with unique identifiers and other information to allow identification. In a recently published opinion on the proposal, the Article 29 Working Party stated that personal data needs to be more extensively defined, as being all data related to an identifiable individual, and that IP addresses should thus be considered related to identifiable individuals, especially if processing IP addresses or cookies is done to identify users of the computer.

Privacy by Design

The baseline is that “[c]ompanies should promote consumer privacy throughout their organizations and at every stage of the development of their products and services” (p. 22).

Such Privacy protections include four substantive principles:

          Data security

          Reasonable collection limits

          Sound retention practices

          Data accuracy

Data Security

The Report notes that the FTC has been enforcing data security obligations under Section 5 of the FTC Act, the FCRA and the GLBA (p. 24) and also notes that several companies have already implemented data security protection measures, such as secure payment card data, browser privacy, or SSL encryption (p.25).

            Reasonable Collection Limit

The FTC believes that companies should limit data collection “to that which is consistent with the context of a particular transaction or the consumer’s relationship with the business, or as required or specifically authorized by law” (p. 27).

Sound Data Retention

Companies should not retain data if it is no longer necessary for the legitimate purpose for which is has collected. The FTC does not, however, set a data retention timetable. Instead, it states that the data retention period can be flexible, and may vary according to the type of data collected and its intended use (p. 29).

Data Accuracy

What companies would have to do in order to ensure the accuracy of the data collected depends on the data’s intended use and whether it is sensitive data or not.

Part II to be posted later this week.


Leave a comment

FCC Fines Google in Street View Case for Lack of Cooperation in Inquiry, But No Enforcement Action Sought

One remembers the stir that Google’s Street View project created during 2010 in the United States, Canada, and the European Union, when it was discovered that the California company had collected WiFi  network data, when its Street View cars roamed the streets of the world, taking pictures of the of the environment in order to create a comprehensive map. It turned out that this data included “payload” data, that is, the content of emails, text messages, or even passwords.

Google had first denied payload data collection, then admitted it, but stated that such data was fragmented, and then finally acknowledged in October 2010 that sometimes entire emails had been captured. Following that statement, the Federal Communications Commission (FCC) started an inquiry to determine whether such conduct violated section 705(a) of the Communications Act of 1934, which prohibits the interception of interstate radio communications, except if authorized by the Wiretap Act.  

Google had argued that, under the Wiretap Act, which prohibits the intentional interception of electronic communications, it is not unlawful to intercept electronic communications made though a system readily accessible to the general public, and that such a definition encompassed unencrypted WiFi communications networks.  

On April 13, 2012, the FCC filed a Notice of Apparent Liability for Forfeiture (NAL) finding that Google “apparently willfully and repeatedly violated [the FCC] orders to produce… information and documents” that the FCC had requested. Such conduct would carry a $25,000 penalty. However, the FCC decided not to take any enforcement action under Section 705(a), as “[t]here is not clear precedent for applying [it] to… Wi-Fi communications.”


Leave a comment

Illegal Now For Maryland Employers to Ask for Employees Electronic Account Passwords

 

 The State of Maryland could become the first State to pass a law prohibiting employers to request employees to disclose their passwords allowing access to a personal electronic account. That would cover email accounts, but also social networking accounts.  The bill now awaits the Governor’s signature.

Employers will be prohibited from taking, or threatening to take disciplinary actions, if the employee refuses to disclose her passwords. Employers will also be prohibited from refusing to hire an applicant because of his refusal to disclose his passwords.

In 2010, Robert Collins, a Maryland corrections officer was asked to provide the Maryland Division of Corrections (DOC) his Facebook login information during a recertification interview. The American Civil Liberties Union of Maryland sent a letter in January 2011 to DOC Secretary Gary Maynard. Secretary Maynard answered in February 2011 and ordered the practice be suspended for 45 days to allow further study of the issue. The DOC then revised its policy: candidates would have to sign a form stating that they understand that providing their passwords is voluntary.

A similar bill is still being discussed in Illinois. Is a federal law around the corner? Senator Richard Blumenthal (D-Conn) plans to introduce a bill which would prevent employers to ask applicants to provide their social media passwords as part of the hiring process.