The Secure Times

An online forum of the ABA Section of Antitrust Law's Privacy and Information Security Committee


Leave a comment

HHS Announcements This Week Show HHS is Serious About Privacy

Earlier this week, HHS announced that it had imposed fines of $4.3 million on Maryland based Cignet Health of Prince George’s County for violations of the HIPAA Privacy Rule.

In October, HHS released its Notice of Proposed Determination against Cignet.  HHS found that Cignet had violated the rights of 41 patients when it denied them access to their medical records despite the HIPAA requirement that covered entities provide patients with copies of their medical records no later than 60 days from receipt of a request.  The civil monetary penalty for this violation was $1.3 million.

HHS imposed an additional civil monetary penalty in the amount of $3 million on Cignet for failure "to cooperate with OCR’s investigations on a continuing daily basis from March 17, 2009, to April 7, 2010, and that the failure to cooperate was due to Cignet’s willful neglect to comply with the Privacy Rule."  The Notice of Final Determination stated that Cignet failed to request a hearing to dispute or settle the fine amount.  This fine is the first imposed against a healthcare provider under the Privacy Rule.

HHS also announced this week that it had reached a settlement with Massachusetts General Hospital for $1 million to settle potential Privacy Rule violations.  Some privacy observers have taken these two actions as a warning that HHS is prepared to actively enforce HIPAA privacy protections.

Advertisements


Leave a comment

FTC Requests Court Shut Down Text Message Spammer

Yesterday, the FTC filed a complaint in the U.S. District Court for the Central District of California requesting a permanent injunction against Philip Flora, alleging violations of §5 of the FTC Act and CAN SPAM.

According to the complaint, the defendant sent millions of text messages, selling loan modification assistance, debt relief, and other services.  In a single 40-day period, the defendant sent more than 5.5 million spam text messages.  The text messages instructed consumers to reply to the message or to visit one of the defendant’s websites.  The defendant collected information from consumers who responded, then sold their contact information to marketers as debt settlement leads.  The FTC alleged that consumers were harmed as a result of the defendant’s spam text messages because many must pay fees to their mobile carriers to receive the unwanted messages.

The Commission charged that the defendant violated the FTC Act by sending unsolicited text messages to consumers and misrepresenting that he was affiliated with a government agency.  The Commission also charged the defendant with violating CAN SPAM by sending emails that advertised his text message blast service that failed to include an opt-out mechanism and the physical mailing address of the sender.

The FTC also acknowledged the "invaluable assistance" it received from Verizon Wireless, AT&T, and CTIA – The Wireless Association in its press release.


Leave a comment

ZIP Codes are Personal Information Under California’s Song-Beverly Credit Card Act.

In a unanimous decision released February 10, 2011 in Pineda v. Williams-Sonoma (S178241),  the California Supreme Court held that a ZIP code is “personal information” under California’s Song-Beverly Credit Card Act of 1971 (the “Song-Beverly Credit Card Act”).  Therefore, businesses may not request or require that a cardholder provide a ZIP code during a credit card transaction and then record the ZIP code, unless the transaction falls under an exception in the Act.
 
The plaintiff in Pineda sued Williams-Sonoma alleging that when she made a purchase at one of its stores using her credit card, the cashier requested her ZIP code and recorded it. The complaint also alleged that Williams-Sonoma subsequently used the plaintiff’s name and ZIP code to find her full address and add her to its marketing database. 
 
The Pineda case involved interpreting the section of the Song-Beverly Credit Card Act which restricts collection of information from consumers during a credit card transaction and the section defining personal identification information. The relevant portion of Sections 1747.08(a)(2) and 1747.08(b) provide as follows:
 
(a)(2)   “[N]o person, firm, partnership, association, or corporation that accepts credit cards for the transaction of business shall . . . (2) Request, or require as a condition to accepting the credit card as payment in full or in part for goods or services, the cardholder to provide personal identification information, which the person, firm, partnership, association, or corporation accepting the credit card writes, causes to be written, or otherwise records upon the credit card transaction form or otherwise." [emphasis added]
 
(b)               "Personal identification information" is defined as "information concerning the cardholder, other than information set forth on the credit card, and including, but not limited to, the cardholder’s address and telephone number.”
The California Supreme Court decided to hear the Pineda case after the lower courts had ruled that collection of a ZIP code alone was not personal identification information and, therefore, was not covered by this prohibition in the Song-Beverly Credit Card Act. The lower courts based their decisions on reasoning similar to that in Party City Corp. v. Superior Court, 169 Cal.App.4th 497 (2008), which also held a ZIP code, without more, does not constitute personal identifying information in large part because a ZIP code pertains to a group of individuals, in contrast to a complete address or a phone number which are specific to an individual. 
 
The California Supreme Court rejected the lower courts’ reasoning and held that personal identification information includes a cardholder’s ZIP code “in light of the statutory language, as well as the legislative history and evident purpose of the statute.” In its statutory analysis, the Court concluded that “address” should be broadly construed “as encompassing not only a complete address, but also its components.”   The Court noted that this broad interpretation is consistent with the “expansive language” used in the Act and with the rule that remedial statutes should be liberally construed in favor of their protective purpose. After its review of the legislative history of the Song-Beverly Credit Card Act, the Court concluded that the Act is “intended to provide robust consumer protections by prohibiting retailers from soliciting and recording information about the cardholder that is unnecessary to the credit card transaction.”
 
Questions Raised By the Decision
.
The Song-Beverly Credit Card Act has been significant over the last several years because it provides a private right of action and statutory damages of up to $250 for the first violation and $1000 for each subsequent violation. After the decision in Florez v. Linen N Things, 108 Cal.App.4th 447 (2003), there have been numerous class actions filed against retailers under the Song-Beverly Credit Card Act that have resulted in settlements.
 .
The Court’s decision in Pineda, particularly the interpretation of the purpose of the Song-Beverly Credit Card Act, raises several important questions and considerations, including the following:
.
·         The ability of businesses to solicit and record information during credit card transactions that is not necessary to the transaction. For example, consideration should be given to other types of data beyond ZIP code that will be covered as personal identification information. 
·         How to interpret the exception in section 1747.08(c)(4) which allows collection of personal identification information for a special purpose incidental but related to the credit card transaction, such as shipping, delivery, servicing or installation or special orders. The questions after Pineda will be what would be other “special purposes” and if the collection is not for a special purpose, is it “necessary” for the credit card transaction. 
·         Whether the Act applies to return transactions, as has been considered in several state and federal court rulings in California. See, e.g., Romeo v. Home Depot USA, Inc., No. 06-CV-1505, 2007 WL 3047105, 2007 U.S. Dist. LEXIS 77144 (S.D. Cal. Oct. 16, 2007); Korn v. Polo Ralph Lauren Corp., 644 F. Supp. 2d 1212 (E.D. Cal. 2008); TJX Cos., Inc. v. Super. Ct., 163 Cal. App. 4th 80 (2008); Absher v. AutoZone, Inc., 164 Cal.App.4th 332 (2008).  These prior ruling reviewed section 1747.08(a)(3) and determined that the Song-Beverly Credit Card Act did not apply to return transactions.  The Court was not considering this particular issue in Pineda or the same subsection, so those decisions should stand and an argument can be created that the additional information is necessary for fraud protection purposes.   It will, however, be important that consideration is given before the information collected for special purposes or returns, is also used for marketing purposes.
·         Whether the Act applies to online transactions, as was considered in Saulic v. Symantec Corp., 596 F.Supp.2d 1323 (C.D.Cal. 2009).  The court in Saulic held that the Act does not apply to online transactions based upon a narrow reading of 1747.08, and its reasoning may be called into question after Pineda


Leave a comment

The European Data Protection Supervisor on data breaches, data portability, and the right to be forgotten

 

The European Data Protection Supervisor (EDPS) published last month an opinion about the European Commission’s Communication reviewing the EU legal framework for data protection. It discusses, among other topics, the introduction of personal data breach notification in EU law.  The EDPS also declares it is in favor of introducing the right to data portability and the right to be forgotten in the EU legal framework.

 

The new legal framework must support an obligation to report security breaches

 

The EDPS supports the extension of the security breaches report obligation which is currently included in the revised ePrivacy Directive, as it is proposed in the Commission’s Communication.

 

As of now, the revised ePrivacy Directive only requires providers of electronic communication services to report security breaches. However, no other data controllers are covered by the obligation. The EPDS notes that “[t]he reasons that justify the obligation fully apply to data controllers other than providers of electronic communication services.” (§75)

 

Indeed, “[s]ecurity breach notification serves different purposes and aims. The most obvious one,

highlighted by the Communication, is to serve as an information tool to make individuals

aware of the risks they face when their personal data are compromised. This may help them to take the necessary measures to mitigate such risks,” such as changing passwords or canceling  their accounts. (§76) Also, these notifications “contribute (…) to the effective application of other principles and obligations in the Directive. For example, security breach notification requirements incentivize data controllers to implement stronger security measures to prevent breaches,” and thus enhance data controllers‘accountability. Such notifications also serve as a tool for the enforcement by Data Protection Authorities (DPAs), as such notification may lead a DPA to investigate the overall practices of a data controller. (§76)

 

The new legal framework must support data portability and the right to be forgotten

 

The Communication vowed that the Commission would examine ways of complementing the rights of data subjects “by ensuring ’data portability’, i.e., providing the explicit right for an individual to withdraw his/her own data (e.g., his/her photos or a list of friends) from an application or service so that the withdrawn data can be transferred into another application or service, as far as technically feasible, without hindrance from the data controllers.” (Communication, p.8)

According to the EDPS, “Data portability and the right to be forgotten are two connected concepts put forward by the Communication to strengthen data subjects’ rights.”(§83)  As “more and more data are automatically stored and kept for indefinite periods of time, “the data subject has very limited control over his personal data. The Internet has a “gigantic memory.” (§84) Also, “from an economic perspective, it is more costly for a data controller to delete data than to keep

them stored,” and thus [t]he exercise of the rights of the individual therefore goes against the natural economic trend.” §(84)

 

“Both data portability and the right to be forgotten could contribute to shift the balance in

favour of the data subject” by giving him more control of his information. The right to be forgotten “would ensure that the information automatically disappears after a certain period of time, even if the data subject does not take action or is not even aware that the data was ever stored.”(§85) This "right to be forgotten" would ensure that personal data are deleted and at the same time it would be prohibited to “further use them, without a necessary action of the data subject, but at the condition that this data has been already stored for a certain amount of time. The data would in other words be attributed some sort of expiration date.” (§88)

 

This new "right to be forgotten" should be connected to data portability. (§89) Data portability is “the users’ ability to change preference about the processing of their data, in connection in particular with new technology services.”(§86)  “Individuals must easily and freely be able to change the provider and transfer their personal data to another service provider.”(§87)

 

The EDPS considers that existing rights “could be reinforced by including a portability right in particular in the context of information society services, to assist individuals in ensuring that providers and other relevant controllers give them access to their personal information while at the same time ensuring that the old providers or other controllers delete that information even if they would like to keep it for their own legitimate purposes.” (§87)

 

Whether the right to be forgotten online will become part of the EU data protection framework remains to be seen. However, several EU countries recognize, or plan to recognize soon, such a right. Google argued last month in a Spanish court that deleting search results, in order to respect, the country’s right to be forgotten, "would be a form of censorship." France is considering recognizing such a right as the French Congress is in the process of implementing the reviewed ePrivacy Directive. As the deadline for implementing the directive, May 25, 2011, approaches, it will be interesting to see how many Member States actually add he right to be forgotten to their legal systems.


Leave a comment

Representatives Markey and Barton Send a Letter to Facebook over Announced Feature

Representative Edward Markey and Representative Joe Barton, Co-Chairmen of the Congressional Privacy Caucus, sent a letter on February 2nd to Facebook’s CEO, Mark Zuckerberg, requesting information about Facebook’s announcement on January 14 that it plans to make its users’ addresses and mobile phone numbers available to third-party web sites. The feature would make a user’s address and mobile phone number accessible to external web site application developers, but not a user’s friend’s addresses or mobile phone numbers

Facebook then announced on January 17 that it decided to delay this new feature, after having received “some useful feedback that [Facebook] could make people more clearly aware of when they are granting access to this data.” Facebook is currently making changes to the feature “to help ensure [users] only share this information when [they] intend to do so.”

The letter asks several questions about the feature:

The First question asks Facebook to describe whether any user information in addition to address and mobile phone number would be shared with third party applications developers, and also to describe whether such information was shared prior to the January 17 announcement of the suspension of the feature.

The Second question asks Facebook to describe what user information will be shared with third party applications developers once the feature is again implemented.

The Fourth question asks Facebook to describe the process which led to the suspension of the program.

The Sixth question asks Facebook to describe Facebook’s internal policies and procedures for ensuring that this feature complies with Facebook’s privacy policy.

The Eighth question asks whether users who had opted in to sharing their addresses and mobile phone numbers will be able to have this information deleted by third party applications or web sites.

The Ninth question asks Facebook whether Facebook’s privacy policy would have been violated if this feature would have been implemented.

The Tenth question asks Facebook if, given the sensitivity of personal addresses and mobile phone numbers, Facebook believes that opt-in should be clearer and more prominent.

Representative Markey said that “Facebook needs to protect the personal information of its users to ensure that Facebook doesn’t become Phonebook." Representative Barton said that “The computer – especially with sites like Facebook – is now a virtual front door to your house allowing people access to your personal information. You deserve to look through the peep hole and decide who you are letting in.”


Leave a comment

Data mashing, crime-mapping, and smart grids: the art of cooking data into tasty reports

The United Kingdom’s Information Commissioner’s Office (ICO), has recently published some advice on crime-mapping and privacy, following the launch by the UK police of a “local crime and policing website for England and Wales” where users can enter their “postcode, town, village or street into [a] search box…, and get instant access to street-level crime maps and data, as well as details of [their] local policing team and beat meetings.”

Crime-mapping

ICO describes crime-mapping as “the process of producing a geographical representation of crime levels, crime types or the locations of particular incidents.” This process “can have an impact on individuals’ privacy where a link can be established between a particular location and a particular individual, allowing identification to take place.”

Data aggregation

ICO warns about the danger of making public information about where a particular crime happened, even if the name of the victim is not released. This information, combined with “other sources of publicly available information” could allow for the identification of an individual. ICO cites online street- maps, newspaper reports, and postings on social networks and other sites as source of publicly available information.

One can also add to that list smart grid data, where smart meters, while allowing individuals to save energy, also gather data which can then be used to track an individual’s activity through the house: why you take a very long shower and run three laundry loads at 3:00 am? And, by the way, where is your wife? Why do you stay up late most nights, and use the bathroom frequently? Are you sick? Smart grid data can also be combined with other data, such as demographic data or credit history to provide interested parties, the police or a private entity, a rather precise picture of your profile, whether it be that of a criminal or a good business prospect.

Under prolonged private surveillance

After concerns over the proverbial government surveillance, there may be a greater volume of constant private surveillance, of “little brothers,” a neighbor, a casual acquaintance, an employer, or a business organization interested in adding us as clients. Indeed, some companies may be interested to use crime-maps for business purposes. ICO cites as an example real estate agents or insurance companies.

In United States v. Maynard, the Court of Appeals for the D.C. Circuit noted last year with respect to  data gathered during a “prolonged (government) surveillance” that :

These types of information can each reveal more about a person than does any individual trip viewed in isolation. Repeated visits to a church, a gym, a bar, or a bookie tell a story not told by any single visit, as does one‘s not visiting any of these places over the course of a month. The sequence of a person‘s movements can reveal still more; a single trip to a gynecologist‘s office tells little about a woman, but that trip followed a few weeks later by a visit to a baby supply store tells a different story.* A person who knows all of another‘s travels can deduce whether he is a weekly church goer, a heavy drinker, a regular at the gym, an unfaithful husband, an outpatient receiving medical treatment, an associate of particular individuals or political groups — and not just one such fact about a person, but all such facts.”

The same analysis can apply to a prolonged private surveillance…

Data mashing

ICO also notes the importance of recognizing the “increasingly sophisticated ‘data mashing’ techniques [which make easier] for the general public to combine information resources to produce a richer, and possibly more privacy-intrusive, picture of crime in their area.” Data, just like potatoes, can be mashed up. One starts by gathering data from various sources and then mashing them into a single representation, a report, or a web site. Business organizations may purchase tools to mash up data to produce reports.

What can be done to prevent personal data to be mashed, cooked, and served to business organizations?

Any solutions?

Solutions may be topical, making sure that the privacy of personal data is protected. For instance, the privacy of smart grid data can be specifically protected. A report released today by the Information and Privacy Commissioner of Ontario, Canada, recommends that “Smart Grid systems should feature privacy principles in their overall project governance framework and proactively embed privacy requirements into their designs, in order to prevent privacy-invasive events from occurring.” On another front, social media users should be informed of the importance of keeping their privacy settings tight.

May any current U.S. law be of use? The Fair Credit Reporting Act (FCRA) defines a consumer report as “any written, oral, or other communication of any information by a consumer reporting agency bearing on a consumer’s credit worthiness, credit standing, credit capacity, character, general reputation, personal characteristics, or mode of living which is used or expected to be used or collected in whole or in part for the purpose of serving as a factor in establishing the consumer’s eligibility for…  credit or insurance to be used primarily for personal, family, or household purposes.” A “consumer reporting agency” is defined as “any person which, for monetary fees, dues, or on a cooperative nonprofit basis, regularly engages in whole or in part in the practice of assembling or evaluating consumer credit information or other information on consumers for the purpose of furnishing consumer reports to third parties, and which uses any means or facility of interstate commerce for the purpose of preparing or furnishing consumer reports.”

If business organizations, such as insurance companies, no longer use consumer reporting agencies, but instead put in place their own proprietary data mash up system, they no longer have to comply with the FCRA. Is it time to update the law?