Data Insecurity: What Remedy Should Consumers Have When Companies Do Not Keep Their Data Safe?
By ANITA RAMASASTRY
|Monday, Mar. 06, 2006|
On December 31, 2005, an employee of Providence Healthcare Systems stored backup computer tapes overnight in his van, which was parked at home in his driveway. The tapes were stolen - and so were data for 365,000 patients in Oregon and Washington.
The data included patients' Social Security numbers, birth dates, addresses, and medical information. Yet the affected patients were not notified of the security breach until January 25, 2006 -- almost a month later.
Several consumers have come forward and reported possible identify theft to the Oregon Attorney General. The rest are left worrying about whether they too may be victims of identity theft - or of disclosure of their personal medical information.
Unsurprisingly, this February, a class action lawsuit ensued. I will argue that the suit asks for reasonable remedies - but also that legislatures may need to step in to create clearer statutory duties and remedies for security breaches to ensure suits like this will succeed.
The Suit: Persuasive Allegations of Providence's Negligence
The complaint in the lawsuit alleges that Providence acted negligently in failing to properly protect customer data. That should be easy to prove.
For one thing, Providence was negligent in not ensuring that its records were kept in a secure business location. Indeed, according to news reports, the company's home services division made it a routine practice to designate certain employees to take unencrypted "backup" tapes home -- supposedly as an emergency backup! (Providence now uses encryption to protect backup files and sends them to a secure off-site facility - which is, of course, what it ought to have been doing all along.)
For another thing, Providence seems to have compounded its initial negligence by unreasonably delaying for nearly a month in telling its patients their data was stolen. And according to the complaint, it compounded that negligence again when it "failed to take any actions to protect patients from misuse of [the stolen] information."
The Remedies the Lawsuit Sought: Entirely Reasonable
The class action complaint seeks injunctive relief - that is, a court order forcing Providence to act. In particular, it asks that the court order Providence "to pay for enhanced credit report monitoring for all class members, pay for the fraud alerts, pay for reporting to the Social Security Administration, and pay for any credit repair process that is required if people are damaged." As amended, it also now seeks monetary damages.
It is eminently reasonable for the patients whose data was stolen to request these remedies. Those patients who are not yet identity theft victims will need to constantly check or monitor their credit history - and monitoring is not free: It takes time and, after the first credit report, it takes money.
For patients who are already identity-theft victims, even more time and money will have to be expended. In addition, identity theft can result in their temporary loss of access to credit until the issue is cleared up.
In 2002, the US GAO published a useful study in which it quantified many of the monetary and non-monetary costs to victims. And the FTC-brokered, late January settlement with the data broker ChoicePoint underlines the point that costs to victims can be substantial: The consumer redress element of the settlement alone came to $5 million.
(As I discussed in greater detail in a prior column, the FTC alleged that ChoicePoint lacked reasonable procedures that would have allowed it to screen prospective subscribers. Because of lax controls, ChoicePoint turned over sensitive consumer data to subscribers whose applications raised obvious "red flags." The result: Identity thieves purchased 163, 000 customers' data. At least 800 cases of identity theft were subsequently reported. The FTC charged that ChoicePoint's security and record-handling procedures had violated consumers' privacy rights and federal laws. In addition to consumer redress, ChoicePoint must also pay $10 million in civil penalties.)
Why Legislation May Be Necessary: The "Economic Loss" Rule
In short, Providence should pay, and pay now - for both monitoring and repair of patients' credit. In the wake of the suit, the company seems to realize this: In mid-February, it offered its customers one free year of credit monitoring and repair services, by the Kroll and Associates security firm. Whether this offer, if accepted, will settle or moot the suit has yet to be determined.
If the suit did continue, would it succeed? Surely, from the standpoint of justice, it ought to. But from the standpoint of the law, it might not.
The main barrier is the "economic loss" rule. Following the common law (the longstanding set of background rules judicially developed over the centuries), courts often require that negligence plaintiffs must have incurred physical harm - not just economic loss. They do so in order to draw a line between contract suits, which focus on economic damages, and tort suits, like the one against Providence. They also seek to discourage plaintiffs from seeking speculative damages like loss of profits or lost opportunities.
But these two concerns don't really apply here. First, conduct like Providence's ought to be the basis for a lawsuit whether or not Providence specifically made formal promises, through contracts, to plaintiffs about the security of their data. Consumers should not have to bargain with companies over their right to have their data kept safe. This ought to be considered a necessary part of doing business when a company seeks confidential customer data as part of its business model.
Second, these damages aren't speculative: The costs of monitoring and of credit repair are real, concrete, and easily quantifiable.
Security breaches can give rise to physical harm. For example, the New Hampshire Supreme Court's 2003 decision in Remsburg v. Docusearch, Inc. held that an information broker is potentially liable for the harms caused by negligently selling personal information. There, a company called Docusearch Inc. sold, for $150, Amy Boyer's work address and other information. The buyer - a former high school classmate who'd chronicled his obsession with Boyer and his plot to kill her on a website - did just that in October 1999, shooting her to death as she was leaving work. (The company settled the case, brought by Boyer's mother, for $85,000.)
But it shouldn't have to come to that before the law intervenes. Security breaches should not be required to lead to physical harm, to be the basis for a tort lawsuit. Stronger laws, for instance, might have shut down the information-selling Docusearch before its practices led to Boyer's death.
What Security Breach Statutes Should Look Like
Because of these potential gaps in the common law, states (and/or the federal government) should pass statutes to protect consumers in the event of a security breach. These statutes should have three key features:
First, they should require companies to immediately notify consumers when breaches occur, so they can protect themselves and their credit. Oregon law didn't require this, and this may be one reason that Providence waited.
Second, they should require credit issuers to offer free security "freezes," by which consumers may prohibit lenders or retailers from granting credit to anyone claiming to be them, as long as their file is "frozen."
Third, they should require companies whose negligence results in a breach to offer consumers credit-monitoring services and if necessary, credit-repair services.
Without such statutes, consumers run the risk that even if they sue, they will not receive the reasonable redress they deserve for the time and money they lose due to negligence - in this case, negligence in securing the safety of the one of the most personal, private kinds of information there is: medical information.