I am so not an economist.
But what I understood from what I have heard from the other panelists is very interesting and very heartening. The questions surrounding privacy regulation are finally closer to being well answered and I think these economic studies are a part of that.
It’s getting harder and harder to be a skunk at the garden party at CFP any more. I remember fondly the squirming people did in their seats at a previous CFP in San Francisco when I derided regulatory approaches to protecting privacy. The things I have heard this week are more nuanced and more thoughtful than what I have seen before.
As you heard on this panel, privacy is a word used to describe a variety of different issues, such as fairness, anti-spam, anti-marketing, security, and so on. Treating each of these separately is a huge advance.
So I thought I would really mess things up by speaking – at the coolest bleeding-edge tech conference – about old things like John Locke and the common law. But Bruce Schneier went and spoke about John Locke and the Enlightenment thinkers during lunch. He totally blew away my schtick. So, instead, what I think I will do to really stink things up is make no reference whatsoever to "Panopticon." [Panopticon, the theme of this CFP, had been beaten to death as a unifying theme in earlier panels, yet this timely joke received a thoroughly tepid response.]
I am the Editor of Privacilla.org, which is a Web-based think-tank devoted to privacy as a public policy issue. There, I deal with privacy from top to bottom, including fundamental questions like what we are talking about when we say the word “privacy,” privacy from government, and privacy in the private sector, including financial, medical, and online privacy.
I am also the Director of Information Policy Studies at The Cato Institute, a Washington, D.C. think-tank devoted to free markets, limited government, and peace. I joined Cato in September last year.
When I was invited to participate in this panel, I thought it would be a good opportunity to air some thinking I have been doing about privacy basics. A lot of the privacy debate has been going forward without figuring out some very basic issues. One of these is the status of personal information. What is personal information anyway? How do we explain what is happening when personal information is being collected and shared?
The best explanation I have for personal information is a theory of property. That is personal information can be conceived of as property in the Lockean sense. John Locke was a political philosopher a couple hundred years ago who was very influential in developing the thinking that underlies our constitution, our freedoms, and our social and political systems. I am no political scholar myself, but I think personal information can be squared pretty well with the theory of property that Locke laid out.
Like other thinkers, Locke sought to explain how we got to where we are from a theoretical state of nature. In the theoretical state of nature, humans were all just here without all the formal institutions we now enjoy and recognize. How did we get from there to here?
Locke’s theory of property is essentially a two-step theory: All the property on the earth is held in common – it’s un-owned – until someone mixes their labor with it. That is, they pick it up and do something with it. Taking land and farming it or ranching it for example. This is roughly what happened with the settlement of the American West, for example, although that was within a statutory framework. If you can do something with it, it's yours.
The other part is to fence it off. Give notice to others that it’s yours. Otherwise, property doesn’t work. You have to let people know it’s yours.
Now I’d like to illustrate how this theory works in some non-traditional areas. Would everyone please breathe in for me? Thank you. I believe that the air in your lungs is your property right now. You have made the effort to bring it into your body, and you are holding it there, which gives notice to other people that they can’t have it.
OK, if you actually did inhale when I asked you to, you can exhale now. What have you just done? You’ve returned that air to the store of common property. You have abandoned it, for someone else to pick up if they want.
Now let’s apply this theory to personal information. In everything you do throughout your day, you are creating what I call cognitive and volitional product. That is, you are creating facts about you by moving, by thinking, by transacting with others, and so on. This is the mixing of your labor with the elements of life. If you do something to fence those facts off from others, they are your property. If you don’t, they are common property, to be used by anyone else the way they see fit.
Let me illustrate all this. When I wave my right arm to one side, I am creating a new fact about myself. It is the product of my labor, but I have done nothing to fence it off from others. Thus, it is common property and you are free to do with this fact whatever you will.
Now, all of us put on clothes this morning, shielding certain facts about ourselves from others. The color of my underwear, for example, is a fact that I own. And I will tell you what it is for, like, a nickel.
So personal information is property. But very often it is common property and not personal property.
A variety of conclusions flow from this Lockean conception of personal information as property, of course. Some of them you’ll regard as good. Others you might think are bad.
You all might be frustrated to realize that you have to take some action to protect property if you are to make it yours. It is common for people to say, “What is that company doing with my information?” But the question that springs to my mind is “What did you do to make it yours?” Often people have done nothing to protect personal information. Their claim to own it is either just invective or a demand for a new entitlement from government.
This challenge is not as bad as it might seem, though. I believe that health care services came with an implied promise of confidentiality before third-party payers and regulations like HIPAA destroyed the relationship between doctors and patients. Financial and other fiduciary services also came with implied confidentiality promises before the government stepped in and started using the financial services industry as a surveillance tool. But it used to be that personal information was rendered private pretty much automatically by contract, consistent with Lockean property theory.
The good from this theory includes the fact that when governments take information from you, require it when you file taxes, and condition the issuance of licenses and permits on release of personal information, they are taking something from you. Most government programs are instituted without considering the privacy consequences. Recognizing that personal information is property would help to make the costs of government programs more clear.
This theory also helps explain the good and bad of data aggregators like Axciom and ChoicePoint too. To the extent they get their information from public records, they are trafficking in stolen information, loosely speaking. That’s bad. But to the extent they amass ordinary commercial information they are assembling worthless, found information into new and valuable property. They enrich the economy and profit justly when they sell it for use in credit reporting, tenant screening, employment screening, and so on. This is good stuff.
So, if you needed it, I think this shows that privacy is more susceptible to economic analysis than you may have thought. Personal information is correctly conceived of, I think, as property.
Now, the ChoicePoints of the world don’t do a perfect job, obviously, and there are many problems with what they do, including the current issue with security breaches at various data aggregators.
So let me talk about another old notion that addresses the breach problem in the best possible way. That is common law liability.
The common law is judge-made law, passed down from our nation’s forebears in England. Judges do a better job balancing the equities in case-by-case adjudication, I think, than regulators do with legislation. This is because they are faced with concrete factual situations and a full airing of the facts and legal arguments, laid out by interested parties.
Common law cases come up with general rules of broad applicability rather than narrow specific rules that hinge on flexible interpretations. Statutes and regulation tend to fight the last battle as we’ve seen with everything coming forward since ChoicePoint brought security regulation back into vogue.
A questioner last night hit the nail right on the head as far as the ChoicePoint problem. He noted that ChoicePoint creates security risks to members of the public by aggregating sensitive personal information. This, in econ-speak, is an externality.
How do you get ChoicePoint to internalize that security risk? You do it with a common law rule that holders of sensitive personal information owe a protective duty to the subjects of that information. When ChoicePoint compiles information, it also compiles a duty to protect the people it is covering.
So in the recent security breaches, ChoicePoint should have a responsibility to the parties it injured with what appears to be its negligent behavior. It should certainly have to compensate the people who are victimized in identity frauds and it may well have to compensate the merchants, credit card systems, and banks who bear substantial monetary losses due to these frauds.
As I said, I think a common law rule like this is a more appropriate response to the ChoicePoint fiasco. One of the few things that statutes have is a bully pulpit. That is, when legislators step to a podium, they advertise what a new law or regulation will do. They’re not always right. In fact, they’re often wrong.
In this case, they’re also a little late to the game. Courts in Michigan and New Hampshire have already adopted a rule saying that holders of sensitive data have a duty to the data subjects. And a lawsuit with claims based on negligence has already been filed in Los Angeles. The common law courts are where the action is, and where the real solutions are to problems like the ChoicePoint data security breach.
©2000-2005 Privacilla.org. All content subject to the Privacilla Public License.