It’s a great pleasure to be here as a panelist on this topic. I am impressed and humbled by the caliber of my co-panelists and our moderator. I wouldn’t have believed a few years ago that I might sit down and join a panel with such important personages as Judge Williams, FCC Commissioner Kathleen Abernathy, and Stewart Baker.
I guess I have really made it in Washington. This, in my own view, proves that I am now officially part of the problem.
As you are aware, I am an Adjunct Fellow at The Progress & Freedom Foundation and the Editor of Privacilla.org, which is a Web-based think-tank devoted to privacy. In addition, I have a lobbying and consulting firm called PolicyCounsel.Com. None of my clients has specific privacy issues, but privacy touches nearly every public policy issue in one way or another. None of the material on Privacilla and none of what I say to you represents the views of any client, but be aware of my potential for bias, as you would be with any privacy advocate.
An early title for this panel was “How Emerging Technologies Force New Considerations in Privacy” and I believe the title that appears in your programs is “Does Emerging Technology Force New Privacy Considerations?” Now, I worked in the sausage factory known as Congress, and I’m very skeptical of efforts to glean congressional intent from amendments made during the legislative process. But it appears from the amendment to the title of the panel that the good folks at the Federalist Society recognized that new technology may not force new considerations. It speaks very well of them that they did. The change in titles is a move in the direction I think we should go not just with privacy, but with all tech policy.
Throughout my work on policy issues in technology, telecommunications, and e-commerce, the most important general message I have for people is not to be bowled over by new technology. Rather, they should focus on the old rules and fit the new technologies into them. Rules found in the Constitution, rules found in the administrative laws, rules about human nature and economic behavior. Those things weren’t challenged and they didn’t change when the Internet came along. I wrote a law review article with that very theme a couple of years ago. It’s called “Government Regulation of Electronic Commerce: What’s New is What’s Old” and it’s in the 1999 Administrative Law Review.
While I’m pitching old considerations and reading material from 1999, let me commend to you the decision of the 10th Circuit in U.S. West v. FCC. I’m sure a lot of you are aware of that case, but to me it is an example of a court not blinking when faced with an argument for “new considerations.”
The 10th Circuit wasn’t bowled over by the FCC’s assertion of threats to privacy in the heaving, churning world of telecommunications. The court said, “Whaddya mean by privacy?” The FCC said that Customer Proprietary Network Information includes information that is “extremely personal” to customers. The court inferred that CPNI info could be embarrassing, which is close to the classic idea of what happens when there is an invasion of privacy.
And I think the court took the FCC’s assertion with a grain of salt — for good reason. I have previously announced my own calling patterns, revealing this highly sensitive information,in an e-mail to Privacilla subscribers. And my calling patterns are: check voice mail, check voice mail, check voice mail, check voice mail, check voice mail, check voice mail . . . .
I overstate the case of course, but generalizations about the privacy of certain kinds of information tend to be wrong as much as they are right. Some people’s religious beliefs are private. Others proselytize. For some people, political beliefs are private. Others go door to door handing out literature.
In tech privacy debates, it’s extremely important to look at old considerations like what privacy is, anyway. At Privacilla, we try to sort out the issues that, in public debate, go under the name of “privacy” — separating out security, identity fraud, and spam, for example. We have put forward a definition of privacy that I think will appeal to a group like this. We hope it will be used by policymakers to better address privacy directly and determine what interests they are pursuing with various types of proposals.
Our definition of privacy is this: Privacy is a subjective condition that individuals enjoy when two factors are in place — legal ability to control information about oneself, and exercise of that control consistent with one's interests and values.
Happily, most of you are lawyers and I don’t have to parse this definition too much for you to get it. Most importantly, privacy is a personal, subjective condition. That means that my sense of privacy is my own, and yours is yours. Legislators and regulators can’t pass laws to tell us we have privacy when we think we don’t. Those laws can only represent guesses about what privacy might look like.
The first factor I mentioned is the legal power to control information. This essentially asks whether people have been deprived of power to control information in some way. There are thousands of laws and regulations that deprive people of power over information about themselves. Let there be no mistake about the good intentions of these laws. They undermine privacy all the same. Indeed, there is a nice correlation between how much laws are designed to help people and how much they undermine privacy. The helping hand of government strips away privacy before it goes to work.
Of course, many laws also protect privacy — or, more accurately, support the privacy-protecting choices we make. These are laws like contract, tort, trespass, and property.
The second factor I mentioned is exercise of control consistent with our interests and values. Ultimately, the only thing that can deliver privacy on the terms consumers want it is consumer awareness and education. If you don’t know how information moves in the Information Economy, you can’t reject practices that you disapprove of. It is the actions of educated and aware consumers in the marketplace that determine whether the uses made of information are acceptable.
This definition divides the privacy problem into two major parts that have entirely different dynamics. The first part — legal power to control information — mostly implicates government. Does government leave power with people to control information about themselves or does it strip this power away?
The second part has to do with what consumers do when they have that power. This implicates markets. What do consumers know and what do they really care about? How do companies learn about these interests and deliver on them?
The division meshes perfectly with topics Commissioner Abernathy discussed. The Communications Assistance to Law Enforcement Act is about the government’s ability to strip away privacy that we as consumers protect in various ways. On such things, I encourage folks at the FCC to be as aggressive as they can about protecting privacy. Carefully circumscribe surveillance powers if the statutes will let you.
The Consumer Proprietary Network Information issue, on the other hand, falls into the second part of that definition. It is an effort to determine what companies may do with information about how their facilities are used by customers. Consumers are more and more able to change communications media based on privacy considerations. They are certainly also free not to care. Phone companies are free to compete against one another on the privacy of customer usage information, just like ISPs do.
In areas like this, I encourage folks at the FCC to stay as far away as you can. You are out of your league. Government bodies have tried time and time again — in the Children’s Online Privacy Protection Act, the Gramm-Leach-Bliley law, in HIPAA, in the Minnesota ISP law, and many other examples — to figure out what consumers want in terms of privacy. These laws have failed in every respect except driving up costs and reducing the availability of services. The only way to get to privacy on the terms consumers want it is through private ordering. The privacy protections accorded to CPNI info should be determined in the market.
So my point is to suggest that there are two different sets of privacy issues. In each, the laws are different, the actors are different, and their incentives are different. True privacy is delivered, first, by governments leaving power with people and, second, by consumers pursuing their interests in the marketplace.
I may have made it in Washington, but I’ve only scratched the surface of the privacy issue. I’m eager to hear the comments of other panelists and discuss these issues with the group.
All content subject to the Privacilla