For several years now, Internet users have ranked privacy as the issue that concerns them the most. This paper traces the development of privacy rights on the web with emphasis on the USA and EU. The paper discusses the advantages and disadvantages of the different approaches to regulation, especially industry self-regulation. Both conceptually and in practice, there are problems with all the modes of regulation. In the U.S.A. there is pressure towards government regulation away from self-regulation. This is due in part to the high-profile breaches of privacy that went unpunished in the self-regulatory regime. It concludes with a basic set of guidelines on privacy for websites.
There is no doubt: privacy concerns are a significant issue that is making users and potential users think twice about using the Internet for transactions. Study after study has borne this out. (For a fairly up-to-date list of surveys, see the “Privacy” sub-directory of NUA Internet Surveys under http://www.nua.ie/surveys/ and the Trust-e’s “How Does Online Privacy Impact Your Bottom Line?” available at http://www.truste.org/webpublishers/pub_bottom.html.) A poll conducted in 1998 among those who were not likely to access the Net privacy concerns outranked reduced cost, ease of use, security of financial transactions or more control over unwanted messages (Alan Westin and Danielle Maurici. 1998. E-Commerce and Privacy: What Net Users Want. Privacy and American Business and Price Waterhouse. June, cited in Swire and Litan p. 80). The fear is understandable. That same year, the US Federal Trade Commission found that while “almost all Web sites (92 percent of the comprehensive random sample) were collecting great amounts of personal information from consumers, few (14 percent) disclosed anything at all about their information practices” (US FTC, 1998)
Two years later, in 2000, a study by the UCLA Center for Communication Policy, found that 64 percent of Internet users “strongly agreed” or “agreed” that logging onto the Internet puts their privacy at risk. Of all the Internet issues explored by the study titled “Surveying the Digital Future”, its coordinator said, “Privacy raises the greatest concern.”
The concern is universal. A 2000 poll in South Korea found that more than 95 percent of Net users were concerned about “possible leakage of their personal information” (Lee, 2001).
In short, privacy concerns do have some impact on the use of Net. There is therefore an incentive for Internet sites to adopt privacy standards that would appease the visitor.
But if this is the case, why has it been so difficult for standards to emerge?
And therein lies a tale fraught with tensions. Tensions not just between the consumer and business but tensions even among those who agree as to the mode and very definition of privacy regulation.
This paper presents the issues and argues the case for where privacy rules are headed. Given the state of present legal uncertainty but also the need to act, the paper concludes with some suggestions on what needs to be done by site owners.
Broadly, an individual’s information privacy is the right to determine when, how, and to what extent information about the person is communicated to others (Westin, 1967). Like virtually all rights, it is not absolute. A person with total privacy would be so isolated he or she would have to cease to exist socially. Any contact with the world, from a magazine subscription to a telephone line to a bank account exposes one to possible privacy violation. The gain from the loss of some privacy is convenience. On the ‘Net, webpages could be customized to the user’s preference. In the offline world, special marketing promotions could be targeted to the user. For official business, there would be less need to fill in forms or verify documents.
On the other hand, a person without any privacy protection would be wholly vulnerable to being harassed by the world. He or she would have to wade through all kinds of commercial mail before coming to truly useful mail. He or she would receive voluminous phone calls or emails. It would be exhausting therefore to live without some form of privacy.
Given this spectrum of what could reasonably be expected from privacy, the next step is to determine at which point the privacy right should be set. It is at this crucial point that it is difficult to obtain a workable definition. Alan Westin, a noted scholar of privacy, is reported to have said that it is impossible to define privacy because privacy issues are fundamentally matters of values, interests and power (Gellman, 1997).
The two major regions of the law world that have initiated the various legislation on privacy are Europe and North America. The most comprehensive privacy rules have been instituted in the European Union. Europeans have seen for themselves how governments can use data against their citizens during World War 2. It was partly the abuse of such data that enabled the Nazi to send 6 million Jews to the death camps.
In contrast, the privacy rules in the U.S. are not comprehensive in scope. The ensuing discussion delves into the differences in approach, differences that affect the rest of the world.
There are several major differences between the American and the European approaches. The first is that in the EU, the protection of personal data and privacy is clearly stated under a convention as a fundamental human right.
Article 8 of the European Convention on Human Rights (1963) states:
(1) Everyone has the right to respect for his private and family life, his home and his correspondence.
(2) There shall be no interference by a public authority with the exercise of this right except such as in accordance with the law and is necessary in a democratic society in the interests of national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the right and freedoms of others.
To be sure, Article 12 Universal Declaration of Human Rights (1948), which precedes the European Convention, has similar words:
No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence, nor to attacks upon his honor and reputation. Everyone has the right to the protection of the law against such interference or attacks.
In fact, the similarity in the words of the two clauses suggests that the European Convention copied from the Universal Declaration. However, the legal force of the European Convention is much stronger. The Universal Declaration might be seen as an “idealistic ideal”-there is no enforcement mechanism beyond diplomatic and international pressure; the European Convention sets out a “realistic ideal” by backing the articles with the enforcement mechanism of the European Court of Human Rights.
Under the European Convention, being classified as a fundamental human right entails obligations over and above that of normal statutory law. A law that violates the Convention can be struck down, the government forced to enact new law, or else be compelled to pay damages to the injured victim.
The US does value privacy but the right to privacy has been read by courts as implied. There is no mention of the word “privacy” or the phrase “protection of privacy” in the US Constitution. The US Supreme Court Justice Louis Brandeis in a dissenting judgment characterized privacy as “the most comprehensive of rights, and the right most valued by civilized men” (Olmstead v. US 1928:478). Brandeis was a dissenting voice in that case but he was proven right when Olmstead was overturned in Katz vs US (389 US 347 (1967)). And in 1965, Griswold v. Connecticut (381 U.S. 479 (1965)) recognized a limited right to privacy as a constitutional guarantee. The Court found the right to privacy implied in the Constitution, in Amendments including the Fourth, Ninth and Tenth.
(The Fourth Amendment states that “(t)he right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no warrants shall issue, but upon probable cause, supported by oath or affirmation, and partially describing the place to be searched, and the persons or things to be seized.” The Ninth Amendment states that “(t)he enumeration in the Constitution, of certain rights, shall not be construed to deny or disparage others retained by the people.” Under a rule of legal interpretation, the listing of rights could be read to imply that the entire list constituted all the rights. The Tenth Amendment states that “(p)owers not delegated to the United States by the Constitution, nor prohibited by it to the states, are reserved to the states respectively or to the people.”)
So while both approaches do enshrine privacy as a fundamental human right, the US approach of treating privacy as an implied right creates ambiguity, which contributes to the problem of regulating privacy. Technically speaking therefore, the right to privacy in the US is weaker than that in Europe although in practice, more recent data suggests that that may not be the case. This is discussed later.
The European approach creates stronger privacy protection for another reason: to guarantee a fundamental human right requires the force of law. In the US, there are privacy laws but these are “fragmented, incomplete and discontinuous” (Gellman, 1993).
Among the acts that affect privacy are the following: Bank Secrecy Act, Cable TV Privacy Act of 1984, Electronic Communications Privacy Act, Fair Credit Reporting Act, Children’s Online Privacy Protection Act, Consumer Credit Reporting Reform Act, Driver’s Privacy Protection Act, Electronic Funds Transfer Act, Electronic Signatures in Global Commerce Act, Family Educational Right to Privacy Act, Financial Services Modernization Act, Freedom of Information Act, Privacy Act of 1974, Right to Financial Privacy Act, Social Security Number Confidentiality Act, Telemarketing and Consumer Fraud Act, Telecommunications Act, Video Privacy Protection Act of 1988. The list demonstrates just how fragmented the laws are in another way: the website Privacy Exchange (www.privacyexchange.org), which generally touted as an authoritative site for materials about privacy, did not have the entire list of acts above.
The laws that have been passed in the USA are sectoral, generally aimed at the public sector, or else targeted at preventing specific abuses, and usually after they have occurred. The Driver’s Privacy Protection Act, for example, was passed after an actress was stalked and killed through disclosure of her driver’s license details. Apart from a few sectoral acts, the private sector is governed by voluntary self-regulatory codes.
Such industry self-regulation, by definition, cannot be comprehensive. The regulations, after all, affect only the industry. Such a sector-specific approach, when applied to an all-embracing concept such as privacy, inevitably creates gaps in the law. That is, there will be areas of law to which privacy rules do not apply. This is a major unsatisfactory result of self-regulation.
It should be noted that the presence of law does not exclude self-regulation. And indeed, European industry has many self-regulatory privacy codes (Mogg, 1998). But to be meaningful, these codes must cover some area not covered by law or else set a higher standard of protection.
So which mode of regulation is better? There is a whole ocean between the two parties.
The argument for direct government legislation is that it increases consumer confidence and therefore increases commerce. But those supporting less regulation argue that regulations interfere with the working of the free market. Both arguments in the abstract are valid. But experience informs us that there are rules and there are rules. Some rules do increase commerce without interfering with the workings of the free market. So which rules do enhance business? The general conclusion is that “clear definition and assignment of rights, efficient rules for making and enforcing contracts, and good laws that reduce the need for consumers to protect themselves will all help commerce. Privacy laws are seen as falling into the third category of reducing the need for consumers’ self-protection” (Swire and Litan, 1998:86).
What is the right mix of approaches in protecting privacy? Market mechanism, technology, industry self-regulation or government regulation (Swire, 1997). There are shortcomings in each of the approaches.
The market mechanism approach is premised on the notion of privacy as a negotiable item, such as property, instead of an inalienable right. A willing consumer trades some aspect of privacy as part of the transaction for goods or service. Not everyone agrees with this and for good reason: if privacy is a right, it should not be the consumer but those who collect and use the data who should bear the responsibility for maintaining privacy.
Nevertheless, a system that has been set up around this notion is the Platform for Privacy Preferences or P3P. It is backed up by the Online Privacy Alliance, a consortium of almost 50 American organizations including the US White House, Microsoft, America Online and organizations like the Center for Democracy and Technology.
P3P uses a protocol developed by World Wide Web Consortium’s (W3C) called Platform for Internet Content Selection (PICS) (Garfinkel, 2000). In its conception, PICS was designed to carry labels that would describe its content to users. So a pornography site would warn a visitor that it had porno and so users who have set their browser to alert them of pornography would be blocked off.
Under the PICS protocol, the web site owner would state the privacy level of the site. The user would also have a privacy preference set up on the web browser. If the level of privacy on the web coincides with the privacy of the browser, the user accesses the site transparently. But if the privacy on the site is set at a lower level than what the user would prefer, a window pops up and asks if the user is prepared to sacrifice some loss of privacy for the service.
In theory, this is an appealing concept as it bypasses one of the thorny problems with privacy the question of definition. It does vary between individuals. P3P would therefore allow the buyer and seller to negotiate an agreeable level of privacy.
There has been much criticism of P3P as a means of securing privacy protection for the consumer (Clarke, 1998; Coyle, 1999, EPIC, 2000). A serious shortcoming is that there are no sanctions. It is in fact, a self-management system and not a self-regulatory system. That is, the system assumes that the company will abide by its own rules that it had negotiated with its customer. It is a basic tenet in developing a code of practice that it should not pretend to deliver more than it does (Ang, 1999) because when the truth emerges, the consumer is left in a spiteful mood (Falk et al, 2000). It is here that P3P falls gravely short. Although it does not aim to mislead, its very nature seems to suggest to the user that privacy will be protected on a site that uses the technology when in fact the protection is very skimpy. Indeed this is the major criticism by the European Union of P3P: it could mislead even its users, i.e. website owners, into believing that they have discharged their legal obligation of privacy protection (Hustinx , 1998).
However, a supporter of P3P, the Center for Democracy and Technology has clarified in a report that “P3P cannot protect the privacy of users in jurisdictions with insufficient data privacy laws” and “cannot ensure that companies follow privacy policies” (Mulligan, 2000). What P3P then hopes to achieve is accountability through transparency. The technology makes it easier to locate and compare privacy policies. The ease of comparison would then drive the industry towards standardization.
The argument would parallel that of car reviews: as reviewers raise the same issues of acceleration, safety and space, car manufacturers would feel compelled to improve on those areas in order to get a good review for their products. Unlike cars, however, it is difficult to tell if the site owner has complied with the self-made rules. Just as violators are not visibly sanctioned, so the rule-abiders are not visibly rewarded either. As of December 15, 2000, there were fewer than 50 sites that were using P3P (W3C, 2000).
Even if there were a critical mass of sites using P3P, consumers would still be uncertain of the quality of privacy protection in the sites. A study by Akerlof (1970) on quality uncertainty suggests that there could emerge “a market for lemons” or poor quality goods that would eventually drive out high-quality producers and, in the extreme, destroy the market.
Here is how Akerlof’s analysis would apply to the web. A website’s privacy practices are marked by asymmetric information. Consumers cannot judge its quality. Because consumers cannot view the quality of the service, no consumer will accept paying a price higher than “normal”. Producers with high-quality service, such as a good standard of privacy protection, would therefore not be willing to sell at that same price.
Akerlof concludes that asymmetric information reduces (1) the volume of transactions and (2) the average quality of goods and services exchanged. Such a situation where high quality goods and services are penalized is known as “adverse selection”.
As a result of asymmetric information, information providers are not able to signal quality differences to the buyers. Sellers of low quality goods and services are not punished by the market because the market does not know that these are low-quality producers. The market price will reflect only the average quality level and therefore attract average quality sellers. This leads to a perception by the consumer of a reduction in the average quality, which leads to a further reduction of market price and another round of lowering of average quality. This process is cumulative and in the extreme, destroys the market altogether.
In sum, for the market mechanism, indeed any regulatory mechanism, to be credible, there must be some visible sanctions of wrong doers.
As long as self-regulation is accepted as a possible means of protecting privacy, there will be continuing efforts to use technology to address the issue, especially in the online world. Some are quite ingenious. Two companies are attempting to minimize the issue of privacy through anonymity and disguise. Privada (www.privada.net) provides a system where users can make all their Internet transactions including email, Web browsing, online chats, and e-commerce, anonymously. Zero Knowledge Systems (www.freedom.net) uses a sophisticated encryption software called “Freedom” to disguise the identity of an Internet user with up to five pseudonyms. Both methods assume that the companies will be ethically vigorous in their protection of their users’ privacy.
More recently, a Dutch-led consortium announced that they had been awarded a three-year 3.2 million euro contract from the EC and Netherlands’ Ministry of Economic Affairs to create a Privacy Incorporated Software Agent (PISA) that would meet the requirements of the EU data protection directives. PISA plans to develop a privacy enhancing technology (PET) architecture by 2002 and to develop a test version of the program by 2004 (TNO-FEL, 2001). Past experience with such technologies suggests that PISA would more likely play a role supplementary to the law (Burket, 1997:136).
The use of technology to aid privacy, however, is a ding-dong battle. Most recently has appeared “web bugs”. These are small graphic files that have the ability to send information about a website visitor to a separate server (Smith, n.d.). Undoubtedly there will emerge both new rules and technology to disclose or defeat these web bugs.
In the meantime, however, one is left with the two main modes of privacy protection. The result so far is that self-regulation is not working well.
There are advantages to self-regulation. For a fast-moving industry such as the Internet, the less formal processes of self-regulation makes it more flexible and therefore less likely to stifle innovation or excessively limit consumer choice.
Further, for such a technical business, it is industry that has the best capability to control quality and recognize low standards. It knows the best way to guarantee quality, the efficacy of potential courses of action and the industry has access to information they need for action at the lowest cost.
Third, because industry bears the cost of regulation it has incentives to keep enforcement and compliance costs down.
But there are major disadvantages to self-regulation. Self-regulation is perceived as never going against the interests of the (self-)regulator. Having been involved for several years in a self-regulatory regime in advertising, this author can state that it is not quite the case. The author has voted for business just as business has voted against their own interests for the consumer. Nevertheless, the perception is difficult to shake.
The most significant is the lack of incentives to control and enforce standards. Self-regulation works in the medical and legal professions because the members are entering into a high-income homogenous group through compulsory membership (van Den Bergh, 1999).
Self-regulation seems to work well as long as the group of agents exerting the power is relatively small and cohesive. Various studies for different reasons come to the same conclusion: self-regulation is more difficult and less effective when it involves a large and heterogeneous group of agents (Scarpa, 1999:254). This heterogeneity may be the critical reason that makes self-regulation of Internet privacy difficult if not impossible. The easiest solution to heterogeneity is to set the lowest standard. But that in itself is problematic as the EU has set a higher standard.
Australia and the United Kingdom, which have relied heavily on self-regulation, are heading more towards direct legislation. In part, there is the zeitgeist of greater competition and self-regulation tends to conflict with competition.
In practice, the compliance record has not been sterling. In May 1999, a survey found that while almost 66 percent of the busiest sites (a random sample of 300 out of presumably American 7,500 sites as at January 1999) had a privacy policy, only 9.5 percent of sites had standards that met those of the FTC’s called for a year ago (Culnan, 1999). In 1998, the US FTC had identified four widely-accepted fair information practices that were an essential part of any government or self-regulatory privacy regimes. These are: 1) notice, displaying a clear and conspicuous privacy policy; 2) choice, allowing consumers to control the dissemination of information they provide to a site, 3) access, opening up the consumer’s personal information file for inspection, and 4) security, protecting the information collected from consumers (US FTC, 1998). Based on the Georgetown report, the FTC in 1999 issued a statement allowing more time for self-regulation of privacy on the Internet. (US FTC, 1999).
In 2000, the FTC conducted several studies that found shortcomings in postings and standards. It did acknowledge an improvement in terms of posting of privacy policies (88 percent from 66 percent) and compliance with the four fair information practices (20 percent from less than 10 percent). But it also found the percentage of those complying with the fair information practices low. Another survey that the FTC conducted found that only 8 percent of sites in a random sample and 45 percent of the busiest sites had a privacy seal. Evidently displaying impatience with industry for dragging its feet on the issue, (the second paragraph of the 2000 report begins: “The Federal Trade Commission has been studying online privacy issues since 1995.”) the Commission urged Congress to pass legislation “to ensure adequate protection of consumer privacy online” (US FTC, 2000). It should be noted that there were dissenting views in the report.
The three best-known prominent privacy seals on the web are TRUSTe, BBBOnline and WebTrust, all based in the US. As at November 2000, they had attracted only a small number of members: 1,900, 680 and 2 respectively. Only a quarter of the top 100 e-commerce sites subscribe to the seals. Among the notable holdouts are Amazon.com and BarnesandNoble.com (Sanders, 2000).
Apart from procedural difficulties in filing a complaint against a website, TRUSTe has not removed a seal once in more than three years for privacy violation (Hunter, 2000). There have been orders to remove the seal for those who have not renewed payment of fees. An email from the author querying how many members it had and how many have had their seals revoked was not answered.
TRUSTe’s difficulties have been highlighted in the case of Real Audio and AOL. In AOL’s case, a complaint was filed about the company passing information to third parties. AOL’s answer was that the TRUSTe seal applied only to the www.aol.com site, not the members.aol.com site (Hunter, 2000).
In the case of Real Audio, the company’s software, RealJukebox, surreptitiously monitored and collected data about the listening habits and some other activities of its users (Robinson, 1999). The company apologized but was never punished by TRUSTe because this activity was not covered by the terms of the TRUSTe seal.
A self-regulatory regime without enforcement is dangerous. The assumption that a privacy notice is better than no notice is fallacious because consumers who are misled become disgruntled and angry. A recent study using a large number of game-theory experiments concluded that there are two major motivating forces that drive consumers to seek sanctions. The first is where the consumer feels that the fairness principle has been violated (and just what is fair and whether the fairness principle even applies depends on the circumstance). The second is, surprisingly enough, spite. The consumer was not driven by “strategic sanctions that are imposed to create future material benefits” (Falk, et al, 2000). This finding corroborates a survey on privacy by The Pew Research Center, which found that users were in a “punishing mood” (2000).
The publicity given to the failures of the seals and the US FTC’s announcement, it would appear that the US is ready to legislate some new privacy laws. But much will depend on the new Bush administration. The US privacy rules are, in that sense, in a transitory state.
Also in transition are the EU data protection provisions. In March 2000, the EU and the US agreed on a “safe harbor” provisions that would make it such that companies that subscribe to the safe harbor principles would be presumed to have adequate protection for the purposes of the EU data protection directive (Clausing, 2000). The provisions mean that there are strict guidelines on the collection of data from EU (not American or other non-EU) customers. At the time of writing, there were 16 organizations that had subscribed to the principles (US Department of Commerce, n.d.). Among them is TRUSTe, which presumably aims to brings its members into the harbor as well.
A concern is the cost of compliance with government regulation. No study has yet been comparing the costs of self-regulation with that of government legislation. The US’s Children’s Online Privacy Protection Act, which came into effect in April 2000, has forced some sites to close those sections that cater to children because of compliance costs (Net privacy law costs children’s sites, 2000).
This author is of the view that those costs will be noted by the EU and in fact may be the very reason for the slow pace of application of the data protection directive as well as the creation of the safe harbor principles. The aim of privacy is to protect the end-user and thereby encourage greater activity on the Net. If the costs of compliance are so onerous that websites are forced to close part or all of the site, then that aim is defeated. The EU is therefore unlikely to compel EU-only data protection rules as this may harm its own Internet industry relative to that of the rest of the world.
As if the situation was not complex enough, just before this article was sent, Consumers International released a report that criticized both US and EU websites for falling “woefully short of international standards on data protection.” It added, “Despite tight European Union (EU) legislation, sites within the EU are no better at providing decent information to their customers than sites based in the US. Indeed some of the best privacy policies are to be found on US sites” (Consumer International, 2001, at p. 5).
Given this state of flux, what should website owners do? Some regulation is necessary, and even helpful. It is neither in the interest of business nor consumer to have cloaking software to bypass privacy concerns. It is not in the interest of business because the information about the consumer is not known. The costs of acquiring and serving the customer increases. Those costs will eventually be passed back to the consumer. It is therefore clearly in the interests of both parties to have privacy standards that both can agree on, but one that importantly also punishes recalcitrant violators.
In the meantime, websites in the US and around the world have little choice but to present themselves to the world at least with the appearance of being responsible. First, sites should collect the minimum data necessary.
Going by the US FTC’s fair information practice principles as well as sensible business practice, where any data need to be collected, there must be a privacy policy notice posted in a prominent location. It is beyond the scope of this paper to discuss the particulars of the policy; there are samples on other sites and templates to work these out.
Next, users must have the choice to opt-in rather than to opt-out (Strover and Straubhaar, 2000). E-Bay’s attempt to get people to opt-out (Delio, 2001) is a misunderstanding of consumers’ preferences. Most consumers do want to receive the material that a website may send and therefore they end up selecting the “Yes” button anyway. But they also want the freedom to say that; they do not want to have the default set at “Yes” even though most of them later will click it.
The FTC’s fair information practice principle also requires that sites allow users to see and correct their personal data. This may entail some software cost for small sites, some of which in this author’s experience merely collect email addresses into a mailing list.
Finally, sites need to have sufficient security. When the US Social Security Administration put online a system to check on the benefits due to an individual, it did not have sufficiently strong protection against unauthorized access. As a result, it had to close down the facility (Pear, 1997).
In the end, however, a lot depends on both a genuine desire to protect the consumer as well as plain common sense. This author had the unpleasant experience of a wave of unsubscriptions to a e-newsletter when an untrained person used the TO header instead of the BCC to send out that month’s mailing. Those who cancelled were upset that their email address had been released inadvertently to everyone else on the list.
This is an unsettling time for web site owners in the area of privacy. It is clear that consumers want more assurance than just an unenforceable seal. The self-regulatory movement in the U.S. has not come through with flying colors. To be sure, there has been some progress in the awareness of the need to protect the privacy of consumers. But this awareness is at a rather low level. In order to boost the confidence of consumers, much more needs be done.
This article has attempted to review the state of privacy of to the present moment. In sum, there is great pressure toward government regulation. There will the resistance against this but to assure the consumer, the surest way is to have regulation. Uncomfortable as that may seem, if that is sufficiently light-handed, yet with genuine sanctions against delinquent sites, this could be the next step up for the web.
Akerlof, G., (1970), The Market for “Lemons”: Qualitative Uncertainty and the Market Mechanism, Quarterly Journal of Economics, 84:488-400.
Ang, P. H., (1999), ISP Self-Regulation: Why and Why Not? Paper Presented at Inter >national Child Pornography Conference, September, Austria.
Burket, H., (1997), “Privacy-Enhancing Technologies: Typology, Critique, Vision”, in Technology and Privacy: The New Landscape P. Agre and M. Rotenberg, eds., Cambridge, MA: MIT Press, 125-142.
Clarke, R., (1998, August), Platform for Privacy Preferences: A Critique. Privacy Law & Policy Reporter, 5(3), 46-48, (accessed on 1/31/2001].
Clausing, Jeri, (2000), Europe and U.S. Reach Data Privacy Pact, New York Times, March 15.
Consumers International, (2001, January), Privacy@Net: An International Comparative Study of Consumer Privacy On The Internet, http://www.consumersinternational.org/news/pressreleases/fprivreport.pdf.> (accessed on 1/31/2001)
Coyle, K., (1999, June), P3P: Pretty Poor Privacy? A Social Analysis of the Platform for Privacy Preferences, (accessed on 1/31/ 2001).
Culnan, Mary, (1999), Georgetown Internet Privacy Policy Survey, (accessed on 1/31/2001).
Delio, Michelle, (2001),EBay E-mail Makes Users “Bidder”, WiredNews, (accessed on 1/29/2001).
Electronic Privacy Information Center, (2000, June), Pretty Poor Privacy: An Assessment of P3P and Internet Privacy, (accessed on 1/31/2001) .
Falk, A. , Ernst,> F.and Urs, F. (2000), Informal Sanctions. Institute for Empirical Research in Economics, University of Zurich. September Working Paper Series Paper No. 59 , < http://papers.ssrn.com/sol3/papers.cfm?abstract_id=245568> (accessed on 1/31/2001).
Garfinkel, S., (2000, July 11), Can a Labeling System Protect Your Privacy?, (accessed on 1/31/2001) .
Gellman, R., (1993), Fragmented, Incomplete, and Discontinuous: The Failure of Federal Privacy Regulatory Proposals and Institutions. Software Law Journal, IV, 199.
_____, (1997), Does Privacy Law Work?, in Technology and Privacy: The New Landscape P. Agre and M. Rotenberg, eds., Cambridge, MA: MIT Press, 193-218.
Hunter, Christopher, (2000, February), Recoding the Architecture of Cyberspace Privacy: Why Self-Regulation and Technology Are Not Enough, (accessed on 1/31/2001).
Hustinix, P.J., (1998, June 16), Platform for Privacy Preferences (P3P) and the Open Profiling Standard (OPS): Draft Opinion of the Working Party, (accessed on 1/31/2001).
Lee, J. H., (2001), 95.7 percent of Netizens Fear Personal Information Leakage, January 22, (accessed on 1/31/2001).
Mulligan, D., (2000), P3P and Privacy: An Update for the Privacy Community, March 28,< http://www.cdt.org/privacy/pet/p3pprivacy.shtml> (accessed on 1/31/2001).
Mogg, J., (1998), Data Protection and Privacy, Internal Market and Financial Services, The European Commission, Speech made to the European-American Business Council, March 18, <http://www.eurunion.org/news/speeches/1998/980318jm.htm> (accessed on 1/31/2001).
Net Privacy Law Costs Children’s Sites, (2000), USA Today, September 14, (accessed on 1/31/2001).
Olmstead v. United States, 277 U.S. 438, 478 (1928) (Louise Brandeis, J., dissenting).
Pew Research Center, (2000), Trust and Privacy Online: Why Americans Want to Rewrite the Rules, (accessed on 1/31/2001).
Pear, R., (1997), Social Security Closes On-Line Site, Citing Risks to Privacy, New York Times, April 10, A15.
Robinson, Sara, (1999), CD Software Said to Gather Data on Users, New York Times, November 1.
Sanders, Edmund, (2000), Privacy Certification Earning Seal Of Disapproval, Los Angeles Times, November 16, <http://chicagotribune.com/tech/economy/article/0,2669,2-48015,FF.html.> (accessed on 1/31/2001).
Scarpa, Carlo, (1999), “The Theory of Quality Regulation and Self-Regulation”, in Bernaerdo Bortolotti and Gianluca Fiorentini, eds., Organized Interests and Self-Regulation: An Economic Approach, Oxford: Oxford University Press, 236-260.
Smith, Richard, (n.d.), FAQ; Web bugs, (accessed on 1/31/2001).
trover, S. and Joe S., (2000), E-Government Services and Computer and Internet Use in Texas, Telecommunications and Information Policy Institute University of Texas at Austin. Commissioned by the Electronic Government Task Force, June, (accessed on 1/31/2001).
Swire, P., (1997), Privacy and Self-Regulation in the Information Age. US Department of Commerce, (accessed on 1/31/2001).
Swire, P. and Robert Litan, (1998), None of Your Business: World Data Flows, Electronic Commerce, and the European Privacy Directive, Washington DC: The Brookings Institution.
TNO-FEL, (2001, January 17), Fast and Safe Internet Work with PISA, (accessed on 1/31/2001).
US Department of Commerce. (n.d.), Safe Harbor List, (accessed on 1/31/2001).
US Federal Trade Commission, (1998), Privacy Online: A Report to Congress, (accessed on 1/31/2001).
_____, (1999), Self-Regulation and Privacy, July, (accessed on 1/31/2001).
_____, (2000), Privacy Online: Fair Information Practices In The Electronic M arketplace, A Report To Congress, May, (accessed on 1/31/2001).
van Den Vergh, Roger, (1999), “Self-regulation of the Medical and Legal Professions”, in Bernaerdo Bortolotti and Gianluca Fiorentini, eds., Organized Interests and Self-Regulation: An Economic Approach, Oxford: Oxford University Press, 89-130.
W3C, (2000), P3P Enabled Web Sites, (accessed on 1/31/2001).
Westin, Alan, (1967), Privacy and Freedom, New York : Atheneum.
Peng Hwa Ang is Vice-Dean, School of Communication Studies, Nanyang Technological University, Singapore. His research interests are in Internet law and policy. The author wishes to thank the Programme in Comparative Media Law and Policy, Centre for Socio-Legal Studies, University of Oxford, for the fellowship in January and February 2001 that made the research and writing possible.