American Bar Association
Forum on Communications Law
Internet Privacy and Free Expression: New Media for the New Millennium
by KURT WIMMER
Protecting personal privacy is fast becoming the most pressing communications policy issue of the new decade. Massive amounts of personal data now can be collected, manipulated, and sent over the Internet with a keystroke. As digital media become more comprehensive, pervasive, and complex, Americans are left wondering how to protect their private lives from public exposure.1 A new vocabulary, including such terms as data mining and online profiling, for the collection of personal information has been created.
The public policy debate has seldom, if ever, focused on the other side of the equation: Legal actions taken to restrict access and dissemination of information to protect personal privacy may also restrict free expression. Freedom in the online world can be a zero-sum game-increased protection for privacy can mean diminished protection for free expression. Threats to free speech in the digital age will inevitably require more subtle analysis than in the past, and vigilance will be required both to keep media companies in compliance with new laws and to monitor potential First Amendment incursions.
The current Supreme Court appears to be moving in the direction of increased reliance on balancing in First Amendment cases. Instead of using the traditional "categorical" form of analysis, the Court in recent years has begun to use a weighing of relevant considerations, particularly in cases involving the newer electronic media (cable TV and the Internet) or multiple speech interests. It is too early to tell whether that trend will be reversed or expanded to encompass other types of First Amendment cases. Indeed, there are several Justices who are Seeking to move First Amendment jurisprudence in the opposite direction. But the trend now Seems sufficiently clear that everyone concerned with First Amendment issues should at least think about the implications of a balancing approach as opposed to a categorical one. Moreover, what is perhaps most interesting about current trends is that the Justices generally considered most "liberal" Seem to be the least inclined to maintain adherence to traditional categories of analyses.
These new issues come from a variety of sources:
This article focuses briefly on these areas lawyers and discusses the increasingly difficult intersection between the right to speak freely and the right to maintain personal privacy.
- New laws and regulations designed to protect children's privacy on the Internet require new policing mechanisms by Web sites and may force some sites to close bulletin boards and chat rooms.
- Federal Trade Commission (FTC) actions enforcing privately drafted Web site "privacy policies" under deceptive trade practice laws will have a broad impact on media and e-commerce sites as will private lawsuits against Web site operators alleging privacy violations.
- New privacy litigation challenging the collection of information from Web site visitors, a practice known as "online profiling," will become widespread.
- Decisions by member states of the European Union (EU) to restrict data flow to countries, including the United States, that do not have "adequate safeguards" for personal privacy could interrupt the flow of information or require companies to adopt significant new safeguards for consumer privacy.
- First Amendment challenges to government regulations stopping the publication of encryption software, which could facilitate private Internet data transmission, may result in increased flexibility to encrypt private Internet conversations.
Internet Privacy and Children
No single group is more vulnerable to privacy violations than children. Recent studies note that some two-thirds of all children have used the Internet, and that some 92 percent of the more than sixty-four million Internet users in the United States were "concerned" about protection of their privacy on the Internet.2 Industry has responded to the increasing presence of children on the Internet by marketing software designed to "filter" inappropriate Internet sites and prevent children from disclosing personal information. In Washington policy circles, however, these efforts have been Seen as inadequate.
In 1998, Congress passed the Children's Online Privacy Protection Act (COPPA).3 COPPA has two broad requirements: notice of privacy policies toward children and parental consent. It requires that Web sites or online services (such as America Online) that collect personal information from children provide effective notice on the site about the information that is collected and the uses to which the operator puts that information.4 It also provides that Web site operators must obtain "verifiable parental consent" for the collection of personally identifiable information from a child.5 "Personal information" is defined broadly to include any information that "permits the physical or online contacting of the child," including first or last name, e-mail address, telephone number, or physical address, and other information that may permit a child to be located.6
The statute, which takes effect in April 2000, is not self-effectuating. Rather, the FTC must adopt rules that govern COPPA's implementation. In October 1999, the FTC adopted rules that require any operator of a Web site or online service directed at children, or any operator that has actual knowledge that it is collecting or maintaining personal information from a child, to comply with certain notice and consent requirements.
The FTC's rules7 (available online at www.ftc.gov ) establish that COPPA will apply not only to children's Web sites such as Nickelodeon or Disney , but also to general-purpose Web sites, if the FTC finds that any portion of the Web site is directed at children. This assessment is to be based on a subjective interpretation of the "overall character" of the Web site, including the site's visual or audio content and whether it uses animated characters or contains child-oriented activities.
Moreover, any other site that has gained actual knowledge that a child has used its site will be deemed to be on notice that it may have collected a child's personal information. The FTC noted that it would examine closely those Web sites that do not ask users to reveal age but do ask for related information that could give operators knowledge that they are dealing with a child. If a general-purpose Web site gains actual knowledge that it has collected personal information from even one child, it will be required to redesign its entire information-gathering procedures to comply with COPPA.
Perhaps most importantly for media Web sites, the FTC rules also make it clear that COPPA will apply not only to Web sites that actually Seek information from children, but also to those sites where children themselves can supply personally identifiable information. For example, if a child in a child-oriented chat room posts identifying information in a real-time chat, the Web site operator may be deemed to have "collected" that personal information from the child. The net effect of such a ruling, many groups believe, will be to require any child-oriented chat rooms and bulletin boards to either close down or adopt twenty-four-hour monitoring in order to immediately remove any personal information posted by children. The American Library Association, the American Civil Liberties Union, and the Center for Democracy and Technology have urged a more workable reading of this rule. Under one proposed interpretation, an operator of a Web site that does not Seek to collect personal information but merely provides a bulletin board, personal home page, or other communications services available to children would not be deemed to be "collecting" information within the meaning of COPPA. Industry groups have been discussing these issues with the FTC staff, but no resolution of this issue appears on the horizon as the law's effective date nears.
If COPPA is found to apply to a Web site or online service, the service is first required to post a "clear, prominent, and understandable" privacy notice. The FTC detailed in its rules the information that must be included in such a notice, including what data are collected, how they will be used, and to whom they may be disclosed. Second, the service must obtain "verifiable parental consent" for the collection, use, or disclosure of personal information from children. COPPA defines "verifiable parental consent" as "any reasonable effort (taking into consideration available technology)" to ensure parental authorization. The FTC listed several ways that operators could accomplish this consent, including requiring a parent to mail a consent form to the operator or to call a toll-free telephone number. The FTC will permit consent by e-mail until April 2002, provided that this method is accompanied by "additional steps" to provide assurance that the parent is providing the consent. Third, the service must provide opportunities for parents to review and delete their child's personal information, must limit collection of personal information to that necessary for the activity, and must protect the integrity of the data collected.
These provisions undoubtedly will prevent many children from unwittingly providing personally identifiable information that could put them in danger. It is difficult to take issue with this legitimate governmental objective, or with the good faith of both Congress and the FTC in passing and implementing the statute. Yet, certain interpretations of COPPA, such as the FTC determination that a Web site has "collected" personal information from a child when the child has posted that information in a chat room or bulletin board, threaten to force some Web sites that undoubtedly provide important outlets for free expression to adopt extensive (and expensive) monitoring programs or to close up shop. Such a result bears significant First Amendment implications.
Web Site Privacy Policies
There currently is no federal or state law that requires Web site operators to post "privacy policies" governing personal data collected from adults. These policy statements provide Web site users with notice of the type of information collected by the site, who will have access to that information, how information will be used and can be corrected, and whether third parties unrelated to the Web site operator will have access to that information. However, consumer privacy concerns and congressional promises to take actions if the Internet industry does not adequately regulate itself have led to an industry practice of posting privacy policies on the majority of popular Web sites.
The FTC continued to apply that principle in 1999, finding that the Young Investors Web site maintained by Liberty Financial of Boston falsely promised Internet users that information collected from them would be held anonymously.9 In fact, the site asked children to answer survey questions about allowance amounts, gifts of stocks and bonds, and other financial matters "anonymously," but maintained the information in a way that linked it with each child who responded. The FTC and Liberty Financial entered into a consent decree based on the terms of COPPA. Robert Pitofsky, FTC chairman, cited the decree as another example of the FTC's efforts to protect online privacy and said that the FTC is "committed to pursuing law enforcement actions in appropriate cases."10
The practice of "online profiling" involves the recording of online behavior for the production of tailored advertising or other content. For example, many (if not most) Web sites leave "cookies" on users' computers. These cookies are small text files from which a site can garner information. For example, the next time a visitor arrives at a site, the site will know that he or she is a repeat visitor and may have some information about how the site was used on previous visits. New technology, however, has permitted sites to form "memberships" so that they can read one another's cookies. Under these new technologies, for example, a Web site that sells music would be able to tell from a first-time visitor's cookies that he or she has visited other sites dedicated to classical music. Rather than providing this visitor with a screen promoting jazz, it will provide him or her with a screen dedicated to classical music.
In late 1999, calls for legislation on "online profiling" led the FTC and the Department of Commerce to hold a joint workshop on the legal issues raised by the practice. Those advocating legal restrictions pointed out that Internet users often use the medium for its supposed anonymity, and that online profiling can reveal such personal data as sexual preference, HIV status, and religious affiliation. Public interest groups pointed out that profiling could lead to "electronic redlining"-the reservation of higher-quality online assets to those more likely to be high-dollar customers.
Likewise, a lawsuit was filed in late 1999 against Real Networks, Inc., a company that markets software for playing digital audio and video files downloaded from the Internet. The suit alleges that Real Networks configured its software to collect information on users' music preferences in such a manner that Real Networks could later retrieve these data. It Seems clear that litigation alleging privacy violations by Web sites collecting personal information will be a growing area of concern.
Europe has long dealt with privacy in a way that is fundamentally different from the United States. Congress and state legislatures typically respond to privacy concerns on an issue-by-issue basis (e.g., Internet privacy for children and federal legislation to protect the privacy of cable television records and video rentals). In contrast, privacy in Europe is recognized as a fundamental human right. Most European countries have long had data protection agencies that regulate data privacy much as the FCC regulates communications.
In 1992, the EU acted to make protection of data privacy uniform across Europe. The EU Data Protection Directive requires each EU member state to adopt a privacy scheme to protect personally identifiable information held on European nationals.13 A directive is not binding law that applies in each member state; rather, it requires the member states to adopt implementing legislation within a set period. Several European states have implemented the directive while many others have preexisting data privacy laws that have not yet been brought into conformity with the directive.
EU Directive and U.S. Companies
Of most immediate concern to American companies is the fact that the directive requires member states to prohibit the transfer of personally identifiable data on European nationals to any non-EU country that does not have sufficient protections for ensuring the privacy and integrity of that data. The United States, of course, does not have any comprehensive law that would be recognized as sufficient by the EU. American industry, particularly multinational companies, feared that these provisions would stop data flow between European countries and the United States.
The potential for the EU directive to have a crushing impact on international journalism was also a concern. The directive's definition of "personal data" is quite broad in that it is defined as "any information relating to an identified or identifiable natural person." Journalists understandably feared that they would be unable to compile records on subjects, and online newsgathering operations further feared that the publishing of Internet reports-which are, by the nature of Internet publication, making information available across borders-would be prohibited. Although there have been some isolated actions taken by data protection authorities to prevent Internet publication of personal data, the directive contains the potential for an exception for journalistic activity. That exemption provides that "Member States shall provide for exemptions for the processing of personal data carried out solely for journalistic purposes or for the purpose of artistic or literary creation only if they are necessary to reconcile the right of privacy with the rules governing freedom of expression" (Directive, art. 9). It remains to be Seen whether each country will adopt such an exemption and whether EU member states will weigh freedom of expression more heavily than the right to privacy. Indeed, the more restrictive "only if" clause was inserted after the U.K. Data Protection Registrar expressed concern that the balance would favor journalists. She commented that "getting the balance right does not mean simply exempting the media from all data protection controls," preferring language that would permit exemptions that "prove necessary."14
The legitimate fear that European data flow to the United States would be halted by the directive led to more than a year of concerted negotiations between the United States and the EU. On February 23, 2000, the EU and the United States announced a broad "safe harbor" agreement under which U.S. companies could continue to import personal data from Europe even though the United States does not have a comprehensive privacy law. Under the "safe harbor," U.S. companies will be required to register with the FTC to commit themselves to certain principles under the EU directive. These principles include notifying consumers of data being collected, permitting them the opportunity to "opt out," and permitting them to search for and modify personal data held by the company (unless unduly burdensome). If the company then failed to live up to its commitments, it could be subjected to FTC claims that such a failure is a deceptive trade practice.15
First Amendment Challenges to Encryption Restrictions
"Encryption" is the ability to encode information to be sent over computer networks so that it can be read only by the intended receiver. Encryption was originally the exclusive province of military and intelligence services, and its dissemination was tightly controlled for national security reasons. Today, however, encryption is widely used to protect privacy and proprietary information stored or transmitted electronically, and a wide variety of products, from cell phones to Web browsers, have built-in encryption features. Certain forms of "strong" encryption that are truly effective in ensuring the privacy of electronic communications cannot legally be exported or even published on the Internet because of concerns that it may be used to shield criminal and terrorist activities. This prohibition against publishing is, of course, a classic prior restraint. As more communication occurs electronically, privacy advocates and the information technology industry have become convinced that effective encryption is necessary both to ensure that sensitive communications stay private and to facilitate the growth of electronic commerce.
Nonetheless, the federal government has been slow to retreat from its traditional policies restricting the export of encryption technologies. Law enforcement and national security agencies are concerned that criminals and terrorists themselves will use encryption to evade detection or to hide evidence. The government has been under strong pressure from information technology vendors and users, however, and has suffered setbacks in litigation with academic cryptographers. In addition, bills have been introduced in Congress that would substantially liberalize U.S. export controls on encryption. They have garnered significant support in both houses.
The long-standing restrictions on encryption source code have been under attack in the courts on constitutional grounds. In three separate lawsuits,16 academic cryptographers have argued that the requirement to obtain a government license before publishing encryption source code on the Internet or by other electronic means is nothing more than a prior restraint on free speech that is contrary to the First Amendment. These scientists maintain that source code is a medium of communication, the preferred medium among computer scientists, and is thus protected speech. The government has defended the regulations by maintaining that source code is functional, not speech, and that it is regulated for what it does, and not for the ideas that it may communicate.
The results so far have been mixed. The district court in the Bernstein case held that source code was protected speech and struck down the export regulations as an unconstitutional prior restraint.17 A Ninth Circuit panel upheld the district court,18 but that opinion has been withdrawn and the case set for rehearing en banc in March 2000. The district courts in the Karn and Junger cases rejected the challenges and upheld the federal regulations. The Karn case was appealed to the D.C. Circuit and remanded to the district court (and a new judge) for further proceedings. In Junger, however, the Sixth Circuit Reversed the district court, finding that "the First Amendment protects computer source code."21
The promised new encryption regulations may have a substantial impact on the claims at issue in these cases. Initial drafts of the regulations include a narrow exception that would allow encryption source code to be exported without licensing to most destinations, provided that (1) it is or will be released publicly, (2) it is not subject to any "proprietary commercial agreement or restriction," and (3) the exporter (publisher) of the source code notifies the Department of Commerce prior to export or publication, and provides the department with a copy of the code or identifies the universal resource locator at which it can be found. This exception might Seem to moot the claims of the plaintiffs, who wish to post source code publicly on the Internet and would not impose any proprietary restriction on it. However, the exception would not permit export to terrorist-supporting countries or their nationals, as defined by the Department of Commerce, which would Seem as a practical matter to preclude Internet publication. In addition, the requirements for notice and disclosure to the government prior to publication are unusual, to say the least, and might not pass constitutional muster.
Privacy versus the First Amendment
In an unprecedented case, the Tenth Circuit in 1999 overturned a Federal Communications Commission (FCC) order protecting the privacy of telephone consumers' information on grounds that the restriction impinged upon the First Amendment rights of telephone companies. In U.S. West, Inc. v. FCC,22 the court considered the Commission's requirement that any telephone companies wishing to use their subscribers' proprietary information-including information on the number dialed and the length of calls-obtain the subscribers' advance consent. The court held that this "opt out" requirement was not a narrowly tailored limitation on the telephone companies' First Amendment right to use this information and overturned the order. A petition for a writ of certiorari is pending before the U.S. Supreme Court.
The U.S. West case is notable for its application of the First Amendment to a statute enacted to ensure consumer privacy. The court found that a telephone company's "speech" to its customers for the purpose of "soliciting those customers to purchase more or different telecommunications services" is commercial speech.23 Applying the traditional Central Hudson analysis,24 the court found that the "speech" in question was true and not misleading and that the asserted benefit of protecting consumer privacy did not outweigh First Amendment protection of the speech. The court found that protecting privacy was a "legitimate and substantial interest" of government.25 It found, however, that the FCC had not sufficiently articulated the privacy interests at issue in the case:
Although we agree that privacy may rise to the level of a substantial state interest, the government cannot satisfy the second prong of the Central Hudson test by merely asserting a broad interest in privacy. It must specify the particular notion of privacy and interest served. Moreover, privacy is not an absolute good because it imposes real costs on society. Therefore, the specific privacy interest must be substantial, demonstrating that the state has considered the proper balancing of the benefits and harms of privacy.
U.S. West case is significant for two reasons. First, it recognizes the potential First Amendment conflict between privacy and free expression. Second, it establishes the beginnings of an analytical framework for assessing whether the privacy interests outweigh the potential expression interests at stake. Privacy advocates have rightly expressed concern that the decision may lead to constitutional principles that take precedence over privacy interests. Regardless of the outcome, it does appear clear that regulatory agencies must articulate specific grounds for rules intended to protect privacy if those rules could be construed to have an adverse impact on expressive activity.
How can we weigh the right to privacy against the right to free expression, particularly in an age in which digital information can be processed and published to millions with a keystroke? There is little, if any, consistency across the various laws that govern privacy in the United States and the various agencies that regulate it. There have been few attempts, in the United States and in Europe, to reconcile the needs of publishers with the desires of subjects to maintain the confidentiality of certain information. The U.S. West case suggests that First Amendment interests should be considered and that privacy interests should be clearly articulated. As with any common law issue, the balance will be drawn in disparate cases scattered across the country. The challenge will be to remain vigilant to the expressive issues raised by privacy laws and to advocate the media's interests.
Endnotes1. See generally FRED H. CATE, PRIVACY IN THE INFORMATION AGE 19-22 (1997); Joseph I. Rosenbaum, Privacy on the Internet: Whose Information Is It Anyway?, 38 JURIMETRICS J. 565, 566-67 (1998).2. See Hertzel, Don't Talk to Strangers: An Analysis of Government and Industry Efforts to Protect a Child's Privacy Online, 52 FED. COMM. L.J. 429, 430, 432 (2000).3. 15 U.S.C. § 6501 (1998).4. Id. § 6502(b)(1)(A)(i) (1998).5. Id. § 6502(b)(1)(A)(ii) (1998).6. Id. § 6501.7. The FTC rules are published at 16 C.F.R. §§ 312.1-312.12 (1999).8. See FTC Press Release, Internet Site Agrees to Settle FTC Charges of Deceptively Collecting Personal Information in Agency's First Internet Private Case (visited Mar. 28, 2000) .9. See FTC Press Release, Young Investor Web Site Settles FTC Charges (visited Mar. 28, 2000) .10. Id.11. See Judnick v. DoubleClick, Inc., No. CV-000421 (Cal. Super. Ct. Jan. 27, 2000); See also Freedom Forum, Internet Ad Firm Sued for Invasion of Privacy (visited March 28, 2000).12. See Freedom Forum, E-Privacy Complaint Filed with FTC (visited March 28, 2000).13. On the Protection of Individuals with Regard to the Processing of Personal Data and on the Free Movement of Such Data, COM(92)422 final.14. See Maxeiner, Freedom of Information and the EU Data Protection Directive, 48 FED. COMM. L.J. 93, 102 (1995).15. See U.S. and Europe Agree in Principle on Terms of "Safety Harbor," COMM. DAILY, Feb. 24, 2000, at 10.16. See Bernstein v. Dep't of State, 922 F. Supp. 1426 (N.D. Cal. 1996); 945 F. Supp. 1279 (N.D. Cal. 1996); 974 F. Supp. 1288 (N.D. Cal. 1997), aff'd, 176 F.3d 1132 (9th Cir. 1999) (op. withdrawn pending en banc review); Junger v. Daley, 8 F. Supp. 2d 709 (N.D. Ohio 1998); Karn v. Dep't of State, 925 F. Supp. 1 (D.D.C. 1996).17 Bernstein, 974 F. Supp. at 1288.18 176 F.3d 1132 (9th Cir. 1999).19. Karn, 925 F. Supp. 1 (D.D.C. 1996).20. Junger, 8 F. Supp. 2d 709 (N.D. Ohio 1998).21. Junger v. Daley, 2000 Fed App. 01178, slip. op. at 2 (6th, Cir., Apr. 4, 2000).22. 182 F.3d 1224 (10th Cir. 1999), petition for cert. filed, (U.S. Feb. 28, 2000).23. U.S. West, 182 F.3d at 1232.24. See Central Hudson Gas & Elec. Corp. v. Public Serv. Comm'n of N.Y., 447 U.S. 557, 564-65 (1980).25. See U. S. West, 182 F.3d at 1234.26. Id. at 1235.Kurt Wimmer is a partner with the firm of Covington & Burling in Washington, D.C. The views expressed are those of the author and do not necessarily represent those of his firm or clients.