Donald F. Theall, 1997
This is a preliminary draft of the article.
Copyright is retained by the author.
Any reader comments would be welcomed at
This copy is a mirror of the original (http://www.catalaw.com/logic/docs/dt-censor.html), hosted by Michael T. Babcock as reflection on his Internet Privacy and Security site.
Writing about Canada, censorship, and the Internet must be a forward-looking, futuristic activity, since as of mid-1997, there has been relatively little discussion of problems of content on the Internet. If Keith Spicer, the former chair of the CRTC and currently the policy director of the Canadian Library Association, had been correct in February 1996 when he declared at a conference in Toronto that "there is no question of censoring the Internet in Canada," this essay would never need to have been written. At the time, Spicer, of course, was echoing the former MIT professor and information highway guru, Nicholas Negroponte, who in an address at the same conference had said that it's impossible to censor the vast flow of data on the Net.(1) Partly, as will become clear later, it is a question of what is regarded as censorship -- a question the context of which is being changed by the emergence of the Internet. Beginning with an overview of the beginnings of the Internet, and an analysis of the first major incident in Canada to raise the question of control of the flow of content on the Internet -- the trial of Paul Bernardo and Karla Homolka, which led to the first attempts by universities and Internet Service Providers to block certain kinds of content -- there will be an examination of some implications of the establishment of the Information Highway Advisory Council (IHAC), including a discussion of its name, and of the first major Canadian Federal Court decision concerning the Internet.
Canadian issues cannot be examined apart from a thorough understanding of the structure and organization of the Internet, and of how its global, transnational nature means that legislative, legal, and diplomatic activity throughout the world effecting the Internet --particularly the United States -- will have a major impact in Canada. This understanding is required in order to appreciate the tremendous difficulties presented to its control and regulation. Accordingly, it is important to explain the multi-technological complexity of the Internet, and the potential effects that will have on our understanding of freedom of expression and communication. Since a recent U.S. Supreme Court case exemplified the complexities of the Internet and their consequences for the control and regulation of information on it, and since Justice John Sopinka of the Supreme Court of Canada has emphasized the importance in this area of our understanding "the American Experience,"the problem of understanding the Internet and its possible legal and diplomatic implications will be developed in relation to the debate concerning the U.S. Communication Decency Act (CDA) and the response of the U.S. Supreme Court to a challenge of its constitutionality.(2)
Returning to specifically Canadian issues, the problem with the legal category of "community standards" in dealing with a transglobal flow of information posed by the U.S. Supreme court will be examined, particularly in the light of the delineation of community standards in the recent Canadian Supreme Court decision concerning obscenity in Regina v. Butler, and the earlier decision in Towne Cinema v. the Queen which is cited in Regina v. Butler in its discussion of community standards. The possible implication of these decisions will be placed against the "American experience," IHAC's discussions on the problems of content on the Internet, and the Canadian role in various international initiatives concerning the Internet. A subset of this problem -- language regulations and the Internet -- will be briefly examined in relation to Canada.
Since, subsequent to the U.S. Supreme Court decision in Reno v. ACLU, both in government and industry the development and use of blocking/filtering software has become a preferred solution -- which will shift responsibility for control of content to individuals and to the private sector -- there will be a thorough examination of the nature, implications and potential legal and legislative problems presented by this technology. The implicit dangers in control of content by the private sector and/or individual self-censorship will be explored, as well as the practicality of such an approach for achieving the goal of protecting children from offensive material, while not limiting the freedom of adult expression. The ultimate end of such a program -- to achieve a thorough control of indecent, offensive and illegal material on the Net -- will be assessed in the light of remarks such as those that suggest you can't control cyberspace because it is technically not feasible. In conclusion, the challenges posed by this unique new medley of technologies for preserving freedom and maximizing the value of the Internet will be assessed.
These issues raise a complex series of questions resulting from a massive change in the international availability and flow of information. Any such extensive change raises fundamental questions about how it will transform our way of life, our fundamental institutions and our values. It also raises fundamental questions: how far should a society go to protect collectively the presumed innocence of their children; how such a technology raises pressing new issues about the freedom of adult expression and communication; how open and accessible should research and the institutions which sustain and promote research be; and about a wide variety of other questions as well. To begin to understand the scope of the problem, it is important to understand how this phenomenon of the Internet -- frequently misleadingly referred to as the Information Highway -- came to be, how its creators conceived it and intended it to operate, and how it subsequently developed in its early stages.
The "Information Superhighway" emerged as a subject of general public interest only in the 1990s. While its history, first as the U.S. Military sponsored ArpaNet and later as a loosely linked association of university researchers, government agencies, and computer hardware, software and service providers reaches back to the 1970s and 1980s. The Internet, as it came to be called, was of limited interest until the massive entry of commercial players in the 1990s. This resulted from its new capacity to carry multi-media messages emulating the advertising capabilities of TV and the flexibility of VCRs.(3) The very newness of regarding the Internet as of massive social concern (Canada's Information Highway Advisory Council only having reported to the government in 1996) means that an analysis of the possibility or advisability of regulating or controlling the Net must refer to what is occurring in other jurisdictions, and also project what might happen in the future, particularly taking into account the deliberate decision made by its designers to create a communication system that operates so randomly that it would not be possible to shut it down or prevent the transmission of messages even if some of its nodes (i.e., the various computers linked by it) were to be blocked or damaged by sabotage, attack, or covert operations.
This means that at its very inception the Internet was deliberately designed to insure that messages transmitted through it would reach their destination. In order to achieve this the Internet had a somewhat "chaotic" structure. Messages flowing across it were broken up into smaller packets of information (portions of the original message) and the packets took different routes to reach their destination. This design was that of a rhizomic network not that of a unidirectional highway. If a packet could not be successfully routed through one set of nodes, it was re-routed through another so that it would ultimately reach its destination. Essentially, ARPA had tried to create a message system in which it was not possible to stop a message from arriving at its destination. The deliberate plan was to frustrate any attempt to stop the flow of information. For the purposes of this examination of the problem of censorship and control, it is not necessary nor feasible to go into further technical detail.(4)
Some of the multitude of possibilities offered to users of the Internet for communicating or transferring information include: person-to-person e-mail; moderated or unmoderated discussion and newsgroups whose members communicate to the entire group by e-mail; online interactive chat groups similar sometimes to a conversation, at others to a public debate; facilities for posting and transferring files (FTP or File Transfer Protocol), and reading or viewing (by Gopher) files of printed or kinoaudiovisual material; and the World Wide Web, which permits the display and online manipulation of all varieties of material, often interactively, thus providing for online galleries of images, online radio and TV broadcasts, online library catalogs, and online libraries.
It is already apparent that the complex nature of this new technology will require radically different approaches to the regulation of its content. The international nature of the Internet, the original intentions of ARPA in designing a highly decentralized sabotage-proof communication system, the free-wheeling way the Net developed for over a decade before the current commercial, industrial, and wide public use of it, its offering of an alternative mode of transmission for personal correspondence and the possibilities that it has offered and continues to offer as an open forum for interactive discussion on any subject, taken altogether, present substantial difficulties to any form of governmental control. Some believe that this phenomenon will even require a reconsideration of existing laws governing the regulation of content -- issues such as hate propaganda and obscenity.
A complex series of tensions have developed between those alarmed at possibly losing a unique international opportunity for freedom of expression and communication, for a free exchange of creative discovery in the arts and other areas of cultural production, and for freedom in research and the dissemination of ideas, on the one hand, and the concerns of those who have been persuaded that the Internet is a major means for hawking pornography, terrorism, drug culture and racial hatred, thus endangering children who use the Net. While it is generally conceded that the Internet is not primarily or even largely concerned with such negative activities, and that even when it is, to succeed at those activities requires a series of specific decisions accompanied by a knowledge of how to access such materials, there continue to be expressions of panic about the harmful nature of the Net. The birth and development of the Internet in the research and artistic communities, for over a decade with no intervention, has spawned among many of its users a deep concern about such moral panic which might easily radically transform and restrict what has heretofore been some the Internet's greatest potential values.
Only in 1993 or 1994 did the control of content on the global computer network become a matter of major concern to a broad Canadian public and to the international diplomatic community. The first widely publicized conflict concerning content on the Internet in Canada occurred during the trials of Karla Homolka and Paul Bernardo when in 1993 the judge placed a publication ban on the proceedings. For the first time in Canadian judicial history a massive transnational challenge was presented to the extent of the authority of the Canadian judiciary over the control of information regarding a judicial action. Web sites and other online discussions and newsgroups in the U.S. and elsewhere (as well as other U.S. and international media) published details of the trial which then through the Internet easily found their way back into Canada, thus undermining a court order limiting what the press could make public about the case. This was particularly dramatized by the police seizure of an issue of the U.S. computer magazine, Wired, which discussed the case. In spite of the court order, Canadians who wanted banned information could easily access a number of web sites in the States through the Internet which provided information (and misinformation) about the trial. They could also participate in discussions through those Usenet newsgroups specifically dedicated to the discussion of the trials of Bernardo and Homolka.
Apart from posing the problem of whether the judiciary could uphold bans against publication given the new potentialities of "cyberspace," there was further fall-out from the Homolka case when McGill University(5) reacted immediately, followed by a number of other universities, asserting the responsibility of the university under law and setting forth the principle that the university must control the reception of content within its own constituency because of its liability as an Internet service provider. This led to other universities, such as Waterloo, not only restricting the Homolka material to Internet users in the university (i.e., the professors, students and other members of the university community) but to extending these restrictions by removing access to various Usenet newsgroups (on line discussions in which participants communicate to the entire group by e-mail) which the administrators felt might be in violation of Canadian laws regarding obscenity and hate literature. While initially this may have seemed appropriate, it did lead Canadian Supreme Court Justice Sopinka, a relatively conservative jurist concerning questions of obscenity and hate propaganda, in a public address at the University of Waterloo to observe that the banning of electronic bulletin boards to university staff and students had created a "tremendous amount of controversy in the academic world" for "the hallmark of learning is free, uninhibited and robust debate. Censorship is the antithesis of this process."(6) Exploring the question as to whether such administrative decisions might be "subject to scrutiny under the 'Charter'," he noted that the precedent in McKinney vs. University of Guelph left open the possibility that "If governing bodies engage in acts of censorship, they run the risk of [those decisions] being classed as government action and subject to the control of the Charter. In this regard it is pertinent to observe that in the United States the First Amendment has been applied to State universities."(7)
The Homolka case and the creation of IHAC broadly focused public, and particularly media, attention on the problem of controlling information on the Internet. Intensified by media coverage in Canada and the United States, there immediately followed a growing public concern with the purported dangers of the Internet with respect to permitting children to access information about: explosives; terrorism; drugs; all modes of sexual activity; indecent and offensive speech, including "hate speech"; permitting various predators, especially pedophiles opportunities to stalk children; and various pornographers, and rapists to threaten or harass women. Subsequently, there has been substantial debate regarding the extent of these purported dangers and the extent, if any, to which the specific introduction of the Internet has increased them. Nevertheless, a global panic about the Internet had been sparked off. It has been widely accepted in Canada that the Internet should be subject to those laws governing hate literature, child pornography and obscenity and that Internet Service Providers may well have some liability with respect to the materials they transmit. As well as being prosecuted for crimes, ISPs might be sued for defamation (such suits have succeeded in other jurisdictions, Australia, for example) or for assisting in violations of copyright.
When the Homolka ruling occurred, the Government of Canada was far from issuing a policy statement concerning the potentialities and problems presented by the newly developing global network, particularly those involving the regulation and control of content. Shortly after, however, IHAC was established with the then recently retired Principal of McGill University, David Johnston, as its chair. Primarily charged to examine the commercial and social development of the potentialities of the Information Highway (a term presumably coined around 1977 by the now vice-president of the United States, Al Gore), the Council, as will become apparent later, nevertheless had to consider the question of control of content.(8) But there were problems in readily following the U.S. Government lead, as the Government of Canada did when it adopted this buzz word, popular with the industry and the media, implying a narrow analogy of the Internet to a highway. IHAC in relatively uncritically accepting this analogy and by invoking an additional analogy to broadcasting and cable-casting, has set a potentially distorted understanding of the technologies involved -- which will be discussed in greater detail later. For example, if ISPs are viewed as cable distributors, it is very different from considering them as operating like a common carrier, such as the telegraph or the telephone.
The power of this analogy was graphically illustrated (1996) by one of the earliest of judicial definitions and precedents concerning the Internet. The primary Canadian judicial decision up to this time (the Summer of 1997) which broadly addressed the question of the nature of the Internet arose from an action against the Federal government by the Vancouver Freenet concerning a dispute over whether or not under the Income Tax Act the Freenet would be granted status as a registered charity. This case specifically involved questions of control of content, since the Ministry of Revenue was denying charitable tax status to the Freenet on the basis that "it did not exercise sufficient control over how the facility was used so as to ensure its use was consistent with a charitable purpose."(9) Justice James Hugessen in delivering his judgement In Vancouver Regional FreeNet Assn. V. Canada (Minister of National Revenue -- M.N.R.), specifically used the analogy between the Internet and a highway. He begins by noting that since the Income Tax Act does not define "charitable" or "charity" it is necessary to go back to "an obscure and not always consistent corner of the law of England, the starting point of which is the Charitable Uses Act, 1601."
Since the preface to that act speaks of "the repair of bridges, ports, havens, causeways, churches, seabanks and highways," Justice Hugessen, with apparent realization of the ironies involved, adopts the phrase "Information Highway" to suggest that the Freenet as an ISP is part of a major instrument of communication in the 1990s. He justifies his opinion by suggesting that the Internet is just like the highways, causeways and bridges in 1600 and that like those public ways the traffic which flows along it should be equally free of restriction -- that is, there should be no control of content except where that content is violating a specific law:
"A real highway or bridge in the time of the first Elizabeth ... might be used by persons going to market as well as to church or school. It might also be used by highwaymen or by absconding debtors. The nature of the traffic, however, did not serve to dilute or diminish the great public good provided by the facility itself."(10)
While this decision is directed specifically towards the problem of tax status, it does strongly imply there should not be any censorship of or prior restraint to the flow of information across the Internet. While the analogy implicit in the "information highway" is useful here in emphasizing the latitude which must be permitted to services that provide connectivity to the Internet, and while Justice Hugessen's decision quite rightly implies a strong burden of justification against prohibiting or controlling any free flow of information across the Internet, it does not fully take into account the complex role that the Internet and freely available access services such as the Vancouver Freenet provide. Later, in decisions by two U.S. Courts, it will become apparent that a more complex analysis of the Internet can reveal the immense complexity of legal problems involved in discussing its regulation and control, particularly with respect to "indecent" and "offensive content.
To write about the problems which the Internet (variously described as the Infobahn, the Information Highway, the Net, or the Matrix) poses for any form of regulation, it is necessary to cope with its chameleon-like nature. Identifying the nature of the Net is similar to the author of Under the Volcano, Malcolm Lowry's describing the nature of the work of modernist art. He noted in a now famous letter to his editor that the modernist novel is a medley which:
". . . can be regarded as a kind of symphony, or in another way a kind of opera -- or even a horse opera. It is hot music, a poem, a song, a tragedy, a comedy, a farce and so forth. It is superficial, profound, entertaining and boring, according to taste. It is a prophecy, a political warning, a cryptogram, a preposterous movie, and a writing on the wall. It can even be regarded as a sort of machine: it works too, believe me." (11)
From the point of view of the media analyst, the Net is just as complex and enigmatic a medley of modes, for it can be regarded as a kind of postal service, or as a kind of town hall meeting -- or even an electronic Hyde Park. It is a newspaper, a bookstore, a library, a museum, a cable service and so forth. It is superficial, profound, entertaining, boring, and an encyclopedic learning machine. It is disc player, TV, movie theater and a videotext (a writing on the screen). It is powerful, all pervasive, a tremendous knowledge source, and it is threatening. It is a communicating machine that subsumes and potentially annihilates all electronic media!
One problematic aspect of the coming-of-age of the Internet as a global, and very likely the future global communication network, arises from the way that various electronic technologies have moved beyond media to become what can only be described as a pan-medium, super-medium or hyper-medium.(12) Another is how these technologies have impinged upon a privileged notion of speech by creating the possibility for integrated, interactive communication utilizing still and moving images, sound, gesture, rhythm, speech and a wide range of print and calligraphy to produce kinoaudiovisual messages. Such theoretical constructs as the French philosopher Jacques Derrida's concept of grammatology, Walter Ong's positing of an extended or secondary orality, or Canada's Marshall McLuhan speaking about intrasensory tactility can be regarded as conceptual correlates anticipating this contemporary mode of kinoaudiovisual communication. This feature of the Net challenges those historical approaches to modes of communication that have granted a unique, privileged status to speech and print in contradistinction to other modes -- whether visual, gestural or kinesthetic or the extensions of these through electro mechanical and electrochemical means. Therefore, the first problem that legislatures and courts will have to face in their approach to the Internet is to understand the genuine complexity of this new medley of technologies and to anticipate how decisions and policies concerning its potential benefits and dangers could affect its future values to society.
As early as 1982 in his future oriented magnum opus, Technologies of Freedom, Ithiel de Sola Pool, a MIT political scientist specializing in communication policy, explained how the evolving computer and telecommunication technologies would create a crisis in communication policy. He noted that "Each new advance in the technology of communications disturbs a status quo. It meets resistance from those whose dominance it threatens, but if useful, it begins to be adopted. Initially." Since this technology is new and still in a state of development "Technical laymen, such as judges, perceive ... [it] ... in that early, clumsy form, which then becomes their image of its nature, possibilities, and use. This perception is an incubus on later understanding."(13) Governments, legislators and regulatory agencies confronting the problems raised by the new technology tend to follow the historic pattern by creating analogies to earlier technologies. Early attempts at understanding the telegraph were devised on analogy with the railroads; just as the telephone was viewed as an extension of the telegraph. But such attempts as the early treatment of cable television as if it were broadcasting, have illustrated how such analogies eventually are misleading and create confused and often conflicting policies. This results in the development of policies which are inappropriate for the more complex and convergent modes of communication that are emerging as the technologies mature. De Sola Pool, who concluded his analysis by confronting the situation of the convergence of media, prophetically forecast: "Historically, the various media that are now converging have been differently organized and differently treated under the law. The outcome to be feared is that communications in the future may be unnecessarily regulated under the unfree tradition of law that has been applied so far to the electronic media."(14)
The current controversy surrounding the extent of regulation of the Internet is specifically directed at the amount of freedom or regulation which should be applied. The quest for analogies has been a predominant feature of the ensuing debate; a quest that has been frustrated by the multiplicity of genres and types of activity which the Net seems to embrace, ranging from person-to-person e-mail to online chat groups and the World Wide Web and ranging from chats about sports and fashion to advanced scientific and technological research material and to avant-garde art and complex intellectual and political controversy. The recognition of the nature of these technologies and the critique and correction of the problem that convergence presents with respect to the use of analogies needs to be supplemented by another perspective. As communication moves beyond media and thus beyond the word, the question arises as to whether the privileging of speech (and of writing) in discussions of censorship or laws controlling content is tenable in a situation where we have what some have described as a 'secondary orality," but which might be more precisely described as an all-encompassing mode of expression embracing the verbal, vocal, visual, gestural and kinesthetic into an integrated whole.(15)
Recent discussions have suggested that interdiscursive dialogue and innovative cultural productions have always played a role in the exploratory transformation of language and of other modes of communication. This implies that an openness of communication has an ecological role to play in the development of modes of expression which will permit interdiscursive exchange between individuals and between groups whose interests, backgrounds and sociocultural formations may differ. If this is valid, as I have demonstrated elsewhere,it must have profound implications for the suggestion that images -- still, moving or multimedia -- ought to be controlled more rigorously than words.(16) Our received wisdom strongly suggests the privileging of speech and writing over other modes and media of communication, just as the American constitution does. But given such privileging, what then could have been the implications when the drafters of the Canadian Charter of Rights and Freedoms chose to speak of "freedom of expression, including freedom of the press and other media of communication" in contradistinction to the United States Bill of Rights which, for historical reasons, only guarantees "freedom of speech" and "freedom of the press"? Isn't it possible that a document drafted in the eighties by Trudeau's government in the wake of the sixties, the moment when McLuhan -- apparently admired by Trudeau -- produced his major works, was intended to broaden our legal comprehension of communication and hence to broaden the guarantee beyond what was specifically guaranteed in the U.S.? While current jurisprudence has been quite conservative in its treatment of Section 2 of the Charter of Rights and Freedoms, invoking the reasonableness clause in Section 1, it is certainly possible in the context of the Trudeau government in the 1980s that a greater breadth was intended. Regardless of how we might argue with respect to the intention of Section 2 of the Charter, the very existence of claims concerning the exploratory and developmental nature of new technological modes of the production, reproduction and dissemination of communication and expression will become increasingly complicated by the integrated nature of the Internet.
The problem has not yet (i.e. of the summer of 1997) been as dramatically confronted in Canada as it has had to be recently in the United States. While the Canadian approach will naturally differ from that of the United States, there is still much to be learned from the "American experience." The U.S. Supreme Court through a decision reviewing the case of ACLU v. Reno (June 1996) -- a judicial review of the Communication Decency Act -- recently confronted the issues raised by control of content on the Internet. This ground-breaking case, which has generated major judicial precedents in the U.S. concerning censorship on the Internet, will be of continuing interest to those making policy, legislation, and law with respect to the regulation of content in Canada and elsewhere. The crisis resulting in this decision arose while the Clinton-Gore administration was pushing to pass the Telecommunications Act of 1996, a piece of legislation strongly supported by the entire U.S. Telecommunication Industry (phone companies, cable companies, satellite companies, etc. as well as the computer industry). In the late stages of debate on this Bill, using the opportunity of the urgency of the White House to have this legislation approved, a group of conservative Senators -- encouraged by fundamentalist religious groups, coalitions supporting family values, conservative feminists, anti-abortionists and other right of center groups -- sponsored the "Communications Decency Act," an amendment which was directed at banning "indecent" and "patently offensive" material from the Internet by criminalizing it. This amendment along with the Telecommunications Act was passed by Congress and signed into law by President Clinton. Led by the American Civil Liberties Union (ACLU) and the American Library Association (ALA), a group of forty-five corporate and individual plaintiffs challenged the constitutionality of various provisions within the act by applying for an injunction on the grounds that they were unconstitutional, violating the first and fifth amendments of the U.S. Consitution (i.e., the amendments guaranteeing freedom of speech and the press, and due process under law respectively).(17)
The two statutory provisions of the CDA which were germane to the ALA-ACLU challenge involved a provision in Section 223(a)(1)(B) that:
"any person in interstate or foreign communications who, "by means of a telecommunications device," ... "knowingly ... makes, creates, or solicits" and "initiates the transmission" of "any comment, request, suggestion, proposal, image or other communication which is obscene or indecent, knowing that the recipient of the communication is under 18 years of age," "shall be criminally fined or imprisoned."
and a provision in Section 223(d)(1) ("the patently offensive provision"), that makes it a crime to use an "interactive computer service" ... to "send" or "display in a manner available" to a person under age 18, "any comment, request, suggestion, proposal, image, or other communication that, in context, depicts or describes, in terms patently offensive as measured by contemporary community standards, sexual or excretory activities or organs, regardless of whether the user of such service placed the call or initiated the communication."
Conviction for a violation of either of these statutes would have resulted in the imposition a either a fine or up to two years imprisonment or both for each offense. The first of these provisions added the category "indecent" to current statutes, thus supplementing existing obscenity laws; the second coined a term for an entirely new category "patently offensive." The U.S. Courts would find both terms too broad and overly vague.
What is important for Canada and other jurisdictions about the two levels of adjudication in ACLU v. Reno (apart from the transnational nature of the Internet), is that it brought together such a large, diverse group of plaintiffs who were knowledgeable concerning the Internet, its design, its use and its implications for human development. Therefore the findings of fact in the initial judgement by the Federal District Court in Philadelphia represent a copious outline of the current state of the art in computerized telecommunications, which assists in identifying the dangers and difficulties of controlling content on the Internet. "It is no exaggeration," the "Findings of Fact" declare, "to conclude that the content of the Internet is as diverse as human thought."(18) The very nature of the medium, the Findings of Fact explains, permits a situation where a "content provider" is not a traditional speaker, but may actually be a medley of speakers. Since the listeners and speakers who constitute this "medley" can easily interchange in the interactivity of the Internet, "content providers" have little or no editorial control for "In the argot of the medium the receiver can and does become the content provider and vice-versa." (#80) The justices conclude in their findings that "The Internet is therefore a unique and wholly new medium of worldwide communication." (#81)
All parties to the action in the Federal District court agreed that the Internet contains sexually explicit material, but it was found that, contrary to the implications of some news media and activist groups, "There is no evidence that sexually-oriented material is the primary type of content on this new medium." (#83) Much of this sexually explicit material frequently involves material of considerable value and benefit to society -- sometimes artistic, sometimes medical and sometimes social advice (e.g., information concerning the prevention of AIDS, advice to teens about the dangers of drugs or about their concerns over sexuality). It is also noted in the Findings that "Even the government witness, Agent Howard Schmidt, Director of the Air Force Office of Special Investigations, testified:'the odds are slim' that a user would come across a sexually explicit site by accident."
The complexities of the case resulted in all three justices writing separate opinions, even though their judgement that the act was unconstitutional was unanimous. In those opinions a number of crucial points were raised:
(a) "Those responsible for minors [should] undertake the primary obligation to prevent their exposure to such materials.
(b) four characteristics of the Internet that are of 'transcendent importance': First, the Internet presents very low barriers to entry. Second, these barriers to entry are identical for both speakers and listeners. Third, as a result of these low barriers, astoundingly diverse content is available on the Internet. Fourth, the Internet provides significant access to all who wish to speak in the medium, and even creates a relative parity among speakers
(c) "Since much of the communication on the Internet is participatory, i.e., is a form of dialogue, a decrease in the number of speakers, speech, fora, and permissible topics will diminish the worldwide dialogue that is the strength and signal achievement of the medium."
(d) the benefit of Internet communication is that so much speech occurs and that speech is easily available to the participants
(e) "The Internet is a far more speech-enhancing medium than print, the village green, or the mails.... Some of the dialogue on the Internet surely tests the limits of conventional discourse. Speech on the Internet can be unfiltered, unpolished, and unconventional, even emotionally charged, sexually explicit, and vulgar -- in a word, "indecent" in many communities. But we should expect such speech to occur in a medium in which citizens from all walks of life have a voice. We should also protect the autonomy that such a medium confers to ordinary people as well as media magnates."
(f) The Internet raises in an entirely new way the problem as to what community standards the Internet is to be judged by.
Although the history of the U.S. First Amendment and the Canadian Charter approaches to "freedom of expression" are different, the issues raised by the nature of the Internet ultimately should affect the approach in both countries. Leaving aside for the moment points (a) to (e), let's first explore point (f), the problem of community standards, a concept in Canadian as well as U.S. law, though differently interpreted. Both the U.S. District Court and the Supreme Court have posed the question, by implication, as to what, if any, cost in ignorance is reasonable in a free and democratic society to protect children from "dangerous" information. Such an inquiry may have to extend beyond the purview of the CDA hearings, eventually requiring more adequate definitions of obscenity and child pornography and reopening the problem of the criminalization of possession of child pornography. At the minimum, as all the justices in the U.S. District Court and the U.S. Supreme Court have noted, the Internet will require new ways of legally delineating community standards since what might well be denominated "a global metropolis" now exists on the Internet.(19)
While the Supreme Court in the United States has defined a community standard to be a local community standard, the Canadian Supreme Court recently in Regina v. Butler (1992) reiterated the community standards test in Towne Cinema Theaters Ltd, v. The Queen (1985) that, on the contrary, there is a national community standard in Canada which is not the standard existing in specific communities such as a university or a cosmopolitan city. In 1992 Justice Sopinka observed:
"The community standards test has been the subject of extensive judicial analysis. It is the standards of the community as a whole which must be considered and not the standards of a small segment of that community such as the university community where a film was shown ... or a city where a picture was exposed.... The standard to be applied is a national one.... With respect to expert evidence, it is not necessary and is not a fact which the Crown is obliged to prove as part of its case.... In R. v. Dominion News & Gifts, Freedman J.A.(1962) (dissenting) emphasized that the community standards test must necessarily respond to changing mores."(20)
In elaborating the nature of the community standard test Justice Sopinka went on to observe later that the court in its earlier decison " reviewed the case law and found: The cases all emphasize that it is a standard of tolerance, not taste, that is relevant. What matters is not what Canadians think is right for themselves to see. What matters is what Canadians would not abide other Canadians seeing because it would be beyond the contemporary Canadian standard of tolerance to allow them to see it."(21)
Chief Justice Dickson's position in Towne Cinema v The Queen suggests that the rule in questions of content is not taste, but tolerance, amplified by the principle that Canadians embrace the view that, with respect to content, the law is not what Canadians think is alright for themselves to read or see, but what others should think and see -- the paternalistic "do what I say not what I do." Since subsequent to 1985 when Dickson C.J. reiterated the tolerance test, the buzz term'zero tolerance' has become all pervasive with respect to disapproved behavior -- ranging from smoking to child abuse and including pornography and other illegal or offensive content, it could be argued that the tolerance principle has become an exteremly restrictive one. This aspect of the community standard test may well have to confront the challenge the Internet presents for the definition and delimitation of a community, a factor which is concealed behind the rhetoric that the Internet as a "global village" is a global community, while it is really a global megalopolis that is an amalgamation of multitudes of cultural difference.
There are further complications, for when it is said that community standards must be national it does not mean "the standards of a small segment of that community such as the university community where a film was shown (R. v. Goldberg,  3 O.R. 323 (C.A.)) or a city where a picture was exposed (R. v. Kiverago (1973), 11 C.C.C. (2d) 463 (Ont. C.A.)). But if this is really the case, then the complex multi-culturalism which has developed in Canada in the last few decades presents a formidable problem to discovering and articulating a community standard which is not a lowest common denominator of a multiplicity of very different groups of individuals. Second, if one accepts the fact that different provinces and different regions of the country are distinct, then it must be recognized that there are historic differences in Canada which the concept of a single nationwide community standard obviates. Third, specialist communities such as universities, research institutes, research libraries and hospitals have always in practice had a latitude concerning obscenity, indecency or patently offensive materials which to an extent has been sustained by the various "redeeming value" clauses of the obscenity and child pornography sections of the criminal code. With the emergence of the problems presented by the Internet which is, among other things, a research library, the common room of a series of global research institutes, the conference room for professional consultations, and a multitude of global university classrooms -- and with the Butler defined community standard, the existence of the "redeeming value" clauses do not necessarily avoid a strong likelihood of creating a chilling effect resulting in a de facto prior restraint, especially if the Boards and senior administrators of these institutions were to take steps to protect themselves and their institutions from possible liability (22)
The problem is even more severe. A distinction is sometimes made between first and second-class speech: first class speech includes recognized literary and art work, professional writings and the like; second class speech, newly emerging artistic activity, everyday cultural production, the banter - sometimes indecent and patently offensive of the interchange in heated discussions.(23) Since it is difficult to anticipate an individual jurisdiction's or Courts' approach to what might be obscene or even as to what might constitute child pornography, the expression and communication of many individuals is silenced through fear of the costs of possible prosecution. The natural result is to sharply reduce the freedom of speech and expression. On the Internet the restraints are not only imposed by the fears of those participating, but by the fears of ISPs, BBSs (owners of computer bulletin boards or newsgroups) and the Institutions, such as libraries and universities which may consider themselves to be responsible. This type of prior restraint is especially serious with respect to ongoing creativity within the arts and cultural production, which, as noted above, may well have a specific ecological role in the exploratory transformation of language and communication, or with areas of academic research that probe into problematic areas of human behavior.
In the opinions rendered by the U.S. Supreme Court in the previously mentioned appeal by the government of ACLU v. Reno, the tensions which arise from the Internet's generating instantaneity and producing a shrinking of space are clearly demonstrated. Both U.S. Courts found the "community standards" test to be problematic as applied to the Internet. These problem arise from the conflict between local standards and national standards noted in the U.S. Supreme Court majority decision in which Justice Stevens observed:
. . . the "community standards" criterion as applied to the Internet means that any communication available to a nation-wide audience will be judged by the standards of the community most likely to be offended by the message....The regulated subject matter includes any of the seven "dirty words" used in the Pacifica monologue, the use of which the Government's expert acknowledged could constitute a felony. See Olsen Test., Tr. Vol. V, 53:16;54:10. It may also extend to discussions about prison rape or safe sexual practices, artistic images that include nude subjects, and arguably the card catalogue of the Carnegie Library.
If attempting to regulate the content of the Internet within the United States could produce such potential conflict about community standards, what would be the result if a genuine international agreement could be produced as the Canadian government appears to desire? Or returning to Regina v Butler what constitutes the whole community on the Internet?
Approaching the problem of control and regulation of content, IHAC commissioned a report from Industy Canada on "Illegal and Offensive Content on the Information Highway"(24) After receiving this copious and thorough report, IHAC made only a few fairly general recommendations concerning the control of content. For its part, the Council advocated that the Government of Canada adopt a strategy outlined in the following recommendations involving the law, information providers, the public, and technology itself:
Primarily, the Council's recommendations would apply existing laws to the Internet, although they clearly recognize the problem of differing international community standards and the need to clarify the differences between the treatment of private and public communication on the Internet. In summary their final recommendations indicate that the government should: fine tune existing laws controlling content; encourage the holding of inter-jurisdictional international meetings to discuss the control of content; clarify ambiguous legal definitions for owners of Internet services; urge Internet Service Providers to create voluntary codes of ethics and adopt modes of dispute resolution; and provide R & D funding to develop filtering and other control mechanisms for individual use in homes to control content and to aid the police in improving enforcement. IHAC, possibly disregarding the complexity of the new technology, also recommended assigning regulation of cultural content on the Internet to the CRTC, arguing that it constituted an extension of broadcasting and cable communication. In coping with these issues the major problem presented by IHAC's report results from their being misled by the implications of the metaphor of the Information Highway, and of failing in their discussion of issues of content to draw sufficient distinctions about the differing features of the Internet. Consequently, the Council does not confront the deliberately 'anarchic' or 'chaotic' features which characterized the Internet from its inception by ARPA.
From a broader understanding of the problem, the U.S. District Court did so in citing and endorsing the testimony of an expert witness for the plaintiffs: "What achieved success was the very chaos that the Internet is. The strength of the Internet is that chaos." The District Court endorsed it with Justice Dalzell's comment: "Just as the strength of the Internet is chaos, so the strength of our liberty depends upon the chaos and cacophony of the unfettered speech the First Amendment protects." By the mid nineties the fact of the purposefully 'chaotic," random structure of the Net is well recognized. Equally well recognized is the unique nature of the Internet mentioned above. The U.S. Supreme Court not only concurred with, but strengthened the District Court's stress on this uniqueness for Justice Stevens, endorsing Judge Dalzell's comment in his decision for the lower court, takes note of the:
"participatory, dialogic and overwhelmingly inclusive nature of the discourse which takes place on the Internet. While dialogue on the internet frequently tests the limits of conventional interchange, the Internet is a far more speech-enhancing medium than print, the village green, or the mails ...."
Recognizing the autonomy and empowerment to a wide range of people that the Internet represents, Justice Stevens's decision also took Judge Dalzell's observation that "Any content based regulation of the Internet, no matter how benign the purpose, could burn the global village to roast the pig" and strengthened it by noting:
In Sable, 492 U. S., at 127, we remarked that the speech restriction at issue there amounted to "burn[ing] the house to roast the pig." The CDA, casting a far darker shadow over free speech, threatens to torch a large segment of the Internet community.
This very feature of the Internet has led a concerned group of Canadian lawyers to organize LOGIC (Legal Group for the Internet in Canada), whose co-founder and Chair has emphasized that "Due to the nature of the Internet, including its history, culture amorphousness and universality, it is quite impossible to effectively regulate .... [for] The very essence of the Internet is anarchy, a diametrical opposite of authority." The chair of LOGIC further alleges that IHAC's paper fails because it shows a confused understanding of the nature of the Internet.(26)
In 1996, apparently in response to one of IHAC's recommendations, the Government through Industry Canada commissioned an Internet Content-Related Liability Study. Four lawyers were appointed and directed to produce a report on the potential legal liabilities of ISP's in providing access to the Internet. In developing this study they were specifically prohibited "from elaborating policy options or formulating recommendations for legislative amendment" and from examining "whether Internet activities should be regulated under telecommunications or broadcasting legislation."(28) Their commission permitted wide consultation with those involved in the Internet, particularly the commercial and industrial players. In their completed report the authors noted that their study "appears to be the first of its kind in the world on the specific legal issues of liability for content circulating on the Internet."(29) The authors review the possible problems of liability between providing Internet access and: (1) the criminal code (Obscenity, Child Pornography and Hate propaganda); (2) trade-mark infringements; (3) civil liability; and (4) Copyright Infringement.
Fundamentally their study explores the question of and provides detailed analyses of how existing laws might be applied in relation to the Internet. In the general conclusion to the Report they observe that "The Internet revolution poses various challenges in applying, enforcing and abiding by existing laws," but if amendments to those laws are necessary these lawyers recommend that they should be made de minimis and in as technologically neutral a way as possible, under the circumstances always keeping in mind the balancing of the "interests of users, publishers and disseminators on the one hand and those of authors on the other, while preserving freedom of expression and only imposing limits on such freedom as is necessary in a free and democratic society."(30) While carrying out its analysis of possible applications of specific laws to the provision of access to the Internet, this Industry Canada report again does not examine the nature of the Internet and its accompanying technologies or of how radically the Net differs from preceding technologies. Consequently this exercise has created as many problems as it solves, for it pressures those providing service on the Internet without reviewing the adequacy or inadequacies of the laws involved or discussing the issue of laws which control content in the context of a genuine study of the nature of the Internet. It is easy to assume that this study was an exercise sponsored by Industry Canada to make it seem as it if has reviewed the issue, while simultaneously contributing to the rush to equate the provision of Internet service with a publisher or distributor, the sum effect being to impose a massive prior restraint on the Internet before a judicial review by the Canadian courts.
Although any consideration of CRTC's role in relation to the Internet was excluded from the study of the Internet Content-Liability group, it seems clear that the CRTC could regulate Canadian based activity on the Internet under both the Broadcasting and Telecommunications Act.(31) It is an important issue that must be faced, particularly with reference to questions of the wisdom and the practicality of such regulation when applied to content on the Internet in contradistinction with content in other modes of telelcommunication or broadcasting. The attendant question, which has not yet been confronted, since the Canadian government is still trying to pursue the problem of content on the Internet primarily under existing legal and legislative structures, is whether the unique nature of the Internet, given its complex assemblage of multiple modes of communication, permits it to be treated adequately and successfully simply as a broadcast or a telecommunication transmission.
Subsequent to the IHAC report and initiatives of the Clinton-Gore administration in the U.S., the Government of Canada has exhibited considerable interest in promoting international discussions concerning the control of content on the Internet. Most recently (Summer 1997), Lloyd Axworthy, the Minister of Foreign Affairs -- citing the usual concerns about terrorism, drugs, obscenity and child pornography -- has stressed the need for a global policy on content. The real question which arises, as broached earlier, is what problems would arise in the attempt to articulate such a policy. If an electronic edition of Salman Rushdie's Satanic Verses(32) were made available on the Internet, for example, Canada would certainly have to consider the Iranian condemnations of such a work as counseling a criminal activity and possibly as representing an incident of hate speech; but because of the legal force of Section 2 of the Charter, it would ultimately have to reject the Iranian position. And while upholding Rushdie's freedom of expression through such a rejection Canada might enjoy the support of the United States, the United Kingdom, and much of the Commonwealth and Europe, would Iraq, or China, or even India agree? A multitude of such problems could be cited which are quite predictable considering the differing international interpretations of and attitudes toward section 19 of the United Nations Charter of Rights which guarantees freedom of speech and expression. For example, would Canada or the United States be willing to ban content that was objectionable or offensive in pietistically Catholic or Muslim nations? Yet, as unlikely as it seems, even if through diplomatic give and take such an international agreement could be achieved, would it not be premature to do so at a stage in which there is little common understanding of the nature of the Internet and of the balancing of the gains and losses which premature control might produce?
To put these questions in a more immediate context, could Canada even work out a common compromise with the United States on issues concerning control of content. As of the moment, it is apparent that there is a standard of tolerance in the United States concerning hate propaganda which does not necessarily apply in Canada. U.S. web sites have been willing and legally able to echo neo-Nazis sites banned in Germany and Canada. The difficulty of a nation reacting to such a ban has been exemplified when Germany tried to ban certain expression contrary to its laws banning neo-Nazi activity by blocking Internet sites within Germany from accessing a Bulletin Board in Denmark, which carried such material. Following the official German blocking, this site was immediately echoed by three major U.S. universities -- MIT, Stanford, and the University of California -- all of which were of vital importance to International Research networks. Consequently to ban the offensive material, Germany had to confront whether or not to prevent their research institutes and universities from contacting these sister institutions in the United States. And even if a ban were to have been effected, would it have been possible to control totally all possibilities for the dissemination of the offensive information? With vastly differing standards of what ought or ought not to constitute protected speech and expression, is a sensible, workable International agreement feasible and even it were, would it be desirable?
One further international question with respect to content on the Internet which some may regard as a mode of censorship and others as a protection of culture, is what languages are to be used on the Internet? France has already attempted to enforce laws concerning language applying to the Internet, and the OFL in Québec has already asserted its right to extend existing language laws to the Internet. Are Québec language laws going to apply to the Internet? A commercial web site in Québec is advertising, so will it be required to use French? Government of Canada web sites provide bilingual services, but how far should bilingual services be available on the Web? Would it be acceptable for a country to block material that was not in an official language, which is apparently what France wished to do in bringing an American university's French branch to court for using English on the Internet. The complexities of the problems presented by this procedure and the attaining of a broad international agreement concerning them will certainly prove daunting. All of the issues examined so far reinforce Negroponte's views concerning the impossiblity of controlling content on the Internet. The only possibility remaining for control of the Internet was outlined in a BBC interview of a Singapore official: "after making very eloquently the point that the model of 'gate keeping' is failing every day, he explained that the function of censorship is to provide a SYMBOL. In other words to dress up society in the straight-jacket of self-denial, self-limitation and self-censorship."(33)
Confronted with the failure of 'gate-keeping' by governments and the Courts. or through direct legislation such as the Communication Decency Act in the U.S., politicians have now turned to a second method of control or regulation of Internet content through the use in the private and/or public sector of filtering/blocking software produced by commerical software or shareware designers, which permits parents, employers, Internet Service Providers or public institutions to block the flow of specifically designated content. The unequivocal decision by the U.S. Supreme Court concerning the CDA also recognized the future possibilities for filtering/blocking software as a less limiting alternative which would permit parents to control access to the Internet in order to protect children from being exposed to speech they consider indecent or patently offensive, while still allowing adults their right to access and view such material. Immediately following that Court decision in June 1997, President Clinton pledged his government to see that successful filterware is developed and made available to parents wishing to protect their children against illegal and offensive material on the Internet.
In Canada IHAC obviously had already supported the need for a technology to filter or block offensive material, since in their report they had called particular attention to a future possibility for using some Internet adapted version of the V-Chip which was then being developed in the engineering faculty at Simon Fraser University to permit the blocking of violence on TV. Probably, if the Council had been fully aware of the difficulties and complexities of using such a chip on the Internet, they would have opted instead for the development of specially designed blocking/filtering software.(34) The theory is that control of the Internet by such filtering/blocking software used either to block sites decided upon by a software provider or, alternatively, to block sites on the basis of self-classification by the individual who maintains the site could be enforced through legislation -- a course of action which has already been suggested by groups in Canada and the United States, and which is already being enforced elsewhere.
At first the use of filtering/blocking software may seem like a simple, reasonable solution, free from any threat to the freedom of expression of adults. Once one examines this proposal more thoroughly, serious problems arise when filtering/blocking software providers select the sites and program the categories to be blocked, since this constitutes permitting a private company to make censorial decisions for the public. The frequently proposed alternative solution is to have the filtering/blocking software block sites on the basis of self-classification -- what the industry has dubbed self-labelling -- by each individual who operates a web site, bulletin board or. The difficulties inherent in this scheme are succinctly summarized by the Singapore official quoted above, since it "dress[es] up society in the straight-jacket of self-denial, self-limitation and self-censorship."
Turning first to filtering/blocking software currently in use, in which the software companies have pre-selected the sites to be blocked, problems arise once one starts asking questions concerning what might or might not be specifically blocked, by whom, for whom, and in whose interests. The use of such software has not been free from controversy. First of all, while these software packages are essentially designed to permit parents to protect children from accessing sites which they would regard as endangering the development of the child, there have been frequent demands that such software be installed on the computers of public institutions such as libraries.(35) It is also being used by some employers, including some branches of government. Second, while such software allows the parent, institution or employer to select what categories of material are or are not blocked, the makers do not usually provide any detailed information concerning precisely which specific individual sites they have selected for blocking. For example, a category chosen for blocking by the manufacturer might be described as sexual explicitness. But such a category in addition to blocking sites that might considered by most users as sexually explicit might also block sites providing specific information about sexual disease, birth control, abortion, or Aids, yet the users would not be aware that these socially useful sites involving discussions about aspects of sexuality are being blocked. Third, since the manufacturers do not reveal the specific sites that are being blocked, pleading that a list of specific sites is a commercial secret and that releasing it would provide a guide to indecent and patently offensive material which impressionable children or teenagers then might access, it is possible for the designers of the software to implement hidden agendas by blocking sites with whose views they do not agree. Fourth, filtering/blocking software can also be used by nations with repressive political administrations to block speech and other expression that they deem to be unsuitable (e.g., China, Iran, Singapore).
One such package, Cybersitter, provided by a software company called Solid Oak, was a particular target of criticism and controversy in 1996 and 1997. It has been argued that Solid Oak Software blocks sites containing criticism by those who disagree with Cybersitter's modus operandi. One such site denominated "www.peacfire.com" is operated by Bennet Haselton, a youthful anti-censorship activist.(36) Haselton had charged that Cybersitter was blocking various other sites for ideological reasons. He and others have also further argued that groups opposed to control of content on the Internet have been blocked by Solid Oak as well as by other "censorware." Whether these allegations are correct or not (at least the ones Haselton has made appear to be true), the secrecy insisted upon by the "censorware" companies leaves open a genuine possibility that they could select sites to block for their own particular reasons. The list of what Solid Oak actually blocked in the Spring of 1997 has been made publicly available on the Internet by a law professor at Case Western University. It showed that Cybersitter blocked some Canadian universities, including McGill, and that they blocked various U.S. universities, including the University of Chicago. At one point, allegedly because of ideological disagreements with some user comments on Solid Oak's policies, M.I.T. was also blocked. While Solid Oaks is alleged to have been more indiscriminate, agenda-oriented, and retributive in its selection of sites than other manufacturers of filtering/blocking software, the same potentialities are present in any other censorware such as Cyberpatrol, SurfWatch or others who insist on keeping their lists secret. Further, since the sites to be blocked by such software are largely selected by the blocking of words or phrases in the site's web address, addresses, using terms such as urinate, sex, nudity, gay community and the like, are blocked even if they actually offer valuable medical advice rather than indecent or patently offensive material. Still further, since these businesses make the ultimate decision of what to block or not to block without any public accountability, this taken together with the preceding points certainly constitutes a type of public censorship by private sector businesses which should be in conflict with Article 2 of the Charter of Rights.
An alternative, favoured by many, including the White House, parents groups, and industry, is to develop a system in which owners of various sites on the Internet (web pages, newsgroups, chat rooms, etc.) or some neutral third party, rate and label the potential degree or level of indecency or patent offense which appears at the site.(37) By 1997 filtering/blocking software capable of using such labeling was available to or was being developed by software companies such as Microsoft and Netscape that provide browsers for the Internet. These software packages can be programmed to permit the user, presumably a parent or authority in charge, to decide to block a variety of different sites dependent on the basis of how they have been labeled. While at first glance this would seem to be a reasonable alternative solution to the problems posed by the software provider's deciding what sites to block, there are still problems inherent within it. In 1997 SafeSurf, one of the two PICS-compatible ratings systems, lobbied the U.S. Government, the Industry, and Parents groups to support the design of an appropriate filtering/blocking software.
Since it would be both highly problematic and quite prohibitive to develop a third party monitoring organization which could visit and evaluate the content on all sites, obviously the only practical solution -- as SafeSurf recommended -- is by requiring individual owners to self-label their sites. To support the use of such software through self-labeling, SurfSafe also argued that legislation must be put in place similar to its proposed draft of an "Online Cooperative Publishing Act" designed to provide "a Safe Internet Without Censorship." For this system to work SafeSurf points out that in such legislation there would have to be a civil and/or criminal recourse against those who 'mislabel' their sites, which ought to involve penalties on non-compliance ranging from initial fines of up to $5000 (US) to larger fines and incarceration for repeat offenses.(38) SafeSurf provides a document explaining its rating system entitled "The SafeSurf Internet Rating Standard." It provides categories or types of different sites for which it would provide filtering and which, therefore, should be labeled for blocking. These categories, SafeSurf notes, were "Designed with input from thousands of parents and Net Citizens":
Sex, Violence and Profanity;
Intolerance of another person's racial, religious or gender background);
glorifying drug use;
other adult themes;
The categories or themes are then subdivided into nine caution levels by age range, those levels being:
adult supervised recommended;
limited to adults;
explicitly for adults.
The application of these caution levels to the respective categories can be illustrated by their application to the category, Nudity:
1) Subtle Innuendo
Subtly Implied through the use of composition, lighting, shaping, revealing clothing, etc.
2) Explicit Innuendo
Explicitly implied (not shown) through the use of composition, lighting, shaping or revealing clothing
3) Technical Reference
Dictionary, encyclopedic [sic], news, technical references
Classic works of art presented in public museums for family viewing
Artistically presented without full frontal nudity
Artistically presented with frontal nudity
7) Detailed Graphic
Erotic frontal nudity
8) Explicit Vulgarity
Pornographic presentation, designed to appeal to prurient interests.
9) Explicit and Crude
Explicit pornographic presentation(39)
It should be noted that in this complex classification system, SafeSurf goes beyond indecent and patently offensive material, including categories such as Intolerance of another person's racial, religious or gender background; glorifying drug use; other adult themes; and gambling. It also provides for a category denominated "violence," as well as one denominated "sex, violence and profanity." It recognizes four stages of childhood: children; older children; teens; and older teens, but it does not distinguish between a teen who is attending a college or university and one who is still in high school or in the work force. The application of the "caution levels" to the category "nudity" clearly indicates the highly subjective, emotive and somewhat ambivalent language, which is involved.
Since public non-profit organizations such as SurfWatch have already recommended that such filtering software also be used by libraries and other institutions to which children have access, there is no doubt that such filterware as SafeSurf proposes would be directed to a market which includes schools, libraries and businesses (e.g., Cybercafe's) or institutions (e.g. Teen Clubs) to which children have access. Perhaps with the category of "late teen," there also would also be some pressure for them to be used in colleges and universities. Part of this is confirmed by the action of cities such as Boston where in 1997 the city government required all public institutions to use filtering software on their computers (including the public library); Austin, Texas where the library was using censorware; or the state of Texas which was introducing a labeling law. By the summer of 1997 it seemed apparent that governments would attempt to legislate the use of a labeling system and that many constituencies would require these to be used in colleges, universities, libraries and possibly other public institutions.
With an awareness of the growing enthusiasm for blocking/filtering software and self-labeling, the American Library Association through its Intellectual Freedom Committee articulated in July of 1997 a "Statement on Library Use of Filtering Software" to supplement its "Library Bill of Rights."(40) The Statement asserts:
The use in libraries of software filters which block Constitutionally protected speech is inconsistent with the United States Constitution and federal law and may lead to legal exposure for the library and its governing authorities. The American Library Association affirms that the use of filtering software abridges the Library Bill of Rights.(41)
Citing examples of the inevitable blocking of materials, legal and useful -- similar to one's noted earlier in our discussion -- the ALA reasserts that "Libraries are places of inclusion rather than exclusion," and that as publicly supported governmental institutions they are subject to the First Amendment, just as in Canada libraries in particular must certainly be subject to Section 2 of the Charter. This would assure public access to information about breast cancer, AIDS, women's rights, or animal rights -- all of which have been blocked by commercial blocking/filtering products.
The implementation of any program for monitoring the adequacy of self-labeling must confront the fact that the magnitude of the problem of policing such self-labeling of sites would be vast. Even with such policing there would be no possible recourse against postings from other national jurisdictions. Self-labeling would not operate successfully for newsgroups or chat rooms where it may be impossible to predict what information might arise and be disseminated during and interchange. Unmoderated groups, which present excellent opportunities for creating a free public square, would either have to be eliminated or responsiblity and liability for what occurs would have to be assigned to some individual or individuals who would provide the self-labeling and then "police" the conversations and discussions within the group to assure compliance with the self-labeling. There is a further more serious aspect to the proposal of self-labeling since individuals would be faced with civil actions and criminal charges resulting in costs, awards, fines and/or imprisonment for improperly rating one's site. This dramatically raises once again the question of the fairness, precision or adequacy of the "community standards" test when applied to the Internet. It has been said that what is one woman's art is another woman's pornography; what one parent considers suitable for a fourteen year old child, another parent finds dangerous and objectionable; what may be offensive to a Roman Catholic parent may not be offensive to a Jewish parent; tolerance for certain forms of humour vary widely among cultural groups. There is no certainty that one's view of the potential "danger" of certain material for certain age groups will correspond to what someone else regards as the obvious "community" standard.
Self-labeling must necessarily be a form of "self-denial, self-limitation and self-censorship." Under current conceptions of freedom of expression, linking it with specific criminal or civil sanctions represents a chilling effect which imposes a prior restraint on speech, expression and communication. The reasonableness of such a prior restraint should obviously have to be connected with a demonstrable clear and present danger. Courts in Canada and the United States both admit that there is no clear, demonstrable evidence to sustain allegations that there are inherent dangers in indecent, patently offensive, pornographic (even including child pornography) or obscene material. Yet since 'popular wisdom' acknowledges such danger is clear and present, should the 'gut feeling' of 'popular wisdom,' which the Supreme Court of Canada seems to have endorsed in Butler, constitute a reasonable enough standard under section 1 of the Charter to outweigh the potential damage of restraining speech, expression, and communication on the unprecedented nature of the Internet as "a far more speech-enhancing medium than print, the village green, or the mails ...." This raises critical issues about what is and has been meant by successive assertions assuring the freedom of speech and expression, and what should constitute "reasonable limits prescribed by law as can be demonstrably justified in a free and democratic society."(42) Should 'popular wisdom' be a demonstrable justification, in light of highly inconclusive and conflicting scientific studies?
While it might well be argued in reply that this is not essentially different from legal approaches in the past through which individuals in Canada and other jurisdictions have been faced with the difficulty of deciding whether community standards would judge certain material to be obscene or hateful, since under the criminal code governing obscenity, child pornography or hate literature, the onus has been that the accused must establish some acceptable "redeeming value": social, artistic, or intellectual. There are two major problems that make this argument problematic. First, according to the Supreme Court of Canada "redeeming value" is also to be judged by "community standards," a process in which 'experts' have no standing; yet many of the sites, discussions and other material on the Internet are there as part of an 'expert' discourse essential to the work of the expert, but made open and available to others wishing to gain a deeper understanding of that expert discourse. Second, once again the uniqueness of the new medley of technologies confronts us with the dilemma as to whether more is to be lost by control than is to be gained by a free flow of information.
As citizens in a nation, and inhabitants of a global megalopolis, there are questions which the control of content on the Internet forces us to confront regarding the issue of how far we are willing to have freedom of expression and communication in an adult world governed by a standard designed to protect the possible, but apparently undemonstrable harm that might be done to children or other adults: do we really wish to control freedom of access of intelligent teen agers, much less some adults who have access only to public, filtered terminals, from participating in small discussions between a group of interested adult individuals; do we condone restricting in any way access to the online equivalent of libraries, museums, universities, and agencies disseminating news; do we wish to inhibit a multitude of other useful services becoming available on the Internet, to such a standard? The Canadian based Nizkor Web Page, which posts all material pro and con concerning the Holocaust and which has been universally praised, has illustrated the proper way of handling adult debate. Parents and teachers must take the time, interest, and responsibility, which are part of parenting and pedagogy, to supervise, monitor and discuss their children's use of the Internet, assisted if they wish by software through which they themselves can monitor and make decisions for which they are reponsible concerning their children's participation. Such an approach seems to be demanded by the complexity and the potential value of this unique new medley of technologies, to avoid reducing the global discourse among adults to what is acceptable to young children or to the most narrow community standards.
The concern demonstrated since 1994 by the Canadian, particularly the Ontario arts, communities about the censorship exercised through the effects of problematic statutes and judicial opinions (such as Butler v. Regina), was sparked by, but by no means limited to, the Eli Langer case.(43) Their critiques dramatically illustrate the dilemma of the artist or writer, critics, publishers, gallery owners or curators being confronted with an imposed necessity to self-censor, since artists often cannot afford or risk the possibility of their work leading to a criminal charge or even a civil action. Since the Internet is being used internationally by artists and writers to exchange ideas and provide their works (if video art) or replicas and/or photographs of their works, a system of self-censorship through ratings would seriously inhibit their creativity and impose limitations on Canadian artists that would place them under suppressions not experienced by their peers in other parts of the world. Similar problems would impact on scholars and digital libraries. For example, the rating of a site reproducing an electronic version of Catcher in the Rye, Ulysses, or The Stone Angel would run into the problem of how to rate their appropriateness to a specific age group. If 18 and over were the decision, would a 17 year old first year university student be banned from viewing them with the same freedom of access as an 18 year old in the same class? Or would the 18 year old have to forego the experience of the work being part of the curriculum, since it would not be suitable to her classmates under 18? If schools were forced to use filtering software to block on the basis of self-labeling, would all parents with children over fourteen wish to prohibit access? Would a scholar posting of an electronic edition of D.H. Lawrence's Lady Chatterley's Lover be guilty of an offence if in labeling it he chose to say it was not restricted to adults only? While it might be suggested that first class speech such as that of either D.H. or Margaret Lawrence could hardly be in danger of censorship, it is important to remember not only have there been attempts to censor them in public libraries (just as the film The Tin Drum was banned in Oklahoma in August 1997), we are not discussing an issue of banning but a law which would criminally penalize mis-classification. It would certainly provide a strong disincentive to an owner of a web page to make the decision that someone in their teens could read Catcher in the Rye, and then to discover that a court had agreed that such a classification was misleading. The obvious solution, therefore, is to classify the book as only available "as adult supervised" or "adults only,"and this might in a rigid interpretation mean that a nineteen year old second or third year university student could not access the book online unless a twenty-one year old friend from third or fourth year were present. In any case, would it be promoting maturity to deny a high school student access to such a book without permission from her or his rather overly-fearful, rigid parents? The complexities to the rights of maturing young women and men in the society are seriously exposed when the problem is laid out in this manner.
In relation to pictorial material, how is a museum to classify a painting of St. Theresa in ecstasy, since such a painting may have been viewed in the past by children since it was a religious icon. But the same painting is not an icon to someone who is not of the same belief, or opposed to religious iconography. Does it then become classified as a recognized painting -- but then how does one distinguish it and other icons from bringing together sex and violence? An obvious answer might be to exempt museum sites, as the major players in the media world have suggested exempting news sites. Then the problem is shifted to who choses the proper distributors of news or the proper managers of museums, since it is remembered, it is museums and galleries that put on shows of photographs such as Langer's or Mapplethorpe's, and the National Museum of Canada has exhibited extremely controversial videos. In a society in which in one community The Tin Drum is awarded an Oscar, in another condemned as child pornography (and it should be remembered that The Tin Drum was originally censored in Ontario), a self-classification system is going to have a chilling effect -- and an even more chilling one when we move from first-class speech to second class speech. If the U.S. Supreme Court spoke of the CDA torching free speech on the Internet, surely rating and classification would absolutely demolish it.
Why does the emergence of the Internet as the maturing and merging of a series of technologies raise such great concern about media censorship? It should be quite predictable, in the light of the history of communication technologies, that successive waves of censorship, often through severe repression of speech and expression, have accompanied the beginnings of new technologies. Let us select just a few examples. The early days of printing saw the rise of the Roman Catholic Index of Prohibited Books and the Inquisition; the beginnings of mass printing saw campaigns of censorship in the late seventeenth and eighteenth century, and the appearance of laws such as those governing criminal libel which extended beyond books and newspapers to the theater. The twentieth century has been vitiated by waves of censorship sparked by advances in publishing, radio, film and television successively. While there is no simplistic relation between cause and effect, technological change has frequently played a major, significant role in generating attempts to control speech and expression in all media. For example, the shift from morality to harm in legislating and adjudicating forbidden speech and expression, which has been given vast impetus in recent years by activists, began in the 1960s as a concern about the supposed effects of TV violence -- which is still being heatedly debated. It is not suprising then that the Internet as a new phenomenon, barely understood, has given a new energy, and urgency on behalf of the proponents of media as harmful; expanded to the similar harm generated by pornography, accompanied by even greater fears and stronger demands for suppression. But as Ithiel de Sola Pool with whom we began points out, the history of communication technologies, history of censorship, and the history of understanding media should rather lead to caution in curtailing the technologies of freedom.
The Internet presents a fundamental challenge to society, for the issue now has
become the realization of the tremendous potentials for enriching human learning and
interchange possible through the complexity of this new medley of technologies. Since
it is ultimately not possible to effectively control the Net, virtually short of dismantling it,
or at the very least rendering it substantially less effectual, it has, therefore,
dramatically demonstrated the dangers in the very idea of censorship or any control of
content on the Net -- except that promoting criminal acts. If all Canadians are not going
to lose, and contribute to other jurisdictions losing a superb opportunity for knowledge
and understanding, the Government and the Supreme Court of Canada must come to
terms with reassessing the Charter, so that limiting knowledge and understanding by
inhibiting the "freedom of communication and expression" offered by the Internet -- in
order to protect the interests of specific groups, even children -- is not and cannot be
one of those "reasonable limits prescribed by law as can be demonstrably justified in a
free and democratic society."
The research of this paper has profited greatly from the online conversations on the Internet discussion groups maintained by Electronic Frontier Canada and by Fight-Censorship in the U.S. On Fight-Censorship the contributions of Declan McCullough, Jonathan Wallace, Jim Tyre, and Professors Peter Junger of Western Case University and Seth Feldman of MIT have been particularly helpful, but all participants in their lively discussion have assisted my thinking. Jonathan Wallace's book Sex, Law and Cyberspace was extremely useful in understanding current issues in the United States. Dov Wisebrod of LOGIC (Canada) was encouraging and his papers on the LOGIC website were particularly useful. Talks over the years with my colleague, John Fekete, have complemented my interest in censorship.
1. Bertrand Marotte, "Censorship Hot Topic at Conference" Southam News Background in Depth, February 13, 1996. Available at http://www.southam.com/mmc/waves/depth/tech/censor0213.html
2. Hon. John Sopinka, "Freedom of Speech and Privacy in the Information Age," Address to Symposium on Free Speech and Privacy in the Information Age, University of Waterloo, November 26, 1994
3. For a short history of the Internet see Bruce Sterling, 'Science Column #5: Internet," The Magazine of Fantasy and Science Fiction, Feb.1993. Available as "A Short History of the Internet" at: http://www.magnet.gr/internet/guides/bruce.html
4. For an interesting Canadian study on these issues, see Dov Wisebrod, Controlling the Uncontrollable:Regulating the Internet (1995) 4M.C.L.R.331 (updated) at http://www.Catalaw.com/dov/docs/dw-inet.htm
5. It should be of some interest that David Johnston, who later became chair of the Information Highway Advisory Council appointed by the Canadian government, was Principal of McGill at the time of the ban in the Homolka case.
6. See above note 2
7.  3 S.C.R. 229; (1990), 76 D.L.R. (4th) 545
8. At least, according to President Clinton, for in remarks he made in the East Room on July 2, 1997 about "The Framework for Global Electronic Commerce," he observed that Al Gore had coined the term "Information Highway" seventeen years earlier.
9. Vancouver Freenet v. Canada, p. 1
10. Vancouver Freenet v. Canada, p. 9
11. Malcolm Lowry to Johnathan Cape [2 Jan 1946] Selected Letters, ed. Harvey Breit and Marjorie Bonner Lowry (New York: J.B. LIPPINCOTT, 1964) 66
12. For a discussion of the phenomenon of moving beyond media, see Donald Theall, Beyond the Word: Reconstructing Sense in the Joyce Era of Technology, Culture and Communication (Toronto: University of Toronto Press, 1995) 91-109
13. Ithiel de Sola Pool, Technologies of Freedom (Cambridge: Harvard University Press, 1983) 7
14. Ibid., 7-8
15. Theall, Beyond the Word, and "Beyond the Orality/Literary Dichotomy: James Joyce and the Pre-History of Cyberspace," Post Modern Culture 23 (May 1992) [an electronic journal] available at http://muse.jhu.edu/journals/postmodern_culture/v002/23theall.html
16. Ibid., see especially the discussion on censorship in the conclusion of Beyond the Word,
17. It should be noted that the decisions of the Federal District Court and the Supreme Court did not make any decision respecting the fifth amendment.
18. Findings of Fact, #74
19. In Marshall McLuhan's Letters, ed. Matie Malinaro, Corinne McLuhan and William Toye (Toronto: Oxford University Press 1987) 78. we discover that he favoured this conception over "global village" which he used to describe the world he felt to be emerging.
20. (1962) Ltd.,  2 C.C.C. 103 (Man. C.A.) (at pp.116-7)
21. Dickson C.J. in Towne Cinema Theatres Ltd. v. The Queen,  1 S.C.R. 494. at pp.508-9
22. Justice Sopnka's remarks at the University of Waterloo clearly indicate the tendency of university administrations, their boards and their Senates to act precipitously in the regulation of speech.
23. J.Shallitt, "The Real Meaning of Free Speech in Cyberspace." An invited talk for the conference "The Internet: Beyond the Year 2000," University of Toronto, May 1, 1996. Available at http://insight.mcmaster.ca/org/efc/pages/doc/b2000.html
24. Gareth Samson, "Illegal and Offensive Content on the Information Highway: A Background Paper," produced by Industry Canada (June 19, 1995). Available at Http://insight.mcmaster.ca/org/efc/pages/doc/offensive.html
25. Report of the Information Highway Advisory Committee, chapter 4, section on "Illegal and Offensive Content"
26. Dov Wisebrod, "Controlling the Uncontrollable: Regulating the Internet," Sections 1b and 4 respectively. Available at Http://www.catalaw.com/dov/docs/dw-inet.htm
28. Internet Content-Related Liability Study, Summary - Introduction - 1. Available at http://stategis.ic.gc.ca/ssg/it03117e.html
30. Ibid., 23
31. For a discussion of this problem see Michael S. Koch, "Square Pegs and Round Holes: CRTC Regulation of the Internet," 1996 posted at the Smith & Lyons web site.
32. It should be noted that for a brief period after Satanic Verses was published Canada Customs banned this book.
33. Example provided by George Koulikis [email protected]
34. Blocking/filtering software is succinctly described by the American Library Association:
Blocking/filtering software is a mechanism used to:restrict access to Internet content, based on an internal database of the
restrict access to Internet content through a database maintained externa l to the product itself, or;
restrict access to Internet content to certain ratings assigned to those sites by a third party, or;
restrict access to Internet content by scanning content, based on a
keyword, phrase or text string or;
restrict access to Internet content based on the source of the information.
35. For example, the Vancouver Public Library is using blocking/filtering software in its children's department. In Boston, Mass. the mandatory use of filtering/blocking software was imposed in 1997 by the City through the Mayor. Since then, however, the librarians, using the ALA arguments, have refused to comply, except for very young children.
36. For information about Haselton, who started crusading against net censorship in his mid-teens, see his own web site at www.peacefire.com. By 1997 when Cybersitter threatened to launch a law suit against him, Haselton was attending university.
37. The following descriptive notes from the appendix of the ACLU's paper on rating and blocking, "Fahrenheit 451.2: Is Cyberspace Burning?" provide elementary descriptions of the components involved in a self-rating or a third party rating scheme.
38. at http://www.safesurf.com/online.htm
39. Ray Soular and Wendy Simpson, "The SafeSurf Internet Rating Standard" PICS Version 3.0, December 1995, available at http://www.safesurf.com/ssplan.htm
40. Avaialble at http://www.ala.org/alaorg/oif/filt_stm.html
41. Available at http://www.ala.org/alaorg/oif/filt_stm.html. The formal resolution of the Council of the ALA adopted on July 2, 1997 is:
Resolution on the Use of Filtering Software in Libraries:WHEREAS, On June 26, 1997, the United States Supreme Court issued a sweeping re-affirmation of core First Amendment principles and held that communications over the Internet deserve the highest level of Constitutional protection; and
WHEREAS, The Court's most fundamental holding is that communications on the Internet deserve the same level of Constitutional protection as books, magazines, newspapers, and speakers on a street corner soapbox. The Court found that the Internet "constitutes a vast platform from which to address and hear from a world-wide audience of millions of readers, viewers, researchers, and buyers," and that "any person with a phone line can become a town crier with a voice that resonates farther than it could from any soapbox"; and
WHEREAS, For libraries, the most critical holding of the Supreme Court is that libraries that make content available on the Internet can continue to do so with the same Constitutional protections that apply to the books on libraries' shelves; and
WHEREAS, The Court's conclusion that "the vast democratic fora of the Internet" merit full constitutional protection will also serve to protect libraries that provide their patrons with access to the Internet; and
WHEREAS, The Court recognized the importance of enabling individuals to receive speech from the entire world and to speak to the entire world. Libraries provide those opportunities to many who would not otherwise have them; and
WHEREAS, The Supreme Court's decision will protect that access; and
WHEREAS, The use in libraries of software filters which block Constitutionally protected speech is inconsistent with the United States Constitution and federal law and may lead to legal exposure for the library and its governing authorities; now, therefore, be it
RESOLVED, That the American Library Association affirms that the use of filtering software by libraries to block access to constitutionally protected speech violates the Library Bill of Rights.
Adopted by the ALA Council, July 2 1997
42. Canadian Charter of Rights, Sec. 1
43. For a series of representative articles and further bibliography see Lorraine
Johson, ed. Suggestive Poses: Artists and Critics Respond to Censorship (Toronto:
Toronto Photographers Workshop and the Riverbank Press 1997)