A few months ago, in the Web lounge of a Chapters store in Toronto, I watched an angry young woman hop off her stool and frantically signal an employee for help. She had been kicked off a Web site, she explained, and she wanted to know why.
The employee typed in the URL, and after a moment
he explained that she had been blocked by the store’s content filter. “”But it’s a site about research on violence against women!”” the girl complained. “”How can that possibly be considered offensive?””
The employee suggested that some of the language contained in the site’s text (ie; “”rape,”” “”vagina,”” etc.) might have confused the filter, but the woman was not appeased. The incident only shows how irritating filtered Web browsing environments can be, how even those in pursuit of legal, important data can find the online door slamming in their face.
On Monday a landmark case began in a United States District Court where a three-judge panel was scheduled to hear arguments regarding the Child Protection Law. This legislation, which its detractors refer to as the Censorship Law, forces libraries and schools to filter out pornography and other offensive material in order to get government funding for for PCs and Internet access. Plaintiffs like the American Civil Liberties Association will claim that the law violates the U.S. First Amendment, a position that would have carried more weight in the carefree days prior to Sept. 11. Even now, there will be enough controversy over the issues involved that the judges may have a difficult time upholding it.
Whatever the outcome, the case may prove a benchmark of sorts for Canadian regulators, who are notoriously shy about making broad decisions that govern digital content. Around this time last year, for example, the Royal Canadian Mounted Police announced a five-point plan to promote “”safe, wise and responsible Internet use.”” This included public awareness campaigns and establishing closer links with Internet service providers (ISPs), but it also mentioned the government’s plans to amend the Criminal Code to better deal with those who use the Internet to lure children. If they’ve made any progress on this front, they’ve been remarkably quiet about it.
If nothing else, Canadians have done a decent job of at least trying to assess the danger before imposing a legislative remedy. Two years ago, for example, the government published “”Young Canadians in a Wired World: The Student’s View.”” The statistics don’t do much to bolster the case for filtering. More than one in 10 of students between nine and 17 said they have hacked through or disabled a filter at a home or school computer. Most say their parents do not use filters at home (65 per cent) or sit with them while they surf (68 per cent). Only two children out of 10 said their parents even ask them to use a filtered search engine. With such a lax attitude at home, it’s hard to see many of them concerned about students’ accessing porn in the classroom or the library.
Legal requirements for filtering raise questions of responsibility — is the school to blame if a child accidentally lands on a pornographic Web site (as more than half of them told the government they do?). Or is the technology, always prone to error and vulnerability, at fault? A Dalhousie University study from 2000 studied everything from automated text analysis tools to site labels and concluded that none of them really do the trick. “”Regulation is not considered to be an effective way to address the challenge of offensive Internet content from a public policy point of view,”” its authors wrote.
The U.S. government has made an admirable attempt to combat evil on the Web with its law, but the court case it has triggered is precedent setting. The Internet cannot be run like a local library. Likewise, every attempt to regulate the distribution of content has global implications. Canadian data indicates the technology isn’t there, nor is the demand. Until that situation changes, the judges need to realize that children are not the only ones that need protection.