Does The Internet Need Moral Police Or A Redlight District?
posted 9th March 2000

Cyber Patrol is doing well, for its makers, not necessarily for its users. Cyber Patrol is a filtering product used to protect children from the Web's seamy side. Good intent, but the road to hell is paved with those. The need to protect minors is a given - for most people anyway, but filtering software is the wrong approach, one that is subject to technical and ethical limitations and like the command "Don't look in that cupboard" far from stopping an undesirable event, ensures it happens in a secretive shady environment. The latest uproar over Cyber Patrol comes with hot-buttons.

Free speech. Reverse engineering. Copyright violation. The other day two hackers agreed to stop distributing their own software that lets users bypass Cyber Patrol's Web content filters. Good thing? I don't think it has much meaning as the suspicion must be that the hacker deal was `sweetened'. There is also the point that blocking sites is open to huge abuse, as the so-called ORBS system setup by Alan Brown in Palmerston North demonstrates. ORBS ostensibly publishes and distributes lists of ISPs that allow their public email systems to be used by spammer relay programmes, hiding the spammers true identity. These sites can then be blocked out from getting or sending email. Worthy aim, but in furtherence of that aim the ORBS system secretly tests emailers with its own untouchable testware and decides unilaterally if the emailer is a goody or a bady. If it decides bad, the ISP's clients get unsolicited emails sent to them (which is spam by any definition) with a tone that says outright that the innocent users email is under threat due to the bad ISP they have chosen.

This is totally contary to any form of natural justice and has lead to some very nasty scirmishes in which everybody continues to lose, including the reputation of the Internet. Cyber Patrol is the latest in the millenia old tendency for moral rightiousness to evolve rapidly into forcing people into unchosen behaviour patterns, ie., violate the few freedoms people have. Information longs to be free, but that's not a bad or good thing, it just is. Cyber patrols encryption of its blocked websites list means they can hide from the user the fact that they might be blocking sites for other than moral reasons, like anti-competitive behaviour, or suppression of critics - both proven to have been indulged in by Cyber patrol. Blockers should publish the criteria they use for blocking Web content. Cyber Patrol argued its lists of blocked sites are trade secrets and that cracking code to gain access to them is copyright violation.Oh dear.

Filtering software is not an answer on the Internet 1) It's never good enough or fast enough to keep up with the Internet, there's always a cracker who can find a workaround 2) It's too open to self interested abuse, filtering firms have been accused of blocking sites they simply don't like, including competitors' sites. I support Jesse Berst of ZdNet. He notes that most communities, zoning laws prohibit adult entertainment close to schools. The could use a similar approach. Create a domain for adult material -- .adu, for example (as in www.smutshop.adu). Any Web content remotely adult in nature -- adult themes, full or partial nudity, obscene language -- must reside in the .adu domain. We can let governments define adult content and penalties for those who don't abide and who serve from the given state's jurisdiction - all coordinated internationaly by inter-government agreement.

Parents, teachers or librarians simply set up a filter to block .adu sites. There is then a mechanism for allowing freedom but freedom with responsibility placed in the hands of the producer of the material and the user. If a person goes to a .adu site, they must accept the consequences. Parents or anyone in loco parentis is then empowered to take responsibility for those under their care. .