/ 29 September 2000

Censorware stops more than porn

The idea that software could ever be sophisticated enough to identify pornographic images is sheer fantasy, writes Avedon Carol Moralists and Net-nanny software hucksters have put a lot of effort into convincing the public that our kids are unsafe on the Internet and that we need to put strict controls on what people can see. Let’s be clear: this is about censorship. These people generally don’t like sexual material even for adults and they like to exploit our fears for children to try to make us think that the rights and even needs of adults have to be put aside in order to protect the kiddies. They certainly don’t want us to give any serious thought to whether the material in question is really harmful to young people and they hope they can make us feel it isn’t any good for adults, either. They want to find some way to block the things they don’t like, if not on the Net as a whole, then at least in public libraries and on school online systems – even at universities. And Net-nanny software is a truly fine example of how censorship really works. To begin with, it provides a Trojan horse to censor other materials that we didn’t have in mind. You might think you are protecting your kids from hardcore pornography, but most of the censorware out there is designed by people with an extensive political agenda and they will be stopping far more than you imagined. Bennett Haselton, founder of Peacefire Teen Anti-Censorship, has found quite a few strange inclusions in the lists of things the software designers don’t want kids or academics to look at. They include the National Organisation for Women, the American Civil Liberties Union, Feminists against Censorship and, unsurprisingly, the Peacefire site. This is particularly impressive when you consider that civil liberties and censorship issues – and even pornography – have become major academic subjects across disciplines, studied even by fairly young teens.

Feminists against Censorship routinely gets letters from students asking for information on all of these subjects, for civics classes, media studies, sociology and psychology courses, among others. The censors, of course, want us not to be able to talk about sex. They’d prefer if teenagers couldn’t look things up or ask people about their fears and feelings about sex. They pretend that kids won’t absorb all sorts of misinformation without the use of the Internet. Indeed, they would be happier if adults went on in ignorance. They even invent phony science to convince us that letting people look at sexual images will turn us all into rapists. They have yet to acknowledge that rape existed before the camera and the printing press were invented.

Almost any blocking system is going to stop things you didn’t have in mind. Some systems block any words perceived as inappropriately sexual, so you find that you can’t sign on if you come from Scunthorpe, you can’t talk about breast cancer and recently a woman discovered she couldn’t register with a system because her name was Sherril Babcock. After they told her she had to change her name, she signed on successfully as Sherril Babpenis. On the other hand, a systems administrator of my acquaintance is still trying to convince his employers that they really don’t need software that stops him from accessing the websites of some of their suppliers, such as Hewlett-Packard (HP). We have no idea why HP is blocked, since the only thing the software is supposed to stop is pornography. Which of course leads us to the next question: does this stuff at least do what it’s supposed to do? Well, no, of course not. Most pornographic images aren’t labelled “pornography”, so you can’t stop pictures merely by blocking words. How many pin-ups have all the parts labelled? To block all porn, you would have to look at every single page, and every single image and put it in the database. There are millions of images and articles on the Net, and no one could possibly look at all of them.

The idea that software could ever be sophisticated enough to be able to identify pornographic images is sheer fantasy, since there are no parameters it could meet that wouldn’t also block most other human images, at the very least. All of which might be terrifying if it wasn’t for one simple, under-reported fact: no one has ever been able to demonstrate that anyone, of any age, is harmed by seeing pornography. What is pretty clearly demonstrated by sex crime data and clinical evidence is that sexual repression can do far more harm than good. If we are constantly telling kids that sexual images are bad, we are giving them some pretty terrible messages about sex. Should we be worrying about what our kids see on the Net? I think the answer is: “Not very much.” Keeping lines of communication open between you and your children is a lot more useful than trying to close their lines of communication with the world. And don’t let them get their hands on your credit card number. Avedon Carol is a founding member of Feminists against Censorship and author of Nudes, Prudes and Attitudes: Pornography and Censorship. See www.fiawol.demon.co.uk/FAC