discussions to be dominated by the people who had the most time to write long
postings. Worse, an early decision, demanded by one council member, to operate
by consensus meant that even the smallest decision took endless amounts of time.
"Shared values, patience, goodwill, mutual accountability," Bruckman told an April
1996 conference on Virtue and Virtuality, "all these things are what makes
consensus process work. All of these things were one-hundred percent lacking in
this experiment." After about ten months the experiment was dismantled.
There is a significant difference between these communities and real life, and it's
not the lack of bodies, but the verifiable existence of an ultimate, all-powerful god:
the person who owns the machine can pull the plug. This is true whether you're
talking about a bulletin board system in someone's bedroom, a commercial service
answerable to its stockholders, the moderator of an online forum on a larger
service, or, as is the case with these MOOs, an experiment set up by a researcher.
This is why Usenet is in many ways the Net's bedrock: no one can pull the plug,
and, at least on the alt hierarchy, there's no way to kill a newsgroup.
If there were ever to be a government of the Net, by the Net, and for the Net, it
would have to be implemented in technology that worked universally, and it would
have to be by consensus (as even one system administrator's refusal to adopt
whatever was agreed upon would undermine any decision that was made). If
Bruckman's MediaMOO, with a relatively homogeneous community of only a couple
of hundred users, found consensus unworkable, the far more diverse, infinitely
larger Net would have little chance.
There are few issues about which there is enough consensus to build on. Child
pornography and junk email are probably the two subjects that attract the most
widespread agreement. Even in those cases, some object to having junk email
regulated or deleted for them, while others believe that removing the newsgroups to
which child pornography is occasionally posted is a bad idea. The material, the
argument goes, will simply find its way into some other, less obviously named area,
where accidental contact with it, perhaps by a child, is far more likely; it will also go
underground, raising the same kind of policing problems we're supposed to restrict
cryptography to avoid.
This is one reason everyone has grumpily jumped on the ratings and filtering
bandwagon. Using blocking software follows the Net's existing structure and its
roots in decentralized, user choice, while ratings similarly distribute responsibility to
the Net's millions of users. These systems will work, if by "work" you mean "give
politicians something to point to that appears to show they've Done Something
about the Net." They will not work, if what you expect is a cyberworld in which no
one will ever see something they find offensive or distressing or a Net which will
never be used for anything illegal. But neither will any other system, not because
Netizens are uncontrollable but because the nature of a diverse world is to offend
some of the people all of the time and all of the people some of the time. We see it
every day in the real world.
Ratings have other problems, which few are talking about yet--Net users because
they tend to figure ratings and filtering are the nearest they're going to get to a way
out of all these regulatory threats, and politicians because they're not about to say
there's nothing they can do. Ratings can be used to advertise, as well as block,
salacious material--just look at all those "XXX" rated movies (there is no XXX
rating, it just sounds more impressive) you see advertised in adult video
catalogues. (The "Operation Clambake" home page displays yellow "Attacked by
COS!" banners next to some of the page's links; a similar badge of pride.) In
addition, if you can tell a software package to block all ratings above a certain level,
you can probably tell the same software (or the inverse software that will doubtless
be written) to go out and find only ratings above a certain level. Software is like
that: it can be changed and emulated. If you're a teenager and someone tells you
there are sex-10 rated sites out there that your school software won't show you, do
you say, oh, yes, the grown-ups know best, or do you try to figure out a way to get
a look at what you're not allowed to see?
For would-be regulators, focusing on the bogeymen--terrorists, drug dealers, spies,
Copyright © 1997-99 NYU Press. All rights reserved.
Reproduction in whole or in part in any form or medium without written permission of New York University Press is prohibited.
Be sure to visit the NYU Press Bookstore
[Design by NiceMedia]