The first clue came from the bookstore at the mall. In the computer section, books about the Internet are taking over shelf space like a virus. There is even a package from Sams Publishing: "Teach Yourself the Internet" and "Your Internet Consultant" are bundled together. Shrink-wrapped to their spine is a tube, about the same diameter as one from a roll of toilet paper. Inside this tube is a full-color "Netsurfer" T-shirt.
And then there's Al Gore, who's telling us that the National Information Infrastructure will "educate, promote democracy, and save lives." Alan Kessler, head of 3Com Corporation, claims that the infohighway will "erase cultural boundaries and move continents and people closer together." Librarians write essays in journals about "wallet-sized libraries" and "libraries without walls." And an ad from MCI Communications in "Business Week" notes soberly that "the space-time continuum is being challenged. The notion of communication is changed forever. All the information in the universe will soon be accessible to everyone at every moment."
And on the eighth day He said, "Let there be an Information Superhighway." All His creatures were digitally empowered, and He saw that it was Good. The Word is out: in the beginning was the One -- and then the Zero. Use a modem if you want your prayers answered.
Entertainment is a top American export, but these folks aren't joking. They're busy hyping information technology, a field in which America leads the world. By using our latest technology to deliver our latest entertainment, we get the superhighway -- potentially the fattest cash cow since the last hype, which was the arms race and Cold War.
It's going to be a one-way "entertainment superhighway" leading straight out of Hollywood, with a return dirt path from the home just large enough to select 500 channels or buy 5000 products, register your opinions on plot turns in "Days of our Lives," and to pay your bill. But it's hyped as the "information superhighway" in order to slip it by the unsuspecting. Get your T-shirt today!
The superhighway is a projection of potential future connectivity, endorsed by government and big business alike. It envisions a time when every home and office is interconnected with high-capacity conduits such as fiber optic cable. This will be expensive, so in the meantime we have to squeak by with telephone lines, and with the coaxial cable that's used by cable TV companies. These are already wired to many homes.
The ideal highway may never exist, but in the meantime the Internet (or just the "Net") is embraced as a model of things to come. This Net is a unique global network that connects other regional and local networks, and can therefore boast of 20 million users. It's easier to envision the Net if you've had some exposure to a local area network (LAN) in an office building. On a LAN, one or more "server" computers can be accessed by "clients" (personal computers or PCs) in the building, through common access to software and files on the server's hard disk. Each PC behaves as if it had an extra hard disk, with all of its resources, inside of it.
In many offices, LANs function merely as a method of sharing a laser printer with all of the PCs in the department, which isn't very impressive. A better use for a LAN would be to zap e-mail around the office, or allow password access to a centralized database. In this case the LAN manager determines the level of access for each user -- some will be able to update the data, while most can only search the data.
Now imagine that 25,000 LAN managers around the world, including universities, governments, and corporations, decided to interconnect all of their servers using dedicated high-speed data lines leased from telephone companies. That essentially describes the Net. The connections are made possible by a standard protocol called TCP/IP that defines how packets of data are sent and received. Now e-mail has become a global affair.
Mitch Kapor once described the Net as "a library where all of the books are dumped on the floor in no particular order." Everyone agrees that the Net is difficult to use. Over the years, many specialized servers have evolved on the Net, particularly in university settings where the Net has been popular since the late 1980s. Some of these servers offer search software that interfaces between users and Net resources, so that files of interest can be identified and downloaded more easily. Other servers make databases available -- many university library card catalogs, as well as the catalog at the Library of Congress, can be accessed on the Net. Still others maintain lists of users who belong to "newsgroups," and allow instant posting of e-mail to everyone else in the same special-interest group. Far-flung users may also engage in real-time discussions.
Once you have access to an Internet server, all the remaining connections across the Net are handled free of charge to the individual user. This is because the Net was developed first by the government and later by the universities, and the "backbone" -- the major high-speed data links -- are subsidized by the National Science Foundation (NSF). Universities can connect on the basis of a flat fee rather than metered usage, and they are also required to provide access to all of their students and faculty.
The Net has its roots in the Cold War. In 1964 Rand Corporation proposed a post-Doomsday network, with switching that would enable packets of data to be easily rerouted. In 1969, the Defense Department's Advanced Research Projects Agency created ARPAnet. TCP/IP started in 1980, and in 1983 ARPAnet split into civilian and military components. NSF launched their "backbone" in 1986 and by 1988 had a "backbone" with a bandwidth of 1.5 million bps (bits per second). The National Research Council recommended opening the network to the nonresearch community in 1987. In 1992 NSFnet reached a capacity of 45 million bps.
As a result of its history, Internet lacks the capability of time-tracking its users. For those who can get free local access it has basically been a free ride. NSF's "appropriate use policy" kept the commercial interests off of their "backbone." But these corporate sharks know a market when they see one. After swimming around the university Net surfers, looking for weak spots, they began building their own "backbone." NSF is now in the process of bowing out in favor of private industry, and "appropriate use" is increasingly a dead letter. The free ride will probably become a series of metered toll gates -- which is how commercial services usually do their billing.
Al Gore and company announced the National Information Infrastructure (NII) in 1993, one aspect of which is the National Research and Education Network (NREN), a billion bps "backbone" that should be completed in 1996. But Gore made it clear that the private sector will develop this infrastructure, and it's also becoming evident that public concerns such as universal access are not high priorities. Despite Gore's rhetoric, it seems clear that the Clinton administration does not have the stomach to defend the public interest against big businesses hungry for new markets. The most we can hope for is a system of tax or other incentives that help reserve a slice of the private-sector's bandwidth for public use.
Big business is attracted by Internet's growth curve, even if a good portion of this growth can be attributed to its own hype. In 1993 Internet doubled in size, and three-quarters of that growth came from newly- registered commercial networks. In 1991, before NSF began thinking about abandoning their subsidy, the Commercial Internet Exchange (CIX) announced an interconnectivity project that went around the NSFnet with its "appropriate use" restrictions. This began the de-Balkanization of the Internet for business connectivity. Of all the networks registered worldwide, 63 percent now belong to businesses or their research labs. An Internet e-mail address is now a status symbol on yuppie business cards.
The NSF subsidy, which will disappear entirely next year, was becoming less significant anyway: in 1993 it was only $11 million, or about ten percent of the total. NSF is now completing plans to privatize their "backbone." Pacific Bell, Sprint, Ameritech, and Metropolitan Fiber Systems will replace NSFnet by next Spring. They will have a five-year franchise to compete with each other to carry Internet traffic. These four new NAPs (network access points) might decide to begin metering the traffic on the Net, rather than charge a flat connect fee as NSF did.
Commercial online services are also rushing to increase Internet access for their subscribers. The government was persuaded to allow commercial e-mail onto the Net in 1989, but now the big three -- CompuServe, America Online, and Prodigy -- are establishing gateways that allow more than just the exchange of e-mail. Each of these three has doubled its subscriber base since 1992, and today they have a combined total of 5 million subscribers. Prodigy, jointly owned by Sears and IBM, has failed to turn a profit in six years. Ads in garish colors clutter most screens. AOL offers better e-mail service than Prodigy, and provides access to 31 magazines and two newspapers. Researchers find that CompuServe is the most useful, primarily because last year it became the only way to access Knowledge Index. An after-hours-and-weekend version of Dialog, KI features about 120 of its more popular databases. At a rate of $24 an hour, two dozen U.S. newspapers can be searched full-text, as well as article abstracts from many magazines.
Along with the Net and the commercial online services, there are also an estimated 60,000 BBSes (bulletin board systems) in the U.S. This number has been doubling every two years. Most of these are hobby affairs, with a system operator who has one telephone line and uses "shareware" BBS software. But others are impressive. EXEC-PC in Wisconsin features 250 telephone lines and handles 4,500 calls a day. Its 50,000 users from 45 countries download 750,000 files monthly, and owners Bob and Tracey Mahoney ring up an annual gross of $2 million. No wonder big business is salivating. If you can channel all of these loose bits flying around into one superhighway, and then insert toll gates, you will rule the world.
But for today's corporation, the Net is not good for much more than business e-mail. Advertising has to be routed around noncommercial users. Many of them cut their teeth on the old standards and still consider ads to be a violation of "Netiquette." When lawyers Laurence Canter and Martha Siegel wanted to advertise their services, they posted an ad by using a program that sent it to thousands of newsgroups, each of which in turn automatically posted it to its individual group users. It was the equivalent of junk fax multiplied by a million. Canter and Siegel received 40,000 replies, many of them "flames," or angry outbursts from offended Net surfers.
A more serious problem for business is the lack of security on the Net. Except for a few services that offer a product and ask for a credit card number on the Net, the lack of encryption and buyer-seller authentication means that the Net cannot presently be used for serious commercial transactions. This situation is expected to change eventually. In fact, a solution is already available. The "RSA public-key algorithm" is ideal for both encryption and authentication, but the patent rights were licensed by the inventors to RSA Data Security, a private firm that is jealously guarding its prerogatives.
The details of how the RSA algorithm works have been widely published. One working version called "Pretty Good Privacy" is used around the world, and can be downloaded from Net servers in Europe. But any major implementations of the RSA algorithm would invite a patent infringement suit. The U.S. government, meanwhile, wants to prosecute the person who slipped Pretty Good Privacy on the Net (and therefore exported it illegally), and has been investigating PGP author Philip Zimmermann. As became clear during the great "Clipper chip" controversy, the feds want to have their cake and eat it too: security for the Net is great, as long as spooks and bureaucrats can violate it at will.
Even Internet e-mail is not secure. Many companies consider e-mail generated on their own systems to be company business, and it is legal for them to monitor the content of such messages. The privacy of most other e-mail has been protected by law since 1986, but in the case of Internet, where e-mail is passed through many transit points, security can hardly be guaranteed in practice. It's easy to sign someone else's name to e-mail, or cut and paste any electronic text and then pass it along. If he's willing to risk getting caught, a "cracker" (illegal hacker) with access to an Internet relay node could write software that detects certain keywords and copies such e-mail into his box. The potential for competitor business intelligence from such a "fly on the wall" is obvious.
Companies that connect their local networks to the Net now install a "firewall" between them and the Net. This is a dedicated computer that examines all data packets entering and leaving a domain, in order to restrict the types of connections that may be made. The idea is to protect the company from intrusion. Perhaps the Pentagon should consider the same thing: this year they reported that intruders have been tapping into their unclassified system through the Internet for seven months, and have stolen, altered, or erased some records. In 1988, a virus on the Net paralyzed thousands of computers and made national headlines. Cracker Robert T. Morris, a grad student at Cornell, said he made an error in his program that caused it to replicate hundreds of times faster than he intended. The judge gave him three years probation, a ten-thousand-dollar fine, and required him to perform 400 hours of community service.
At present, the big dogs of the private sector are lurking at the fringes of the Net -- eager to cash in, but temporarily blocked by technical and legal obstacles. On the Net, meanwhile, a 60s-style, good-vibes, libertarian hacker ethic claims that "information wants to be free," and political activism is unnecessary because the Force is with them. Then we also have the public-interest activists in Washington, who worry that the rush down the superhighway by the private sector will inevitably leave the little guy in the dust.
Stretched between these positions is Mitch Kapor, who caters to all three groups. He founded the Electronic Frontier Foundation, a pro- superhighway lobbying group. When he consults with Al Gore's people, he promotes his "Jeffersonian" vision of universal access to a switched system. But he also says that private enterprise should be given a chance to see what it can do. Lately Kapor seems to be worried, but he still insists that the problem is not one of public vs. private:
The problem is that the free-market guys fall apart at exactly the same level as the regulators -- that is, when things get large. My conclusion is that the real division in society is not between public and private, it's between big and little. Because when things get big, they get broken.... I'm becoming less and less optimistic that the private sector will, left to itself, build the kind of infrastructure that's best for the citizens of the country. ("Seven Thinkers in Search of an Information Highway," Technology Review, August/September 1994, pp. 48, 51.)
In any event, the bandwidth will have to be expanded. The connections coming into most homes consist of a pair of unshielded copper wires (the telephone line). Since cable TV began in the early 1970s, an increasing number of homes are wired with coaxial cable (a copper wire surrounded by shielding). Bandwidth is a measure of how much capacity a given conduit can handle. Coaxial cable can transmit information at speeds up to 600 times greater than telephone lines, but fiber-optic cable can handle speeds at least 150,000 times greater than copper wire. As speeds increase on copper wires, the energy radiates off the wire, and it behaves like some sort of unintentional antenna. Telephone lines are extremely limited because they lack shielding.
Although modems are getting faster, mainly due to techniques such as on-the-fly compression and error-detection, soon we will approach the upper limit with telephone lines. Essentially modems are suitable for textual material, and for limited graphics. Video (except perhaps for one low-resolution frame a second) and digital music are impractical because they take too long to download at these slow speeds.
The cable TV companies with their coaxial cable are being eyed hungrily by big business. But while coaxial cable has much more bandwidth, this network is suitable only for broadcasting. The amplifiers are one-way, and there is no installed switching capability. This will not change without a massive investment in equipment. Certainly any company interested in the superhighway will first go the route of "500 channels" with a limited return channel before they invest in outfitting the cable TV system with switching equipment. And if the day ever arrives when you're willing to spend that kind of money, why not just forget about copper and go all the way with fiber optics, which is so much better?
In other words, there is much infrastructure that needs to be built before the reality of any "information superhighway" begins to approach the hype. Public-sector advocates worry that any new investment will increase the gap between rich and poor. The private sector has been stringing "coax" for more than twenty years now, yet even today the poor sections of many urban areas are still not wired. Now the regional Bell companies are filing applications with the FCC for "video dialtone" networks. "Our analysis reveals a clear pattern," concluded Jeffrey Chester of the Center for Media Education in Washington, DC. "Low income and minority neighborhoods are being systematically underrepresented in these plans."
Once the infrastructure is in place, there will still be a massive content problem. In 1971, the Sloan Commission on Cable Communications noted enthusiastically that "cable technology, in concert with other allied technologies, seems to promise a communications revolution." Twenty years later the content of community-access channels on cable was best depicted by the movie "Wayne's World." Increased bandwidth will most likely lead to a "vaster wasteland," where the signal-to-noise ratios are distressingly low.
The "libraries without walls" scenario, according to which anyone can access any book or periodical electronically, is all hype. Two insoluble problems prevent the digitization of printed material: copyright laws and cost. John Barlow maintains that copyright laws have no place in cyberspace -- they're a violation of hacker ethics -- and that they will soon give way to the cosmic forces that impel his vision. What Barlow ignores is that a huge chunk of our gross national product is directly dependent on copyright protection, and these laws won't be tossed out anytime soon.
At Public Information Research, we sometimes hear from folks who don't understand why they can't hit a key and read full text on their computer. Their apparent misapprehension is that we can throw a book into a chute, and have it come out on a tiny disk, automatically indexed. If we would only follow up on their brilliant suggestion, we could make it much easier for them to access our information. Why can't we get our act together?
Every book we index in NameBase is read cover to cover, and we compile our own name index during this reading. We cannot use the index in the back because that index is protected by copyright (and also because we input country and time-frame information for the names). Even if the index in the back wasn't copyrighted, we still couldn't use it. After a thousand names or so, you start getting namesakes, which have to be resolved by referring to the context. Moreover, book indexes don't separate names occurring in interesting contexts or which are important, from casual mentions of celebrities, which are often neither.
Let's assume that the index in the back wasn't copyrighted, and that we could scan it with OCR (optical character recognition) software. Omnifont OCR scanning, when done to an error-free standard, requires human proofing. But let's assume, just for the sake of argument, that scanning could be perfect. After a few thousand names, any cumulative indexing done this way would become hopelessly muddled. Namesakes would be combined, while different forms of the same name would be separated. The index would soon be so full of errors that it would become useless. This situation is not going to change; software cannot compete with skilled wetware. Sometimes scanning is faster, but it's also dumber. "Artificial intelligence," regrettably, is just another hype.
At PIR we do our own indexing, which avoids copyright problems and has kept the data better than 99 percent accurate over ten years and 78,000 names. To get the full text, we have a photocopy and fax service. Since we are nonprofit, and since the requests we get for photocopies or faxes are scattered due to the fact that they follow our indexing (users typically want two pages from one book and one page from another), this falls under the "fair use" provisions of the Copyright Act of 1976. There is no other way that our service can be provided legally. It is so labor-intensive, on top of the nonprofit status which helps make our scattered photocopying legal, that we still have zero competition -- even from other nonprofits.
Once you have full text available online, it can be searched for keyword combinations. Some librarians point out that keyword searching is frequently inferior to the searching that is made possible by other methods -- the Library of Congress cataloging system, for example, which places similar books together on the shelves and facilitates browsing in the library. Leaving this objection aside, generally it is only cost- effective to offer full text online when the book or periodical was created on digital equipment in the first place. That includes almost everything published these days, but what about everything published before the early 1980s?
Printed newspapers and magazines have a short shelf life, and their owners don't mind reselling the digital version, as long as it's already in their computer, to make more money. But publishers of books, fearful of losing sales, are not ready to go online. It's just too easy to zap text around on the Internet in violation of copyright laws and not get caught, or even criticized.
As for material that exists only in print form, the costs of error- free OCR have remained constant: as the software gets cheaper, the labor costs associated with proofing have increased. Another problem with OCR scanning is that there are no formatting standards for presenting the text. Because they don't want to adopt the wrong standard and do the work twice, librarians have shown more interest in "raster" scanning, which creates a bit-mapped image of the page.
Raster scanning requires no human proofing. Like a video camera, the computer stores the image digitally without recognizing anything except black or white on each tiny section of the page. It can be letters or pictures; the software can't detect the difference. The downside of raster scanning is two-fold: the first is that the bit-image file is typically at least 100 times larger than the equivalent text file for that page, and the second is that you can't computer search the words on the page unless you compile a separate index, or process the bit-image file through OCR software. Obviously, such manual indexing is labor-intensive, while the alternative of OCR processing is much too slow for searching purposes, even if inaccuracies can be tolerated.
Budget considerations have placed digital scanning out of reach for almost all libraries. The cost per page for either method of scanning is roughly three dollars. Microforming, by contrast, is about six cents per page. The Library of Congress has just announced a program to scan perhaps a million images a year into high-resolution, bit-mapped files, and then feed them to the superhighway. Copyright considerations have yet to be resolved, and the initial funding for the project will come from private foundations. Since their collection totals 104 million items, don't fire up your modem just yet. If each item required an average of only 150 images, it will still take 15,000 years.
The potential for meaningful content is limited not only by cost and copyright law, but also by the commercial demographics of our spectacular American culture. Today most online content qualifies as entertainment rather than information. The most popular activity on bulletin board systems is downloading adult GIF files (sexy pictures in Graphics Interchange Format). The second most popular is playing video games.
The biggest Internet forum is alt.sex.stories, with half a million users per month. "What more could you ask for?" coos the author of an article about another popular sex forum, alt.sex.bondage ("Wired," June 1994, p. 42). "It's one of the few places on the Net where you can find yourself genuinely surprised and illuminated about human desire each time you log on."
Corporate America need not worry about underestimating the demand for serious content on the entertainment superhighway. A BBS owner in Oregon grossed $3.2 million in 1992 from adult GIFs, and was sued by Playboy for copyright infringement. He settled out of court by paying them $500,000 in cash. Playboy quickly began thinking about setting up its own BBS.
Even e-mail is increasingly recognized as something of a junk medium, and some corporations have restricted its use in the interests of getting more work done. It's too easy to broadcast notices of softball practice to everyone in the building with the touch of a key, or to "copy" your boss on every piece of trivia you generate just to cover your ass. Microsoft founder Bill Gates gets so much junk e-mail from outside the company that he now puts it all through a "bozo filter" -- another program that's designed to banish most of it to an archive where it sits unread.
Many surfers insist that the content is not as important as the virtual community they experience on the Net. They may have a point. Ham radio operators can now bounce signals off their own satellites, but they're still asking the same old question: "So, how's the weather over there in Kansas?" Something else must be going on here; perhaps it's the need for a supportive community. (This author, KM6E, has been a ham off and on for over 30 years and it's still a mystery to him. Each of the four times I put a new station on the air it took just two contacts before I began wondering why I went to all the trouble. Then I'd pack up my equipment to make room for another bookshelf.) This sense of community will be lost if big business doesn't buy into the switched-network model. If you mention the notion of "community" to them, they'll ask you how much they can charge for it.
Another hint that the lack of content is not easily blamed on the mass media comes from a perusal of the print media offered by Net enthusiasts. Anyone can spend several hours with "Wired" magazine (one of the recent success stories in magazine publishing) or with "Mondo 2000," read them from cover to cover, and then put them aside to write one page about what they learned. Nothing comes out. These magazines dislike content, but they love graphics.
This culture is losing its respect for language. Arthur Naiman, one of the authors of "The Macintosh Bible" (1991), complained that subsequent generations of Mac keyboards kept adding more keys. He doesn't care for function keys, escape keys, or control keys ("That's what the MOUSE is for!"), and prefers the tiny keyboard that came with the original Mac. Naiman would be content to point with his mouse at icons on cavemen's walls. (Before he starts on another book, perhaps he can be persuaded to reduce his keyboard further -- by plucking off a couple of vowels!)
One very guilty party is "Wired" editor Kevin Kelly, who has just written a book titled "Out of Control: The New Biology of Machines." Anson Rabinbach, a reviewer for "The Times Literary Supplement" (September 9, 1994), criticizes Kelly's "biotechnotopianism" for its incipient religiosity. Kelly also edited "Signal: A Whole Earth Catalog" (1988), another in a series of their "tools for access" catalogs ("Mondo 2000" says, "Why be pretentious? It's a shopping mall.") A plug for Apple's HyperCard software -- a hypertext database program more appropriate for entertainment than for accessing information -- went like this:
HyperCard is uniquely suited for activist causes. It goes without saying that its great ease of use and flexibility favors the underdog. Activist groups have often relied on people power and maneuverability to counteract the brute economic and political force of various Powers-That-Be; HyperCard can enhance both of these advantages. ("Signal," p. 164)
Not surprisingly, Leary's notions of a coming technological
millennium sound a little blissed-out; still, given all the acid that went
down the hatch in the old days, Leary's at least got an excuse. But what's
Al Gore's excuse? What's Business Week's excuse? What's yours?
The spectacle is the existing order's uninterrupted discourse about itself, its laudatory monologue. It is the self-portrait of power in the epoch of its management of the conditions of existence. 
The modern spectacle expresses what society can do, but in this expression the permitted is absolutely opposed to the possible. The spectacle is the preservation of unconsciousness. It is its own product, and it has made its own rules: it is a pseudo-sacred entity. 
The technology is based on isolation, and the technical process isolates in turn. From the automobile to television, all the goods selected by the spectacular system are also its weapons for a constant reinforcement of the conditions of isolation of "lonely crowds." The spectacle constantly rediscovers its own assumptions more concretely. 
Waves of enthusiasm for a given product, supported and spread by all the media of communication, are thus propagated with lightning speed. 
Capitalist production has unified space, which is no longer bounded by external societies. This unification is at the same time an extensive and intensive process of banalization. The accumulation of commodities produced in mass for the abstract space of the market, which had to break down all regional and legal barriers and all the corporative restrictions of the Middle Ages that preserved the quality of craft production, also had to destroy the autonomy and quality of places. This power of homogenization is the heavy artillery which brought down all Chinese walls. 
In the spectacle, which is the image of the ruling economy, the goal is nothing, development everything. The spectacle aims at nothing other than itself. 
-- A selection of quotations, with paragraph numbers in brackets, from Guy Debord, "Society of the Spectacle," first published in Paris in 1967. Debord was editor of the journal "Internationale Situationniste" from 1958 to 1969. This translation is from an edition published in 1983 by Black & Red in Detroit.
NameBase book reviews