The Virtual Book Revisited
An addendum to predictions that appeared in The Age of Intelligent Machines, written for "The Futurecast," a monthly column in the Library Journal.
Originally published February 1993. Published on KurzweilAI.net August 6, 2001.
One of the advantages of being in the futurism business is that by the time your readers are able to find fault with your forecasts, it is too late for them to ask for their money back. Like the sorcerer who predicted he would live forever, he was never proven wrong - at least not during his lifetime.
Nonetheless, I like to monitor the progress of my predictions. I take satisfaction when projections that seemed so startling when first proposed become progressively less so as the world accommodates ever accelerating change.
In my 1990 book, The Age of Intelligent Machines (LJ 3/1/90), I predicted that the next several years would bring convincing evidence that computers were capable of passing "narrow" versions of the Turing Test. And by the end of the decade, it would be well established that computers could establish human or higher levels of performance in a wide variety of intelligent tasks, and that they would be relied upon to diagnose disease and make financial judgments. The Turing Test was first proposed by Alan Turing in 1950. The Turing Test, which has survived as our quintessential measure of intelligence in a machine, involves the ability of a computer to imitate human performance, particularly in the ability to engage in written dialog. In the test, human judges communicate with the computer and one or more human "foils" over terminal lines. The judges cannot see who (or what) they are communicating with so as not to prejudice them against entrants without human bodies. Both the computer and the human foils try to convince the judges they are human. If the judges are unable to reliably make the determination, the computer is deemed to have passed.
It has become accepted in the worlds of computers and philosophy that a machine or entity passing the Turing Test would be regarded as intelligent. It should be noted, however, that the converse of this statement does not necessarily hold. Some observers ascribe a high level of intelligence to certain species of animals such as dolphins and whales, but these animals are obviously in no position to pass the Turing Test (they have no fingers, for one thing).
A "narrow" Turing Test limits the domain of discussion to a specific topic. It is considerably easier for a computer to pass a narrow Turing Test than the unrestricted version envisioned by Turing. In late 1991, the Computer Museum in Boston organized its first "Turing Test" contest, and a program called "PC Therapist 3," developed by Thinking Software, was able to successfully fool the judges in the domain of "whimsical conversation."
I have been asked to be on the prize committee in this year's contest, which will be held in November. I have asked the committee to eliminate the domain of "whimsy" as I feel that it is too easy to simulate. It reminds me of a computer program written a number of years ago by Kenneth Colby, a psychiatrist-programmer. Called Parry, it was able to successfully fool other human psychiatrists into thinking that Parry was a real human paranoid psychotic.
Colby's demonstration of the Turing Test was criticized by Joseph Weizenbaum, a noted artificial intelligence expert and skeptic, who pointed out that his electric typewriter was able to simulate another form of human psychosis, that of infantile autism. He types questions, and it just sits there and hums. Transcripts of his conversations with his electric typewriter cannot be distinguished by human experts from transcripts of conversations with real autistic patients.
A computer excelling at whimsical conversation is demonstrating not so much its own intelligence but that of the judges who are able to make mental connections between their questions and almost any whimsical reply. Computers catching upOn other fronts, however, the progress toward machine intelligence has been less confounded. Computers are beginning to be used to evaluate diagnostic medical tests. Most electrocardiograms today are diagnosed by a computer built into the EKG machine. These diagnoses are reviewed by the physician, and, while not perfect, they are correct most of the time. Large financial funds routinely use computers to determine buy and sell decisions based on expert system models of financial markets. While these systems are not perfect either, the shortcomings of human analysts in predicting financial markets have been well established.
I also predicted that the world chess champion would be a computer by the end of the decade. At that time (1990), the leading computer player called HiTech was rated at 2400 with the human world champion rated at 2800. Since that time, a more advanced chess machine from Carnegie Mellon University has been rated at 2600. It continues to appear that the human chess champion should be regarded as a decidedly endangered species.
Let's take a look at some predictions I've made in the pages of LJ. Only three months ago (LJ, November 15, 1992, p. 53-54), I noted that the Ricoh RN-100 could simulate 128 million neural connections per second, which was 150 million times slower than the human brain. I predicted that computer science would bridge these eight orders of magnitude and provide machines that could match the processing power of the human brain by the year 2019 (I should have said 2020 so as not to imply such precision in these matters). Since that time, a new neural computer from Toshiba was introduced that can now process two billion neural connections per second. We have, therefore, progressed through one of the eight orders of magnitude of improvement needed in short order. The other seven will take somewhat longer, but I am considering moving up my estimate. Paper cutbackOf particular interest to LJ readers were predictions on the future of books. I projected that the software category of virtual books would push the paper-based technology we now use into obsolescence by the end of this decade. In the life cycle of a technology, obsolescence does not mean immediate displacement. Most of the technologies that we use are already obsolete. The stage between the onset of obsolescence and antiquity generally make up five to ten percent of a technology's overall life cycle. Books with pages (although not of paper) go back over 2000 years, and printed books go back over 500 years, so the technology of printed books that we all make use of will enjoy a protracted period of decline.
It has now been one year since my column "The End of Books" appeared (LJ, February 15, 1992, p. 140-141). While many Futurecast readers told me they found this prediction surprising at the time, it is probably less startling today. The events of a year can make a big difference in our perspectives.
Over the past year, it has become accepted that computerized books are better than the paper variety in certain categories. For reference works, we are interested in finding information quickly; the ability of a computer to conduct rapid in-context word searches is unmatched by paper technology. With the success of Grolier's CD-ROM-based encyclopedia and other online and CD ROM-based reference works, the benefits of computerized books in library-based research have become clear. Whenever there is a choice, I opt for conducting factual research electronically. As reported last month, the difference in speed can be 50:1. Exploring new technologiesAnother category in which the electronic book has already demonstrated its superiority is an area we might call books of exploration, such as National Geographic might publish. Indeed, one of the year's best examples is the book From Alice to Ocean, produced by Rick Smolan, a National Geographic photographer.
This CD-ROM-based book (best if played on a Macintosh computer) recounts the story of a woman, her dog and four camels as they trek 1700 miles across Australia's outback over a six-month period. This is not a story we necessarily want to experience in chronological order, and the organization of the "book" allows for a more natural exploration.
The reader can view maps of the journey and then investigate corresponding portions by simply clicking on map locations. Videos describing parts of the journey are available by clicking at appropriate places. Companion pieces on local culture, wildlife, and other relevant topics are available at the click of the mouse. By exploring one of the many cross references available at the push of a button, you can even obtain videos of how Smolan took the pictures in the book. The color photographs reproduce well on a standard Macintosh screen, although they still are not of coffee-table book quality.
The computer as book still falls short when it comes to your standard sequential book, and thus this technology remains a false pretender. Books are still not obsolete, but this victory will be short-lived. The eyes have itLet's review some of the reasons cited a year ago for why computers were not yet ready to push paper books into obsolescence. One of the immediate complaints of users of electronic books is flicker. As noted in the January 1992 Futurecast (p. 80, 82), a flickering screen is perceived by our visual system as motion and thus is tiring and slows down reading speeds. In the past year, the flicker-free Apple Powerbook as become the biggest selling notebook computer. This seems to have started a trend and many new notebook PCs are now flicker free. Another issue is contrast. Notebook computers have had only half of the 120:1 contrast of a good paper book. However, recent notebook computers are getting very close, with contrast ratios of over 100:1.
The next problem is color. I predicted the first wave of color notebook computers would hit during the ensuing year, and I am pleased to say that, unlike a year ago, high-quality color notebooks are now quite normal.
Resolution is another key issue. A good quality PC screen can match a poor quality paperback book but does not compare with most books. New screens now available on computer workstations with 250 dpi (dots per inch) can now rival paper and ink. A workstation is a more expensive category of desktop computer reserved for engineers and other professionals, but the lag time for workstation features to migrate to the world of PCs is usually only about one to two years, with another year or two lag to reach the notebook computer. Thus, we can look to workstations as a way of predicting the next generation of more portable computers. Curling up with a good PCFinally, we come to size, weight, and battery life. Curling up with your notebook computer is a little like taking a large, single-volume encyclopedia into bed with you. At 12" and 6 lbs., it is not a comfortable alternative to the bedside novel. It is also annoying when the two to three hour battery charge runs down. But the past year has introduced a new category of computer: the subnotebook. With a size of 8" x 10", a weight of under 2 Ibs., and a much longer battery life, the right form factor has now been introduced. These subnotebooks are about two years behind notebooks in terms of their technical capabilities. If it will take two to three years for high-resolution color screens to hit the notebook, then we can expect this capability in the friendlier subnotebook format well before the decade is out.
This past year also saw the introduction of computers with built-in cellular communication: by the time we do want to read the latest best seller an our subnotebook or palmtop computer, it can be transmitted from the library without leaving your bedroom. The next decade will greatly expand the number of things that we can do while still sitting in bed.
While the 500-year-old technology of the printed book was not sent into obsolescence in the past year, there was nonetheless very significant progress in each of the underlying capabilities necessary to achieve this objective. With Moore's Law (which every 18 months doubles the capabilities of computer technology in every dimension for the same unit cost) still driving computer technology, the day of fully viable virtual books is not far off.
Reprinted with permission from Library Journal, February, 1993. Copyright © 1993, Reed Elsevier, USA
Other Futurecast columns
|