Origin > Will Machines Become Conscious? > The Central Metaphor of Everything?
Permanent link to this article: http://www.kurzweilai.net/meme/frame.html?main=/articles/art0365.html

Printable Version
    The Central Metaphor of Everything?
by   Jaron Lanier

Jaron Lanier's Edge article takes a skeptical look at Moore's Law and its application to trends outside of computer hardware. Will computers become smarter than us in twenty years? Is the computational metaphor actually impeding progress?


Originally published December 4, 2001 at Edge. Published on KurzweilAI.net December 4, 2001.

One of the striking things about being a computer scientist in this age is that all sorts of other people are happy to tell us that what we do is the central metaphor of everything, which is very ego gratifying. We hear from various quarters that our work can serve as the best understanding - if not in the present but any minute now because of Moore's law - of everything from biology to the economy to aesthetics, child-rearing, sex, you name it. I have found myself being critical of what I view as this overuse as the computational metaphor. My initial motivation was because I thought there was naive and poorly constructed philosophy at work. It's as if these people had never read philosophy at all and there was no sense of epistemological or other problems.

Then I became concerned for a different reason which was pragmatic and immediate: I became convinced that the overuse of the computational metaphor was actually harming the quality of the present-day design of computer systems. One example of that, the belief that people and computers are similar, the artificial intelligence mindset, has a tendency to create systems that are naively and overly automated. An example of that is the Microsoft word processor that attempts to retype what you've just typed, the notion of trying to make computers into people because somehow that agenda of making them into people is so important that if you jump the gun it has to be for the greater good, even if it makes the current software stupid.

There's a third reason to be suspicious of the overuse of computer metaphors, and that is that it leads us by reflection to have an overly simplistic view of computers. The particular simplification of computers I'm concerned with is imagining that Moore's Law applies to software as well as hardware. More specifically, that Moore's Law applies to things that have to have complicated interfaces with their surroundings as opposed to things that have simple interfaces with their surroundings, which I think is the better distinction.

Moore's Law is truly an overwhelming phenomenon; it represents the greatest triumph of technology ever, the fact that we could keep on this track that was predicted for all these many years and that we have machines that are a million times better than they were at the dawn of our work, which was just a half century ago. And yet during that same period of time our software has really not kept pace. In fact not only could you argue that software has not improved at the same rate as hardware, you could even argue that it's often been in retrograde. It seems to me that our software architectures have not even been able to maintain their initial functionality as they've scaled with hardware, so that in effect we've had worse and worse software. Most people who use personal computers can experience that effect directly, and it's true in most situations.

But I want to emphasize that the real distinction that I see is between systems with simple interfaces to their surroundings and systems with complex interfaces. If you want to have a fancy user interface and you run a bigger thing it just gets awful. Windows doesn't scale.

One question to ask is, why does software suck so badly? There are a number of answers to that. The first thing I would say is that I have absolutely no doubt that David Gelernter's framework of streams is fundamentally and overwhelmingly superior to the basis in which our current software is designed. The next question is, is that enough to cause it to come about? It really becomes a competition between good taste and good judgment on the one hand, and legacy and corruption on the other - which are effectively two words for the same thing, in effect. What happens with software systems is that the legacy effects end up being the overwhelming determinants of what can happen next as the systems scale.

For instance, there is the idea of the computer file, which was debated up until the early 80s. There was an active contingent that thought that the idea of the file wasn't a good thing and we should instead have a massive distributed data base with a micro-structure of some sort. The first (unreleased) version of the Macintosh did not have files. But Unix jumped the fence from the academic to the business world and it had files, and Macintosh ultimately came out with files, and the Microsoft world had files, and basically everything has files. At this point, when we teach undergraduates computer science, we do not talk about the file as an invention, but speak of it as if it were a photon, because it in effect is more likely to still be around in 50 years than the photon.

I can imagine physicists coming up with some reasons not to believe in photons any more, but I cannot imagine any way that we can tell you not to believe in files. We are stuck with the damn things. That legacy effect is truly astonishing, the sort of non-linearity of the costs of undoing decisions that have been made. The remarkable degree to which the arrow of time is amplified in software development in its brutalness is extraordinary, and perhaps one of the things that really distinguishes software from other phenomena.

Back to the physics for a second. One of the most remarkable and startling insights in 20th century thought was Claude Shannon's connection of information and thermodynamics. Somehow for all of these years working with computers I've been looking at these things and I've been thinking, "Are these bits the same bits Shannon was talking about, or is there something different?" I still don't know the answer, but I'd like to share my recent thoughts because I think this all ties together. If you wish to treat the world as being computational and if you wish to say that the pair of sunglasses I am wearing is a computer that has sunglass input and output- if you wish to think of things that way, you would have to say that not all of the bits that are potentially measurable are in practice having an effect. Most of them are lost in statistical effects, and the situation has to be rather special for a particular bit to matter.

In fact, bits really do matter. If somebody says "I do" in the right context that means a lot, whereas a similar number of bits of information coming in another context might mean much less. Various measurable bits in the universe have vastly different potentials to have a causal impact. If you could possibly delineate all the bits you would probably see some dramatic power law where there would be a small number of bits that had tremendously greater potential for having an effect, and a vast number that had very small potentials. It's those bits that have the potential for great effect that are probably the ones that computer scientists are concerned with, and probably Shannon doesn't differentiate between those bits as far as he went.

Then the question is how do we distinguish between the bits; what differentiates one from the other, how can we talk about them? One speculation is that legacy effects have something to do with it. If you have a system with a vast configuration space, as is our world, and you have some process, perhaps an evolutionary process, that's searching through possible configurations, rather than just a meandering random walk, perhaps what we see in nature is a series of stair steps where legacies are created that prohibit large numbers of configurations from every being searched again, and that there's a series of refinements.

Once DNA has won out, variants of DNA are very unlikely to appear. Once Windows has appeared, it's stuck around, and so forth. Perhaps what happens is that the legacy effect, which is because of the non-linearity of the tremendous expense of reversing certain kinds of systems. Legacies that are created are like lenses that amplify certain bits to be more important. This suggests that legacies are similar to semantics on some fundamental level. And it suggests that the legacy effect might have something to do with the syntax/semantics distinction, to the degree that might be meaningful. And it's the first glimmer of a definition of semantics I've ever had, because I've always thought the word didn't mean a damn thing except "what we don't understand". But I'm beginning to think what it might be is the legacies that we're stuck with.

To tie the circle back to the "Rebooting Civilization" question, what I'm hoping might happen is as we start to gain a better understanding of how enormously difficult, slow, expensive, tedious and rare an event it is to program a very large computer well; as soon as we have a sense and appreciation of that, I think we can overcome the sort of intoxication that overcomes us when we think about Moore's Law, and start to apply computation metaphors more soberly to both natural science and to metaphorical purposes for society and so forth. A well-appreciated computer that included the difficulty of making large software well could serve as a far more beneficial metaphor than the cartoon computer, which is based only on Moore's Law; all you have to do is make it fast and everything will suddenly work, and the computers-will-become-smarter than-us-if-you just-wait-for-20-years sort of metaphor that has been prevalent lately.

Continued at Edge.

Copyright © 2001 by Edge Foundation, Inc.



www.edge.org

 Join the discussion about this article on Mind·X!  
 

   [Post New Comment]
   
Mind·X Discussion About This Article:

The Central Metaphor of Everything?
posted on 09/03/2003 12:11 AM by REDquist

[Top]
[Mind·X]
[Reply to this post]

I think this is just right, i.e. computer speed doesn't translate into cognition when current approaches to software are the basic language. I wish Jaron would attempt to point out a direction here. If our current software practices don't work, where should we go? We describe the brain as a neural net, so that is where we could start. Something massively parallel that is given some input/senses (hear, see, make sounds, touch, etc.) and then programs itself, probably based on some reasonably simple rules that provide for communication/signaling and that then lead to organization.

Maybe DARPA will take us to the next level
posted on 09/03/2003 11:07 AM by grantcc

[Top]
[Mind·X]
[Reply to this post]

Genoa II: Man and Machine Thinking as One

By DAN VERTON
SEPTEMBER 01, 2003

Some of the technology shown in last year's blockbuster movie Minority Report may soon be a reality and a centerpiece of the intelligence community's war on terrorism. In the futuristic thriller, Tom Cruise played the head of a police unit that uses psychic technology to arrest and convict murderers before they commit their crimes.
Research into new intelligence technology is taking place as part of a $54 million program known as Genoa II, a follow-on to the Genoa I program, which focused on intelligence analysis.

In Genoa II, the Defense Advanced Research Projects Agency (DARPA) is studying potential IT that may not only enable new levels of collaboration among teams of intelligence analysts, policy-makers and covert operators, but could also make it possible for humans and computers to "think together" in real time to "anticipate and preempt terrorist threats," according to official program documents.


Re: The Central Metaphor of Everything?
posted on 09/03/2003 12:47 PM by /:setAI

[Top]
[Mind·X]
[Reply to this post]

for many of us working in AI and COg Sci- Linear digital transistor based computing has been dead for a number of years- there are so many new and much more powerful approaches- especially in analog computing [the greatest promise is in DNA computing I think- and beyond that there are many approaches using exotic polymers which naturally form network filamentary structures] that is the way you have to go-

our work has shown us beyond a shadow of doubt that all complex systems are continuous and saturated with profoundly nested feedback signal-paths- in order to create truly complex systems like minds and bodies- we have to go analog or hybrid-

the dominion of Von Neumann computing is at it's end- algorithms can provide very powerful specialized modular functions- but not [easily or practicaly] whole systems

Re: The Central Metaphor of Everything?
posted on 03/12/2007 3:20 AM by raylpc

[Top]
[Mind·X]
[Reply to this post]

Jaron Lanier has made a very good point that technology solution is a package of hardware and software. Many people have optimistically predicted a future based solely on hardware technology advancement with a wishful thinking that the same pace of advancement would happen on the software side as well. Moore’s law has fueled much of Kurzweil’s camp speculation although the law itself is just merely an observation that the industry has been trying very hard to sustain. I won’t comment on Kurzweil’s Law of Accelerating Returns which is an even bolder claim. What I want to bring up is that fact that untapped processing power is useless. In this regard, I’m in complete agreement with Lanier that the “stale” software advancement compared to the leaps and bounds of its hardware counterpart is the bottleneck of technology advancement as a whole.

Looking back at the history of software, we can see the paradigm shifts including from assembly to high-level languages, from procedural to object-oriented etc. These were not really earth-shaking changes. More or less, they’re still just telling the computer what to do including the logic-based ones like PROLOG. Intelligence is more than just doing what you’re told to do. In order to achieve human intelligence, we need a dramatic paradigm shift that can cast a “creativity spell” on the machines. One common characteristic of such programs is complexity – don’t tell me that they will be simple. Even if we don’t talk about intelligence, but just utilizing the new found processing power, the software gets more and more complicated and larger – think Windows Vista. Now you can see an inevitable theme – complexity if we’re going to utilize the abundant processing power brought by the hardware advancement.

Software is hard; and large software gets exponentially harder and more expensive with its scale. Lehman and Belady pioneered the studies in the 70s on large software. They have done a number of statistical studies and found some attributes of large software that make their development hard and expensive:
1. Law of Continuing Change - a system that is being used undergoes continuous change until it is judged more cost effective to restructure the system or replace it by a completely new system.
2. Law of Increasing Complexity - a program that is changed becomes less and less structured (entropy increases) and thus becomes more complex. One has to invest extra effort in order to avoid increasing complexity.
3. Law of Program Evolution (Self Regulation) - the growth rate of global system attributes may seem locally stochastic, but is in fact self-regulating with statistically determinable trends.
4. Law of Invariant Work Rate - the global progress in software development projects is statistically invariant.
5. Law of Incremental Growth Limit - a system develops a characteristic growth increment. When this increment is exceeded, problems concerning quality and usage will result.

The last three laws are just inherent properties of working in large projects involving large number of team workers which will apply to future software development regardless of the software paradigm we will be in. This largely limits the software advancement pace. One might argue that the hardware development suffers from the same limitations. However, this is only partially true since these limitations mostly just affect their design phase but not the production phase. Once a hardware technology breakthrough is achieved, the new products can be manufactured in the factory which is a mechanical process; on the other hand, even if a great new paradigm comes, the software products are instead “manufactured” by the programmers which is an “artistic” process. Therefore, the production phase in software (coding) is kept in check by the last three laws. This largely explains the “stale” software advancement compared to the hardware side. As a result, the technological future speculated by Kurzweil’s camp is unfortunately largely limited by software advancement even if we could get the kind of computing power forecast by Kurzweil.

Janier’s point on legacy software is echoed by the first two laws as well. Legacy software limits the extent of revolution in the software industry much more than in the hardware industry as the software’s much longer service life time calls for more stringent backward compatibilities. This re-affirms my take that Kurzweil’s camp is overly optimistic because they assumed an unrealistic expectation in the software advancement.

Re: The Central Metaphor of Everything?
posted on 03/13/2007 2:23 PM by bnbwhitezombie

[Top]
[Mind·X]
[Reply to this post]

I also believe that software lags far behind hardware.

It would be nice to have a 'help' file that was really helpful.

Computers are great at throwing up little boxes that tell you there's an error. But just try finding a little box that really tells you, in a manner that you can understand, how to fix that error.

It would really be great if Microsoft could come out with a windows operating system that every time an error message box popped up, another little box would pop up giving you simple instructions on how to fix it.

Re: The Central Metaphor of Everything?
posted on 03/13/2007 2:58 PM by NanoStuff

[Top]
[Mind·X]
[Reply to this post]

Way to dig up a dinosaur.

Software that throws errors and then identifies the originating problem of the error, describing how to fix it? Think about that for a moment, the software would have to be self-aware, as well as being aware of it's intended function to do this. You're asking for software that's smarter than the programmer who wrote it. If that were the case, it would be able to write itself, being aware of potential problems thus never presenting them in the first place.

Re: The Central Metaphor of Everything?
posted on 03/17/2007 2:52 AM by bnbwhitezombie

[Top]
[Mind·X]
[Reply to this post]

Isn't that what Ray was talking about in his book? Creating computers that are smarter than us? Then have them write the software that will go into the next generation of computers?
Man, I hope I'm still alive then. I better quit smoking!

Re: The Central Metaphor of Everything?
posted on 03/17/2007 4:13 AM by bnbwhitezombie

[Top]
[Mind·X]
[Reply to this post]

Way to dig up a dinosaur.

Software that throws errors and then identifies the originating problem of the error, describing how to fix it? Think about that for a moment, the software would have to be self-aware, as well as being aware of it's intended function to do this. You're asking for software that's smarter than the programmer who wrote it. If that were the case, it would be able to write itself, being aware of potential problems thus never presenting them in the first place.


Actually, the reason I brought this up was because I downloaded some updated video and audio drivers for my laptop. I installed the drivers. Now every time I boot up my laptop I get these error messages telling me that some files are missing. Why would installing updated drivers cause files to disappear? I don't get it. However, it would be nice if Windows would not only tell me that the files are missing but how to go about replacing the files or just replace the files its self.
Not something that the software or computer itself did to cause the problem, but something that I did to cause the problem.
It would be nice though to never see one of those "this computer has performed an illegal function and will shut down" messages again. LOL!