Origin > Visions of the Future > The Age of Intelligent Machines > The Age of Intelligent Machines: The Significance of Fifth-Generation Computer Systems
Permanent link to this article: http://www.kurzweilai.net/meme/frame.html?main=/articles/art0309.html

Printable Version
    The Age of Intelligent Machines: The Significance of Fifth-Generation Computer Systems
by   Kazuhiro Fuchi

Since 1982 Kazuhiro Fuchi has been director of the Research Center of the Institute for New Generation Computer Technology (ICOT, also known as the Fifth-Generation Computer System Project). He is editor of the journal New Generation Computing and author of numerous articles on cutting-edge computer technology in Japan.

From Ray Kurzweil's revolutionary book The Age of Intelligent Machines, published in 1990.


The Fifth-Generation Computer System (FGCS) Project was inaugurated in 1983. I would like to report on the significance of the project today. The word "significance" is quite hard to define. It will probably be given different definitions by different people. To put it most simply, if I say the project will be useful or profitable, some people may think it is very significant. Since, however, profit is a very tough subject for me to deal with, I would like to see the significance of the project from a different perspective.

Computer History Will Change

The greatest significance of the FGCS Project, as I see it, is that computer history will be changed by the successful completion of this project. Will this change be for the good? I feel that the computer must evolve to the next generation if it is really to take root in society. To make such evolution possible is the aim of the project. I think that if the aim is achieved, that will be the significance of the project. More simply, the project, as I see it, is aimed primarily at changing the basic design principle that has given us existing computers. I shall expand on this later. Bringing computer technology to a new stage by changing its basic design philosophy, then, is the aim of the project. Has a new computer-design principle been established yet? The answer is no. I think that changing the basic design means establishing new concepts and translating them into practical basic technologies. Establishing basic technologies is the goal of this ten-year project.

As we work on the project, we have visitors from various fields, such as journalists and researchers. One visitor wanted to have a look at a fifth-generation computer. When I told him that we did not have one as yet, he joked, "You've hidden it somewhere, haven't you?" Another serious visitor said this, to my embarrassment: "We have a plan at our company to introduce a new computer system three years or so in the future. Your fifth-generation computer seems very good, and we'd like to install it."

Ten Years Are Needed to Establish Basic Technologies

The ten-year time frame of the project is rather long, but I feel it takes ten years or so to establish basic technologies. If the project succeeds, it will still take several more years to build commercial products on the basis of it. So its realization will take a dozen years from inception, perhaps too distant a future for some people. But I believe it is still very worthwhile to pursue the project.

When I am asked if there is any blueprint for the fifth-generation computer system kept in some vault, I say no, at least not for now. I should rather say that drawing up such a basic blueprint is the goal of the project. The most difficult part of the project is to make the idea of the project understood. Once a product is physically available, we can readily make it understood. But we don't have that yet. There is now no example anywhere in the world that we can cite to show what our projected computer system will be like. This is a characteristic of the project. If what we aim at could be explained in terms of increasing the speed of some known process by ten times or reducing its cost by a certain percentage, it could be understood very easily. As it is, there is no such process.

But that does not mean we have stepped into an entirely nebulous field. And I think that this too is a characteristic of the project. Planning for the project was preceded by three years of research and study and to tell the truth, it incorporated discussions held in various places even further back. For instance, at the Electrotechnical Laboratory, where I was before I came to ICOT, we had had discussions for five or six years on what the next age would be like. One of the motives behind the project is to integrate such various discussions from various places. In the planning process we also discussed where we should go on the basis of diverse leading edge research under way throughout the world. I myself made efforts in that direction, and so did other researchers. So the project is built on research conducted in the past and not on just a collection of casual ideas. However, when it comes to shaping an image out of various researches, you can do it by merely gathering and processing the data statistically. I may say the project represents a refinement of the very intensive discussions we had with various people and on insights gained from past trends.

Establishing new technology is the primary aim of the project. Japan is not very experienced in developing new technology where there is an idea or goal but no example to go by. First a new methodology for implementation of the project must be developed. If we could develop new technology by just following traditional methods, that would be best. But that is not possible, I think. As is often pointed out, the Japanese traditionally prefer stability to innovation. But just developing old themes will not create new technology. That is why we need a new method for carrying out the project. To sum up, I think that making an effort to demonstrate where Japan is going with new technology is the nature and significance of the project.

The Basis Is a Logic Machine

As I see it, there is emerging a situation that calls for Japan to strive to make great contributions at what Professor Moto-oka aptly called a precompetitive stage. The FGCS Project is a project that responds to just that situation.

Though the primary goal of the project was just now explained in terms of changing the design philosophy of computers, it can also be described as developing an easy-to-use computer or a computer with intelligence. When I talked about this with one gentleman, he suggested that the goal might be to seek a new paradigm. The term "paradigm" is normally used to indicate an example, tangible or intangible, that provides the basis for evolution of a cultural or scientific theory. Let me explain this is my own parlance. The computer, as I see it, is a logic machine.

From that standpoint, the basis of the FGCS Project can be traced to logic. But what logic or system of logic is the present computer based on? At the beginning there was Turing-machine theory. It is not entirely wrong to say that the computer has evolved on the basis of that theory. While the present computer is not a Turing machine per se, its basis can be traced back to the Turing theory. So it is still operating in the Turing paradigm. Is there, then, a different paradigm from the Turing paradigm? There is. The Turing theory was published in 1936, which happens to be the year in which I was born. According to the texts, those days were the golden age of logic. Various logic systems were devised to pursue computability, etc., which resulted in the establishment of the concept of computability. From this came the Turing theory.

From the standpoint of logic, however, the Turing theory was a very special one. The mainstream of logic is a system of logic called predicate logic. There were a number of great men involved in establishing predicate logic. This began with Frege in the nineteenth century, who was followed by Gödel in the 1930s. Von Neumann also had his name recorded in the history of predicate logic. With a long history dating way back to Aristotle, predicate logic is, in a sense, an ordinary, more natural logic. So when it comes to logic machines, there might have been predicate-logic machines rather than machines modeled after the Turing machine. But history did not turn out that way.

The computer has followed the route as it has. Some people say that this may have been a gigantic historical detour. I partly agree with them. I think the time is approaching to return to predicate logic as a paradigm. In the past ten years or so there have been moves to restore predicate logic in the field of programming too. Called logic programming, the movement is aimed at developing programming languages based on systems of logic like predicate logic and using them for programming.

Looking back on the planning for the FGCS Project, I may say the project is based on the concept of logic programming. This could be interpreted as redesigning software and applications within that concept, or it could be viewed as building machines with a new type of architecture to support the concept of logic programming. Though logic was our starting point, I would say that logic was not taken into consideration from the outset. Rather, it came as a conclusion after the analysis of various research projects, as I mentioned a short while ago.

Logic programming is closely related to various fields. Take the field of data bases, for example. In the world of data bases, relational data bases are now accepted. The basic concept of relational data bases is relations, and the relations form a concept that is based on predicate logic. Data bases are becoming a very large proportion of computer systems, but the relational-data-base model is not consistent with the programming world at present.

Programs are based not on relations but on procedures. Relations and procedures are fundamentally different from each other. Data-base languages are theoretically more advanced, while the programming languages in current use are based on old concepts. I feel that the present situation is that we have no alternative but to connect the two in unnatural ways. Data-base and programming languages are similar in that logic programming may be regarded as intended to bring programming up to the level of relational data bases, but not the other way around. I thus see the possibility that use of logic in programming will allow it to connect very beautifully with the world of relational data bases, though not the reverse.

In the field of software engineering, research was conducted on a variety of subjects such as new styles of programs, program verification, and synthesis. From these also came logic programming. When considering program verification or synthesis or a very efficient debugging system, which I think will be a future challenge, we have great difficulty in theoretically dealing with the basic computer model of today, which is based on the GOTO statement and assignment. The concepts of assignment and the GOTO statement are basic and easy to understand but are not suitable for proving properties, because they are oriented toward functions. By contrast, machines based on predicate logic are suited for such purposes, since proving is built into them. This is not just a theoretical argument but is somewhat in line with the trend some ten years ago toward avoiding the use of GOTO statements wherever possible so as to write neat programs. That fact suggests that programming based on predicate logic is better suited to our purposes than the ordinary languages we are currently using.

Close relations with Artificial Intelligence

The FGCS Project is also closely related to the field of artificial intelligence. Knowledge-engineering systems, or expert systems, which have been drawing considerable attention recently, are based on the idea of making a consultation system by representing knowledge in the form of rules and processing the rules. Roughly, organizing knowledge into rules is a return to predicate logic. If we go into detail, we shall find diverse arguments on this subject. I might add that AI experts tend to be concerned with even minuscule differences. For knowledge representation there are numerous proposals that are substantially the same with minor differences.

In connection with artificial intelligence, there is the question of natural-language processing. Grammar, for instance, is very closely related to predicate logic. So is semantics. This may be only natural, because logic itself came from a desire to formulate part of the mechanism of natural language. Historically, logic programming originated from proposals by researchers of natural language.

Historical background aside, if we go with the predicate-logic paradigm instead of the Turing paradigm, we can expect that all the problems we now face will be put in order as far as software is concerned.

Linkage with Software Concepts

The resurgence of predicate logic is not limited to the area of software. Diverse research in computer architectures has yielded new interesting architectures. They are interesting not just in hardware configuration but in that they are intended to connect with software concepts. They include frequently cited data-flow machines and their variations, reduction machines, and more recently proposed advanced parallel machines. These represent a new trend, because they are aimed at linking architecture to software engineering concepts.

Conversely, previous bottom-up concepts alone, such as parallelism and associative memory, are not sufficient. They must be connected with high-level software. Recent notable moves take the standpoint that bringing in such software concepts may make it possible to organize parallelism well.

In this context, functional programming is often discussed. Predicate-logic programming, again in macro terms, is an expanded concept that encompasses functional programming. From the viewpoint of architecture too, logic programming may provide a better base for parallel processing, because it contains more parallelism than functional formulation. This thinking underlies the FGCS Project.

Our plan to consider such parallel machines is supported by the progress of VLSI technology. Let us look again at the history. What was the primary reason for going with the Turing paradigm in the first place? In the 1940s memory elements were very expensive. So it was necessary to use as simple a hardware configuration and as few vacuum tubes as possible. A great idea born in those circumstances is what is now called the von Neumann computer concept. But computer history has reached a point where the conventional concept of making software do everything on simple hardware presents various problems. The so-called software crisis is taking place in some fields. For this reason I feel we may see computer history moving out of a phase characterized by strict adherence to a basic philosophy born in the days of costly hardware and into a new phase.

FGCS as a Return to Predicate Logic

For these reasons I see the basic thrust of FGCS as a return to predicate logic. This is not an innovation. Rather it is a return to something old. To people adverse to the word innovation, I explain it as restoration. To people fond of innovation, on the other hand, I explain it as a venture into a new field. I am not lying to either: we are in a transition phase of the cycle of history. What I mean is that history has not yet reached a turning point but will in the 1990s. It will be a little more than a half century since the computer as we have known it was born. The time span may also justify this scenario.

So far I have explained the FGCS Project from my own point of view. Behind this view is, of course, lots of discussions. I have boiled down the results of the discussions I have had with various experts over the years and recapitulated them as I understand them. The ICOT Research Center started activities with researchers sent on loan from numerous organizations. Though they were like a scratch team, these people were very quick to blend themselves harmoniously into the center. Very quickly a sense of togetherness prevailed in the center, and all the people have done better than expected in their respective research fields. This is attributable partly to the enthusiasm of the researchers and those supporting them and partly to the goal of the project, which, if somewhat vague, is ambitious enough to stimulate the young researchers.

Overseas reactions to the project are now very different from what they were when we started discussions on the project. At first there was some criticism that our project had little chance of success. But there have since been increasing numbers of people overseas who support the project's objective of ushering in a new age of computers. I think that this is evidenced by the start of new programs in a number of countries. Behind these startups is the support of researchers in the respective countries.

Allow me to sum up. The FGCS Project is considerably different in nature from a number of other projects under way. It is aimed at something entirely new. For this reason I think we need to make greater efforts than in the past. ICOT has to play a central role in these efforts. But that alone will not be enough. We must encourage research activities throughout Japan and all over the world. The bud is present, and now we need to make it grow. By doing so, we can usher in a really new computer age. But global rather than merely local efforts are needed to make this happen, and after all, the new age will benefit all mankind. As we exert efforts toward that end, we look for cooperation and support from all those concerned everywhere.

We have made a fairly good start on the project. This project is very ambitious in that it is aimed at ushering in a new computer age rather than at developing products in the near future. What we have to do to make that happen and what we sought to do is to employ the framework of logic programming with the aim of building a new hardware architecture and new-type software and applications within that framework and thereby establish a basic body of computer technology for a new age.

Some people say that our commitment to logic programming is simple-minded and may place us under restraint. But evidence indicates that things may go in the direction of logic programming. Moreover, we have no intention of excluding anything else, though we think that the various good ideas suggested so far will fit naturally within that framework. We selected logic programming as the basic idea in the expectation that it would increase freedom in hardware and software design rather than limit it.

The project requires more than group activities on a limited scale. The wisdom of Japan must be combined, and in a broader perspective, the wisdom of computer scientists all over the world must be marshalled to usher in a new age for mankind. So far we have received much more support and cooperation than is usual with other projects. But we have a long, long way to go, and there will be numerous difficulties on the way.

 Join the discussion about this article on Mind·X!

Courtesy of the Institute for New Generation Computer Technology, Tokyo
Kazuhiro Fuchi


 
 

   [Post New Comment]
   
Mind·X Discussion About This Article:

hi!!!!
posted on 06/28/2002 12:44 AM by bing_she@yahoo.com

[Top]
[Mind·X]
[Reply to this post]

hi !!!! thank you for your very nice site.

Re: The Age of Intelligent Machines: The Significance of Fifth-Generation Computer Systems
posted on 06/28/2002 2:31 AM by azb0@earthlink.net

[Top]
[Mind·X]
[Reply to this post]

Wow, blast from the past. I hadn't heard a word about the fifth-gen effort in years (and a quick net search turned up nothing later than about 1994...)

From a formal viewpoint, I cannot see where predicate-logic based computers buys you anything qualitatively different than a turing machine. In particular, you should be able to simulate any conjectured predicate-logic machine on a turing machine. (Contrastingly, I would find it rather difficult to implement a numerical integrator in Prolog.)

However, as we approach the issue of "growing computers from the ground up", perhaps there is value in thinking about the most elementary components in terms of performing "atomic" predicate logic operations.

I have long tried to "view" the manner in which the living cell cleaves DNA and copies out sections to be read by ribosomes in protien production as if it were representative of "computation". I ask myself "what maps to the intructions, the CPU(s), the registers, etc." It occurs to me now to consider the base operations in a different light.

From my recollection of chemistry, reactions proceed spontaneously as the Gibbs free energy decreases. Can we make an analogy between this behavior, and the "reduction" direction of predicate logic operators?

____tony____

Re: The Age of Intelligent Machines: The Significance of Fifth-Generation Computer Systems
posted on 06/28/2002 10:13 AM by grantc4@hotmail.com

[Top]
[Mind·X]
[Reply to this post]

Can anyone give me an example of the kinds of statements used in predicate logic and how they are used to solve a simple problem?

Re: The Age of Intelligent Machines: The Significance of Fifth-Generation Computer Systems
posted on 06/28/2002 11:48 AM by tomaz@techemail.com

[Top]
[Mind·X]
[Reply to this post]

(1) All mammals NOT lived 1 billion B.C.

(2) All mammals have a mammal ancestor.

(3) All ancestor lived at least 1 second before it's offspring.


etc. etc ... just like Cyc does.


Now you can formalize those statements and calculate all theorems out of them.

One will be, that a mammal have lived more than 1 billion B.C.

So ... you have a paradox, and one of the axioms must be eliminated. I suggest (2).


It's a little more blurry, when a human thinks ... but can be perfectly simulated by this kind of predicate reasoning.

IMHO.

- Thomas

Re: The Age of Intelligent Machines: The Significance of Fifth-Generation Computer Systems
posted on 07/01/2005 2:19 AM by ScottyDM

[Top]
[Mind·X]
[Reply to this post]

Greetings:

Back in the early 90s a group of people got together in Silicon Valley as a sort of answer to the ICOT project. They called themselves The Fifth Generation Computer Club (or maybe it was Group). I was involved in that group and gave a talk at one of the meetings.

In some ways we were a bit like a "Fifth Generation Home Brew Computer Club" in that the ultimate goal of the leadership of the group was to build an "AI machine", and we were made up of individuals and not corporations. I remember one fellow (who's name escapes me) had something he called "The Totality of Knowledge".

If anyone knows what happened to any of these people, or recognizes themselves in this description, I'd sure love to contact any of them again. I'd like to know what happened to the group's members. I'm currently half a continent away from Silicon Valley.


Thanks.

Scotty