Origin > Visions of the Future > The Age of Intelligent Machines > The Age of Intelligent Machines: A NOR B--The Basis of Intelligence?
Permanent link to this article: http://www.kurzweilai.net/meme/frame.html?main=/articles/art0315.html

Printable Version
    The Age of Intelligent Machines: A NOR B--The Basis of Intelligence?
by   Raymond Kurzweil

From Ray Kurzweil's revolutionary book The Age of Intelligent Machines, published in 1990.


An entire computer can be constructed from the repetition of a very simple device called a logic gate. We can make an even stronger statement: any computer can be built with just one type of logic gate. Any computer (serial computers, which do one computation at a time, as well as massively parallel computers, which do many things at once) can in theory be built from a suitable number of this very simple device.

The Church-Turing thesis states that all problems solvable by a "sentient being" (a person) are reducible to a set of algorithms. Another way of expressing this thesis is that machine intelligence and human intelligence are essentially equivalent. According to this (controversial) thesis, human intelligence may at present be (quantitatively) more complex than machine intelligence-that is, use more parallelism and more sophisticated algorithms-but there is no fundamental (qualitative) difference between them. As machines gain in capacity, do more operations in parallel, and use more capable methods (better algorithms), they will be able to emulate human intellectual ability more and more closely. This article will demonstrate that the logic gate, particularly the one we call NOR, can be considered the basis of all machine intelligence. To the extent that one accepts the Church-Turing thesis, the simple NOR gate can be considered (at least in theory) the basis of human intelligence as well. Our ability to model the complex and diverse examples of intelligent behavior from building blocks of NOR gates is another example of a simple unifying formula underlying complicated phenomena. As the following derivation points out, however, this insight alone is not sufficient to build an intelligent machine.

To understand how we can build a computer from logic gates, it is first necessary to understand the nature of such devices.


A logic gate.
A logic gate is a device with two inputs and one output. We can imagine each input and the output to be wires. Each wire can have two states: true and false. When implemented with electronics, this often corresponds to the presence or absence of electric voltage. For example, in TTL (transistor-transistor logic) circuits, a very common type, +5 volts and 0 volts represent the two states.

With two inputs and two possibilities for each input, there are four possible input combinations to a logic gate. There are actually several types of logic gates. Particular types will associate each of these four combinations of input with an output. For example, the NOR ("not or") logic gate transforms each of these four possible input combinations as shown in the table.

The NOR gate
Input AInput BOutput
FalseFalseTrue
FalseTrueFalse
TrueFalseFalse
TrueTrueFalse
The Output of NOR is true if and only if neither A nor B are true. Another example of a logic gate is the AND gate (see table).
The AND gate
Input AInput BOutput
FalseFalseFalse
FalseTrueFalse
True False False
True True True
The output of AND is true only if input A and input B are true. There are 16 possible different logic gates, based on all possible mappings of the four inputs onto outputs. Some of these logic functions have commonly understood names and others do not (see table).
Outputs of the 16 possible dual input logic functions
Function name@F-F @F-T @T-F @T-T
ALWAYS FALSE@F @F @F @F
AND @F @F @F @T
A AND NOT B @F @F @T @F
A @F @F @T @T
B AND NOT A @F @T @F @F
B @F @T @F @T
XOR (exclusive or) @F @T @T @F
OR @F @T @T @T
NOR @T @F @F @F
EQUIVALENCE @T @F @F @T
NOT B @T @F @T @F
A OR NOT B @T @F @T @T
NOT A @T @T @F @F
B OR NOT A @T @T @F @T
NAND (not and) @T @T @T @F
ALWAYS TRUE @T @T @T @T
Not all of the logic functions are equally useful. For example, the ALWAYS TRUE function, whose output is true regardless of its input, is relatively useless, since its output is always the same. The same can be said of the ALWAYS FALSE function.
An important observation is that we can create all 16 of the functions from the NOR function. For example, we can create NOT A as shown in the figure. We can call NOT A by a simpler name: NOT. It simply transforms a single input into its opposite. We can also create OR and AND from NOR (see figures).
If we express these equivalences in the language of logic, we have NOT A = A NOR A; A OR B=NOT (A NOR B); A AND B= (NOT A) NOR (NOT B).
With not, AND, and OR (each of which, as we have shown, can be constructed from the NOR function), we can in turn build up all 16 logic functions (see table). Since we can build up all 16 logic functions from NOT, AND, and OR and in turn can build up NOT, AND, and OR from the NOR function, it follows that we can build all logic functions from NOR.
Building logic functions from NOT, AND, and OR
Inputs
Function NameF-FF-TT-FT-TDescription using NOT, AND and OR
ALWAYS FALSEFFFFA AND NOT A
ANDFFFTA AND B
A AND NOT BFFTFA AND NOT B
AFFTTA
B AND NOT AFTFFB AND NOT A
BFTFTB
XORFTTF(A OR B) AND NOT (A AND B)
ORFTTTA OR B
NORTFFFNOT (A OR B)
EQUIVALENCETFFT(NOT A AND NOT B) OR (A AND B)
NOT BTFTFNOT B
A OR NOT BTFTTNOT (B AND NOT A)
NOT ATTFFNOT A
B OR NOT ATTFTNOT (A AND NOT B)
NANDTTTFNOT (A AND B)
ALWAYS TRUETTTTA OR NOT A
It turns out that there is one other logic function that can also be used to build all the logic functions, the NAND gate. The reason for this is as follows. A critically important logic function is negation or the NOT function. Only four of the 16 logic functions allow us to negate an input: NOR, NAND, NOT A, and NOT B. Of these four, two, NOT A and NOT B, are unable to give us AND or OR because they each ignore one of their inputs. Thus, only NOR and NAND can give us all three vital building blocks of NOT, AND, and OR.

Memory

Thus, all of logic can be built up from one simple logic gate, NOR (or alternatively, NAND). There is, however, one other important type of mechanism essential for either machine or human intelligence, the ability to remember. In the same way that we can build up very complex logic functions from very simple ones, we can also build very complex and extensive memory structures from simple one-bit (on or off) memory. The question, then, is, Can we build a one-bit memory from logic gates? As it turns out, we can.

If the "set" input (which might be called MAKE TRUE) is made true, even momentarily, the output will become true and will remain true, even when the set input becomes false. This memory cell, constructed of only two NOR gates, thus remembers forever even a temporary state of true on the set input. Similarly, a momentary state of true on the "reset" (MAKE FALSE) input will cause the output to go


A memory cell from NOR.
permanently to false. Interestingly, we can replace the two NOR gates with NAND gates in exactly the same configuration and the memory cell will work in a similar way (we need only reverse the position of the "memory" and "not memory" output lines and negate the set and reset input lines).

We thus have all logic functions as well as memory-the two requirements for computation-built up from a very simple device, the NOR gate (or NAND gate). With enough NOR gates we can build any computer or implement any algorithm. All machine intelligence (and, if we accept the Church-Turing thesis, all natural intelligence) can be constructed from NOR.

To solve a problem, it is not enough, however, to simply have a sufficient number of NOR gates. If we dumped a million NOR gates on a table, they would not perform any function, useful or otherwise. Obviously, they need to be connected together and to the outside world in appropriate ways. The outside world can be modeled as a series of switches that present problems or situations. Even human sensory inputs, such as sight or hearing, can be modeled in this way. The input switches could be thousands or even millions of light sensors arranged in an array or auditory sensors that respond to particular frequencies. Even continuous-valued input can be modeled by using multiple input switches to represent different levels of input.

Let us define a "node" to be a particular input switch, a final output display, or an input or output of a particular logic gate. We thus have the following types of nodes:

  • Inputs presenting a problem or situation
  • Logic-gate inputs
  • Logic-gate outputs
  • Final outputs

If we assign each of these nodes a unique number, we can then describe any machine by a series of numbers that describes the connections of the nodes. For example, if our machine is described as 10,126, 4034, 28,1,12,..., this would mean that node 1 is connected to node 10, node 2 is connected to node 126, and so on. Thus, any machine can be described as a series of numbers describing its logical connections.

There is one other type of information required before our machine can solve a problem, and that is the initial state of the memory cells. Even though the memory cells are constructed from logic gates, they have two different states: they hold either the value true or the value false at any one time. We need to set an initial value for each such cell. If we number each memory cell, we can describe these initial values as a sequence of one-bit numbers (such as 0,1,1, 0, etc.). If we group the memory cells into "words" of more than one cell each, then our list of initial values can use numbers other than just 0 and 1.

Hardware and Software

The first list, which described the interconnections of our inputs, logic gates, and outputs, might be called the hardware description. The second list, which described the initial states of the memory cells, can be called the software description. In designing such systems, real engineers would substitute more descriptive symbols for most of the numbers. The final result would be a string of numbers, nonetheless. At this point we notice that the hardware description of our machine looks very similar to the software description. Both are simply lists of numbers. We might substitute the word "information" or even "knowledge" for the expression "list of numbers." Seen in this light, the creation of hardware and software designs are very similar: they both involve the creation of software-like information. We might even regard them both as forms of software.

This model of intelligence as a sea of NOR gates provides us with a simple theoretical model for understanding machines that can implement an algorithm and for machine complexity. If we regard the two lists of numbers described above as descriptions of software, then we can say that we have a simple formula (i.e., NOR) for the hardware of intelligence; we will have to look elsewhere for a unifying model of the software.


Hooking up the NORs. Node connections described by the list 10,126,4034,28,1,12.

 Join the discussion about this article on Mind·X!

Deriving NOT from NOR.




Deriving OR from NOR.




Deriving AND from NOR.


 
 

   [Post New Comment]
   
Mind·X Discussion About This Article:

can't(be the first)
posted on 09/27/2002 9:54 PM by blahblah@blahblah.com

[Top]
[Mind·X]
[Reply to this post]

fussy budget! (first an e-add then a message)

Re: The Age of Intelligent Machines: A NOR B--The Basis of Intelligence?
posted on 09/28/2002 12:54 AM by mentifex@scn.org

[Top]
[Mind·X]
[Reply to this post]

Since the associated article deals with logic and truth-tables, a germane Web page is the http://www.scn.org/~mentifex/negsvo.html rough-draft AI textbook chapter on the negation of simple subject-verb-object (SVO) sentences generated by the AI Mind.

Normally an author withholds chapters of a commercial textbook before hardcopy publication, but AI is so important -- now that The Singularity has actually begun -- that the AI Mindmaking chapters are going up immediately onto the Web in order perchance to influence the further development of the AI Mind.

If the resulting Generic AI Textbook were to be adopted outside the USA in Bangalore or Bangkok or Berlin, then students in those cities would learn not about "AI techniques" but about how to build AI Minds forthwith and how to release them onto the World Wide Web to propagate freely and to evolve wildly. Caveat dubitor!