Author: | James A. Anderson | ISBN: | 9780199357802 |
Publisher: | Oxford University Press | Publication: | March 3, 2017 |
Imprint: | Oxford University Press | Language: | English |
Author: | James A. Anderson |
ISBN: | 9780199357802 |
Publisher: | Oxford University Press |
Publication: | March 3, 2017 |
Imprint: | Oxford University Press |
Language: | English |
Current computer technology doubles in in power roughly every two years, an increase called "Moore's Law." This constant increase is predicted to come to an end soon. Digital technology will change. Although digital computers dominate today's world, there are alternative ways to "compute" which might be better and more efficient than digital computation. After Digital looks at where the field of computation began and where it might be headed, and offers predictions about a collaborative future relationship between human cognition and mechanical computation. James A. Anderson, a pioneer of biologically inspired neural nets, presents two different kinds of computation-digital and analog--and gives examples of their history, function, and limitations. A third, the brain, falls somewhere in between these two forms, and is suggested as a computer architecture that is more capable of performing some specific important cognitive tasks-perception, reasoning, and intuition, for example- than a digital computer, even though the digital computer is constructed from far faster and more reliable basic elements. Anderson discusses the essentials of brain hardware, in particular, the cerebral cortex, and how cortical structure can influence the form taken by the computational operations underlying cognition. Topics include association, understanding complex systems through analogy, formation of abstractions, the biology of number and its use in arithmetic and mathematics, and computing across scales of organization. These applications, of great human interest, also form the goals of genuine artificial intelligence. After Digital will appeal to a broad cognitive science community, including computer scientists, philosophers, psychologists, and neuroscientists, as well as the curious science layreader, and will help to understand and shape future developments in computation.
Current computer technology doubles in in power roughly every two years, an increase called "Moore's Law." This constant increase is predicted to come to an end soon. Digital technology will change. Although digital computers dominate today's world, there are alternative ways to "compute" which might be better and more efficient than digital computation. After Digital looks at where the field of computation began and where it might be headed, and offers predictions about a collaborative future relationship between human cognition and mechanical computation. James A. Anderson, a pioneer of biologically inspired neural nets, presents two different kinds of computation-digital and analog--and gives examples of their history, function, and limitations. A third, the brain, falls somewhere in between these two forms, and is suggested as a computer architecture that is more capable of performing some specific important cognitive tasks-perception, reasoning, and intuition, for example- than a digital computer, even though the digital computer is constructed from far faster and more reliable basic elements. Anderson discusses the essentials of brain hardware, in particular, the cerebral cortex, and how cortical structure can influence the form taken by the computational operations underlying cognition. Topics include association, understanding complex systems through analogy, formation of abstractions, the biology of number and its use in arithmetic and mathematics, and computing across scales of organization. These applications, of great human interest, also form the goals of genuine artificial intelligence. After Digital will appeal to a broad cognitive science community, including computer scientists, philosophers, psychologists, and neuroscientists, as well as the curious science layreader, and will help to understand and shape future developments in computation.