Limit this search to....

Neural Networks and Analog Computation: Beyond the Turing Limit 1999 Edition
Contributor(s): Siegelmann, Hava T. (Author)
ISBN: 0817639497     ISBN-13: 9780817639495
Publisher: Birkhauser
OUR PRICE:   $161.49  
Product Type: Hardcover - Other Formats
Published: December 1998
Qty:
Annotation: The theoretical foundations of Neural Networks and Analog Computation conceptualize neural networks as a particular type of computer consisting of multiple assemblies of basic processors interconnected in an intricate structure. Examining these networks under various resource constraints reveals a continuum of computational devices, several of which coincide with well-known classical models. What emerges is a Church-Turing-like thesis, applied to the field of analog computation, which features the neural network model in place of the digital Turing machine. This new concept can serve as a point of departure for the development of alternative, supra-Turing, computational theories. On a mathematical level, the treatment of neural computations enriches the theory of computation but also explicated the computational complexity associated with biological networks, adaptive engineering tools, and related models from the fields of control theory and nonlinear dynamics.

The topics covered in this work will appeal to a wide readership from a variety of disciplines. Special care has been taken to explain the theory clearly and concisely. The first chapter review s the fundamental terms of modern computational theory from the point of view of neural networks and serves as a reference for the remainder of the book. Each of the subsequent chapters opens with introductory material and proceeds to explain the chapter's connection to the development of the theory. Thereafter, the concept is defined in mathematical terms.

Although the notion of a neural network essentially arises from biology, many engineering applications have been found through highly idealized and simplified models of neuronbehavior. Particular areas of application have been as diverse as explosives detection in airport security, signature verification, financial and medical times series prediction, vision, speech processing, robotics, nonlinear control, and signal processing. The focus in all of these models is entirely on the behavior of networks as computer.

The material in this book will be of interest to researchers in a variety of engineering and applied sciences disciplines. In addition, the work may provide the base of a graduate-level seminar in neural networks for computer science students.

Additional Information
BISAC Categories:
- Computers | Machine Theory
- Mathematics | Applied
- Computers | Neural Networks
Dewey: 004.015
LCCN: 98029446
Series: Progress in Theoretical Computer Science
Physical Information: 0.6" H x 6.36" W x 9.7" (1.00 lbs) 181 pages
 
Descriptions, Reviews, Etc.
Publisher Description:
Humanity's most basic intellectual quest to decipher nature and master it has led to numerous efforts to build machines that simulate the world or communi- cate with it Bus70, Tur36, MP43, Sha48, vN56, Sha41, Rub89, NK91, Nyc92]. The computational power and dynamic behavior of such machines is a central question for mathematicians, computer scientists, and occasionally, physicists. Our interest is in computers called artificial neural networks. In their most general framework, neural networks consist of assemblies of simple processors, or "neurons," each of which computes a scalar activation function of its input. This activation function is nonlinear, and is typically a monotonic function with bounded range, much like neural responses to input stimuli. The scalar value produced by a neuron affects other neurons, which then calculate a new scalar value of their own. This describes the dynamical behavior of parallel updates. Some of the signals originate from outside the network and act as inputs to the system, while other signals are communicated back to the environment and are thus used to encode the end result of the computation.