Item description for Emergent Neural Computational Architectures Based on Neuroscience: Towards Neuroscience-Inspired Computing (Lecture Notes in Computer Science) by Stefan Wermter...
It is generally understood that the present approachs to computing do not have the performance, flexibility, and reliability of biological information processing systems. Although there is a comprehensive body of knowledge regarding how information processing occurs in the brain and central nervous system this has had little impact on mainstream computing so far. This book presents a broad spectrum of current research into biologically inspired computational systems and thus contributes towards developing new computational approaches based on neuroscience. The 39 revised full papers by leading researchers were carefully selected and reviewed for inclusion in this anthology. Besides an introductory overview by the volume editors, the book offers topical parts on modular organization and robustness, timing and synchronization, and learning and memory storage.
Promise Angels is dedicated to bringing you great books at great prices. Whether you read for entertainment, to learn, or for literacy - you will find what you want at promiseangels.com!
Est. Packaging Dimensions: Length: 9.2" Width: 5.9" Height: 0.9" Weight: 1.8 lbs.
Release Date Aug 24, 2001
ISBN 354042363X ISBN13 9783540423638
Availability 89 units. Availability accurate as of May 28, 2017 02:57.
Usually ships within one to two business days from La Vergne, TN.
Orders shipping to an address other than a confirmed Credit Card / Paypal Billing address may incur and additional processing delay.
Reviews - What do customers think about Emergent Neural Computational Architectures Based on Neuroscience: Towards Neuroscience-Inspired Computing (Lecture Notes in Computer Science)?
Good overview of current research May 15, 2005
Considering intense efforts in bio-inspired computing, taking the form of genetic algorithms, swarm intelligence, and artificial life, it is not surprising that the field would also gain inspiration from the workings of the brain, whether the brain comes from a human or some other mammal. This is both an exciting development and a difficult one, not only because a complete working model of the mammalian brain is not available yet, but also because of the sheer computational power needed. With faster machines of course, the second problem will be alleviated, and the first is undergoing rapid development, thanks to the research efforts in neuroscience. This book is a collection of articles that give a review of some of research in neuroscience-inspired computing. Since they are review articles, readers will not find in-depth discussion, but references are given that will assist curious readers who need more information on a particular topic.
The editors introduce the subject of neuroscience-inspired computing by pointing out some of the main sources for this inspiration. These sources serve to divide the book into four sections. The first of these concerns the modular organization of the brain. Even though the view of the human brain as being composed as specialized modules is still the subject of intense debate, there is experimental evidence from brain imaging studies that there are regions in the brain that are correlated with cognitive functions. Neural networks are of course used extensively in business and industry, but they are highly task specialized, and it therefore would be interesting, and useful, to entangle these networks together in order to enable the resulting system to solve more general tasks.
Another inspiration comes from the amazing robustness of the human brain. Through various recovery mechanisms, a damaged brain still is able to function to a large degree, and it therefore would be useful in emulate these mechanisms in non-biological machines. The editors discuss briefly various approaches that have been taken in the construction of models of both recovery through regeneration and via functional reallocation. Several papers in the book illustrate the construction of these models. One of these stands out with its emphasis on the creation of neural systems that are dynamic and adaptive. Everyone who has designed neural networks for practical use is aware of the fine-tuning needed to create a successful neural architecture. The authors of this particular article use a neuron development simulator along with evolutionary algorithms to evolve various neuron morphologies and architectures.
The third source of neuroscience-inspired computing comes from the neurophysiology of the brain. The performance of the brain is dependent on the temporal correlation between collections of neurons and brain regions. In the typical construction of a neural network, time dependences are usually not taken into account. This prohibits the neural network from dealing with data that is temporal in nature. In one of the articles in the book, the author describes the neuroscience behind time-dependent learning and proposes an associative learning rule that respects the potentiation or depression of the neuronal synapse at long time scales. The model of dynamical synapse that he proposes involves the computation of the excitatory post synaptic potential at the synapse and the backpropagating action potential. A learning rule is then constructed which depends on the cross-correlation between these two signals. This model, along with others that are discussed in other articles in the book, illustrate the role of timing and synchronization in neuronal processes. One of the more exotic models discussed in this regard is based on chaotic dynamics. Although the modeling of the brain as a collection of chaotic neural objects is difficult to validate because of the long time scales and large amount of data required, the model is discussed in sufficient detail to make it worth reading.
The last source of inspiration concerns the memory storage capabilities of the human brain. One of the articles in the book concerns the construction of artificial neural networks that can deal with sensitive to contexts, is hierarchical and extensible. The article is more ambitious than the others in the book as it discusses many difficult issues in the neuronal modeling. The goal of this modeling is to account for the ability of the human brain to engage in abstract reasoning. The local associationist algorithms that are usually used cannot emulate symbol-like behavior, and so the authors attempt to use recurrent nets and `schemata' to deal with this issue.
The articles definitely motivate the reader to investigate further the status of research in neuroscience-inspired computing. Further progress in neuroscience will be needed before machines can be constructed which emulate the brain in more detail. But even without a full understanding of the human brain machines could be constructed that use some approximate features of the human brain. These machines will have capabilities that may be better than those whose functioning or computational abilities are not based on human brain processes or its modular structure. The practical use of these machines will then motivate the construction of even better machines as more knowledge of brain processes becomes available.