Theoretical Neuroscience Computational and Mathematical Modeling of Neural Systems

by ;
Format: Paperback
Pub. Date: 2005-08-12
Publisher(s): The MIT Press
  • Free Shipping Icon

    This Item Qualifies for Free Shipping!*

    *Excludes marketplace orders.

List Price: $89.60

Buy New

Arriving Soon. Will ship when available.
$85.33

Buy Used

In stock
$64.00

Rent Textbook

Select for Price
There was a problem. Please try again later.

eTextbook

We're Sorry
Not Available

How Marketplace Works:

  • This item is offered by an independent seller and not shipped from our warehouse
  • Item details like edition and cover design may differ from our description; see seller's comments before ordering.
  • Sellers much confirm and ship within two business days; otherwise, the order will be cancelled and refunded.
  • Marketplace purchases cannot be returned to eCampus.com. Contact the seller directly for inquiries; if no response within two days, contact customer service.
  • Additional shipping costs apply to Marketplace purchases. Review shipping costs at checkout.

Summary

Theoretical neuroscience provides a quantitative basis for describing what nervous systems do, determining how they function, and uncovering the general principles by which they operate. This text introduces the basic mathematical and computational methods of theoretical neuroscience and presents applications in a variety of areas including vision, sensory-motor integration, development, learning, and memory. The book is divided into three parts. Part I discusses the relationship between sensory stimuli and neural responses, focusing on the representation of information by the spiking activity of neurons. Part II discusses the modeling of neurons and neural circuits on the basis of cellular and synaptic biophysics. Part III analyzes the role of plasticity in development and learning. An appendix covers the mathematical methods used, and exercises are available on the book's Web site.

Author Biography

Peter Dayan is Professor of Computational Neuroscience at the Gatsby Computational Neuroscience Unit at University College London

Table of Contents

Preface xiii
I Neural Encoding and Decoding
1(150)
Neural Encoding I: Firing Rates and Spike Statistics
3(42)
Introduction
3(5)
Spike Trains and Firing Rates
8(9)
What Makes a Neuron Fire?
17(7)
Spike-Train Statistics
24(10)
The Neural Code
34(5)
Chapter Summary
39(1)
Appendices
40(3)
Annotated Bibliography
43(2)
Neural Encoding II: Reverse Correlation and Visual Receptive Fields
45(42)
Introduction
45(1)
Estimating Firing Rates
45(6)
Introduction to the Early Visual System
51(9)
Reverse-Correlation Methods: Simple Cells
60(14)
Static Nonlinearities: Complex Cells
74(3)
Receptive Fields in the Retina and LGN
77(2)
Constructing V1 Receptive Fields
79(2)
Chapter Summary
81(1)
Appendices
81(3)
Annotated Bibliography
84(3)
Neural Decoding
87(36)
Encoding and Decoding
87(2)
Discrimination
89(8)
Population Decoding
97(16)
Spike-Train Decoding
113(5)
Chapter Summary
118(1)
Appendices
119(3)
Annotated Bibliography
122(1)
Information Theory
123(28)
Entropy and Mutual Information
123(7)
Information and Entropy Maximization
130(15)
Entropy and Information for Spike Trains
145(4)
Chapter Summary
149(1)
Appendix
150(1)
Annotated Bibliography
150(1)
II Neurons and Neural Circuits
151(128)
Model Neurons I: Neuroelectronics
153(42)
Introduction
153(1)
Electrical Properties of Neurons
153(8)
Single-Compartment Models
161(1)
Integrate-and-Fire Models
162(4)
Voltage-Dependent Conductances
166(7)
The Hodgkin-Huxley Model
173(2)
Modeling Channels
175(3)
Synaptic Conductances
178(10)
Synapses on Integrate-and-Fire Neurons
188(3)
Chapter Summary
191(1)
Appendices
191(2)
Annotated Bibliography
193(2)
Model Neurons II: Conductances and Morphology
195(34)
Levels of Neuron Modeling
195(1)
Conductance-Based Models
195(8)
The Cable Equation
203(14)
Multi-compartment Models
217(7)
Chapter Summary
224(1)
Appendices
224(4)
Annotated Bibliography
228(1)
Network Models
229(50)
Introduction
229(2)
Firing-Rate Models
231(10)
Feedforward Networks
241(3)
Recurrent Networks
244(21)
Excitatory-Inhibitory Networks
265(8)
Stochastic Networks
273(3)
Chapter Summary
276(1)
Appendix
276(1)
Annotated Bibliography
277(2)
III Adaptation and Learning
279(120)
Plasticity and Learning
281(50)
Introduction
281(3)
Synaptic Plasticity Rules
284(9)
Unsupervised Learning
293(20)
Supervised Learning
313(13)
Chapter Summary
326(1)
Appendix
327(1)
Annotated Bibliography
328(3)
Classical Conditioning and Reinforcement Learning
331(28)
Introduction
331(1)
Classical Conditioning
332(8)
Static Action Choice
340(6)
Sequential Action Choice
346(8)
Chapter Summary
354(1)
Appendix
355(2)
Annotated Bibliography
357(2)
Representational Learning
359(40)
Introduction
359(9)
Density Estimation
368(5)
Causal Models for Density Estimation
373(16)
Discussion
389(5)
Chapter Summary
394(1)
Appendix
395(1)
Annotated Bibliography
396(3)
Mathematical Appendix
399(20)
Linear Algebra
399(9)
Finding Extrema and Lagrange Multipliers
408(2)
Differential Equations
410(3)
Electrical Circuits
413(2)
Probability Theory
415(3)
Annotated Bibliography
418(1)
References 419(20)
Index 439

An electronic version of this book is available through VitalSource.

This book is viewable on PC, Mac, iPhone, iPad, iPod Touch, and most smartphones.

By purchasing, you will be able to view this book online, as well as download it, for the chosen number of days.

Digital License

You are licensing a digital product for a set duration. Durations are set forth in the product description, with "Lifetime" typically meaning five (5) years of online access and permanent download to a supported device. All licenses are non-transferable.

More details can be found here.

A downloadable version of this book is available through the eCampus Reader or compatible Adobe readers.

Applications are available on iOS, Android, PC, Mac, and Windows Mobile platforms.

Please view the compatibility matrix prior to purchase.