blank

brown logo

home

syllabus

lectures

homework

exams

links



AM 41: Mathematical Methods in the Brain Sciences
Brown University
Fall 2005



Class goals:

1. Become more comfortable with mathematical notation and reasoning
2. Learn basic elements of mathematical programming on a computer
3. Learn introductory concepts from
    - differential equations
    - probability and statistics
    - information theory
4. See examples of these concepts in the Brain Sciences
5. Learn how to use MATLAB

Grades:

The grades in this class are based on 3 exams and 9 homeworks.  The exams correspond to each of the three main topics covered in the course: differential equations, probability and statistics, and information theory.  They are not cumulative.  There is no cumulative final exam, but exam #3 may be scheduled during the final exam period. 

The exams each count for about 30% of your grade and all of the homeworks together count for about 10%.  I say "about" because if someone has very good homeworks (indicating that they put a lot of effort into the class), then I may use that to boost their grade a little.  This can make a difference if you are borderline between two letter grades at the end of the year.  The final grade cutoffs are above 90% for A, above 80% for B, above 70% for C and for pass.  I may lower these depending on the distribution of grades, but I will not raise them.  (So, if you have a 90 or above at the end of the year, then you will get an A regardless of the curve, if any.)

Homeworks are not accepted for grades after the solutions have been posted.  However, if you turn in a late homework, it can still help boost a borderline grade as mentioned above.  Limited collaboration is allowed on homeworks, but everyone must turn in their own version with distinct solutions and distinct MATLAB code.  No copying is allowed.  Obviously, no collaboration is allowed on exams.  The use of any materials (e.g., homeworks, solutions, MATLAB code, exams, etc.) from previous years is strictly forbidden.

Class Outline: (last updated November 19, 2004)              pdf version for printing

0. Introduction

    0.1 Three Main Topics
          0.1.1 Differential Equations
          0.1.2 Probability and Statistics
          0.1.3 Information Theory

    0.2 MATLAB
          0.2.1 Plotting sinusoids
          0.2.2 Computing Fibonacci numbers
          0.2.3 A simple differential equation
          0.2.4 Numerical integration

1. Differential Equations

    1.1 One-dimensional Differential Equations
          1.1.1 Differential equations and their numerical integration
          1.1.2 Linear equations
          1.1.3 Some non-linear equations
          1.1.4 Qualitative analysis

    1.2 Two-dimensional Differential Equations
          1.2.1 Examples
          1.2.2 Qualitative analysis
                   1.2.2.1 The Phase Plane
          1.2.3 Pairs of linear equations
                   1.2.3.1 Stability analysis
                   1.2.3.2 Nullclines
          1.2.4 Pairs of nonlinear equations
                   1.2.4.1 Stability and local analysis
                   1.2.4.2 Example: two neurons
                   1.2.4.3 Linear approximations near equilibria
                   1.2.4.4 Limit cycles and the Poincare-Bendixson Theorem
                   1.2.4.5 Example: a two-neuron oscillator

2. Probability and Statistics

    2.1 Probabilities and Random Variables
          2.1.1 Probability spaces
                   2.1.1.1 Sample spaces
                   2.1.1.2 Events
                   2.1.1.3 Probabilities
                   2.1.1.4 Conditional probabilities
          2.1.2 Random variables
                   2.1.2.1 Definition
                   2.1.2.2 Probability distributions
                   2.1.2.3 Densities
                   2.1.2.4 Mean, variance and standard deviation
                   2.1.2.5 The normal density function
                   2.1.2.6 The binomial probability distribution

    2.2 Limit Laws of Probability
          2.2.1 The Law of Large Numbers (LLN)
                   2.2.1.1 Independent random variables
                   2.2.1.2 Identically distributed random variables
                   2.2.1.3 Law of Large Numbers
          2.2.2 The Central Limit Theorem (CLT)
                   2.2.2.1 Examples (with MATLAB)
                   2.2.2.2 Standardized sums
                   2.2.2.3 Central Limit Theorem
          2.2.3 Example: limit cycles in populations of neurons

    2.3 Hypothesis Testing
          2.3.1 Basic elements of a hypothesis test
                   2.3.1.1 Main idea
                   2.3.1.2 Definitions
                   2.3.1.3 Critical region and significance level
          2.3.2 Examples
                   2.3.2.1 Voter preference
                   2.3.2.2 Deployment of defibrillators
                   2.3.2.3 Approximate tests using CLT
                   2.3.2.4 Extreme fishing
                   2.3.2.5 Poisson spiking

3. Information Theory

    3.1 Measures of Information / Uncertainty
          3.1.1 "Twenty questions"
                   3.1.1.2 Bits
                   3.1.1.3 Fractional bits
          3.1.2 Entropy
                   3.1.2.1 Example: uniform distribution
                   3.1.2.2 Definition: discrete distribution
          3.1.3 Lossless coding
                   3.1.3.1 Codes
                   3.1.3.2 Prefix codes
                   3.1.3.3 Optimal prefix codes
          3.1.4 Review
                   3.1.4.1 Codes and trees
                   3.1.4.2 Entropy
                   3.1.4.3 Entropy and coding

    3.2 Measures of Mutual Information
          3.2.1 Pairs of random variables
                   3.2.1.1 Joint probability distributions
                   3.2.1.2 Joint entropies
                   3.2.1.3 Conditional probability distributions
                   3.2.1.4 Conditional entropies
          3.2.2 Mutual information
          3.2.3 Review
          3.2.4 Example: neuroscience