Here is a partial listing of courses at Brown taught by members of the group.
18. Modeling the World with Mathematics: An Introduction for Non-Mathematicians
Mathematics is the foundation of our technological society and most of its powerful ideas are quite accessible. This course will explain some of these using historical texts and Excel. Topics include the predictive power of 'differential equations' from the planets to epidemics, oscillations and music, chaotic systems, randomness and the atomic bomb. Prerequisite: some knowledge of calculus.
41. Mathematical Methods in the Brain Sciences
Basic mathematical methods commonly used in the cognitive and neural sciences. Topics include: introduction to differential equations, emphasizing qualitative behavior; introduction to probability and statistics, emphasizing hypothesis testing and modern nonparametric methods; and some elementary information theory. Examples from biology, psychology, and linguistics. Prerequisite: a course in
integral and differential calculus.
65. Essential Statistics
A first course in statistics emphasizing statistical reasoning and basic concepts. Comprehensive treatment of most commonly used statistical methods through linear regression. Elementary probability and the role of randomness. Data analysis and statistical computing using Excel. Examples and applications from the popular press and the life, social and physical sciences. No mathematical prerequisites beyond high school algebra.
107. Quantitative Models of Biological Systems
An intermediate course between Biomed 110 and Biomed 212 (Applied
Mathematics 222). Quantitative modeling techniques useful in
molecular biology, physiology and ecology. Topics range from the
subcellular level through the cellular level and the organ systems
level to the level of the whole organism and to population dynamics
and ecosystem analyses.
120. Operational Analysis: Probabilistic models
Methods of problem formulation and solution. Introduction to the
theory of Markov chains, the probabilistic analog of a difference or
differential equation. Birth-death statistical processes and their
applications. Queuing, probabilistic service and waiting line theory.
Sequential decision theory via the methods of dynamic programming.
Prerequisite: AM 165, or Mathematics 161, or equivalent.
121. Operational Analysis: Deterministic Methods
An introduction to the basic mathematical ideas and computational
methods of optimization. Linear programming, the theory of optimal
decision making under linear constraints on resources. Applications
include decision theory in economics, transportation theory, optimal
assignments, production and operations scheduling. Network modeling
and flows. Integer programming. Prerequisites: an introduction to
matrix calculations, such as AM 34 or Mathematics 52.
165. Statistical Inference I
AM 165/166 constitute an integrated first course in mathematical
statistics. The first third of AM 165 is probability theory, and its
last two thirds are statistics. Specific topics include probability
spaces, discrete and continuous random variables, methods for
parameter estimation, large and small sample techniques for confidence
intervals and hypothesis testing. Prerequisite: Mathematics 10 or
equivalent.
166. Statistical Inference II
Simple and multiple regression. Analysis of variance and covariance.
The general linear model. Introduction to categorical and
nonparametric data analysis. Prerequisites: AM 165 and some linear
algebra.
167. Statistical Analysis of Time Series
Time series analysis is an important branch of mathematical statistics
with many applications to signal processing, econometrics, geology,
etc. The course emphasizes methods for analysis in the frequency
domain, in particular, estimation of the spectrum of a time-series,
but time domain methods are also covered. Prerequisite: elementary
probability and statistics on the level of AM 165-166. Offered in
alternate years.
168. Nonparametric Statistics
A systematic treatment of the distribution-free alternatives to
classical statistical tests. These non-parametric tests make minimum
assumptions about distributions governing the generation of
observations, yet are of nearly equal power to the classical
alternatives. Prerequisite: AM 165 or equivalent. Offered in
alternate years.
169. Computational Probability and Statistics
Examination of probability theory and statistical inference from the
point of view of modern computing. Random number generation, Monte
Carlo methods, simulation, and other special topics. Prerequisites:
calculus, linear algebra, AM 165, or equivalent. Some experience with
programming desirable.
171. Information Theory
Information theory is the study of the fundamental limits of
information transmission and storage. This course, intended
primarily for beginning graduate students and advanced undergraduates,
offers a broad introduction to information theory and its applications:
Entropy and information; lossless data compression, communication
in the presence of noise, capacity, channel coding; source-channel
separation; Gaussian channels; lossy data compression.
211. Real Analysis
This course provides the basis of real analysis which is fundamental
to many of the other courses in the program: metric spaces, measure
theory, and the theory of integration and differentiation.
212. Hilbert Spaces and Their Applications
A continuation of AM 211: The theory of Lp spaces, the geometric
theory of Hilbert spaces, spectral theory and bounded and unbounded
operators in Hilbert spaces, and applications to integral and partial
differential equations.
263, 264. Theory of Probability (Mathematics 263, 264)
A two-semester course in probability theory. Semester I includes an
introduction to probability spaces and random variables, the theory of
countable state Markov chains and renewable processes, laws of large
numbers and the central limit theorems. Measure theory is first used
near the end of the first semester (AM 211 may be taken concurrently).
Semester II provides a rigorous mathematical foundation to probability
theory and covers conditional probabilities and expectations, limit
theorems for sums of random variables. martingales, ergodic theory,
Brownian motion and an introduction to stochastic process theory.
266. Stochastic Processes
Topics in the theory on continuous parameter stochastic processes.
The precise content varies from year to year, but generally includes
selections from the following topics: second order stationary
processes; ergodic processes and their applications; Markov processes,
including jump processes and diffusions; applications to noise and
communication theory.
267. Mathematical Statistics I
Topics in classical statistical inference: unbiased, maximum
likelihood, minimax and equivariant estimators; Cramer-Rao inequality;
confidence sets; hypothesis testing; likelihood ratio tests; large
sample theory; consistency and asymptotic normality; Bayesian
efficiency, and super-efficiency.
268. Mathematical Statistics II
Introduction to decision and game theories; admissibility; complete
class theorems; the Bayesian approach to statistics; subjective and
prior information; posterior distribution; Bayesian methods for point
estimation, hypothesis testing, and multiple decision problems;
Bayesian sequential analysis; the sequential likelihood tests;
applications to classification and learning problems. Prerequisite:
AM 267.
269, 270. Topics in Statistics and its Applications
Advanced topics varying from year to year, including: non-parametric
methods for density estimation, regression and prediction in
time-series; cross-validation and adaptive smoothing techniques;
bootstrap; recursive partitioning, projection-pursuit, ACE algorithm;
non-parametric classification and clustering; stochastic
Metropolis-type simulation and global optimization algorithms; Markov
random fields and statistical mechanics; applications to image
processing, speech recognition and neural networks.
272. Information Theory II
Information theory and its relationship with probability,
statistics, and data compression. Entropy.
The Shannon-McMillan-Breiman theorem. Shannon's source
coding theorems. Statistical inference; hypothesis testing;
model selection criteria; the minimum description length
principle. Information-theoretic proofs of limit theorems
in probability: Law of large numbers, central limit theorem,
large deviations, Markov chain convergence, Poisson
approximation, the Hewitt-Savage 0-1 law.
<\font>