Learning in Games
Graduate Topics Course
Games have long been used as benchmarks in artificial intelligence, and research in game playing has closely tracked major developments in computing. Famous examples include IBM's Deep Blue and Google Deepmind's AlphaGo. Driven by advances in machine learning, recent years have seen rapid progress in the field of game playing artificial intelligences. This reading course will review the major achievements in learning in games, discuss different classes of games, and the algorithms used to select good strategies. We will introduce relevant game theory, discuss classical methods for complete information games and combinatorial games, and modern learning methods such as counterfactual regret minimization methods used in incomplete information games. We will conclude by identifying the frontiers of artificial intelligence in games. Students are expected to enter with rudimentary coding experience.
Introduction to Computing
Computation is an essential topic across the physical and social sciences, in statistics, data science, and machine learning. Numerical linear algebra is the essential language of computation. Through a series of hands-on applications, students will implement and evaluate the essential algorithms used to solve linear systems and least squares problems, perform regression, orthogonalize bases, decompose signals via the FFT and related transforms, and perform matrix factorizations. We will focus on the computational complexity and stability of each algorithm, as well as its practical uses. Example applications include iterative optimizers used to solve large systems arising in engineering, spectral embedding methods for dimension reduction (PCA, MDS, and diffusion maps), and linear methods for classification and clustering. Examples will be presented as interactive coding notebooks available through a web browser. Prior coding experience is strongly encouraged, though students looking for an introduction to Jupyter notebooks and Python are welcome to enroll.
Numerical Linear Algebra
This course is devoted to the basic theory of linear algebra and its significant applications in scientific computing. The objective is to introduce students to the tools needed to state, analyze, and solve multivariate problems. Students should leave the course ready to use linear algebra in future courses in algorithms, scientific computing, mathematical modeling, signal processing, multivariate statistics, data analysis, as well as the physical and social sciences. Topics include Gaussian elimination, vector spaces, linear transformations and associated fundamental subspaces, orthogonality and projections, eigenvectors and eigenvalues, diagonalization of real symmetric and complex Hermitian matrices, the spectral theorem, and matrix decompositions (QR, and Singular Value Decompositions). Systematic methods applicable in high dimensions and techniques commonly used in scientific computing are emphasized. Some programming exercises will appear as optional work.
This course is concerned with the analysis of nonlinear dynamical systems arising in the context of mathematical modeling. The focus is on qualitative analysis of solutions as trajectories in phase space, including the role of invariant manifolds as organizers of behavior. Local and global bifurcations, which occur as system parameters change, will be highlighted, along with other dimension reduction methods that arise when there is a natural time-scale separation. Concepts of bi-stability, spontaneous oscillations, and chaotic dynamics will be explored through investigation of conceptual mathematical models arising in the physical and biological sciences.
This is the third in a sequence of mathematics courses for physical sciences majors. It covers differential equations: first and second-order ODE, systems of ODE, damped oscillators and resonance, Fourier series and Fourier transforms, Laplace transforms, and solutions of the heat and wave equations. This course focuses on solving differential equations via the superposition of solutions, and the representation of general signals via a change of function basis (Fourier and Laplace transforms).