ChatGPT, DALL-E, Stable Diffusion. These are all examples of Generative AI models.
Generative models are a class of machine learning techniques that can learn to synthesize realistic data from a dataset, such as datasets of images, text, audio, etc.
In this course, we will cover the mathematical foundations and practical implementation of generative models. This includes topics such as Probability Distributions, Maximum likelihood estimation, Bayesian inference, Variational inference, Monte Carlo methods, and Markov Chain Monte Carlo techniques. We will explore and implement generative models, such as Restricted Boltzmann Machines (RBM), Variational Autoencoders (VAE), Energy-Based Models (EBM), Transformers (GPT), and Diffusion Processes (DDPM).
The course will consist of nine lectures, four homework assignments, a Midterm, and a Final. Each week a discussion topic will be posted on the discussion board. Extra points can be earned by thoughtful contributions to the discussions.
Prerequisites: Linear algebra, calculus, probability theory (such as FINM 34000 - Probability and Stochastic Processes), and basic programming skills in Python.
This course counts towards the Financial Data Science concentration.