Mixing times for Markov chains is an active area of research in modern probability and it lies at the interface of mathematics, statistical physics and theoretical computer science. The mixing time of a Markov chain is defined to be the time it takes to come close to equilibrium. There is a variety of techniques used to estimate mixing times, coming from probability, representation theory and spectral theory. In this mini course I will focus on probabilistic techniques and in particular, I will present some recent results (see references below) on connections between mixing times and hitting times of large sets.
Prerequisites: It would be helpful to be familiar with Chapters 4(mixing definitions) and 12(spectral methods) from the book Mixing Times for Markov Chains by D. Levin, Y. Peres and E. Wilmer.
Further reading: