Meaningful use of advanced Bayesian methods requires a good understanding of the fundamentals. This engaging book explains the ideas that underpin the construction and analysis of Bayesian models, with particular focus on computational methods and schemes. The unique features of the text are the extensive discussion of available software packages combined with a brief but complete and mathematically rigorous introduction to Bayesian inference. The text introduces Monte Carlo methods, Markov chain Monte Carlo methods, and Bayesian software, with additional material on model validation and comparison, transdimensional MCMC, and conditionally Gaussian models. The inclusion of problems makes the book suitable as a textbook for a first graduate-level course in Bayesian computation with a focus on Monte Carlo methods. The extensive discussion of Bayesian software - R/R-INLA, OpenBUGS, JAGS, STAN, and BayesX - makes it useful also for researchers and graduate students from beyond statistics.
Applications of queueing network models have multiplied in the last generation, including scheduling of large manufacturing systems, control of patient flow in health systems, load balancing in cloud computing, and matching in ride sharing. These problems are too large and complex for exact solution, but their scale allows approximation. This book is the first comprehensive treatment of fluid scaling, diffusion scaling, and many-server scaling in a single text presented at a level suitable for graduate students. Fluid scaling is used to verify stability, in particular treating max weight policies, and to study optimal control of transient queueing networks. Diffusion scaling is used to control systems in balanced heavy traffic, by solving for optimal scheduling, admission control, and routing in Brownian networks. Many-server scaling is studied in the quality and efficiency driven Halfin–Whitt regime and applied to load balancing in the supermarket model and to bipartite matching in
This introduction to some of the principal models in the theory of disordered systems leads the reader through the basics, to the very edge of contemporary research, with the minimum of technical fuss. Topics covered include random walk, percolation, self-avoiding walk, interacting particle systems, uniform spanning tree, random graphs, as well as the Ising, Potts, and random-cluster models for ferromagnetism, and the Lorentz model for motion in a random medium. This new edition features accounts of major recent progress, including the exact value of the connective constant of the hexagonal lattice, and the critical point of the random-cluster model on the square lattice. The choice of topics is strongly motivated by modern applications, and focuses on areas that merit further research. Accessible to a wide audience of mathematicians and physicists, this book can be used as a graduate course text. Each chapter ends with a range of exercises.
This book is a readable, digestible introduction to exponential families, encompassing statistical models based on the most useful distributions in statistical theory, including the normal, gamma, binomial, Poisson, and negative binomial. Strongly motivated by applications, it presents the essential theory and then demonstrates the theory's practical potential by connecting it with developments in areas like item response analysis, social network models, conditional independence and latent variable structures, and point process models. Extensions to incomplete data models and generalized linear models are also included. In addition, the author gives a concise account of the philosophy of Per Martin-Löf in order to connect statistical modelling with ideas in statistical physics, including Boltzmann's law. Written for graduate students and researchers with a background in basic statistical inference, the book includes a vast set of examples demonstrating models for applications and
This book is a readable, digestible introduction to exponential families, encompassing statistical models based on the most useful distributions in statistical theory, including the normal, gamma, binomial, Poisson, and negative binomial. Strongly motivated by applications, it presents the essential theory and then demonstrates the theory's practical potential by connecting it with developments in areas like item response analysis, social network models, conditional independence and latent variable structures, and point process models. Extensions to incomplete data models and generalized linear models are also included. In addition, the author gives a concise account of the philosophy of Per Martin-Löf in order to connect statistical modelling with ideas in statistical physics, including Boltzmann's law. Written for graduate students and researchers with a background in basic statistical inference, the book includes a vast set of examples demonstrating models for applications and
During the past half-century, exponential families have attained a position at the center of parametric statistical inference. Theoretical advances have been matched, and more than matched, in the world of applications, where logistic regression by itself has become the go-to methodology in medical statistics, computer-based prediction algorithms, and the social sciences. This book is based on a one-semester graduate course for first year Ph.D. and advanced master's students. After presenting the basic structure of univariate and multivariate exponential families, their application to generalized linear models including logistic and Poisson regression is described in detail, emphasizing geometrical ideas, computational practice, and the analogy with ordinary linear regression. Connections are made with a variety of current statistical methodologies: missing data, survival analysis and proportional hazards, false discovery rates, bootstrapping, and empirical Bayes analysis. The book
This textbook offers a compact introductory course on Malliavin calculus, an active and powerful area of research. It covers recent applications, including density formulas, regularity of probability laws, central and non-central limit theorems for Gaussian functionals, convergence of densities and non-central limit theorems for the local time of Brownian motion. The book also includes a self-contained presentation of Brownian motion and stochastic calculus, as well as Lévy processes and stochastic calculus for jump processes. Accessible to non-experts, the book can be used by graduate students and researchers to develop their mastery of the core techniques necessary for further study.
This textbook offers a compact introductory course on Malliavin calculus, an active and powerful area of research. It covers recent applications, including density formulas, regularity of probability laws, central and non-central limit theorems for Gaussian functionals, convergence of densities and non-central limit theorems for the local time of Brownian motion. The book also includes a self-contained presentation of Brownian motion and stochastic calculus, as well as Lévy processes and stochastic calculus for jump processes. Accessible to non-experts, the book can be used by graduate students and researchers to develop their mastery of the core techniques necessary for further study.
Based on a starter course for beginning graduate students, Core Statistics provides concise coverage of the fundamentals of inference for parametric statistical models, including both theory and practical numerical computation. The book considers both frequentist maximum likelihood and Bayesian stochastic simulation while focusing on general methods applicable to a wide range of models and emphasizing the common questions addressed by the two approaches. This compact package serves as a lively introduction to the theory and tools that a beginning graduate student needs in order to make the transition to serious statistical analysis: inference; modeling; computation, including some numerics; and the R language. Aimed also at any quantitative scientist who uses statistical methods, this book will deepen readers' understanding of why and when methods work and explain how to develop suitable methods for non-standard situations, such as in ecology, big data and genomics.
In a surprising sequence of developments, the longest increasing subsequence problem, originally mentioned as merely a curious example in a 1961 paper, has proven to have deep connections to many seemingly unrelated branches of mathematics, such as random permutations, random matrices, Young tableaux, and the corner growth model. The detailed and playful study of these connections makes this book suitable as a starting point for a wider exploration of elegant mathematical ideas that are of interest to every mathematician and to many computer scientists, physicists and statisticians. The specific topics covered are the Vershik-Kerov–Logan-Shepp limit shape theorem, the Baik–Deift–Johansson theorem, the Tracy–Widom distribution, and the corner growth process. This exciting body of work, encompassing important advances in probability and combinatorics over the last forty years, is made accessible to a general graduate-level audience for the first time in a highly polished presentation.
Applications of queueing network models have multiplied in the last generation, including scheduling of large manufacturing systems, control of patient flow in health systems, load balancing in cloud computing, and matching in ride sharing. These problems are too large and complex for exact solution, but their scale allows approximation. This book is the first comprehensive treatment of fluid scaling, diffusion scaling, and many-server scaling in a single text presented at a level suitable for graduate students. Fluid scaling is used to verify stability, in particular treating max weight policies, and to study optimal control of transient queueing networks. Diffusion scaling is used to control systems in balanced heavy traffic, by solving for optimal scheduling, admission control, and routing in Brownian networks. Many-server scaling is studied in the quality and efficiency driven Halfin–Whitt regime and applied to load balancing in the supermarket model and to bipartite matching in
This introduction to some of the principal models in the theory of disordered systems leads the reader through the basics, to the very edge of contemporary research, with the minimum of technical fuss. Topics covered include random walk, percolation, self-avoiding walk, interacting particle systems, uniform spanning tree, random graphs, as well as the Ising, Potts, and random-cluster models for ferromagnetism, and the Lorentz model for motion in a random medium. Schramm–Löwner evolutions (SLE) arise in various contexts. The choice of topics is strongly motivated by modern applications and focuses on areas that merit further research. Special features include a simple account of Smirnov's proof of Cardy's formula for critical percolation, and a fairly full account of the theory of influence and sharp-thresholds. Accessible to a wide audience of mathematicians and physicists, this book can be used as a graduate course text. Each chapter ends with a range of exercises.
In a surprising sequence of developments, the longest increasing subsequence problem, originally mentioned as merely a curious example in a 1961 paper, has proven to have deep connections to many seemingly unrelated branches of mathematics, such as random permutations, random matrices, Young tableaux, and the corner growth model. The detailed and playful study of these connections makes this book suitable as a starting point for a wider exploration of elegant mathematical ideas that are of interest to every mathematician and to many computer scientists, physicists and statisticians. The specific topics covered are the Vershik-Kerov–Logan-Shepp limit shape theorem, the Baik–Deift–Johansson theorem, the Tracy–Widom distribution, and the corner growth process. This exciting body of work, encompassing important advances in probability and combinatorics over the last forty years, is made accessible to a general graduate-level audience for the first time in a highly polished presentation.
Based on a starter course for beginning graduate students, Core Statistics provides concise coverage of the fundamentals of inference for parametric statistical models, including both theory and practical numerical computation. The book considers both frequentist maximum likelihood and Bayesian stochastic simulation while focusing on general methods applicable to a wide range of models and emphasizing the common questions addressed by the two approaches. This compact package serves as a lively introduction to the theory and tools that a beginning graduate student needs in order to make the transition to serious statistical analysis: inference; modeling; computation, including some numerics; and the R language. Aimed also at any quantitative scientist who uses statistical methods, this book will deepen readers' understanding of why and when methods work and explain how to develop suitable methods for non-standard situations, such as in ecology, big data and genomics.
During the past half-century, exponential families have attained a position at the center of parametric statistical inference. Theoretical advances have been matched, and more than matched, in the world of applications, where logistic regression by itself has become the go-to methodology in medical statistics, computer-based prediction algorithms, and the social sciences. This book is based on a one-semester graduate course for first year Ph.D. and advanced master's students. After presenting the basic structure of univariate and multivariate exponential families, their application to generalized linear models including logistic and Poisson regression is described in detail, emphasizing geometrical ideas, computational practice, and the analogy with ordinary linear regression. Connections are made with a variety of current statistical methodologies: missing data, survival analysis and proportional hazards, false discovery rates, bootstrapping, and empirical Bayes analysis. The book
This is a graduate-level introduction to the theory of Boolean functions, an exciting area lying on the border of probability theory, discrete mathematics, analysis, and theoretical computer science. Certain functions are highly sensitive to noise; this can be seen via Fourier analysis on the hypercube. The key model analyzed in depth is critical percolation on the hexagonal lattice. For this model, the critical exponents, previously determined using the now-famous Schramm–Loewner evolution, appear here in the study of sensitivity behavior. Even for this relatively simple model, beyond the Fourier-analytic set-up, there are three crucially important but distinct approaches: hypercontractivity of operators, connections to randomized algorithms, and viewing the spectrum as a random Cantor set. This book assumes a basic background in probability theory and integration theory. Each chapter ends with exercises, some straightforward, some challenging.
Communication networks underpin our modern world, and provide fascinating and challenging examples of large-scale stochastic systems. Randomness arises in communication systems at many levels: for example, the initiation and termination times of calls in a telephone network, or the statistical structure of the arrival streams of packets at routers in the Internet. How can routing, flow control and connection acceptance algorithms be designed to work well in uncertain and random environments? This compact introduction illustrates how stochastic models can be used to shed light on important issues in the design and control of communication networks. It will appeal to readers with a mathematical background wishing to understand this important area of application, and to those with an engineering background who want to grasp the underlying mathematical theory. Each chapter ends with exercises and suggestions for further reading.
The Poisson process, a core object in modern probability, enjoys a richer theory than is sometimes appreciated. This volume develops the theory in the setting of a general abstract measure space, establishing basic results and properties as well as certain advanced topics in the stochastic analysis of the Poisson process. Also discussed are applications and related topics in stochastic geometry, including stationary point processes, the Boolean model, the Gilbert graph, stable allocations, and hyperplane processes. Comprehensive, rigorous, and self-contained, this text is ideal for graduate courses or for self-study, with a substantial number of exercises for each chapter. Mathematical prerequisites, mainly a sound knowledge of measure-theoretic probability, are kept in the background, but are reviewed comprehensively in the appendix. The authors are well-known researchers in probability theory; especially stochastic geometry. Their approach is informed both by their research and by
The Poisson process, a core object in modern probability, enjoys a richer theory than is sometimes appreciated. This volume develops the theory in the setting of a general abstract measure space, establishing basic results and properties as well as certain advanced topics in the stochastic analysis of the Poisson process. Also discussed are applications and related topics in stochastic geometry, including stationary point processes, the Boolean model, the Gilbert graph, stable allocations, and hyperplane processes. Comprehensive, rigorous, and self-contained, this text is ideal for graduate courses or for self-study, with a substantial number of exercises for each chapter. Mathematical prerequisites, mainly a sound knowledge of measure-theoretic probability, are kept in the background, but are reviewed comprehensively in the appendix. The authors are well-known researchers in probability theory; especially stochastic geometry. Their approach is informed both by their research and by
This is a graduate-level introduction to the theory of Boolean functions, an exciting area lying on the border of probability theory, discrete mathematics, analysis, and theoretical computer science. Certain functions are highly sensitive to noise; this can be seen via Fourier analysis on the hypercube. The key model analyzed in depth is critical percolation on the hexagonal lattice. For this model, the critical exponents, previously determined using the now-famous Schramm–Loewner evolution, appear here in the study of sensitivity behavior. Even for this relatively simple model, beyond the Fourier-analytic set-up, there are three crucially important but distinct approaches: hypercontractivity of operators, connections to randomized algorithms, and viewing the spectrum as a random Cantor set. This book assumes a basic background in probability theory and integration theory. Each chapter ends with exercises, some straightforward, some challenging.