강연 / 세미나
세미나
세미나
일정
MINDS Seminar Series | Scalable Bayesian inference for high-dimensional regression
기간 : 2022-10-25 ~ 2022-10-25
시간 : 10:00 ~ 11:00
개최 장소 : Online streaming (Zoom)
개요
MINDS Seminar Series | Scalable Bayesian inference for high-dimensional regression
분야Field | |||
---|---|---|---|
날짜Date | 2022-10-25 ~ 2022-10-25 | 시간Time | 10:00 ~ 11:00 |
장소Place | Online streaming (Zoom) | 초청자Host | |
연사Speaker | Gyuhyeong Goh | 소속Affiliation | Kansas State University |
TOPIC | MINDS Seminar Series | Scalable Bayesian inference for high-dimensional regression | ||
소개 및 안내사항Content | Scalable Bayesian inference for high-dimensional regression Abstract: he growing influence of high-dimensional regression modeling has led to many remarkable advances in Bayesian variable selection and shrinkage estimation. Due to the computational convenience and theoretical relevance, the use of Gaussian scale mixture priors has become standard practice in high-dimensional Bayesian regression settings. The conditional conjugacy of Gaussian scale mixtures enables us to perform posterior inference via Gibbs sampling. However, when the number of regression coefficients is very large, the computational cost of Gibbs sampling becomes prohibitively expensive as the posterior sampling step requires iterative computations of a large inverse matrix. To address such scalability issue, we propose a scalable Bayesian inference procedure using a new representation of Gaussian scale mixture distributions. The greatest merit of the proposed method is that fast posterior sampling is possible via a partially collapsed Gibbs sampling scheme, which does not require the iterative inverse matrix computation. As an illustration, we show some results from simulation studies and real data analysis. ID : 688 896 1076 / PW : 54321 |
학회명Field | MINDS Seminar Series | Scalable Bayesian inference for high-dimensional regression | ||
---|---|---|---|
날짜Date | 2022-10-25 ~ 2022-10-25 | 시간Time | 10:00 ~ 11:00 |
장소Place | Online streaming (Zoom) | 초청자Host | |
소개 및 안내사항Content | Scalable Bayesian inference for high-dimensional regression Abstract: he growing influence of high-dimensional regression modeling has led to many remarkable advances in Bayesian variable selection and shrinkage estimation. Due to the computational convenience and theoretical relevance, the use of Gaussian scale mixture priors has become standard practice in high-dimensional Bayesian regression settings. The conditional conjugacy of Gaussian scale mixtures enables us to perform posterior inference via Gibbs sampling. However, when the number of regression coefficients is very large, the computational cost of Gibbs sampling becomes prohibitively expensive as the posterior sampling step requires iterative computations of a large inverse matrix. To address such scalability issue, we propose a scalable Bayesian inference procedure using a new representation of Gaussian scale mixture distributions. The greatest merit of the proposed method is that fast posterior sampling is possible via a partially collapsed Gibbs sampling scheme, which does not require the iterative inverse matrix computation. As an illustration, we show some results from simulation studies and real data analysis. ID : 688 896 1076 / PW : 54321 |
성명Field | MINDS Seminar Series | Scalable Bayesian inference for high-dimensional regression | ||
---|---|---|---|
날짜Date | 2022-10-25 ~ 2022-10-25 | 시간Time | 10:00 ~ 11:00 |
소속Affiliation | Kansas State University | 초청자Host | |
소개 및 안내사항Content | Scalable Bayesian inference for high-dimensional regression Abstract: he growing influence of high-dimensional regression modeling has led to many remarkable advances in Bayesian variable selection and shrinkage estimation. Due to the computational convenience and theoretical relevance, the use of Gaussian scale mixture priors has become standard practice in high-dimensional Bayesian regression settings. The conditional conjugacy of Gaussian scale mixtures enables us to perform posterior inference via Gibbs sampling. However, when the number of regression coefficients is very large, the computational cost of Gibbs sampling becomes prohibitively expensive as the posterior sampling step requires iterative computations of a large inverse matrix. To address such scalability issue, we propose a scalable Bayesian inference procedure using a new representation of Gaussian scale mixture distributions. The greatest merit of the proposed method is that fast posterior sampling is possible via a partially collapsed Gibbs sampling scheme, which does not require the iterative inverse matrix computation. As an illustration, we show some results from simulation studies and real data analysis. ID : 688 896 1076 / PW : 54321 |
성명Field | MINDS Seminar Series | Scalable Bayesian inference for high-dimensional regression | ||
---|---|---|---|
날짜Date | 2022-10-25 ~ 2022-10-25 | 시간Time | 10:00 ~ 11:00 |
호실Host | 인원수Affiliation | Gyuhyeong Goh | |
사용목적Affiliation | 신청방식Host | Kansas State University | |
소개 및 안내사항Content | Scalable Bayesian inference for high-dimensional regression Abstract: he growing influence of high-dimensional regression modeling has led to many remarkable advances in Bayesian variable selection and shrinkage estimation. Due to the computational convenience and theoretical relevance, the use of Gaussian scale mixture priors has become standard practice in high-dimensional Bayesian regression settings. The conditional conjugacy of Gaussian scale mixtures enables us to perform posterior inference via Gibbs sampling. However, when the number of regression coefficients is very large, the computational cost of Gibbs sampling becomes prohibitively expensive as the posterior sampling step requires iterative computations of a large inverse matrix. To address such scalability issue, we propose a scalable Bayesian inference procedure using a new representation of Gaussian scale mixture distributions. The greatest merit of the proposed method is that fast posterior sampling is possible via a partially collapsed Gibbs sampling scheme, which does not require the iterative inverse matrix computation. As an illustration, we show some results from simulation studies and real data analysis. ID : 688 896 1076 / PW : 54321 |
Admin
·
2022-10-22 21:40 ·
조회 87
2017년 이전 세미나