한국정보과학회 KCC2020
- Toward the World’s Leading Software Event
2020년 7월 2일(목) ~ 4일(토), Online
후원: 한국과총, Microsoft, 네이버, 디모아, NHN, 라인플러스, 삼성SDS, 인텔코리아, 카카오, KT, SK브로드밴드, SK하이닉스,
한글과컴퓨터, 지능정보산업협회, 아이티센그룹, 올포랜드, 이노그리드, 인프라닉스, 코난테크놀로지, 큐브리드
최승진 부사장 |
학력: Ph.D., University of Notre Dame (1996) |
||||||||||||
강연제목 Variational Inference |
|||||||||||||
강연요약
A probabilistic model is represented by a joint distribution over a set of hidden variables and a set of observed variables. Probabilistic inference involves calculating posterior distributions over hidden variables given the observations. Two big hammers for probabilistic inference include: (1) sampling methods; (2) variational inference (topic of today). In this talk, I begin with underlying mathematical preliminaries such as Shannon's entropy, KL-divergence, and variational methods. Then we will see how the variational method is applied to Bayesian hierarchical models, leading to 'variational inference'. I will explain how variational inference is applied to a few exemplary probabilistic models, including variational mixture of Gaussians, variational PCA, variational linear regression and variational logistic regression. Finally, I will be talking about deep probabilistic models with variational inference emphasized, from variational autoencoders to neural statisticians and neural processes. This will be a nice 3 hour tutorial for those of you who really wish to dive into the world of probabilistic models. |
|||||||||||||
1. 시간별 강의계획
2. 참고문헌 |
|||||||||||||
수강자격요건 |
확률에 대한 기본 지식만 있으면 되고, 머신러닝을 알면 좋으나, 확률모델을 배우고자 하는 사람은 누구나 수강 가능 |