튜토리얼

  • 홈ˆ
  • 행사안내
  • 튜토리얼

최승진 부사장
(BARO AI)

* 학력
Ph.D., University of Notre Dame (1996)

* 경력
포항공대 컴퓨터공학과 교수 (2001-2019)
삼성전자 종합기술원 자문교수 (2018)
삼성리서치 자문교수 (2017-2018)
신한카드 빅데이터센터 자문교수 (2016-2017)

연구실적: ICML, NeurIPS, AISTATS, AAAI, IJCAI, CVPR 등에 다수의 논문
주요연구: Statistical machine learning, probabilistic models and inference
관심분야: Meta-learning, Bayesian optimization, deep generative models

김정택 박사과정
(포항공과대학교)

* 학력
Ph.D. Student, POSTECH (2015-Present), B.S., POSTECH (2015)

* 경력
SigOpt, Research Engineering Intern (2018)
삼성전자 반도체연구소, Research Intern (2016, 2017)

연구실적 : ECML-PKDD, ICML, NeurIPS, AAAI 및 ICML workshop AutoML, NeurIPS workshop BayesOpt 등에 논문 발표
주요연구: Bayesian optimization, Automated machine learning
관심분야: Statistical machine learning, Bayesian optimization

강연제목

Machine Learning with Graphs

강연요약

Bayesian optimization is a sample-efficient method for finding a global optimum of an expensive-to-evaluate black-box function. It’s been widely used in various applications such as hyperparameter optimization, neural architecture search, automated machine learning, experimental design, active user modeling, and so on. This tutorial consists of 2-hour-long lecture for introducing standard methods for Bayesian optimization, followed by 1-hour-long hands-on practice with BayesO that is the python BO library developed by us. For the first 2 hours, Seungjin Choi will explain two core ingredients of Bayesian optimization: (1) probabilistic surrogate models such Gaussian process regression; (2) optimization of acquisition functions such as expected improvement (EI), GP-UCP, and Thompson sampling. For the last 1 hour, Jungtaek Kim will introduce “BaeyeO” software operating on python. Participants have an opportunity to have hands-on practice on running BayesO for hyperparameter optimization experiments with their own laptops.

1. 시간별 강의계획

no

주제

주요내용

1

Black-box optimization
& surrogate models
(by Seungjin Choi)

- Why black-box optimization
- Overview of Bayesian optimization
- Gaussian process regression as a probabilistic surrogate model

2

Acquisition functions & optimization
(by Seungjin Choi)

- Expected improvement
- GP-UCB
- Thompson sampling
- Local vs global optimization

3

Hands-on practice with BayesO
(by Jungtaek Kim)

- Open source projects of Bayesian optimization
- How BayesO works
- Hyperparameter optimization with BayesO (requires a laptop for practice)

 

2. 참고문헌
- Bobak Shahriari, Kevin Swersky, Ziyu Wang, Ryan P. Adams and Nando de Freitas (2016), ”Taking the Human Out of the Loop: A Review of Bayesian Optimization,” Proceedings of the IEEE.
- H. Kushner (1964), ”A new method of locating the maximum of an arbitrary multipeak curve in the presence of noise,” Journal of Basic Engineering.
- J. Moˇckus, V. Tiesis, and A. Zˇilinskas (1978), ”The application of Bayesian methods for seeking the extremum,” Toward Global Optimization.
- N. Srinivas, A. Krause, S. M. Kakade, and M. Seeger (2010), ”Gaussian process optimization in the bandit setting: No regret and experimental design,” ICML.
- Thompson, William R. "On the likelihood that one unknown probability exceeds another in view of the evidence of two samples". Biometrika, 25(3–4):285–294, 1933.
- Jungtaek Kim and Seungjin Choi (2020), "On local optimizers of acquisition functions in Bayesian optimization," ECML-PKDD.
- Jungtaek Kim, Michael McCourt, Tackgeun You, Saehoon Kim, and Seungjin Choi (2019), "Bayesian optimization over sets", ICML Workshop on Automated Machine Learning.
- https://github.com/jungtaekkim/bayeso

수강자격요건

배우고자 하는 사람은 누구나 수강 가능


서울시 서초구 방배로 76 (방배동, 머리재빌딩 401호) 우)06704 | (Tel)1588-2728 | (Fax)02-521-1352 | 고유번호 : 114-82-03170 | 대표 : 김두현

Copyright (c) KIISE. All rights reserved.