컨텐츠 시작
학술대회/행사
초록검색
제출번호(No.) | 0116 |
---|---|
분류(Section) | Special Session |
분과(Session) | (SS-19) Optimization and Machine Learning (SS-19) |
발표시간(Time) | 19th-A-11:00 -- 11:30 |
영문제목 (Title(Eng.)) |
Double-step alternating extragradient with timescale separation for finding local minimax points |
저자(Author(s)) |
Kyuwon Kim1, Donghwan Kim1 KAIST1 |
초록본문(Abstract) | In minimization, gradient descent converges to a local minimum, and almost surely avoids strict saddle point, under mild conditions. In contrast, minimax optimization lacks such comparable theory for finding local minimax (optimal) points. Recently, the two-timescale extragradient (EG) method has shown potential for finding local minimax points, over the two-timescale gradient descent ascent method. However, it is yet not stable enough for finding \emph{any} degenerate local minimax points that are prevalent in modern over-parameterized setting. We thus propose to incorporate a new double-step alternating update strategy to further improve the stability of the two-timescale EG method, which remedies the aforementioned issue. This is a step toward establishing a theory in minimax optimization analogous to that in minimization. |
분류기호 (MSC number(s)) |
90C47 |
키워드(Keyword(s)) | Minimax optimization, nonconvex-nonconcave optimization, extragradient method, dynamical systems |
강연 형태 (Language of Session (Talk)) |
Korean |