컨텐츠 시작

학술대회/행사

초록검색

제출번호(No.) 0144
분류(Section) Special Session
분과(Session) (SS-19) Optimization and Machine Learning (SS-19)
발표시간(Time) 19th-A-10:00 -- 10:30
영문제목
(Title(Eng.))
Stochastic extragradient with flip-flop shuffling & anchoring
저자(Author(s))
Jiseok Chae1, Chulhee Yun1, Donghwan Kim1
KAIST1
초록본문(Abstract) The extragradient method has been extensively studied as a technique for solving minimax problems, as it outperforms the widely used gradient descent-ascent method, especially in convex-concave problems. However, the stochastic extragradient (SEG) method has seen limited success in demonstrating a decisive advantage in terms of convergence guarantees over stochastic gradient descent-ascent. In this talk, we present our recent work in pursuit of convergence guarantees for "shuffling-based" SEG, motivated by the recent progress in shuffling-based (i.e., without-replacement sampling) stochastic methods. Our analysis reveals that, for convex-concave problems, (a) modifying the sampling scheme alone is insufficient for resolving the nonconvergence of SEG, but (b) with an additional simple trick called anchoring, we are able to develop the "SEG with flip-flop anchoring" method, which successfully converges. Moreover, we provide upper and lower bounds in the strongly-convex-strongly-concave setting also, thus demonstrating that our new method has a provably faster convergence rate compared to other shuffling-based methods.
분류기호
(MSC number(s))
90C47, 90C15
키워드(Keyword(s)) Minimax optimization, extragradient method, stochastic optimization
강연 형태
(Language of Session (Talk))
Korean