Jointly Improving the Sample and Communication Complexities in Decentralized Stochastic Minimax Optimization

Xuan Zhang, Gabriel Mancino-Ball, Necdet Serhat Aybat, Yangyang Xu

Research output: Contribution to journalConference articlepeer-review

Abstract

We propose a novel single-loop decentralized algorithm, DGDA-VR, for solving the stochastic nonconvex strongly-concave minimax problems over a connected network of agents, which are equipped with stochastic first-order oracles to estimate their local gradients. DGDA-VR, incorporating variance reduction, achieves O(ϵ−3) oracle complexity and O(ϵ−2) communication complexity without resorting to multi-communication rounds – both are optimal, i.e., matching the lower bounds for this class of problems. Since DGDA-VR does not require multiple communication rounds, it is applicable to a broader range of decentralized computational environments. To the best of our knowledge, this is the first distributed method using a single communication round in each iteration to jointly optimize the oracle and communication complexities for the problem considered here.

Original languageEnglish (US)
Pages (from-to)20865-20873
Number of pages9
JournalProceedings of the AAAI Conference on Artificial Intelligence
Volume38
Issue number18
DOIs
StatePublished - Mar 25 2024
Event38th AAAI Conference on Artificial Intelligence, AAAI 2024 - Vancouver, Canada
Duration: Feb 20 2024Feb 27 2024

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Jointly Improving the Sample and Communication Complexities in Decentralized Stochastic Minimax Optimization'. Together they form a unique fingerprint.

Cite this