Continuous submodular maximization: Beyond DR-submodularity

Moran Feldman, Amin Karbasi

Research output: Contribution to journalConference articlepeer-review

Abstract

In this paper, we propose the first continuous optimization algorithms that achieve a constant factor approximation guarantee for the problem of monotone continuous submodular maximization subject to a linear constraint. We first prove that a simple variant of the vanilla coordinate ascent, called COORDINATE-ASCENT+, achieves a (2ee--11 - ε)-approximation guarantee while performing O(n/ε) iterations, where the computational complexity of each iteration is roughly O(n/√ε + n log n) (here, n denotes the dimension of the optimization problem). We then propose COORDINATE-ASCENT++, that achieves the tight (1 - 1/e - ε)-approximation guarantee while performing the same number of iterations, but at a higher computational complexity of roughly O(n32.5 + n3 log n/ε2) per iteration. However, the computation of each round of COORDINATE-ASCENT++ can be easily parallelized so that the computational cost per machine scales as O(n/√ε + n log n).

Original languageEnglish
JournalAdvances in Neural Information Processing Systems
Volume2020-December
StatePublished - 2020
Externally publishedYes
Event34th Conference on Neural Information Processing Systems, NeurIPS 2020 - Virtual, Online
Duration: 6 Dec 202012 Dec 2020

Bibliographical note

Publisher Copyright:
© 2020 Neural information processing systems foundation. All rights reserved.

Fingerprint

Dive into the research topics of 'Continuous submodular maximization: Beyond DR-submodularity'. Together they form a unique fingerprint.

Cite this