Abstract
In this paper, we propose the first continuous optimization algorithms that achieve a constant factor approximation guarantee for the problem of monotone continuous submodular maximization subject to a linear constraint. We first prove that a simple variant of the vanilla coordinate ascent, called COORDINATE-ASCENT+, achieves a (2ee--11 - ε)-approximation guarantee while performing O(n/ε) iterations, where the computational complexity of each iteration is roughly O(n/√ε + n log n) (here, n denotes the dimension of the optimization problem). We then propose COORDINATE-ASCENT++, that achieves the tight (1 - 1/e - ε)-approximation guarantee while performing the same number of iterations, but at a higher computational complexity of roughly O(n3/ε2.5 + n3 log n/ε2) per iteration. However, the computation of each round of COORDINATE-ASCENT++ can be easily parallelized so that the computational cost per machine scales as O(n/√ε + n log n).
Original language | English |
---|---|
Journal | Advances in Neural Information Processing Systems |
Volume | 2020-December |
State | Published - 2020 |
Externally published | Yes |
Event | 34th Conference on Neural Information Processing Systems, NeurIPS 2020 - Virtual, Online Duration: 6 Dec 2020 → 12 Dec 2020 |
Bibliographical note
Publisher Copyright:© 2020 Neural information processing systems foundation. All rights reserved.