## Abstract

In this paper, we propose the first continuous optimization algorithms that achieve a constant factor approximation guarantee for the problem of monotone continuous submodular maximization subject to a linear constraint. We first prove that a simple variant of the vanilla coordinate ascent, called COORDINATE-ASCENT+, achieves a (_{2}^{e}_{e}^{-}_{-}^{1}_{1} - ε)-approximation guarantee while performing O(n/ε) iterations, where the computational complexity of each iteration is roughly O(n/√ε + n log n) (here, n denotes the dimension of the optimization problem). We then propose COORDINATE-ASCENT++, that achieves the tight (1 - 1/e - ε)-approximation guarantee while performing the same number of iterations, but at a higher computational complexity of roughly O(n^{3}/ε^{2.5} + n^{3} log n/ε^{2}) per iteration. However, the computation of each round of COORDINATE-ASCENT++ can be easily parallelized so that the computational cost per machine scales as O(n/√ε + n log n).

Original language | English |
---|---|

Journal | Advances in Neural Information Processing Systems |

Volume | 2020-December |

State | Published - 2020 |

Externally published | Yes |

Event | 34th Conference on Neural Information Processing Systems, NeurIPS 2020 - Virtual, Online Duration: 6 Dec 2020 → 12 Dec 2020 |

### Bibliographical note

Funding Information:The work of Moran Feldman was supported in part by ISF grants no. 1357/16 and 459/20. Amin Karbasi is partially supported by NSF (IIS-1845032), ONR (N00014-19-1-2406), AFOSR (FA9550-18-1-0160), and TATA Sons Private Limited.

Publisher Copyright:

© 2020 Neural information processing systems foundation. All rights reserved.