תקציר
In this paper, we propose the first continuous optimization algorithms that achieve a constant factor approximation guarantee for the problem of monotone continuous submodular maximization subject to a linear constraint. We first prove that a simple variant of the vanilla coordinate ascent, called COORDINATE-ASCENT+, achieves a (2ee--11 - ε)-approximation guarantee while performing O(n/ε) iterations, where the computational complexity of each iteration is roughly O(n/√ε + n log n) (here, n denotes the dimension of the optimization problem). We then propose COORDINATE-ASCENT++, that achieves the tight (1 - 1/e - ε)-approximation guarantee while performing the same number of iterations, but at a higher computational complexity of roughly O(n3/ε2.5 + n3 log n/ε2) per iteration. However, the computation of each round of COORDINATE-ASCENT++ can be easily parallelized so that the computational cost per machine scales as O(n/√ε + n log n).
| שפה מקורית | אנגלית |
|---|---|
| כתב עת | Advances in Neural Information Processing Systems |
| כרך | 2020-December |
| סטטוס פרסום | פורסם - 2020 |
| פורסם באופן חיצוני | כן |
| אירוע | 34th Conference on Neural Information Processing Systems, NeurIPS 2020 - Virtual, Online משך הזמן: 6 דצמ׳ 2020 → 12 דצמ׳ 2020 |
הערה ביבליוגרפית
Publisher Copyright:© 2020 Neural information processing systems foundation. All rights reserved.
טביעת אצבע
להלן מוצגים תחומי המחקר של הפרסום 'Continuous submodular maximization: Beyond DR-submodularity'. יחד הם יוצרים טביעת אצבע ייחודית.פורמט ציטוט ביבליוגרפי
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver