It is generally believed that submodular functions—and the more general class of 7-weakly submodular functions—may only be optimized under the non-negativity assumption f(S) > 0. In this paper, we show that once the function is expressed as the difference f = g — c, where g is monotone, non-negative, and 7-weakly submodular and c is non-negative modular, then strong approximation guarantees may be obtained. We present an algorithm for maximizing g — c under a fc-cardinality constraint which produces a random feasible set S such that E [g(5)-c(S)≥ (1 - e-γ-e)g(OPT)-c(OPT), whose running time is 0 (2 log2 |), independent of k. We extend these results to the unconstrained setting by describing an algorithm with the same approximation guarantees and faster O(n/c log i/c) runtime. The main techniques underlying our algorithms are two-fold: the use of a surrogate objective which varies the relative importance between g and c throughout the algorithm, and a geometric sweep over possible γ values. Our algorithmic guarantees are complemented by a hardness result showing that no polynomial-time algorithm which accesses g through a value oracle can do better. We empirically demonstrate the success of our algorithms by applying them to experimental design on the Boston Housing dataset and directed vertex cover on the Email EU dataset.
|Title of host publication||36th International Conference on Machine Learning, ICML 2019|
|Publisher||International Machine Learning Society (IMLS)|
|Number of pages||22|
|State||Published - 2019|
|Event||36th International Conference on Machine Learning, ICML 2019 - Long Beach, United States|
Duration: 9 Jun 2019 → 15 Jun 2019
|Name||36th International Conference on Machine Learning, ICML 2019|
|Conference||36th International Conference on Machine Learning, ICML 2019|
|Period||9/06/19 → 15/06/19|
Bibliographical notePublisher Copyright:
Copyright 2019 by the author(s).