Regularized Submodular Maximization at Scale

Ehsan Kazemi, Shervin Minaee, Moran Feldman, Amin Karbasi

פרסום מחקרי: פרק בספר / בדוח / בכנספרסום בספר כנסביקורת עמיתים


In this paper, we propose scalable methods for maximizing a regularized submodular function f, expressed as the difference between a monotone submodular function g and a modular function ℓ. Submodularity is related to the notions of diversity, coverage, and representativeness. In particular, finding the mode (most likely configuration) of many popular probabilistic models of diversity, such as determinantal point processes and strongly log-concave distributions, involves maximization of (regularized) submodular functions. Since a regularized function can potentially take on negative values, the classic theory of submodular maximization, which heavily relies on a non-negativity assumption, is not applicable. We avoid this issue by developing the first one-pass streaming algorithm for maximizing a regularized submodular function subject to a cardinality constraint. Furthermore, we give the first distributed algorithm that (roughly) reproduces the guarantees of state-of-the-art centralized algorithms for the problem using only O(1/ε) rounds of MapReduce. We highlight that our result, even for the unregularized case where the modular term ℓ is zero, improves over the memory and communication complexity of the state-of-the-art by a factor of O(1/ε). We also empirically study the performance of our scalable methods on real-life applications, including finding the mode of negatively correlated distributions, vertex cover of social networks, and several data summarization tasks.

שפה מקוריתאנגלית
כותר פרסום המארחProceedings of the 38th International Conference on Machine Learning, ICML 2021
מוציא לאורML Research Press
מספר עמודים11
מסת"ב (אלקטרוני)9781713845065
סטטוס פרסוםפורסם - 2021
פורסם באופן חיצוניכן
אירוע38th International Conference on Machine Learning, ICML 2021 - Virtual, Online
משך הזמן: 18 יולי 202124 יולי 2021

סדרות פרסומים

שםProceedings of Machine Learning Research
ISSN (אלקטרוני)2640-3498


כנס38th International Conference on Machine Learning, ICML 2021
עירVirtual, Online

הערה ביבליוגרפית

Publisher Copyright:
Copyright © 2021 by the author(s)

טביעת אצבע

להלן מוצגים תחומי המחקר של הפרסום 'Regularized Submodular Maximization at Scale'. יחד הם יוצרים טביעת אצבע ייחודית.

פורמט ציטוט ביבליוגרפי