FaiRecSys: mitigating algorithmic bias in recommender systems

Bora Edizel, Francesco Bonchi, Sara Hajian, André Panisson, Tamir Tassa

نتاج البحث: نشر في مجلةمقالةمراجعة النظراء

ملخص

Recommendation and personalization are useful technologies which influence more and more our daily decisions. However, as we show empirically in this paper, the bias that exists in the real world and which is reflected in the training data can be modeled and amplified by recommender systems and in the end returned as biased recommendations to the users. This feedback process creates a self-perpetuating loop which progressively strengthens the filter bubbles we live in. Biased recommendations can also reinforce stereotypes such as those based on gender or ethnicity, possibly resulting in disparate impact. In this paper we address the problem of algorithmic bias in recommender systems. In particular, we highlight the connection between predictability of sensitive features and bias in the results of recommendations and we then offer a theoretically founded bound on recommendation bias based on that connection. We continue to formalize a fairness constraint and the price that one has to pay, in terms of alterations in the recommendation matrix, in order to achieve fair recommendations. Finally, we propose FaiRecSys—an algorithm that mitigates algorithmic bias by post-processing the recommendation matrix with minimum impact on the utility of recommendations provided to the end-users.

اللغة الأصليةالإنجليزيّة
الصفحات (من إلى)197-213
عدد الصفحات17
دوريةInternational Journal of Data Science and Analytics
مستوى الصوت9
رقم الإصدار2
المعرِّفات الرقمية للأشياء
حالة النشرنُشِر - 1 مارس 2020

ملاحظة ببليوغرافية

Publisher Copyright:
© 2019, Springer Nature Switzerland AG.

بصمة

أدرس بدقة موضوعات البحث “FaiRecSys: mitigating algorithmic bias in recommender systems'. فهما يشكلان معًا بصمة فريدة.

قم بذكر هذا