Selective Fairness in Recommendation via Prompts
Autor: | Wu, Yiqing, Xie, Ruobing, Zhu, Yongchun, Zhuang, Fuzhen, Ao, Xiang, Zhang, Xu, Lin, Leyu, He, Qing |
---|---|
Rok vydání: | 2022 |
Předmět: | |
Druh dokumentu: | Working Paper |
Popis: | Recommendation fairness has attracted great attention recently. In real-world systems, users usually have multiple sensitive attributes (e.g. age, gender, and occupation), and users may not want their recommendation results influenced by those attributes. Moreover, which of and when these user attributes should be considered in fairness-aware modeling should depend on users' specific demands. In this work, we define the selective fairness task, where users can flexibly choose which sensitive attributes should the recommendation model be bias-free. We propose a novel parameter-efficient prompt-based fairness-aware recommendation (PFRec) framework, which relies on attribute-specific prompt-based bias eliminators with adversarial training, enabling selective fairness with different attribute combinations on sequential recommendation. Both task-specific and user-specific prompts are considered. We conduct extensive evaluations to verify PFRec's superiority in selective fairness. The source codes are released in \url{https://github.com/wyqing20/PFRec}. Comment: Accepted by SIGIR 2022 |
Databáze: | arXiv |
Externí odkaz: |