Open this publication in new window or tab >>2018 (English)Licentiate thesis, comprehensive summary (Other academic)
Abstract [en]
Recommender systems have become an integral part of virtually every e-commerce application on the web. These systems enable users to quickly discover relevant products, at the same time increasing business value. Over the past decades, recommender systems have been modeled using numerous machine learning techniques. However, the adoptability of these models by commercial applications remains unclear. We assess the receptiveness of the industrial sector to algorithmic contributions of the research community by surveying more than 30 e-commerce platforms, and experimenting with various recommenders on proprietary e-commerce datasets. Another overlooked but important factor that complicates the design and use of recommender systems is their ethical implications. We identify and summarize these issues in our ethical recommendation framework, which also enables users to control sensitive moral aspects of recommendations via the proposed “ethical toolbox”. The feasibility of this tool is supported by the results of our user study. Because of moral implications associated with user profiling, we investigate algorithms capable of generating user-agnostic recommendations. We propose an ensemble learning scheme based on Thompson Sampling bandit policy, which models arms as base recommendation functions. We show how to adapt this algorithm to realistic situations when neither arm availability nor reward stationarity is guaranteed.
Place, publisher, year, edition, pages
Malmö university, Faculty of Technology and Society, 2018. p. 168
Series
Studies in Computer Science ; 4
Keywords
recommender systems, e-commerce, recommendation ethics, collaborative filtering, thompson sampling, multi-arm bandits, reinforcement learning
National Category
Engineering and Technology
Identifiers
urn:nbn:se:mau:diva-7792 (URN)10.24834/2043/24268 (DOI)24268 (Local ID)978-91-7104-900-1 (ISBN)978-91-7104-901-8 (ISBN)24268 (Archive number)24268 (OAI)
Presentation
2018-03-16, NIB:0E07, 13:00 (English)
Opponent
2020-02-282020-02-282024-02-23Bibliographically approved