Recent years have witnessed the great successes of embedding-based methods in recommender systems. Despite their decent performance, we argue one potential limitation of these methods — the embedding magnitude has not been explicitly modulated, which may aggravate popularity bias and training instability, hindering the model from making a good recommendation. It motivates us to leverage the embedding normalization in recommendation. By normalizing user/item embeddings to a specific value, we empirically observe impressive performance gains (9% on average) on four real-world datasets. Although encouraging, we also reveal a serious limitation when applying normalization in recommendation — the performance is highly sensitive to the choice of the temperature
Citation:
@inproceedings{chen2023adap,
title={Adap-$\tau$: Adaptively Modulating Embedding Magnitude for Recommendation},
author={Chen, Jiawei and Wu, Junkang and Wu, Jiancan and Cao, Xuezhi and Zhou, Sheng and He, Xiangnan},
booktitle={Proceedings of the ACM Web Conference 2023},
pages={1085--1096},
year={2023}
}