With the advent of deep learning in the recommendation field, a lot ofwork has been and still is being done to bring deep learning-based models to their full potential. One line of work, Self-Supervised Learning (SSL), focuses on extracting the maximum potential of datasets to improve recommendation performance, and at the same time, attempts to diminish data-related problems, such as data sparsity, that are commonly seen in machine learning techniques. One branch of SSL for recommender systems uses predictive strategies to create new labels and examples for training by, for instance, adding new interactions to the user’s history of interactions. However, the existing models that explore this idea are somewhat limited. They focus on adding new interactions at the start of the sequences, ignoring the performance improvements that could be achieved by adding interactions in the middle and end of the sequences. We propose Extrapolation-based Sequence Augmentation for Sequential Recommendation (ESA4SRec), a model that uses the sequence reconstruction capabilities of BERT4Rec to generate new data at any position of a sequence by extrapolating the existing knowledge to unknown, novel interactions. The resulting augmented dataset is then used as input to a modelagnostic sequential recommender system. We compare our approach to related models and demonstrate the performance improvements when compared with the original datasets and the overall best performance of our method. ESA4SRec’s code available at https://github.com/viniciusgm000/ESA4SRec.
Kaiyang MaZhenyu YangYu WangLaiping Cui
Haiyang WangYan ChuHui NingZhengkui WangWen Shan
Tingting ZhengZhilong ShanZhengyang WuXiaoyong Hu
Benjamin AmankwataKenneth K. Fletcher
Shuai WangYancui ShiHao YangJie Zheng