Room “Sala Seminari” - Abacus Building (U14)
Transformers for Sequential Recommendation with Large Catalogues
Speaker
Prof. Craig Macdonald
University of Glasgow
Abstract
In this talk, we draw parallels between language representation and sequential recommendation and show how large-scale sequential recommendation can be effectively addressed using the Transformer architecture. We first discuss the training objectives for training a Transformer-based recommender system, particularly showing how introducing recency during training helps the Transformer to learn quickly. Next, we consider large item catalogues, addressing first the interplay between loss functions and negative sampling for large numbers of items and, finally, how we may break item representations down into “sub-items” to allow efficient representation of large item catalogues by the Transformer model. This presentation combines work published in RecSys'22, RecSys'23, and WSDM'24. An extended version of this talk was presented at ECIR 2024, ESSIR 2024 and the RecSys summer school.
Short Bio
Craig Macdonald is a Professor of Information Retrieval Group at the University of Glasgow. He has co-authored over 230 publications in information retrieval and recommender systems, on topics from expert search, efficient and effective search engines, learning-to-rank, dense retrieval and supervised models for multi-modal and sequential recommendation. He has received best paper awards at ECIR (2014), SIGIR (2015) and RecSys (2023). Craig has been joint coordinator of the TREC Blog, Microblog and Web tracks, run under the auspices of the US National Institute of Standards & Technology, and is the lead maintainer of the PyTerrier platform. He has recently been ECIR 2024 General Co-Chair; ECIR 2025 PC Co-chair; WWW 2025 Short Paper Co-Chair and SIGIR 2024 Short Paper Co-Chair.
contact person for this Seminar: leonardo.mariani@unimib.it