0:00
/
0:00
Transcript

"Unifying Generative and Dense Retrieval for Sequential Recommendation"

The podcast on this paper is generated with Google's Illuminate.

Smart fusion of retrieval methods makes recommendations work better with less memory

This paper introduces LIGER, a hybrid model that combines generative and dense retrieval methods for recommendation systems. It addresses performance gaps and cold-start item challenges while maintaining computational efficiency, showing significant improvements in recommendation quality on academic benchmarks.

-----

https://arxiv.org/abs/2411.18814

🔍 Original Problem:

Sequential dense retrieval models require storing unique representations for each item, leading to high memory costs. While generative retrieval offers a promising alternative, it struggles with cold-start items and shows performance gaps compared to dense retrieval methods.

-----

🛠️ Solution in this Paper:

→ LIGER integrates sequential dense retrieval into generative retrieval, using semantic IDs to represent items efficiently

→ The model combines text embeddings with semantic ID generation to improve cold-start item recommendations

→ During inference, LIGER uses beam search to retrieve candidates, then ranks them using dense retrieval techniques

→ The hybrid approach maintains the storage efficiency of generative retrieval while leveraging the ranking capabilities of dense retrieval

-----

💡 Key Insights:

→ Generative retrieval methods tend to overfit to items seen during training

→ Dense retrieval excels at cold-start items but requires significant storage

→ Semantic IDs effectively capture item relationships while reducing storage needs

→ Hybrid approaches can balance performance and computational efficiency

-----

📊 Results:

→ LIGER achieves 13.06% Recall@10 on cold-start items vs 0% for baseline TIGER

→ Storage complexity reduced from O(N) to O(t), where t << N

→ Performance matches or exceeds state-of-the-art on both in-set and cold-start items