0:00
/
0:00
Transcript

"Personalized Graph-Based Retrieval for Large Language Models"

Generated below podcast on this paper with Google's Illuminate.

When AI lacks your history, it learns from your neighbors to personalize responses

PGraphRAG enhances LLM personalization by using graph-based retrieval instead of just user history, enabling better context even for new users with limited data .

-----

https://arxiv.org/abs/2501.02157

🤔 Original Problem:

Existing LLM personalization methods rely heavily on user history, making them ineffective for new users or those with limited interaction data. This creates a significant cold-start problem in real-world applications .

-----

🔧 Solution in this Paper:

→ PGraphRAG constructs bipartite graphs connecting users with items through interaction edges

→ The framework retrieves context using both direct user history and neighbor information from the graph

→ A query function transforms input into retrieval queries for finding relevant user profile entries

→ The system assembles personalized prompts by combining input with retrieved graph-based context

→ The framework optimizes retrieval by selecting the k-most relevant entries from user profiles

-----

💡 Key Insights:

→ Neighbor information provides significant value beyond just user history

→ Retrieval of 4 items generally yields optimal performance

→ The method works effectively even with limited user data

→ Graph-based approach enables richer context for cold-start scenarios

-----

📊 Results:

→ Outperformed baselines across 12 personalization tasks

→ Achieved +32.1% ROUGE-1 improvement in Hotel Experience Generation

→ Demonstrated consistent gains in both long and short text generation

→ Maintained performance with both BM25 and Contriever retrievers

------

Are you into AI and LLMs❓ Join my daily AI newsletter. I will send you 7 emails a week analyzing the highest signal AI developments. ↓↓

🎉 https://rohanpaul.substack.com/

Discussion about this video