Efficient tensor factorization now possible on massive datasets through smart Bayesian modeling.
This paper introduces a scalable Bayesian Tensor Ring (SBTR) model for tensor factorization that uses Multiplicative Gamma Process prior and efficient Gibbs sampling. It handles both continuous and binary data while being computationally efficient for large-scale tensor analysis.
-----
https://arxiv.org/abs/2412.03321
๐ Original Problem:
Existing Bayesian Tensor Ring methods suffer from sub-optimal solutions due to Automatic Relevance Determination prior limitations, can't handle discrete data, and are computationally expensive for large tensors.
-----
โก Solution in this Paper:
โ Introduces weighted tensor ring decomposition with Multiplicative Gamma Process prior for better rank adaptation.
โ Implements Pรณlya-Gamma augmentation technique to handle binary data types.
โ Develops efficient Gibbs sampler that reduces computational complexity by two orders compared to previous methods.
โ Creates online Expectation-Maximization algorithm for processing extremely large tensors.
-----
๐ฏ Key Insights:
โ MGP prior provides superior accuracy in identifying latent structures compared to ARD prior
โ Computational complexity reduced from O(DMR^6) to O(DMR^4)
โ Model can automatically adapt tensor ranks during training
โ Handles both continuous and binary data in unified framework
-----
๐ Results:
โ Outperforms baseline models on continuous tensor completion tasks with RMSE 0.560 vs 0.590
โ Achieves superior AUC scores (0.973) on binary tensor completion
โ Shows better rank estimation accuracy especially with high missing ratios










