Bold claim in this paper. Says FAN: Fourier Analysis Network can replace MLP layers in various models
⢠Language modeling: Up to 14.65% lower loss, 8.50% higher accuracy vs standard Transformer
⢠FAN leverages Fourier Analysis to model periodicity, demonstrating improved generalization and efficiency.
đ https://arxiv.org/pdf/2410.02675
Original Problem đ:
Existing neural networks struggle to model and reason about periodicity, tending to memorize periodic data rather than understand underlying principles.
-----
Solution in this Paper đĄ:
⢠Proposes FAN: Fourier Analysis Network
⢠Incorporates Fourier Series into network architecture
⢠FAN layer: Ď(x) = [cos(W_p x) || sin(W_p x) || Ď(B_pĚ + W_pĚ x)]
-----
Key Insights from this Paper đĄ:
⢠FAN outperforms baselines in modeling basic and complex periodic functions
⢠Demonstrates superior performance on real-world tasks
⢠Reduces parameters and FLOPs compared to MLP
⢠Enhances generalization in cross-domain tasks
-----
Results đ:
⢠Periodicity modeling: FAN significantly outperforms MLP, KAN, Transformer
⢠Symbolic formula representation: FAN surpasses baselines as parameter count increases
⢠Time series forecasting: Transformer with FAN improves MSE by 14.3-15.0%, MAE by 7.6-7.9%