Renormalization-Inspired Effective Field Neural Networks for Scalable Modeling of Classical and Quantum Many-Body Systems
Xi Liu, Yujun Zhao, Chun Yu Wan, Yang Zhang, Junwei Liu · Feb 24, 2025 · Citations: 0
How to use this page
Provisional trustThis page is a lightweight research summary built from the abstract and metadata while deeper extraction catches up.
Best use
Background context only
What to verify
Read the full paper before copying any benchmark, metric, or protocol choices.
Evidence quality
Provisional
Derived from abstract and metadata only.
Abstract
We introduce Effective Field Neural Networks (EFNNs), a new architecture based on continued functions -- mathematical tools used in renormalization to handle divergent perturbative series. Our key insight is that neural networks can implement these continued functions directly, providing a principled approach to many-body interactions. Testing on three systems (a classical 3-spin infinite- range model, a continuous classical Heisenberg spin system, and a quantum double exchange model), we find that EFNN outperforms standard deep networks, ResNet, and DenseNet. Most striking is EFNN's generalization: trained on $10 \times 10$ lattices, it accurately predicts behavior on systems up to $40\times 40$ with no additional training -- and the accuracy improves with system size, with a computational time speed-up of $10^{3}$ compared to ED for $40\times 40$ lattice. This demonstrates that EFNN captures the underlying physics rather than merely fitting data, making it valuable beyond many-body problems to any field where renormalization ideas apply.