Neural PDE operator learning using transformers
November 14, 2024 — February 27, 2025
calculus
dynamical systems
geometry
Hilbert space
how do science
Lévy processes
machine learning
neural nets
PDEs
physics
regression
sciml
SDEs
signal processing
statistics
statmech
stochastic processes
surrogate
time series
uncertainty
Suspiciously similar content
Does Operator learning work with transformers? Maybe. Let’s find out.
Are all PDE foundation models transformers? If so, we could merge the notebooks.
1 References
Alkin, Fürst, Schmid, et al. 2024. “Universal Physics Transformers: A Framework For Efficiently Scaling Neural Operators.”
Bodnar, Bruinsma, Lucic, et al. 2024. “Aurora: A Foundation Model of the Atmosphere.”
Cao. 2021. “Choose a Transformer: Fourier or Galerkin.” In Advances in Neural Information Processing Systems.
Duraisamy, Iaccarino, and Xiao. 2019. “Turbulence Modeling in the Age of Data.” Annual Review of Fluid Mechanics.
Gilpin. 2023. “Model Scale Versus Domain Knowledge in Statistical Forecasting of Chaotic Systems.” Physical Review Research.
Guibas, Mardani, Li, et al. 2021. “Adaptive Fourier Neural Operators: Efficient Token Mixers for Transformers.”
Hao, Wang, Su, et al. n.d. “GNOT: A General Neural Operator Transformer for Operator Learning.”
Herde, Raonić, Rohner, et al. 2024. “Poseidon: Efficient Foundation Models for PDEs.”
Hoffimann, Zortea, de Carvalho, et al. 2021. “Geostatistical Learning: Challenges and Opportunities.” Frontiers in Applied Mathematics and Statistics.
Mialon, Garrido, Lawrence, et al. 2024. “Self-Supervised Learning with Lie Symmetries for Partial Differential Equations.”
Shih, Peyvan, Zhang, et al. 2024. “Transformers as Neural Operators for Solutions of Differential Equations with Finite Regularity.”
Xu, Gupta, Cheng, et al. 2024. “Specialized Foundation Models Struggle to Beat Supervised Baselines.”
Zhou, Ma, Wu, et al. 2024. “Unisolver: PDE-Conditional Transformers Are Universal PDE Solvers.”