MLPS @ NeurIPS2025
Outline
- Base research from IPAC 2025 paper
- Remove section on eddy current compensation (but keep it as motivation for going full time series)
- Elaborate transfer learning techniques for other magnet familes
- Discuss other available covariates like voltage.
- Model comparison on fixed dataset, discussion on attention.
- Show usefulness on removing precycle as we will demonstrate in Dedicated MD 2025-08-13.
Machine learning techniques for hysteresis modeling
- Implemented new models EncoderDecoderLSTM, AttentionLSTM and TransformerLSTM to compare to Temporal Fusion Transformer using validation set on v9.1 with data on flattened MD1.
- Discuss method comparison with Neural ODEs and Neural Operators.
- Discussion on whether we learn hysteresis or data