This is the official repository of the paper
S. Brivio, N. R. Franco, Deep symmetric autoencoders from the Eckart-Young-Schmidt perspective (2025),
providing (i) a novel mathematical framework for symmetric autoencoders, (ii) suitable error estimates, and (iii) a brand-new data-driven initialization strategy.
We suggest to install the library dependecies in a clean conda environment, namely,
conda create -n sym-ae python=3.11.9
conda activate sym-ae
conda install -c conda-forge fenics
pip install -r requirements.txt --no-cache-dirThe source code implementation is contained in src and is organized as follows:
src/activations.pyimplements bilipschitz activations and relative functionalities.src/blocks.pycontains the implementation of classes needed to build the neural network architecture skeleton.src/modules.pyimplements AE, SAE, SBAE, and SOAE networks along with their initialization procedures.src/NestedPOD.pycomprise the definition of the homonymous class, useful for the EYS initialization.src/training.pycontains the training loop function and relative utilities.src/utils.pyimplements additional utilities for reading and saving files.
The scripts to run are contained in the main folder, whereas notebooks comprise the jupyter notebooks.
- Generate the datasets by executing the notebook
notebooks/datagen.ipynb; the saved data are then available indata. - Run
python comparison.pyto generate the results for the comparison analysis (which then will be saved inresults); - Execute the remaining jupyter notebooks to visualize the numerical results and generate the paper figures, then available in the folder
results.
If the present repository and/or the original paper was useful in your research, please consider citing
@misc{brivio2025saeeys,
title={Deep Symmetric Autoencoders from the Eckart-Young-Schmidt Perspective},
author={Simone Brivio and Nicola Rares Franco},
year={2025},
eprint={2506.11641},
archivePrefix={arXiv},
primaryClass={math.NA},
url={https://arxiv.org/abs/2506.11641},
}