img2physiprop (Image to Physical Property) is a python package that maps medical image data to physical properties. This makes it possible to vary e.g. material parameters in FE simulations according to patient specific medical image data. The package includes the following features to ease the development process and ensure a high code quality:
- PyTest testing framework including an enforced minimum coverage check
- Automated Github CI/CD
- Exhaustive Pre-Commit framework to automatically check code formatting and code quality
- Automatically generated Documentation based on the included Python docstrings
The remaining parts of the README are structured as follows:
For a quick and easy start an Anaconda/Miniconda environment is highly recommended. Other ways to install img2physiprop are possible but here the installation procedure is explained based on a conda install. After installing Anaconda/Miniconda execute the following steps:
- Create a new Anaconda environment based on the
environment.ymlfile:
conda env create -f environment.yml
- Activate your newly created environment:
conda activate i2pp
- Initialize all submodules
git submodule update --init --recursive
- All necessary third party libraries for all submodules can be installed using:
git submodule --quiet foreach --recursive pip install -e .
- Install all img2physiprop requirements with:
pip install -e .
- Finally, install the pre-commit hook with:
pre-commit install
Now you are up and running 🎉
To execute img2physiprop run
i2pp --config path/to/config.yaml
with your custom configuration file. A template configuration file containing all possible input configurations can be found in the folder templates/config.
To locally execute the tests and create the html coverage report simply run
pytest
To locally create the documentation from the provided docstrings simply run
pdoc --html --output-dir docs src/i2pp
-
Interpolation methods (
processing.interpolation.method):nodes: Interpolates values at the element’s nodes and assigns the element mean (ignoring NaN nodes). Fast and robust; respects node sampling.nodes_scaled: Likenodes, but computes a scaled mean using node-specific scaling factors (dis.nodes.scaling_factors), which are set viaprocessing.interpolation.node_scaling_factors.surfaceandprocessing.interpolation.node_scaling_factors.interior. This adjusts the influence of specific nodes.elementcenter: Interpolates at each element centroid and assigns that value.allvoxels: Collects all voxels whose grid coordinates lie inside the convex hull of the element nodes; assigns the mean value; optionally filters outliers.allvoxels_scaled: Computes a voxel-weighted mean where voxel weights derive from node scaling factors and inverse node-to-voxel distances. The influence of the distance (decay) is controlled byprocessing.interpolation.inverse_distance_powerp (p=1 linear, p=2 quadratic (default), p>=3 increasingly like step function); Optionally filters outliers.
-
Element and node value overrides:
processing.interpolation.set_surface_node_value: If provided, all nodes that belong to any surface receive the fixed value (vector size must match the number of pixel channels); only relevant fornodesandnodes_scaledinterpolation methods.processing.interpolation.set_surface_element_value: If provided, all elements touching any surface node receive the fixed value (scalar or vector); applicable to all interpolation methods.
-
Outlier filtering (
processing.interpolation.filter_outliers):- In
allvoxelsandallvoxels_scaled, if enabled and enough voxels are present (>5), outliers are removed using a modified Z-score (median/MAD-based, threshold=3.5) before averaging.
- In
-
Fallbacks and warnings:
- If outlier filtering removes all voxels, the method falls back to the unfiltered mean.
- If an element contains no voxels (allvoxels modes), interpolation falls back to the element center.
- If interpolated points fall outside the image grid, element data is NaN and a warning summary is logged after processing.
To ease the dependency update process pip-tools is utilized. To create the necessary requirements.txt file simply execute
pip-compile --all-extras --output-file=requirements.txt requirements.in
To upgrade the dependencies simply execute
pip-compile --all-extras --output-file=requirements.txt --upgrade requirements.in
Finally, perforfmance critical packages such as Numpy and Numba are installed via conda to utilize BLAS libraries.
All contributions are welcome. See CONTRIBUTING.md for more information.
This project is licensed under a MIT license. For further information check LICENSE.md.