Skip to content

heitorctmba/PROJ-01-Pellis

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Pellis: Burn Image Classification

Experimental study on automated classification of burn images into five severity levels, comparing different convolutional neural network architectures with transfer learning and varying configurations (attention blocks, pooling, dropout, learning rate, optimizers). Developed at the VORTEX laboratory of Tec Unifor (Universidade de Fortaleza) as the AI backbone for the Pellis application, a clinical decision support tool for healthcare professionals treating burn patients.

Clinical reference project: Academic research creates application to assist therapy for burn patients


Overview

[Input image]
        ↓
[Pre-trained base model (ImageNet)]
   EfficientNetV2 / InceptionV3 / ResNet / ResNeXt
        ↓
[Optional attention block]
   CBAM or Squeeze-and-Excitation
        ↓
[Pooling]
        ↓
[Dense layers with Dropout (configurable)]
        ↓
[Classification: 5 classes (Softmax)]

Severity Classes

Class Code Description
0 burn_1 1st degree burn
1 burn_2p 2nd degree deep burn
2 burn_2s 2nd degree superficial burn
3 burn_3 3rd degree burn
4 burn_not No burn / healthy skin

Dataset

Split Images
Train 1,698
Validation 300
Test 503
Total 2,501

Metadata is stored in pellis_treino.csv and pellis_teste.csv with one-hot encoding per class. The gold column flags reference images validated by specialists, i.e., samples that fully meet the clinical criteria for each class.

Images are not included in this repository. The dataset was excluded for two reasons: file volume and, most importantly, the sensitive nature of the content, as these are real clinical photographs of patients with severe burns.


Evaluated Architectures

Family Models
EfficientNetV2 B3, S, M, L
Inception InceptionV3, InceptionResNetV2
ResNet ResNet50V2, ResNet152V2
ResNeXt ResNeXt50, ResNeXt101

All networks use pre-trained ImageNet weights with full fine-tuning.


Techniques

Transfer Learning and Fine-tuning ImageNet weights as starting point, with all layers unfrozen for training.

Data Augmentation Random rotation (±15%), translation (±10%), zoom (±20%) and horizontal flip.

Attention Blocks Optional per experiment: CBAM (channel and spatial attention) or Squeeze-and-Excitation.

Training Callbacks Early stopping (patience of 10 epochs), ReduceLROnPlateau (factor 0.7, patience of 2 epochs) and ModelCheckpoint with periodic and best-model saving.


Repository Structure

PROJ-01-Pellis/
├── pellis_treinamento.ipynb    # Main notebook: training and evaluation
├── pellis_treino.csv           # Training set metadata (1,998 rows)
├── pellis_teste.csv            # Test set metadata (503 rows)
├── auxiliar/
│   ├── arquiteturas.py         # Model definition and building
│   ├── attention_blocks.py     # CBAM and Squeeze-Excitation blocks
│   ├── callbacks.py            # Training callbacks configuration
│   ├── dataset.py              # Image loading and preprocessing
│   └── avaliar.py              # Evaluation, metrics and confusion matrices
└── README.md

Outputs generated during training:

redes/{NETWORK}/
├── modelos_salvos/             # Weights saved per epoch and best model
└── log_treino/                 # Per-epoch metrics log (CSV)

log_teste/
├── log_teste.csv               # Aggregated results across all experiments
└── previsoes/                  # Per-image predictions for each evaluated model

Installation and Usage

Requirements

pip install tensorflow pandas numpy scikit-learn matplotlib seaborn gputil classification-models efficientnet_v2

Training

Open the main notebook and run the cells in order:

jupyter notebook pellis_treinamento.ipynb

The notebook automatically configures the GPU, loads the data, instantiates the chosen model via auxiliar/arquiteturas.py and starts training with the configured callbacks.

Experiment Configuration

The notebook is designed to facilitate architecture and hyperparameter swapping between runs, allowing systematic comparison of different combinations. All parameters are set at the top of the notebook without requiring code changes:

Parameter Options
Network EfficientNetV2B3, S, M, L / InceptionV3 / InceptionResNetV2 / ResNet50V2, 152V2 / ResNeXt50, 101
Epochs 100 (with early stopping)
Learning rate configurable
Optimizer Adam, SGD, RMSprop
Dense layers configurable (e.g. [1024, 256])
Dropout configurable per layer
Attention None, SE or CBAM
Pooling Global Average Pooling or Global Max Pooling

Each run generates an automatic ID and results are consolidated in log_teste/log_teste.csv for cross-experiment comparison.


Team

Name Role
Heitor de Castro Teixeira Development and data science
Rodrigo Bigas Backend
Joel Sotero da Cunha Neto Coordinator

Developed at the VORTEX Laboratory of Tec Unifor, based on clinical research conducted by Gilka de Albuquerque Forte Aguiar in the Professional Master's in Technology and Innovation in Nursing (MPTIE), supervised by Prof. Dr. Rita Neuma Dantas Cavalcante de Abreu and co-supervised by Prof. Dr. José Eurico de Vasconcelos Filho.


License

Property of the VORTEX Laboratory, Universidade de Fortaleza (UNIFOR).

About

The final product of the dissertation by Gilka de Albuquerque Forte Aguiar, graduate of the Professional Master's in Technology and Innovation in Nursing, the "Pellis" application was built and validated as a support tool for healthcare professionals in obtaining information about burns and associated treatments.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors