It is common in the transfer learning approach to freeze and add layers when training on the target dataset. Also, a fine-tuning step with unfrozen layers is a common way to adjust the model to the final dataset. SimbaML should be able to support this.