79391143

Date: 2025-01-27 14:34:33
Score: 1
Natty:
Report link

I would recommend to freeze your parameter in the base_model, otherwise they would be modified by the training process. Since your last layer are not train, they probably have a negative impact on the pretrained model.

If you one to finetune them after the training on your custom layers, you can unfreeze the base_model layers and train for one or two epochs with a very low learning rate.

Anyway, it is possible that the base_model does not work well with your data, as it is trained for a completely different task with data of other distribution, namely RBG pictures instead of brain tumor data.

See: https://keras.io/guides/transfer_learning/#transfer-learning-amp-finetuning

Reasons:
  • Long answer (-0.5):
  • No code block (0.5):
  • Low reputation (1):
Posted by: Juan Manuel Rodriguez