79334400

Date: 2025-01-06 21:37:04
Score: 1
Natty:
Report link

yes , this kind of memory usage is completely normal as far as I have seen. I had a project on a comparitive study between a few custom made models over transfer learning models. There every time I would run each of the models (or worse , all of them on a single notebook in a sequential manner) , it would completely use up all the vram available on the p100 in kaggle and the memory usage shot up to over 22gb the moment model fitting starts.

I would suggest using a lower batch size during fitting in order to reduce the resouce usage slighty , though i am no expert and pretty much a novice. I would also suggest you to try different learning rates and optimziers as well and reduce the number of epochs, as in my experience after about 40 - 50 epochs , there's barely any noticeable difference.

Reasons:
  • Long answer (-0.5):
  • No code block (0.5):
  • Low reputation (1):
Posted by: Subho