79461074

Date: 2025-02-23 10:39:29
Score: 2.5
Natty:
Report link
for i in range(100):
    optimizer.zero_grad()
    output = model(**model_inputs, labels=labels) # forward pass
    loss = output.loss
    loss.backward()
    optimizer.step()
print('Fine-tuning ended')

I could run the above loop only once in Google Colab and notebook crashes if i increase the range even to 2 or 3. Experiencing the same problem even if I use Databricks ML instance too?. What alternatives you recommend to run the notebook with AdamW optimizer as suggested above in Cloud?.

Reasons:
  • Has code block (-0.5):
  • Ends in question mark (2):
  • Low reputation (1):
Posted by: Balachandar Ganesan