79712079

Date: 2025-07-23 14:25:36
Score: 1.5
Natty:
Report link
  1. Your model maybe suffering from an insufficient training sample

  2. I think where you may be having a problem is the dataset structure for training. Check again how you have prepared the dataset for training. There is an issue with the tokenized function

  3. Then I don't think you should use the global tokenizer -- why not pass the tokenizer explicitly

Reasons:
  • No code block (0.5):
  • Low reputation (1):
Posted by: Hadeynike