79637465

Date: 2025-05-25 07:52:02
Score: 1
Natty:
Report link

The GPT-2 tokenizer is not compatible with the newer GPT-3 models (including davinci-002 and gpt-3.5-turbo). You'd have to use something like tiktoken instead.

For example:

import tiktoken

tokenizer = tiktoken.encoding_for_model('gpt-3.5-turbo')
tokenizer.encode('airplane')
Reasons:
  • Low length (0.5):
  • Has code block (-0.5):
  • Low reputation (1):
Posted by: tlo8