The GPT-2 tokenizer is not compatible with the newer GPT-3 models (including davinci-002 and gpt-3.5-turbo). You'd have to use something like tiktoken instead.
For example:
import tiktoken
tokenizer = tiktoken.encoding_for_model('gpt-3.5-turbo')
tokenizer.encode('airplane')