I think, that's why you are using same max_completion_tokens amount on both GPT-4, GPT-5.
In case of GPT-5, it needs much tokens to complete response than GPT-4 model.
In my opinion, you have to use max_completion_tokens as 5000.
Please try so and let me know the result.