OpenAI GPT-3 API error: “This model’s maximum context length is 4097 tokens”
The max_tokens parameter is shared between the prompt and the completion. Tokens from the prompt and the completion all together should not exceed the token limit of a particular OpenAI model. As stated in the official OpenAI article: Depending on the model used, requests can use up to 4097 tokens shared between prompt and completion. … Read more