ERROR: This model’s maximum context length is 8192 tokens. However, you requested 8875 tokens (7875 in the messages, 1000 in the completion). Please reduce the length of the messages or completion.