Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Maybe something's timing out with the longer o1 response times?

Let me look into this – one issue is that OpenAI doesn't expose a streaming endpoint via the API for o1 models. It's possible there's an HTTP timeout occurring in the stack. Thanks for the report



I've gotten this as well, on very short code snippets. I type in a prompt and then sometimes it doesn't respond with anything, it gets stuck on the thinking, and other times it gets halfway through the response generation and then it gets stuck as well.

https://chatgpt.com/c/66e3a628-2814-8012-a6c5-33721b78cb99




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: