Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Looks like the docs have a typo:

    Recommended context: 65,536 tokens (can be increased)
That should be recommended token output, as shown in the official docs as:

    Adequate Output Length: We recommend using an output length of 65,536 tokens for most queries, which is adequate for instruct models.


Oh thanks - so the output can be any length you like - I'm actually also making 1 million context length GGUFs as well! https://huggingface.co/unsloth/Qwen3-Coder-480B-A35B-Instruc...




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: