Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You can use the distilled version on Groq for free for the time being. Groq is amazing but frequently has capacity issues or other random bugs.

Perhaps you could set up Groq as your primary and then fail back to fireworks, etc by using litellm or another proxy.



Do you know any assistants for jetbrains that can plug into groq+deepseek?


I do not as I'm not in the ecosystem, but groq is openai compliant, so any tool that is openai compliant (99% are) and lets you put in your own baseurl should work.

For example, many tools will let you use local llms. Instead of putting in the url to the local llm, you would just plug in the groq url and key.

see: https://console.groq.com/docs/openai


Continue.dev is available for Jetbrains, though the plugin is not as good as the VSCode counterpart. You can plug in any openai compatible API. Under experimental settings, you can also define an applyCode model (and others) which you could set to a faster, cheaper one (eg Sonnet).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: