I do not as I'm not in the ecosystem, but groq is openai compliant, so any tool that is openai compliant (99% are) and lets you put in your own baseurl should work.
For example, many tools will let you use local llms. Instead of putting in the url to the local llm, you would just plug in the groq url and key.
Continue.dev is available for Jetbrains, though the plugin is not as good as the VSCode counterpart. You can plug in any openai compatible API. Under experimental settings, you can also define an applyCode model (and others) which you could set to a faster, cheaper one (eg Sonnet).
Perhaps you could set up Groq as your primary and then fail back to fireworks, etc by using litellm or another proxy.