> As soon as the AI models run on an external server a buy once model does no longer work (or atleast not with an acceptable one time price). The constant server cost is just to high.
Tell this to Rabbit and their R1 device, which comes with unlimited LLM usage. I guess they just rate limit through having a bad UX though. /s
But yes, you're right, this is what I was getting at, LLM usage is expensive enough to need a subscription model.
Tell this to Rabbit and their R1 device, which comes with unlimited LLM usage. I guess they just rate limit through having a bad UX though. /s
But yes, you're right, this is what I was getting at, LLM usage is expensive enough to need a subscription model.