A better monetization model for LLM applications
Bring your own keys into an affiliate relationship
I’m starting to observe a problem where a lot of LLM-enhanced apps are starting to pop up. For coding you have Cursor, but now there’s also a terminal called Warp and it costs $15/month. For individuals, consultants, small and even medium sized companies, this isn’t a workable pricing model. All apps were already turning into subscriptions and the cost of LLMs is accelerating that.
What compounds the problem is that, because everyone feels uncomfortable with the potential surprise high bill of pay-for-what-you-use, many of these apps are charging a single monthly fee. A simple single flat fee, except that the cost of the LLMs is not flat. It grows linear with usage. This means having to throttle the LLM usage to stay within the margin of the flat fee and have a chance at a profit. That means using cheaper models, which yield worse result on average.
I think this is why when I compare Cursor to Claude Code I find Claude Code to be better: I’m giving Anthropic a lot more money than Cursor. But also, I’m happy with that, because I can use Claude in many other ways, where Cursor is a single-use application.
I think from now on, for each LLM powered application, I either want to be able to put my own keys in, or have pay-as-you-go with a lot of transparency. When I need it to work, I want it to work well. When I’m not using it, I want it to cost nothing.
There’s another solution though. LLM providers could have affiliate systems where other companies get a commission for the token usage they generate. Using Warp as an example, Warp wouldn’t ask me for $15/month. Warp would ask me for my Claude API key. Warp would identify all those requests as caused by Warp, and then Anthropic would pay Warp a proportional fee for the usage generated.
This is a win-win-win: Anthropic gets more token usage (more customer expansion), Warp gets more customers (I’m not paying $15/month, but I would plug in my key), and the user gets to have another LLM tool that otherwise they would not.


