Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

i dont believe this for a second. Inference margins are huge, if they stopped R&D tomorrow they would be making an incredible amount of money, but they cant stop investing because they have competitors.

its all pretty simple



Do we have any evidence of this that inference costs are less than the subscription price?


No, and nobody claiming this seems to be posting evidence when confronted.

I guess we'll find out in 2-3 years.


I don't know if the site is just broken for me at the moment, but this has used to track how much people were costing the companies (based on tokens per $ if I remember correctly): https://www.viberank.app/

One user in particular ran up a $50k bill for 1 month of usage, while paying $200 for the month: https://bsky.app/profile/edzitron.com/post/3lwimmfvjds2m

Plenty of people are to blow through resources pretty quickly, especially when you have non-deterministic output and have to hit "retry" a few times, or back-and-forth with the model until you get what you want, whereby each request adds to total tokens used in the interaction.

AI companies have been trying to clamp down but so far unsuccessful, and it may never be completely possible without alienating all of their users.


This is unrelated to the original assertion: "If they charged users what it actually costs to run their service, almost nobody would use it."

5 million paying customers on 800 million overall active users is an absolutely abysmal conversion rate. And that's counting the bulk deals with extreme discounts (like 2.50 USD/month/seat) which can only be profitable if a significant number of those seats never use ChatGPT at all.


youre assuming that if chatgpt suddenly cost $20 a month to access at all that 795 million people would never talk to chatgpt again?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: