Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The problem is it wouldn't be difficult for Amazon to start charging based on how much storage space you're consuming, which would pull the rug from under their feet.


I'm wondering what the logistics of this would be. Say 1 customer is storing 100GB in a cloud storage service that does internal de-duping. A second customer uploads exactly the same 100GB of data. What is each charged? The whole 100GB*rate? Does each customer pay half? Do prices change for remaining customers when one "copy" of the data is removed? Without charging each customer for their total apparent storage use, I don't see how any customer can have any predictability as to their monthly bill.


This is exactly the point. Billing under this some dedupe scheme would be a nightmare. The only scale that is to be had is for the provider. Either Amazon in this example, or some other service built on top of Amazon (or self hosted) that kept the vagaries of price fluctuation away from the customer.

Notice that the way Amazon does price arbitrage with their compute nodes allows then to pull the power on them at any time when the price point moves above what you had payed for it. Anybody feel good about that happening to their data?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: