Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't think edge ML is competitive for now. It can do simple things but not the big beefy LLMs. State of the art AI chips are so expensive that you can't afford to idle them. They need to be M:N shared - M chips for N users so they have maximum utilization, and that fits perfectly for the cloud.

However, there is a middle ground: pluggable AI could potentially be a thing. The device would use an open protocol to access cloud AI. If the original company goes bankrupt then someone else can implement the protocol and the devices can be repointed.



Yeah definitely, especially for more complicated things like having a conversation. State of the Art will probably always need a server.

For simpler things, small models can definitely handle them. Transcription, object detection, simple classification tasks. I expect more and more to fall under the category of “things which ML can do on $X of hardware” as hardware and software get better.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: