Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The fact that I can run this on mobile (iOS) devices using the C++ interface makes all the difference for me. I find that extremely refreshing among all those other Python/Server/PC -only frameworks.

Running non-trivial ML workloads on the edge has been on my wishlist for years and it sounds like Apple has just the thing.



I’m somewhat concerned that it doesn’t support ANE from the get-go (they list CPU and GPU, for now). Not even for inference only.

Other frameworks have (non-python) solutions for mobile in place, e.g. tflite, libtorch, and onnxruntime.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: