Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Some of us have dipped our toes in local LLMs. To be sure, the ones I can run on my hardware always pale when compared to the online ones. But perhaps in time the ones you can run locally will be good enough.

Or perhaps an Apple or Kagi will host an LLM with no built-in monetization skewing its answers.



you csn run the model, but someone with vastly bigger resources need to train it.


Sure. Hopefully decent pre-ad-injected models will still be around.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: