Some of us have dipped our toes in local LLMs. To be sure, the ones I can run on my hardware always pale when compared to the online ones. But perhaps in time the ones you can run locally will be good enough.
Or perhaps an Apple or Kagi will host an LLM with no built-in monetization skewing its answers.
Or perhaps an Apple or Kagi will host an LLM with no built-in monetization skewing its answers.