It's laziness on my part. I read just your blog post. Had I clicked through to the tool, it clearly says it at the top. My apologies.
I'm very grateful for all the work and writing you are doing about LLMs.
Regarding your note about JSON mode with llama.cpp, I'm writing a wrapper for it on my katarismo project. It is basically the stdout suggestion from that comment, but it is working really well for me when I use it with pocketbase.
I'm very grateful for all the work and writing you are doing about LLMs.
Regarding your note about JSON mode with llama.cpp, I'm writing a wrapper for it on my katarismo project. It is basically the stdout suggestion from that comment, but it is working really well for me when I use it with pocketbase.
https://gitlab.p.katarismo.com/katarismo/backend