This one works with GGUF-compatible llama.cpp wrappers like the fully-local, fully-private Layla app or the other examples tossed around. With some prompt tweaking it's capable of a broad variety of... tasks.
This model is very compliant and doesn't tend to go to "I'm sorry, as a large language model..." type replies.
This one works with GGUF-compatible llama.cpp wrappers like the fully-local, fully-private Layla app or the other examples tossed around. With some prompt tweaking it's capable of a broad variety of... tasks.
This model is very compliant and doesn't tend to go to "I'm sorry, as a large language model..." type replies.