Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

citation needed. specifically regarding gpt4 and not the other models, since that was your claim.

> That being said, you can run LLMs on-prem.

how is this relevant at all? it's just dodging the question, since the claim was about gpt4 32k tokens and not the crop of generic models that we have now availabe on prem (and which don't support 32k token anyway)



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: