Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
roarcher
on Aug 15, 2023
|
parent
|
context
|
favorite
| on:
How Is LLaMa.cpp Possible?
We may "just" be neural networks that run on meat instead of silicon, but that does not mean that we're LLMs.
Waterluvian
on Aug 15, 2023
[–]
Why doesn’t it?
marginalia_nu
on Aug 16, 2023
|
parent
|
next
[–]
It's a formal logical error. One does not follow from the other without affirming the consequent.
roarcher
on Aug 16, 2023
|
parent
|
prev
|
next
[–]
Because not all neural networks are LLMs.
A GAN is a neural network, does that make it an LLM?
hkt
on Aug 16, 2023
|
parent
|
prev
[–]
We have inputs other than words, for a start
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: