Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
scottndecker
5 days ago
|
parent
|
context
|
favorite
| on:
GPT-5.2
Still 256K input tokens. So disappointing (predictable, but disappointing).
coder543
5 days ago
|
next
[–]
https://platform.openai.com/docs/models/gpt-5.2
400k, not 256k.
reply
nathants
5 days ago
|
parent
|
next
[–]
400 - 128 = 272. Codex cli source.
reply
coder543
4 days ago
|
root
|
parent
|
next
[–]
If you want to be able to generate up to 128k tokens in one go successfully, then yes, that math checks out.
reply
htrp
5 days ago
|
prev
[–]
much harder to train longer context inputs
reply
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: