Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

LLM n00b here.

My 2c - Prompts are the input that you send to LLMs to get them to give you output. In general LLMs are large black boxes, and the output you get is not always great. The output can often be significantly improved by changing the input. Changing the input usually involves adding a ton of context - preambles, examples, etc.

A lot of the work of prompt rewriting is like boilerplate generation. It is very reusable so it makes sense to write code to generate prompts. Prompt Engine is basically a way of making that prompt rewriting work reusable.

Code Engine seems to be a way of rewriting prompts for LLMs that generate code in response to text prompts

Chat Engine is the same for LLMs that generate chat/conversational responses.



Midjourney does not have contextual memory, but it does have a feature to always add a given suffix to any prompt. I guess this is a more powerful variant of the same sort of concept. I wonder who will "win" - specialised models or a single configurable one...




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: