Emphasis on "if you could simply type in English" (or insert native language here).
Sometimes the AI misinterprets what you want. Sometimes this is due to lack of sophistication on the model's part, but I think the bigger issue at hand are the shortcomings of natural language. Even when working with an intelligent human being, sometimes you have to communicate your thoughts/ideas through pictographic means.
It's something I've thought a lot about lately. No-code/low-code is about as old as GUIs themselves, yet it's never become the de-facto way of creating software. However, for everything else we do we use a GUI. No one creates illustrations on the computer programmatically (unless you're into generative art like me). You probably aren't doing your taxes in a terminal. You get the idea.
This is a segue into my other critique of prompting as a UI. A lot of UX people think it is the ultimate UI, and natural language the premier way of communicating jobs to be done. I disagree. I think symbols and diagrams communicate more with less. Concerning language, I think that for whatever reason programming languages are the most natural way (that we've currently found) to communicate intent in a programmatic context, markup languages are the most natural way to communicate intent in a layout context, and that's the reason why despite 50+ years of GUI innovation programmers still use text-editors. The old guard has all aged out of the profession, the new guard still uses text-editors.
So that all being said, I think the most incredible AI demo I've seen so far is the tldraw stuff done by Steve Ruiz, namely Make Real, and that's because you're communicating with the AI in a pictographic / symbolic context, and because it's a whiteboard, there's no limit to the symbols at your disposal to communicate your intent. The limit is AI's ability to interpret what those symbols mean.
Sometimes the AI misinterprets what you want. Sometimes this is due to lack of sophistication on the model's part, but I think the bigger issue at hand are the shortcomings of natural language. Even when working with an intelligent human being, sometimes you have to communicate your thoughts/ideas through pictographic means.
It's something I've thought a lot about lately. No-code/low-code is about as old as GUIs themselves, yet it's never become the de-facto way of creating software. However, for everything else we do we use a GUI. No one creates illustrations on the computer programmatically (unless you're into generative art like me). You probably aren't doing your taxes in a terminal. You get the idea.
This is a segue into my other critique of prompting as a UI. A lot of UX people think it is the ultimate UI, and natural language the premier way of communicating jobs to be done. I disagree. I think symbols and diagrams communicate more with less. Concerning language, I think that for whatever reason programming languages are the most natural way (that we've currently found) to communicate intent in a programmatic context, markup languages are the most natural way to communicate intent in a layout context, and that's the reason why despite 50+ years of GUI innovation programmers still use text-editors. The old guard has all aged out of the profession, the new guard still uses text-editors.
So that all being said, I think the most incredible AI demo I've seen so far is the tldraw stuff done by Steve Ruiz, namely Make Real, and that's because you're communicating with the AI in a pictographic / symbolic context, and because it's a whiteboard, there's no limit to the symbols at your disposal to communicate your intent. The limit is AI's ability to interpret what those symbols mean.