A ton of UIs are built from the ground up. When you compare the logic and math involved to the logic and math used in 3D rendering or even 3D game logic, it's an absolute pushover, though you might feel like it's a huge hassle since in web development you get the UI logic, more or less, for free. You'll notice very quickly in game development that, until very recently and with the exception of graphics / sound libraries, you have to code everything from scratch.
It really depends on the library/engine you are using (if you're using one!) OGRE3D integrates well with several packages, Unity has a builtin GUI system that is stupendously inefficient, and the list goes on...
It's definitely a pain point for games that there isn't much choice around for general-purpose UIs. It's a big factor in why many indie games end up with really shitty, underdeveloped UI.
Fortunately, you don't really need to reinvent Qt or CSS to get the job done. A font system and a few hand-coded controls usually suffice.
I think the bigger problem is in sound synthesis and processing. This arena still remains mostly confined to reverb algorithms, crossfaded tracks, and pre-recorded samples, yet the depth of the subject goes about as far as graphics does. So we're really missing out in terms of applying technology to creativity here.
"Stupendously inefficient" is a bit of a stretch for me in regards to Unity; I think it has a fairly well thought out - if admittedly rudimentary - GUI system (in its second iteration, upon which the editor application interface is also built). "Efficiency" is typically not the foremost concern for the interface in a 3d game, anyway; though in that regard I also don't consider Unity to be a slouch.
It really depends on the library/engine you are using (if you're using one!) OGRE3D integrates well with several packages, Unity has a builtin GUI system that is stupendously inefficient, and the list goes on...