A lot still needs to be done to make the design pipeline accessible to normal people. Any clown can set up a bambu and print objects they find/buy online, but it's a huge learning curve to get from that point to being able to actually create your own parts, especially stuff that integrates with the complex shapes of some existing object.
Recent examples from my life include wanting mounts for flashlights, a thing to attach to bike handlebars, a shelf bracket, a battery cover for a tool, a piece of a bird feeder, etc. Where is the interface that lets me scan the existing objects I need to integrate with and then quickly assemble prefab subcomponents into a printable design and seamlessly iterate on that?
No, VR goggles can't scan an object and generate precise enough measurements of it to create a basic 3D model that can attach to it. And they don't particularly help in refining this base model afterwards either, because they don't help in any way with the hardest part about 3D interactions, which is not the visualization, but the actual editing/interaction part, especially for fine details.
What you need is a feedback system between your coarse measurements from a low precision system such as a ruler or goggles and the prints that you make. Does the tab you printed fit into the latch? What part fails to fit?