I have to be careful about what I describe, but I don't think people care about speed or performance at all when it comes to tech, and it makes me sad. In fact, there are so many occasions where the optimisation is so good that the end user doesn't believe that anything happened. So you have to deliberately introduce delay because a computer has to feel like it thinks the same way you do.
At my current place of employment we have plenty of average requests hitting 5-10 seconds and longer, you've got N+1 queries against the network, rather than the DB. As long as it's within 15 or 30 seconds nobody cares, they probably blame their 4G signal for it (especially in the UK where our mobile infrastructure is notoriously spotty, and entirely absent even within the middle of London). But since I work on those systems I'm upset and disappointed that I'm working on APIs that can take tens of seconds to respond.
The analogy is also not great because MPG is an established metric for fuel efficiency in cars. The higher the MPG the better.
> In fact, there are so many occasions where the optimisation is so good that the end user doesn't believe that anything happened. So you have to deliberately introduce delay because a computer has to feel like it thinks the same way you do.
I never liked this view. I can't think of a single legitimate use case that couldn't be solved better than by hiding your true capabilities, and thus wasting people's time.
> they probably blame their 4G signal for it
Sad thing is, enough companies thinking like this and the incentive to improve on 4G itself evaporates, because "almost nothing can work fast enough to make use of these optimizations anyway".
"I can't think of a single legitimate use case that couldn't be solved better than by hiding your true capabilities, and thus wasting people's time."
Consider a loading spinner with a line of copy that explains what's happening. Say it's for an action that can take anywhere from 20 milliseconds to several seconds, based on a combination of factors that are hard to predict beforehand. At the low end, showing the spinner will result in it flashing on the screen jarringly for just a frame. To the user it will appear as some kind of visual glitch since they won't have time to even make out what it is, much less read the copy.
In situations like this, it's often a good idea to introduce an artificial delay up to a floor that gives the user time to register what's happening and read the copy.
This doesn't work well in apps but games do incredible things to hide that state; and it's partially a consequence of avoiding a patent on minigames inside loading screens.
e.g. back in the 90s with Resi 1, the loading screen was hidden by a slow and tense animation of a door opening. It totally fit the atmosphere.
Plenty of games add an elevator or a scripted vehicle ride, or some ridiculous door locking mechanism that serves the same purpose without breaking immersion, especially as those faux-loading screens can be dynamic.
It's pretty much the exact same technique used in cinema when a director wants to stitch multiple takes into a single shot (e.g. that episode in True Detective; that other one in Mr Robot; all of Birdman).
Flash is good. If the state transition is "no indicator -> spinner -> checkmark", then if the user notices the spinner flashing for one frame, that only ensures them the task was actually performed.
It's a real case, actually. I don't remember a name, but I've encountered this situation in the past, and that brief flash of a "in progress" marker was what I used to determine whether me clicking a "retry" button actually did something, or whether the input was just ignored. It's one of those unexpected benefits of predictability of UI coding; the less special cases there are, the better.
> In fact, there are so many occasions where the optimisation is so good that the end user doesn't believe that anything happened. So you have to deliberately introduce delay because a computer has to feel like it thinks the same way you do.
I see this argument coming up a lot, but this can be solved by better UX. Making things slow on purpose is just designers/developers being lazy.
Btw users feeling uneasy when something is "too fast" is an indictment of everything else being too damn slow. :D
At my current place of employment we have plenty of average requests hitting 5-10 seconds and longer, you've got N+1 queries against the network, rather than the DB. As long as it's within 15 or 30 seconds nobody cares, they probably blame their 4G signal for it (especially in the UK where our mobile infrastructure is notoriously spotty, and entirely absent even within the middle of London). But since I work on those systems I'm upset and disappointed that I'm working on APIs that can take tens of seconds to respond.
The analogy is also not great because MPG is an established metric for fuel efficiency in cars. The higher the MPG the better.