I heard Google also found that increasing speed increased users' perception of the quality of the search results. The reason was that when the site was faster, users did more searches. Naturally they found more stuff. But they attributed the gain to Google's cleverness rather than the extra work they'd done themselves.
I have spoken to tons of folks on websites they think are fast, websites they would benchmark against -- in the majority of cases perception of fast loading website triumphs an apples-to-apples comparison of time taken to download & render a page. Perception plays a huge role when it comes to wining the mind games in the users head regd. page download times.
Here's an example which made me quite a bit of money. My website basically looks like this:
AAAAAA
CBBBBB
CBBBBB
CBBBBB
CDDDDD
CDDDDD
CDDDDD
CDDDDD
Most users will have their screen break a bit above the end of the B block. (By design.)
The heaviest part of the page is D. Eliminating it is not an attractive option -- the overall conversion rate suffers. The most valuable parts of the conversion pathway are in B and C. The first thing the customer sees is invariably A.
Solution: a bit of technical magic to make sure that the website loads element in A->C->B->D order (there is a competing technical imperative which means that in the HTML it makes more sense to order them A->B->D->C), and significant microoptimization of how quickly B and C are minimally interactive to the user. (We're talking measuring improvements in the kilobytes or tens of microseconds range.) Since customers have to actively scroll down to engage the D block, and since most do not engage the D block before first engaging the B block, the fact that D is heavy and slow very rarely impacts their experience.
Oh yes, we've made extensive use of Souders' work. But I was asking about something different: what things (besides progress bars and loading animations) have the psychological effect of making software feel faster?
One of the key things I've found works for us is differentiating what we WANT to have from users vs. what we NEED to have.
We used to have a webform that (in summary) asked the user what they wanted to download, how they got there, and how we can contact them later (as sales leads). Conversion ratio was fairly poor, so I refactored the priorities of what we were asking for into 2 forms.
The first form consisted of questions we HAD to know the answer to to service the user's request, and the second form consisted of info we wanted from them (name, email, contact info, etc.)
So between the first form and the second form, we'd kick off the download for the user (and show a progress meter at the top) -- which got them what they needed immediately, making the site seem faster overall.
Plus, since they were waiting for the download to complete anyway, the conversion goals for the information we wanted went way, way up as well.
Again, I didn't actually MAKE things faster, in fact, having to fill out 2 forms was significantly slower, but we kept users engaged and let them waste their idle cycles making us happy since we were already accomodating their request.
Loading animations make people perceive it as fast loading?
It has exactly the opposite effect on me; If someone site has to show me loading animation,it's not fast enough. I'm thankful for the animations though in situations where delay can't be avoided.
That article also links to a great article called "Maximizing Human Performance" (http://www.asktog.com/basics/03Performance.html) which goes over a few topics, including response times and their perceived meaning and optimizing user flows for the human thinking process.
We use the Google Web Toolkit to defer loading large pieces of data that compose slots around the page. The result, which you could achieve with any asynchronous mechanism, is that the page is up fast, and then details pop in as they're ready. If we waited to retrieve all of the data and render the HTML on our server, pages that currently start presenting on the browser in one second or less would take 4 or more seconds. Basically, if it's below the fold for most of our users, we defer loading it.
If I understood prakash's comment, the first way is actually making it faster, either by generating it faster or by caching. I usually launch four or fives sites at once. HN is the first to load, so I usually start browsing it instead of the (slower) others.
Known tricks are explicit sizes for tables and images. The HTML file itself is loaded fast so you can start reading while images are loading.
There's also the opposite method: preloading images so, once the needed items are in the browser, the page renders faster.
I wonder how it would work to preload the images for the initially visible part of a page and leave for later the ones that you must scroll to see.
I've read recently about using sprites for the graphics: one big image with a mosaic of little images used, then split using CSS. This trick reduces the number of HTTP requests.
You could separate it into 2 segments. One portion of users not having used a website in a particular category and the other bucket being folks that had used such websites before. Let's take online travel websites as an example.
In both cases people had a good experience using the website which in turn was a combination of great design including crispness & information architecture, variable font sizes, great presentation of the most important information, a lot of user empathy, generally not making the user think and not surprising the user, heat maps of which areas of a webpage users are likely to look at, etc.
The technical aspects of it included, smaller webpage sizes, tons of caching by abstracting away personalized portions of the webpage, and thereby caching close to 70-95% of the webpage in the browser, using a CDN, compression, reducing RTT, using distributed dns, not accessing the database for requests if possible, having as many static components as possible, use of persistent connections, caching url query strings, and all the things mentioned in Y! performance guidelines.
For folks that had never used an online travel website, they compared the speed of a website to the ones they regularly used like google, rediff, yahoo, etc. For folks that had used an online travel website, the comparison was with the competition and how much better you were with the above aspects.
Things that make me think a site is too slow are staring at a blank screen while waiting for doubleclick and trying to find the skip button for an intro ad. A site should display its content first and then fill in the ads later. Engage your users while all the clutter catches up.
There may be a psychological effect too. I've noticed that once people think of a piece of software as fast, they're more likely to think of it as good in other respects, even unrelated ones.
Couldn't one come to another conclusion here? In fact several other conclusions. For one, since users were getting more results on the page they didn't need to go to a second results page, thus causing the drop-off in traffic.
If you're going to come to some grand "Aha!" moment, you should define your experiment. Were these the same users? on the same day? searching for the same stuff? What browsers were they using? Did they have their windows maximized? What was their screen size? ... and so on.
Though I've never been able to find the results and complete study methodology online (can anyone?), Mayer is a CS grad from Stanford, as well as head of the UI team at Google. I'm guessing she and her group know what they are doing.
I'd be more than willing to bet that they controlled for these variables you mention, else the test would be useless. Sure, it's right to be skeptical, but they do this for a living.
Given Google's stake in having accurate data, they probably bothered to do it correctly.
I'm getting more questions as I start to think about this. They get revenue from ads, not searches. Did they vary the number of ads displayed? Did people look at the first page of 10, not find what they saw, then check the ads, find what they saw (which may be a page two organic result), and click the ad?
If that is what is happening, then I can see why Google would lose. That is, their main conclusion would be that people tend to check organic results first. If that's true then if a company that has to pay to be page 1 is organically moved there by increasing the number of results displayed, then G loses.
They check this by seeing if revenue goes up when they limit the results to 5 ads.
How does it not make more sense that traffic dropped because the average user only wants to see 30 results and saw them all on a single page instead of navigating to three separate pages. Revenue dropped because they showed 1/3 the amount of adwords. Did I miss something here?
Too true. I like the optometrist who made me read tiny letters again and again instead of asking "better like this... or like this?" Because even I can't tell the difference at first sight.
I think the magnitude of impact might be due to the search engine market. Results of different search engines are usually indistinguishable (to most web users), so the most important factors become design and speed. In other markets the balance may be different.
I'm not saying that speed isn't important for everyone, but having that killer feature, even if it slows down your site, might be what sets you apart from your competitors.
Maybe PG should try this in kicking his HN habit, introduce throttling features that will increase the time taken to render HN rather than focusing on optimizations, I know it would help a lot of other folks as well ;-)
I hope he doesn't make the site suck for everyone just to "help" the users who seem to need baby-sitters. Suburbs suck for the same reason. I know you are joking but I am really tired of all the Nanny State type stuff.
"Oh I am just doing this for your own good. You will thank me later." is one of the most evil attitudes that I know. I am not able to express clearly why I think so and why I feel so strongly[#] about it. May be it is coz I find it patronizing. Also I find it extremely unhackerly.
[#] My mind is screaming obscenities and I am feeling pretty angry.
Sorry for being obscure. I was responding to your line:
""Oh I am just doing this for your own good. You will thank me later." is one of the most evil attitudes that I know."
I completely understand your objection to that phrase in a lot of adult situations. (Safety fence after safety fence, stopping people from hurting themselves with technical products, not letting you hack stuff). I get that completely. Lowest Common Denominator design sucks.
What I was asking was whether you apply that thought accross the board.. My mind particularly went to a Parent>Child relationship. Learning by experience (ie - getting burnt) is a great thing, but sometimes you can/should learn from others experience. Sometimes someone older/more experienced/with more wisdom -- can stop you doing something for your own good.
Yeah, that is a totally different situation and I only have knowledge of one-side (child's POV) so far. My thought on the topic at the moment is that parents should definitely safe-guard children from life threatening and other irreversibly catastrophic situations. Other situations will mostly be a judgment call, erring on the side of freedom, rather than caution.
As Paul Buchheit said: advice = limited life experiences + over generalization
I live in Australia - we might be getting a bigger firewall than China's - I know all about Nanny States.
When I have kids, one of my aims as a parent will be to try and err on the side of freedom. (Although I can see that will be a hard choice to make at points). Also - this was a good link I picked up off here/reddit recently: http://freerangekids.wordpress.com/
I can understand this. Walmart is the most painful website to browse on the internet, because you go on usually looking for something or the price of something and it takes what feels like an eternity to get anywhere.
So, where does this fit into net neutrality? ISPs can certainly sell QoS services to customers.
For some applications, this is important (e.g. gaming). For others, it could be a nice competitive advantage. But then with more & more of the ATM frames dedicated to QoS, that'll leave more contention (thusly, higher latency & drop) for the rest.
I'm just worried about who's going to use these facts for what ends.
I agree that everyone was probably aware that speed is important, but I always thought that as long as pages loaded in less than 1 or 2 seconds, then it wouldn't make much of a difference to most users. This post changes my point of view; I think I'm going to work harder to reduce page load times as much as possible instead of stopping at something that "seems fast"
so, why not load the first few results quickly, then fill in the rest later through ajax? It'll take more than .5 seconds to eyeball the first ten results.
Sorry to bring back the table vs. css layout debate, but since Google use tables for some parts of their layout, can we assume that this is because it renders faster (and therefore makes more money for them)?