This sounded like a potentially useful feature for web developers, to improve perceived loading speed without having to use JS (which may be desirable in some situations). But reading Google's notes on the subject is like a splash of cold water: http://code.google.com/chrome/whitepapers/prerender.html
TL;DR: Only one page may be prerendered at a time per-instance of Chrome (not per-tab).
It would be nice to know if this just limits simultaneous prerendering, or if only one link may be prerendered and it must be clicked on or "evicted" before any more prerendering will occur.
I can't speak definitively, but I doubt the prerendered page needs to be visited before another one can be prerendered. If that were the case, I could open a google search and leave it in a background tab and never ever get prerendering ever again.
Yes, that's what I'm afraid of. It's not clearly stated if that is or isn't the case. Their wording is ambiguous. I'd test this myself using the dev build of Chrome, but I can't install it where I am.
Somewhat concerning: "What looks like a real pageview from a modern browser might be a browser downloading your page resources in the background before possibly being presented to an actual visitor. Websites that care about separating eyeballs from machines should add new JavaScript to their pages to create awareness of the current loading state."
So won't this be confusing, as non-discriminating sites choose to inflate their page views by ignoring whether those were prefetches or real loads?
Nearly everything that has to do with ads (both tracking and serving) is already standardized client side script, so it won't make much difference for publishers/advertisers. Ad agencies already want data vetted by someone like Comscore, and most publishers use tools like Google Analytics for their own purposes. I suspect those tools already account for invisible pre-rendering.
This doesn't really effect much, to be honest. People who want to inflate their stats already do so by counting bot traffic, iframe requests, etc. Google's been doing partial javascript evaluation for a long time as part of their spidering.
So won't this be confusing, as non-discriminating sites choose to inflate their page views by ignoring whether those were prefetches or real loads?
There are a lot of sites out there that use JavaScript reloads every X minutes to do this sort of thing already. More trackable but I bet they still include those numbers in their media packs ;-)
TechCrunch, for example, has this near the bottom of the source:
setTimeout('location.reload(true)',1200000);
So that's every 20 minutes.
A more insidious trick to "inflate" your views would be to detect when a tab isn't open but still showing your site and to refresh a lot more often in that case (meaning the user won't notice).
The article outlines a technique to check whether your page was preloaded or not using JavaScript. So the JavaScript would need to be run in order to make that check.
When some random site throws my pages into an iframe, at least the user sees what I've served, and I at least have some control -- I can break out of it with Javascript, or I could even serve completely different content.
It's a bit different though when the biggest fucking site on the internet joins in.
Why should I be forced to expend bandwidth every time Google decides a page on my server is relevant to a search result? This could double or even triple my bandwidth costs.
What? Use robots.txt and exclude yourself from the Google index. It's not like they're preloading the first 200 results for obscure searches. If you're one of the top few results for a very active search then I'm sure someone else is more than willing to take over if you can't handle the load.
If you're worried about Google being a bad web citizen, they've already said that they only use prerendering for the tom search result, and only if they think it has a very high probability of being clicked. (Based on my limited testing, this only ever seems to happen with navigational searches.) So you'd be spending that bandwidth anyway, and your users are getting a better experience out of it.
If you're worried about the technology itself being abused to DDOS people, I don't see how it enables any attack methods that didn't already exist.
If it is as you say, only the #1 result for searches that have a high probability of it being clicked, then it's generally fine with me. If they were blindly pre-loading the first few results, I'd have a problem.
I guess I should give Google more credit, and assume they're not idiots.
You don't, just like you don't get to opt out of being the target of a 302 redirect or hidden iframe. But just like both of those, you can detect it and use that information as you see fit.
TL;DR: Only one page may be prerendered at a time per-instance of Chrome (not per-tab).
It would be nice to know if this just limits simultaneous prerendering, or if only one link may be prerendered and it must be clicked on or "evicted" before any more prerendering will occur.