Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The deficit is software. Hardware has gotten crazy fast, but software is struggling to make use of that speed in ways that are valuable to average users (e.g. reducing perceived latency).

At the risk of stating the obvious, a computer feels snappy when it responds to user actions (clicks, keypresses) quickly. That is a holistic problem -- one that concerns all parts of the system -- so it's hard to optimize. Software tends to evolve "within the lines"; it's hard to make changes that cross abstraction boundaries. But that's where performance is.

On the other hand, hardware has gotten faster unevenly. Disk seeks are the slowest thing your computer does besides WAN access, but those haven't sped up much over the last decade. Disk throughput (and size) have increased orders of magnitude faster than seek time.

But we're still using the same software abstractions (and the same software!) for these radically different machines. For example, stat() is still a synchronous call. In the old days it wasn't all that slow relative to CPU operations. Now it can take 30ms, which is 30,000,000 cycles on a 1 Ghz CPU. So arguably the default should be async.

C is another abstraction that had a reasonable performance model when it was invented, but no longer does. It's hard for the average developer to have a decent (portable) model of a CPU these days, especially with regard to concurrency.



I generally agree, but would present a specific software-architecture cause for most responsiveness issues: the simplest way to perform a UI update is "toss and rebuild" - remove all the elements and replace them with new ones containing changes. This method is straightforward and much more maintainable than the performant alternative: hold everything in place and manipulate the minimum needed to update.

It's a particularly important trade-off as you start considering, for example, more complex layouts where the size and positioning of one element may affect any number of others, elements get added or removed, etc. There's a spectrum of variations between these two extremes that let you choose between maintainable and fast, but neither end really has an ideal combination of both. Language and coding style changes can help, too, but a truly complete solution still seems absent.

The initial load is the second major source of peril for responsiveness. Even if a custom cache for the load is used, the user's first-run impression is still going to be of an uncached experience. Here the disk seek, as you mention, is of huge importance. But I find that I'm personally less irritated by load times than I am by software with intermittent spikes of high CPU usage, which will slow down the whole system at unexpected moments.


Well, I agree that there are specific program architectures that can cause problems, but I haven't had the experience where "toss and rebuild" is the bottleneck.

One of the other responses nailed what is a more common problem IME -- network activity and UI activity on the same single-threaded event loop. Firefox and Chrome both have this problem (or what amounts to it essentially). For example, in Chrome:

http://code.google.com/p/chromium/issues/detail?id=113389

This bug has gone unfixed for YEARS (older ones were marked dupes of later ones).

The way we write software now is just too difficult to get right. Once you have 1M lines of C++ code there are going to be crazy performance bugs that nobody can fix. Chrome has some of the best engineers on the planet and they're not immune to this by any means.

I guess I would amend my original answer to say that "software size" is the problem. Everybody thinks they know how to write performant software. But that's only when you can fit everything in your head.

I would say that "toss and rebuild" isn't the #1 culprit because the web browser is the ultimate "toss and rebuild" architecture -- that is, every request and response stand alone. Yes web pages can be ungodly slow and unresponsive. But it's possible to make them responsive if you really simplify -- i.e. Google or craigslist.

In theory we could have a stateful web and make it faster. But I think our collective programming skill and our languages/tools just aren't up to the task. I think the web exists as it is, and is as popular as it is, because lots of people with "domain knowledge" could write dirty PHP scripts and such. They aren't fast but they get the job that users want done.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: