Ajax, high latency and user experience

Tags:

Most of us probably have used at least one website which bases some major functionality on Ajax. But have you ever used a such site, done something, then moved on to another page, only to come back later to realize what you just did was never saved?

What happens to XMLHttpRequests in a high latency situation, and how to deal with it?

Bad things

That’s right, bad things start happening. In the worst case scenario, the request never reaches the server, and what you just thought you did never actually happens anywhere else than at your end.

Why is this? When you open a page in your browser, it sometimes takes a while for anything to happen. You can see this, and wait until the page has loaded, but in an asynchronous request that is handled in the background, how will you know?

Often the user interface is updated way before the request has actually even finished. This is even suggested as a method of making the user interface seem snappier, to hide the fact that it’s actually not instantenous. This, combined with a sudden spike in latency, can cause some headaches for the site’s users.

Incomplete ajax requests get canceled when you click a link on a page. So if one of the requests hasn’t still talked to the server, it never will!

You can test this yourself using TMnetsim, which is an excellent Windows app for simulating slow network connections, packetloss etc.

Consequences

Ajax is often thought of as a good technique to speed up page loading and other actions. While this is true, it can actually make your site annoying to use if you haven’t planned things properly.

This is something that isn’t discussed enough. Most people talk about how to utilize JavaScript and Ajax to make the user experience better, but ask yourself: what is more frustrating, than noticing that the edit you just did, never actually was saved? At that point you may start to doubt the site: Okay, so I just saved this, but did it really save? Maybe I’ll wait a moment to be absolutely safe…

Personally I hate that. If the UI shows that hey, everything is okay, you can go now, that’s what it should be.

Surviving the lagocalypse

So how do you keep the user interface fast, but keep the user from accidentally killing off any slow XMLHttpRequests?

The simplest approach is probably to just show your typical Ajax loader until the request has actually finished. This is probably the best from the user’s point of view, as they get obvious feedback right away.

The downside of that is, of course, that the UI no longer will feel very quick. If you simply update the UI immediately where possible, it will seem faster, but the users may fall victims of lag.

There are two solutions to this I can think of: either build your site fully-ajax based, so that there are no links or other things that will lead the user away from the current page, stopping the requests, or you can disable all outgoing links until all requests are complete.

I don’t think these are perfect solutions though. Building a site purely on JavaScript and all is probably overkill in most cases, and would mean the site won’t necessarily work so well on older PCs, older browsers, or mobile phones, which are getting more important to support nowadays.

Disabling all links until the requests are done may be the better choice of the two. It can be confusing for the user though – links usually don’t get disabled – so the user needs to get informed of the reasoning why they can’t go ahead just yet.

There is one more trick though: you could artificially slow down the links. By this I mean the same as above: disable the links, but instead of just doing nothing, you could display a loader animation, and when the requests have completed, you can automatically forward the user’s browser to their destination. Is this the optimal solution? Possibly, but may be tricky to implement.

Summary

When you combine techniques that make a page seem faster with a high latency connection, it will suddenly be a possible trap which can annoy your users until no end. There are some tricks to deal with this, best of which would seem to be delaying any links until the Ajax requests in the background have completed.

Can you think of any more good examples of how to deal with this?