Software Reality
Programming with
a dose of satire.

Site Map Search


Agile Development
 
Extreme Programming
 
Code Generation


Articles
Lifecycle
Design
Programming
Soapbox
Reviews
Cthulhu

Check out our ageing Reviews Section


Use Case Driven
Use Case Driven Object Modeling with UML: Theory and Practice
Get from use cases to working, maintainable source code. Examples use Spring Framework, JUnit and Enterprise Architect

Agile UML
Agile Development with ICONIX Process
A practical subset of agile development techniques, illustrated by example

Get Controversial!
Extreme Programming Refactored
Extreme Programming with a dose of satire
Available now:



ICONIX/Sparx Public Classes to Come to London

ICONIX is planning a series of open- enrollment public classes entitled Hands-On Enterprise Architect for Power Users in collaboration with Sparx Systems.




User Interface Design

UI Leaps of Faith

By Matt Stephens
January 2, 2006

Now that's a phoneIn every software company where I've ever worked, an interesting thing happened whenever a phone call came in to the wrong person. A highly intelligent software engineer would pick up the phone, say “Sorry, you’ve come through to the wrong person, I’ll transfer you,” then stare around the office like a lost sheep.

"Plastic nemesis"

This person, who could write a million lines of well-structured Java code and pinpoint exactly which class was responsible for a given function, mysteriously failed to “get” the phone’s hardware UI, his immediate nemesis. For all its sleek plastic power, the evil phone device was standing in his way and preventing him from simply transferring the call so that he could get on with his programming.

I started out with this article thinking that I would be blaming the phone’s interface for the programmer’s moment of confusion. Given that transferring calls is a fairly major function, there’s nothing on the phone that even hints as to how a call should be transferred. No “transfer” button. But then I realised that other, non-technical people appear to have no problems at all operating the phone.

Receptionists typically put us programmers to shame. They can stack waiting calls, forward to people’s voicemails, instantly get back to caller no 7 (for example) and divert all incoming calls for extension 472 to extension 331, all the while one-handedly operating Windows Solitaire so fast that the screen’s a blur.

She is obviously building a model in her head of the phone’s internal state, with its several patiently waiting callers. The phone has an “implied” UI, not at all evident in its keypad or tiny monochrome LCD. But if you know how it works, it’s possible to make that lump of plastic sing Soprano. It’s a bit like playing chess without a chessboard; just keep track of the pieces in your head, and occasionally blurt out comments like: “Rfd1, rook takes knight!”

"Painting reality"

Whether the receptionist has realised it or not, she has painted her own sphere of reality, her own persistent, highly consistent virtual universe in which her consciousness regularly sets up camp at 9AM each morning; you can almost see this holographic spinning “microverse” hovering several inches above that phone, her gateway to our corporeal world. She’s a 9-5 workaholic demigod.

Which brings us back to our programmer (oftentimes me) fumbling awkwardly to transfer a call. He/she has been interrupted from his own virtual universe, and is faced with the horror of having to interact with this non-software, (shudder) physical device with its unfriendly UI. What the hell does the “Ans.Rel.” button do? Where are the tooltips?

Of course, it’s the receptionist job to be good at using this device; you could safely say that she’s an advanced user, unlike the programmer who flops onto the phone (or fax machine, or photocopier, or whatever other device he is suddenly required to figure out) like a fish out of water. Likewise, asking the receptionist to fill in for the programmer on his coffee break isn’t likely to turn out too well, either.

But there does seem to be an impedance mismatch between the software universe that the programmer has grown so used to, and the physical world in which he must visualise an entirely different type of microverse atop any even slightly complex device, and map the device’s physical constraints into his universe’s virtual parameters; all that just to transfer a phone call, or send a fax, or to bathe the cat in the washing machine.

Supposedly “non-technical” people do this all the time, even if they don’t realise it. But it’s almost as if we computer geeks have become so embroiled in the software world that we’ve become deskilled in our ability to visualise any hardware device’s implied UI and make sense of it. We have difficulty even making a device grunt, let alone sing like a diva.


Hanging on the Telephone

The need to maintain a cognitive model of the device’s internal state is one problem. But, as if that weren’t enough, there’s another showstopper issue: With software we’ve grown used to having an Undo option; or at least some trusted mechanism for saving our skins if we get into a scrape. Deleted a file? Go to the trashcan. Or failing that, go to the backup disk; or to CVS, or whatever. Closed a file by mistake? Just re-open it, its contents will still be there; no big deal.

But when transferring a call, there are no such safety nets. Mess it up, and the caller is cut off in his prime, left wounded and probably angry because you hung up on him. So you’d hope that the function you’re trying to achieve is at least made really obvious, so that you know that you’re not accidentally doing something else.

"It's that bit about putting the phone down."

With our phone system, the lack of an actual Transfer button supposedly makes the task easier. You “just” dial the person’s extension number, wait until they answer, then put the phone down. Nothing easier! (explains the exasperated receptionist to a red-faced programmer). But it’s that bit about putting the phone down. How do you know for sure that the call will be transferred? So far, nothing on the phone’s little LCD has suggested that that’s what will happen. Usually, when you put the receiver down, you hang up: terminate the call, cut the line dead. Who’s to say that this time will be any different? The user must make a serious leap of faith and just go for it, slam the phone down, trust the system to transfer the call successfully, knowing that if he’s got it wrong then there will be no going back.

The receiver’s inconsistent behaviour is, of course, that dreaded of all UI antipatterns, a “mode”.

In the HCI world, an interface is modal if the same action does different things depending what state some other part of the UI is in. It’s highly confusing, depressingly common, and generally frowned upon. Like this. (I’m frowning right now).

But you can understand why a phone, with its limited set of hardware buttons, would have to be modal in order to pack in all of its super-powerful functions. Notwithstanding that the majority of users only really need a tiny subset of those functions: making a call, ending a call, checking their voicemail, recording a new “Hi this is Bob from accounts, leave your complaint after the beep” voicemail message, transferring a call, putting the call on hold. Each of those functions could quite easily have its own button.

All those other, highly esoteric functions (if they really must be available at all) can be virtual. “Ans.Rel.”, “Group”, “Program” etc. It doesn’t matter if they’re hidden away in a mode, they would never be used anyway.

So until computer geeks learn to re-adapt to the physical world around them, we can only hope that device manufacturers learn some lessons from software UI design, eliminate modes, and make their devices more forgiving, and more… well, obvious.

What’s more likely to happen is that phones and other devices will become more like computers. They’ll consist primarily of a big VGA touchscreen and a receiver, and naturally the phone will be Linux-based. It’s already happening, of course. So techies can rejoice: the mountain has come to them.

Instead of the user creating their own cognitive model of the phone’s internal state, the model is shown on the screen for them: “7 callers on hold: Switch to Caller 1, Switch to Caller 2, ... Show callers in chronological order”, that sort of thing.

"PCs in disguise"

But, ironically perhaps, non-techie users will, I’m sure, soon be seen fumbling with these “PCs in disguise”, cursing the logical, dynamic interface with its disappearing virtual buttons, drop-down lists and so forth. The poor receptionist is being airlifted into a foreign land where her amazing ability to construct in her mind these virtual holograms (“mindgrams”?) of the phone’s highly complex internal state, is no longer needed. Instead the evil PC-device has taken over this job for her. It must be like watching the movie of the book, and discovering that the sets and the lead characters look completely different to how you’d imagined them. (As for VoIP applications, don’t even go there!)

I’m not suggesting for a moment that receptionists will be the next victim of the technological revolution, thrown onto the scrapheap of obsolescence along with cobblers, petrol pump attendants and philosophers. My God, we still need receptionists and will do for a very long time. Nothing kills programmer productivity quite like a ringing telephone.

But, for a while at least, the receptionists’ lives are going to be made harder, not easier, by the “advanced” new PC-phones. You'd expect there to be an adjustment period after which they learn to teach this new, friendlier device to sing, but there's more to it.

However more obvious to use (aka “intuitive”) the phones may be, they’ll only slow down an advanced user. It takes longer to fiddle with a bunch of drop-down menus, spin-buttons and so forth than with nice chunky physical buttons (as long as you’re familiar with them); Fitts’ Law governs UI design like gravity governs aeroplane design. In fact, it’s more like Fitts’ Law cubed, because to operate a “soft” UI the user must cross mediums, from the physical to the virtual; “step into” the screen with their fingertips. (I would also mention GOMS here if it didn't sound like somebody clearing their throat).

Similarly, creating the model of the phone’s state on behalf of the user – and showing that model on the screen – makes the phone more usable for the casual user who rarely uses the phone except to answer an incoming call from his boss.

But the ostensibly "user-friendly" UI is also less efficient for experienced users because it creates more “stuff” that they must deal with and navigate through. Experienced users find “helpful” UIs frustrating because to achieve a task they must press maybe 5 buttons instead of just one. Most likely they’ll still map the PC phone’s “helpful” state-model onto their own cognitive model anyway.

What it boils down to is this: Different types of users need different interfaces, because we want to get different things out of the device we’re using. Our goals are different; our experience with the product is different; our level of patience with the device is hugely different.

Hopefully sometime soon, phone designers will figure this out and target different phone UIs for different breeds of people: simple, obvious, hand-holding EezeeFones for us poor fumbling technogeeks who use the desktop phone maybe once a week, and even then only grudgingly; and scary-looking, ultra-efficient, modal (if need be) Rubik’s Cube devices for the super-users for whom it’s their main tool used every minute of the day.

 

Other UI Design Articles by Matt Stephens


The Oyster Gotcha

Toolbars in the Dock

Persona Power
(Software Development Magazine, March 2004 - free reg required)

<< Back to Design

<< Back to the Front Page

 

 

All trademarks and copyrights on this page are owned by their respective owners.
Stories and articles are owned by the original author.
All the rest Copyright 1998 onwards Matt Stephens. ALL RIGHTS RESERVED.