I’ll be showing art at the upcoming Dallas Indie Festival happening at White Rock Lake in North Dallas. A beautiful location!
I’ll have some large pieces in prominent places at the film festival. I’ll also have some art of various sizes setup for sale. Look for me near the lake. I’ll have plenty of MySparcLine on hand. Come and pick out the one that is you. See you there!
What’s worse than understanding a lot of crap on the screen? Memorizing a lot of crap that isn’t. Bring back affordances. The essence of the graphical user interface was not graphic design by graphic designers for graphic designers. It was to enable and empower users to interact with computers by way of visual representation of the functionality and the means to interact with it.
Why am I seeing a loss of affordances everywhere and a surge in reliance upon guesswork and memorization for successful interaction? Affordances are of course, those cues built in to the design of things that offers me clues as to what it is and how I am to interact with it. We depend on these things to get through the day. More and more these things are aspects of software. So, why are these things disappearing? Sigh. It’s a case of minimalists vs. simpletons.
Invisibility is an effect to be achieved when the user is put in touch with the subject matter to the extent that the user’s awareness of the UI itself fades. This is a form of minimalism and is not a new idea. I recall news anchor, Jim Lehrer weighing in on design in an article in ID magazine in the late 80s. He defined the best design as invisible, citing the example of the suit he is wearing on the News Hour not detracting from the subject matter of the news. Obviously that idea is lost on this generation of news anchors, Robin Meade, Soledad O’Brien, or the Fox & Friends guys, etc.
A gross fallacy is to think that this effect of invisible design is achievable simply by removing anything visible of the UI. That would be Simpleton design.
This new modernist movement contains all of the truth and fiction of previous ones. Designers will play Jenga with design, removing pieces until it all comes crashing down, then start putting some pieces back until it is stable once again. It is actually a pretty good exercise, but a painful one to put users through.
Einstein’s advice to “Make things as simple as possible, but not simpler” can be violated either of two ways. Trading one mistake for the other is not really much of an improvement.
A chronic challenge and often frustration among designers and innovators is the difficulty in gaining acceptance of changes to existing systems by current users, even when improvements appear to be substantial. A recent online discussion surrounded the hypothesis that “users will only upgrade to a better interface if they have at least 50% better productivity with the new interface?” https://www.quora.com/User-Interface-Design/Upgrade-an-interface-When-will-users-change
The key to successful change is 50% about as much as the meaning of life is 42. Neither designer ego nor marketing’s need to be fashionable is sufficient to justify the cost of change from a user point of view. And yes, this is a designer speaking. Even an apples to apples time comparison in a lab fails to reflect the real life costs of change in the equation.
Applying user empathy almost always deflates assumptions and egos. User perception has many facets and counts for far more than time scores or numbers of clicks (good heavens). Disruption of mental models is huge. Does your time study account for that? What is the cost of changing one’s mind?
On the other hand, keeping things the same for sameness sake is also a terrible strategy. Not only will it very well undermine the benefits of improvements, it can mislead and misrepresent as it tries to hide change, only to surprise the user with it at some inopportune moment.
All change costs something. But if the user perceives an appreciable improvement from their point of view, they will forgive you for causing them pain. And, if the user sees the stuff they care about in the design, presented in a better way than ever before, they will love you.
The best example I can think of this is the reality show, “Extreme Home Makeover”. Anyone who has seen the show, knows how they take a deserving family’s run-down home and rebuild/replace it with a new one custom built just for them. The kicker is that when the family sees the new home for the first time, they immediately recognize it as “home”. Why? Because it contains all their favorite stuff, arranged and presented in a beautiful setting and functional framework.
Do the same with software design and you can stop whining about inflexible users. Every study I’ve ever done where I asked users what they want/need/expect to see in the UX, the answer is always the same. It is invariably “My stuff”. Of course exactly what the stuff is depends on who the users are and what their reason for using the system is.
When users discover that their mental models which are intrinsic to the tasks, data, and domain are better represented in the new design than in the former and that the noise of the system-induced clutter that used to cause them grief, confusion, and extra work has been reduced or eliminated altogether, they won’t mind that the colors, fonts, and even layout has changed. In fact, they may not even see it. The more “invisible” the new design is, the more invisible the pain of change. The concept of “invisible design” is not a new idea, but still as valid and as elusive as ever.
That’s how to win at change management design. Just make the change invisible. That’s all. How? Remember, Its all about the user’s stuff. Its not really even about change. That’s a fallacy. The new design either makes the user’s stuff more visible or less visible. More, they love you; Less, they hate you.
“All my stuff’s here anyway” – It is the reason for going or staying.
That’s the secret. Shhhh. Dont tell anyone.
Part 1 – The Enchantment
I was actually in the room at CHI 2001 when Bill Gates introduced the Tablet PC. Taking notes on my Pocket PC, mobile and tablets, I was all over it. I became an early adopter soon afterward, using my convertible notebook/tablet doing UX design consulting. And let me tell you that in those days, jaws would drop in amazement when folks saw you draw on the screen. Nothing impressed the natives more. “Look, him draw on screen! Him must be a god!” Alias sketch, Microsoft OneNote, and all those new app user interfaces that were going to change the way we interacted with computers,
Part 2 – The disappointment
except that it didn’t happen. None of the MS Office apps ever budged to utilize pen input, nor did Adobe, or anyone else. Instead Microsoft abandoned us early tablet adopters like freedom fighters at the Bay of Pigs.
Then a few years later, Apple invented both the pocket PC and the tablet. And everyone swooned. And for good cause, this time it worked. Plus it had the Apple and third party support to make it really productive and price point lower, not higher, than a regular laptop. Google joins in the fun steeling Microsoft’s role as the “other leading brand” to be compared with Apple, mimicking their every move, yet with an open hardware platform.
Part 2b – More Disappointment
Fast-forward again to 2012. Microsoft introduces Windows 8 and the Surface. There has probably never been someone so late to their own party and awkwardly dressed for the occasion. Microsoft launches an Apple-esque store in the mall with Kool-Aid drunken sales people mimicking the weirdoes at the Apple store. So I stopped in to check out the Surface. The name itself speaks of another great concept that couldn’t find a market and so left its name to be adopted by this iPad wannabe.
Somewhere in the windows 8 mix, I was hoping to find my old tablet PC reborn with a contemporary vigor. No such luck. What I discover is a lesser knock off of tablets that are already too dumb for my professional taste. That may sound lofty, but this was my daily work tool for four years, constantly with me in airports, airplanes, hotels, coffee shops, and offices everywhere.
Part 2c – Even More Disappointment
Windows 8 — Disappointing Usability for Both Novice and Power Users http://www.useit.com/alertbox/windows-8.html
Disappointment. The word that Jakob Nielson uses to describe the Windows 8 experience. I am compelled to agree. Not that I am or have ever been a firm Jakob follower. I just hoped that we would see the high end supported with trickle down impact to the lesser demanding users. Instead what I see at every turn is the computing environment reduced to a contest between Dumb and Dumber.
Looking at Nielsen’s article, it is confirmed. Power users have been thrown under the Windows 8 bus. Nielsen’s description of the modern style induced usability problems in Windows 8 sound all too familiar. It seems that Microsoft has confused minimalist with primitive. Can you say pre-Win 95? No, wait, more like pre-Win 3.1! It’s like Microsoft has unlearned all the lessons of the past twenty years. I wouldn’t mind except that I depend on their products to do actual work, not just goofing off.
Makes me think of the “Apple Wheel” as reported by the Onion
It feels like Microsoft is sacrificing the power user desktop which is still dominates to become a tag along in the tablet space. It seems, there may be a clear opening for a high performance user experience Operating system environment. Silicon Graphics Irix, where are you?
Part 3 – The Enlightenment
On the other hand, if Microsoft believes that the desktop is vanishing from the earth no matter what, then it might seem prudent to use their window of time to convert that desktop lead while it exists into a tablet contender. It still leaves many of power windows users in a hard way and opens the door for a new aggregator to jump in and direct the larger virtual platform.
Pondering this a bit more, this may indicate a milestone in the abandonment of the desktop by Microsoft as something that they see that cannot be held onto. The computing environment that was once the virtual desktop metaphor invented at Xerox PARC is now being replaced by a ubiquitous heterogeneous environment that exists both in real space and in the cloud. An aggregation of real and virtual devices is needed to perform the same role that the proprietary desktop once played. I see that mobile devices may have their own avatars in this virtual space.
Who will define this space? Who will own it? How about me and you?
I am pretty sure this is the theme of my next techy art piece. “MyFavoritemachine, In the Cloud” or maybe “Escape from Desktop’.
copyright 2012 r.e.belveal