I am left in the bittersweet position where a) I 99% agree with an article and b) scratch one post in my Wordpress drafts, because it would add very little to the topic now.
I also sell software to customers with substantially the same technical aptitude as Andy's. My comments:
1) Copy/paste: they can't reliably copy/paste. Of those customers who can copy/paste, a number of them know exactly one way to copy/paste, and will fail if it does not work in the context they need it in. (i.e. they either know what an MS Word copy/paste button combo looks like, OR they know how to right click, OR they know the keyboard shortcut -- and it is, far and away, the keyboard shortcut which is the most widely supported and least widely understood option)
Andy's suggestion is to make it easier to type things out longhand. I suggest making it unnecessary instead -- you can do some client/server trickery to avoid this (discussed here: http://www.resultsjunkies.com/blog/back-office-exposed-bingo... under the heading "When a sale comes in, can you walk us through the process?")
9) My sole point of disagreement with Andy: this type of user really wants to relate to their computer like they relate to a toaster. No one reads a toaster's documentation, and no one should have to. If the UI needs external documentation, that is probably a bug.
> I am left in the bittersweet position where a) I 99% agree with an article and b) scratch one post in my Wordpress drafts, because it would add very little to the topic now.
Post it anyways? Your articles and style fill me a mix of joy and inspiration. And I think hundreds or thousands of other people too.
> you can do some client/server trickery to avoid this
I saw quite a neat trick the other day when I bought EagleFiler[1]. Having already used the demo, the program had registered a URI protocol handler, and included a link in the registration email that looked like:
x-eaglefiler://p?n=<my name>&sn=<my serial number>
Mail.app made this clickable, so it would have been unnecessary for me to even copy&paste the details (I didn't actually need to as I'd already registered, ironically by copying the serial off the receipt webpage).
Although, the text in the email makes no mention that I could just click on the link to register the program, so the point of the original article still stands...
It doesn't make sense to distinguish "technical" from "non-technical" users. Plenty of users with no interest in programming and no formal training in computing handle these tasks easily. Many users understand that a file exists in a physical location, that it can be copied from a hard drive to a flash drive, and copied from that flash drive to a different computer, and they understand the necessity for that. And they understand that when they use a web app that relieves them of that burden, that the data is now stored "out there" somewhere instead of locally.
From the other side, I had a boss with a Master's in Computer Science who had lots of trouble with normal desktop usability. He certainly never learned to guess which actions were reversible or not. My sister, who teaches history and is barely technically self-sufficient, could run rings about him on the Windows desktop, not to mention my friend the accountant who has never written a program but could be a passable professional programmer with six months' full-time training.
So instead of saying, "this is what non-technical users don't understand," it's better to say, "these misunderstandings are part of the normal spectrum of user savviness that you will have to deal with." Whether a user is technical or non-technical only allows you to make a rough guess about the mistakes they'll make.
His Master's was about algorithms and was entirely mathematical. His undergrad was in electrical engineering, and it involved plenty of hands-on work... but he got both his degrees in the 1970s. There's a pretty good chance Windows was not part of the curriculum.
Years ago I had the opportunity to host a professional astronomer at a star party at a dark site. He was really excited because he had never seen the milky way before; at least not with his own eyes. He was totally blown away. Most of the regulars were complaining about the transparency that evening, but to this guy it was a revelation.
oh, if you want them, most places have plenty of courses that are all reading papers, or talking about managing development, or crossover with cogsci departments. then you do a thesis on how developers are affected by some lights which change color depending on the number of bugs in their tracking system.
This article reminds me of two things I've observed in ordinary computer users:
1. Many users don’t understand how or where data is stored or even that it is separate from the application.
That is correct. My photographing neighbour keeps his photos 'in' Adobe Bridge, another neighbour keeps them 'in' Adobe Elements. They don't have a notion that those programs only offer a view on the pictures which are stored in some directory on their harddisk (they don't understand the hierarchical file system either). When they have to import photos from their camera or email a photo, they always do it from their photo cataloging program, never from the Windows Explorer.
Likewise, a few of my collegues think their texts exist solely 'in' Microsoft Word, and that the 'Internet' is the same as the WWW.
2. reversibility
One day I was helping a friend of a friend with transfering some texts over to another computer. We reviewed a text, by accident I deleted a paragraph and restored it promptly with Control-Z (undo). The person was looking flabbergasted at me: 'How did you do that?'. He told me that often when some part of his text 'disappeared' magically without apparent reason, he had to type it all over again. He didn't know that Word has an Undo. I told him he could benefit from a basic Windows and Word training, but he wouldn't hear of that. He did remember the Undo trick, however ;-)
Beginning with Windows 98, it has been more trouble than it's worth to try putting your files where you want them in the filesystem, instead of letting the application find a place for them. After I realized this, I tried putting My Documents on drive D - that didn't go so smoothly either.
I recently helped my aunt install a new scanner. She asked me how to send a document to someone. After exploring the driver software for a while, I adviced her to start the scanner wizard/pamphlet printer/photo editor/document manager, click "scan", click "email", and let Outlook take it from there.
I honestly believe that this is the difference between Apple and the rest of the competition. You see, the secret they have to their products is that they realize that most people don't have any idea whatsoever what the hell a frikkin' mp3 file is. Neither do they know the difference between lossy and lossless conversion. They are intimidated by all those fancy buttons on the screen and are too scared to try anything just in case they break something.
Steve Jobs recognizes this and that's his genius in a way. So, instead of creating a widget that does everything they choose to make a widget that does everything the user wants. There's a subtle, but important difference over here. Only a small subset of their users want extensive options so that they can put FLAC and other stuff onto it. Those are the users who fret about the product to the nth degree and they know that they don't exist to serve them.
So, they pick a feature and implement it in a way that those other users (80%+ of the market) know how to use it, and that makes all the difference. Just look at enterprise software and how hideously broken it is. You have SaaS that try to do everything at once which directly results in their users doing nothing at once. This instead causes grumbling against the software and in the longer turn your sales go down. The hardest lesson in business, I think, is learning how to bite the bullet. No one wants to admit that their software can't do something. So, they rush and put the feature in to satisfy themselves without checking if it works or not. Big mistake. What's worse is that it is so ingrained in us that we don't even realize what damage we have done.
What I am trying to say is that it's perfectly fine to not have the insanely great feature X, but it's inexcusable that the user has to read a 300 page manual to use feature X.
Then some OS X features completely baffle me. Like installing software. I'm not a "Mac guy" and each time I ask someone how to install software downloaded off the web I get a different answer, all preceeded by "You just ..." and then 10 or 11 simple steps. Apparently they have never heard of installers. I'd guess from this, that the average Mac has never had ANY software installed after it left the factory.
I find it pretty hard to believe that this is really the most typical response. Most Mac software downloaded goes like this: Double-click to mount, drag Application into Applications folder. Two steps, usually with big arrows to make it as clear as possible. The only other use case I know is: Double-click to mount, Double-click installer, click through till you're done. And most Mac users I know (non-techies) have installed several third party applications.
You left out the most important and confusing step: Running the app. That requires not clicking on the icon you see in the dmg folder, but opening the applications folder and scanning all the apps looking for the unfamiliar one that was just installed.
I've had several friends come to me after installing their first OS X application, saying they thought they had a virus.
"Whenever I try to run this application, I get the warning 'This was downloaded from the internet', etc., etc. I have to click 'Okay' every time I use the app. It doesn't seem safe."
The problem? They had dragged the app straight from the disk image onto the dock.
That said, OS X installation is about as simple as you can get. (Though I wish Apple would finally include Homebrew as its default package manager.)
Apparently the the drag/drop install thing is not intuitive to even technical users. I don't remember what my reaction was to this the first time I encountered it, but I'd guess it wasn't entirely positive.
There's a perfectly good system wide Installer on the Mac, but nobody uses it unless they have to, because installers suck (they ask, among other things, which drive you'd like to install the software to -- what normal user knows the answer to that question?).
A number of vendors are working to improve application downloads by distributing applications as self-extracting zipfiles that put the original archive in the trash on extractions, and making it so that if the application is launched from the downloads folder or the desktop it initially politely asks if you'd like to "install" the application. This is pretty foolproof. Even if the user declines to move it, the Application will still work from the Downloads directory or the Desktop, it just won't be as easy to find.
There's a perfectly good system wide Installer on the Mac, but nobody uses it unless they have to, because installers suck (they ask, among other things, which drive you'd like to install the software to -- what normal user knows the answer to that question?).
It didn't help that PackageMaker used to be one of the oddest, most incomprehensible pieces of software you ever had the misfortune to use. It's a bit better now, but most developers I know have sworn off it for good.
Apple does have a technology called "Internet-Enabled DMGs", which automatically mounts, copies the application, then unmounts and deletes the dmg. It's not widely known about or used.
That's sounds like the feature that was needed. The software I've had to suffer under, had me do that by hand.
No, drag-and-drop is NOT unintuitive, and if that was all that it took then great. But Mac guys are so used to the ringamarole that they forget this part: "Oh yeah, then mount the dmg, then open the virtual drive, then drag one of the various icons (which one???) to your Applications folder, oh yeah open that first in the finder, then close the virtual drive and unmount the dmg, then find the downloaded wad (what was it called again?) and delete it." All of those operations accessed from different menus/tools.
A few things used to make this process a little less painful.
First, Safari used to download things to the desktop. This meant the DMG file was sitting right there, looking ugly, taking up space.
Second, Safari used to, by default, open "Safe" attachments (which included disk images), and open the Finder window to the newly mounted image automatically.
The upshot was that, assuming the developer had made an effort to indicate what to drag where with a big honkin' arrow, after your download was done there would be a pretty clear sign of what to drag where to install the app. Unmounting an deleting the DMG has always been an issue (and the internet-enabled DMGs are better for this), although only the unmounting bit is unique to OS X. Under Windows you're still going to have an installer file lying about that has to be deleted.
Both of these things went away, though, and many developers haven't caught up with the times.
I guess the users who don't care about which drive the installation targets don't have multiple drives. Few things irritate me more than installers which don't ask that question, as space on my primary drive (SSD) is precious.
Actually, I get the impression that most Mac users have laptops.
Meanwhile, junctions on Windows largely solve the installation location problem, except for those apps that use hard links behind the scenes (especially for transactional installation changes) and expect those hard links to work, especially across temporary directories etc. In practice, I've found this is only Microsoft apps.
Really? I use Windows most of the time, but I didn't have any trouble installing applications on the Mac. All I did was load the disk and dragged the application to the Applications folder. One step. Granted, Windows installers aren't exactly rocket science either, but the process of installing software isn't difficult at all on the Macintosh.
Given that, I agree with you in thinking that the majority of Macs haven't had any applications installed on them since they left the factory. Why? Macs come with a much wider range of applications than Windows boxes. As long as its not common for consumer PCs to ship with Office and Firefox pre-installed, there'll be more software installation on Windows than on PCs.
I use Windows most of the time, but I didn't have any trouble installing applications on the Mac. All I did was load the disk and dragged the application to the Applications folder. One step.
I'm curious - coming from a Windows background, how did you know to do this on a Mac?
It sounds so simple when you describe it but I think its one of the last thing a power Windows user (who knows about extracting zips, running installers, exes and dlls, etc) would think to do - since dragging and dropping a single icon would work for a very small percentage of Windows apps.
OS X disk images (DMGs) can be created with a background image. Most developers take advantage of this fact to make it really, really obvious what you're supposed to do. For example, here is what DoubleTwist's DMG looks like: http://cl.ly/cd6c0478bef684fffaed.
How could any user be confused by that? (Also, Safari is by default configured to open DMGs automatically, so all the average user needs to do is click the download link and follow the arrows).
Because it looks more like an static image than something you can interact with?
Not that I have the problem now but I remember the first app I downloaded in osx was Quicksilver, which, at the time, the DMG opened to a grey background and a giant quicksilver logo. My windows/linux brain assumed the DMG was empty due to a corrupt download or something and I ended up downloading it a second time. Of course turned out the logo WAS the appfolder.
Which doesn't make it look any less like a static diagram.
Specifically it doesn't do anything more to imply there are two objects presented, like say "drag the (image|icon|folder) below to (your|the) Applications folder to install."
Well that's a great feature and smart of developers to take advantage of. But what I am really curious is the thought process (if it's possible to remember and elaborate on) for someone who was going through this the first time, was not presented with such a user-friendly DMG background image, but was still able to intuitively figure out what to do.
It seems to me that while it might be a great way to install something - incredibly simple - years of Windows use would completely program someone to not consider something so simple.
It's not actually simple - that's the key. The alleged simplicity is an abstraction, and most PC power users are more familiar with the underlying mechanisms. Efforts at hiding these mechanisms are all well and good, but if and when something breaks, or you need to do something unorthodox (e.g. automated installation - many setup.exe's take a '/?' flag - or moving installed apps from one drive to another, etc.), lower level knowledge is needed.
I got a mac a year ago, and the first time I had to install an application I was completely baffled. It took me an hour to figure it out. I'm a software developer by the way.
If I were more demanding I would assume that, by now, a Mac would automatically copy the files from the .dmg over to the Applications directory when the user runs the program bundle for the first time. The programs would automatically get purged out after a timeout if the user never ran them again from the hard disk.
Or even better, something akin to Zero Install (http://0install.net/) coated with OS X system sugar would be even better. That would also resolve the problem of software updates which are a pain on OS X, unless the program goes handle them by itself.
My benchmark for non-technical users is my father: he's 60-something and he started using computers & internet 3 years ago when I bought him a laptop.
I can recognize in him pretty much all the behaviours described in this article, especially the fear of setting everything up on fire with a click, but I've also noticed that he tends to use the machine with a weird sequential approach.
Click the fox icon, go to thataddress.com, click there, type this, click yes etc.
For some complicated tasks (such as burning a DVD) he's got all the buttons and actions he needs to perform written down on a piece of paper.
The amount of struggle and effort he puts in it always fascinates me.
Witness what happened when google results for "facebook login"[1] returned an article on readwriteweb, which confused thousands of visitors who thought that facebook had redesigned. So this is not an isolated phenomenon.
My assorted elderly/older relatives all exhibit the same pattern, and after years of not understanding it, I've finally learned enough about how people learn to explain it. Basically, they haven't developed any kind of abstracted internal mental model of what the computer or software are doing, and, as a result, treat it as a rote incantation. To put it in education-geek terms, they're stuck at the first level of Bloom's Taxonomy, and while they have some "knowledge" (where to click, etc.) they have no "comprehension" and therefore don't really know what their knowledge means or represents in any kind of larger context.
As far as they're concerned, each step in their "get my email" checklist is equally important and equally arbitrary. You could literally tell them that, after clicking on the fox icon and going to thataddress.com, the next step is to spin around counter-clockwise three times in their desk chair while singing "God Save the Queen", and they'd believe you- and then, one day when they only spun around twice in their chair, they'd be calling you in a panic to ask if that's why their "internet is broken".
This isn't a matter of stupid vs. smart, or anything like that- it's a matter of learning. Getting past that first step is hard for a lot of people, especially with computers[1]... and, while I'm on my soap box, I gotta say that I'm not convinced that GUIs do a good job of making it any easier (although I certainly can't think of anything that does a better job). GUIs make it really easy to aquire "knowledge" about how to use a program, but they can impair their users' abilities to get past that first step and really understand what's going on.[2]
I think that this is because GUIs sort of imbue the computing experience with a sense of, for lack of a better word, "false concreteness"- they make what are, in reality, highly abstract tasks appear to be very concrete, and let novice users get away with treating them as such... until, of course, something breaks or changes, at which point the user panics: since they don't have any sense of the larger context surrounding their use of the computer (i.e., why they need to click on one button as opposed to another, what the address in the address bar really means, etc.), they have no way of telling what "broke", or whether the changes they're seeing are important or not, and they don't have the mental tools they need to reason effectively about ways to get around whatever problem they've encountered.
Neal Stephenson touches on this subject a little bit in his essay, "In the Beginning, There Was the Command Line." It's more than a little dated, but I actually re-read parts of it a week ago and I'm happy to report that it's held up better than a lot of things that were written in 1999 about computers. Not surprisingly, the things that have held up the best are the parts that aren't tied to any specific piece of technology, but rather are about the abstract concepts and theories underpinning modern (i.e., post-1984) UI design.
[1] Anybody who's tried to teach somebody how to program will tell you that, while students have trouble with syntax and what-not, their bigger problems are almost always related to learning to think about problems in an abstract and generalized way.
[2] Here, I'm not necessarily talking about understanding the algorithms and data structures behind the program- I don't think it's particularly important for most users of most programs to understand the software at that level, although there are some cases where it might be desirable (certain medical or industrial applications, for example). What I'm referring to is the abstract understanding of how the different parts of the user interface fit together with one another and with the task that the program is trying to accomplish for the user- sort of like what Joel Spolsky's talking about in his "Figuring Out What They Expected" essay, but at an even more basic level.
To me, the interesting thing is that the entire culture of usability seems to be focused on widening this gap. Perhaps the biggest rule of user interface is "the user shouldn't have to care how it works". This extends even down into programming. You bring in a library, you look at the function specs, call a function, and if it works without you having to grok a single thing happening beneath the surface, it's resounding success. The same thing but worse happens at the application level.
Of course, good luck competing in a market where you're the only guy trying to (gasp) make the user understand what's going on. I don't even know if I expect the problem to get worse (as interfaces become more and more abstracted) or better (as users increasingly grow up with computers in their lives) as time goes on.
A mental model is essential to successfully use _any_ machine.
You don't need to have a correct mental model of how it works, just a mental model that given the most common input, predicts the output.
From your comment, I can see you have a mental model of how the car works. You press on the gas, and expect the car to go faster. You press on the brakes, and expect the car to slow down until it eventually stops. You turn the steering wheel to the right and you know that the front wheels point rigthwards, making the car go in that direction. This mental model allows you to predict the outcome of pressing the gas and the brakes at the same time, and therefore know that you should not do it.
You do not need to know the inner workings of the car. You might as well think there are midgets under the hood doing all the work.
The problem is that the mental model that most people form about computers is so wrong, that it doesn't predict anything. So they can't use a computer properly.
And this is mostly the fault of interface "designers". An interface to _anything_ should allow the user to form a mental model that predicts the outcome of the operations that users will need to perform. This does not mean exposing the inner workings of the machine, but also not over-simplifying and use metaphors excessively. Like I said, the mental model does not need to be accurate, it just needs to help the user do whatever it needs to do.
Car Analogies? Even Stephenson's book that the GP references doesn't pull that off.
Why do cars need to be refilled with gas? Why do their engines require regular oil changes? Why do their doors and ignition devices have keyed locks? Why do you need to learn to drive it, coordinating your arms, legs, eyes, and ears? Why do you need to learn the legal and social rules of traffic?
There's essential complexity here. The "mental model of how a computer works" is learning the UI paradigm (rules of the road), not busses and syscalls (drive shafts and carburetors).
The iPad doesn't have a 35-year legacy of physical removable media for user data. It also requires syncing to iTunes as the sole means of initialization and backup.
Exactly. Getting a working mental model, even if it's not very exact, does wonders for the ability to experiment. The problem is, most people get an explanation of what the different parts of the computer are, but not a working model of how it works.
And here lies the challenge of designing computers that are useful for people without the mental model, and efficient for people with it. Gadgets tend to do one or the other, but accomodating both is a good bit harder.
The "writing down instructions" approach is very effective for complex tasks that you are unable to script and don't want to invest the time to learn properly (such as, for example, every task on a computer for a certain class of user).
I use this approach all the time for bureaucratic "machines". It's so much more efficient than trying to actually figure out how the bureaucracy works.
But if you build an abstract semantic model of the bureaucratic "machine", you'll be able to "program" the organization as if it were a computer. Imagine what you could accomplish!
Sure, it's a lot more difficult due to the non-deterministic nature of most social systems, but even a loose, stochastic model allows you to "hack" organizations in a way that is likely to make your endeavors within an organization that much more effective.
Interestingly, the fuzzier nature of social systems gives it a kind of "undo" function - you can, in most circumstances, make up for mistakes and misjudgements - so the cost of experimentation is more similar to software than to, say, structural engineering.
In the golden age of the British pop industry, record companies used to play their newly pressed singles to the elderly doormen and cleaners in the offices. If these old duffers could whistle the tune after hearing the record once, they knew the song stood a good chance of becoming a hit. They called it "the old grey whistle test".
I am increasingly of the opinion that the single most valuable asset to any consumer software company is at least one "old grey". The more time I spend with ordinary end-users, the more I realise that I cannot even begin to comprehend their mental processes. They are a total black box to me, with a completely different set of instincts and intuitive responses. I think that the majority of software developers, even professional UX folks, massively misunderstand their users.
Yup. I'm lucky in some respects as a UX designer, because I do UX for set top boxes, not computers - my users are even less technically-aware that computer users, and they have a very strong expectation that their television is not going to be hard/confusing to use.
As STBs have got more and more complex though, taking on major media centre capabilities, interfacing to Internet services such as YouTube, Netflix and FaceBook, we are really struggling to keep the interface simple. But at least we are aware that this is a critical aspect of our software - computer app developers sometimes can lose sight of that.
#23: Users will not read dialog boxes. Especially on Windows. They will click them away and not even realize they existed.
Try it.
1) Put a dialog popup in your MS desktop app that says, "Clicking OK below will destroy the chair beneath you. Clicking Cancel will make a box of chocolates appear on your desk."
2) Observe when a user performs a task that produces the pop up.
3)Ask the user if they were disappointed that no chocolate appeared.
The user will look at you with incomprehension.
To clarify, I am not criticizing users. Users are users, we're not going to change their behavior with training or help manuals. We need to design around them.
That's a habituation to the reality of the Win32 style — "Find reached the end of the document | OK"
Users don't have an a priori instinct to dismiss dialog boxes out of spite — on the contrary, the elderly noob reads every one carefully as if the wrong choice would actually destroy the chair beneath them. A new dialog box like that would ruin their day, and mine when I get their panicked phone call now that they've gone off script. If they ever do build a mental model of the UI, they'll have learned otherwise.
But because of the way UI is designed, one of the first elements of the UI mental model your elderly noob will learn is that dialog boxes are to be ignored.
It seems like an easy way to avoid many of these issues is simply to make web applications where all the software is stored in the cloud. Most of the arguments I've seen for web apps are from the devs' perspective (you control the hardware, you can pick whatever programming language you want, etc), but I guess web apps are better for users, too.
There's a whole nuther article about "what users don't understand about your web app"
That network/DNS/router/etc issues aren't your fault and you can't fix them.
Everything regarding passwords and account security.
URLs, esp typo's and that The Google (esp it's search bar) != Internet.
HTTPS
Phishing, and a slew of other attacks. Also that punching the monkey is probably a bad idea.
Browsers, different browser versions, different browser manufacturers, cross browser incompatibility. Also that you did not create the browser and you can't fix it. Also why they have to download install new version of browser to use your software.
Why your webapp quits working when they aren't connected to The Google.
Number 11 (Only applies to non-english speaking software): English.
Related to number 4, english expressions can be as bad as technical jargon. There's no reason to call a file 'file', if the localised Datei or Fichier is better understood.
> [techies explore & fiddle] Non-technical users aren’t so confident and won’t try things in the same way. In fact some of them seem to think that a wrong move could cause the computer to burst into flames.
This is the fundamental difference between a "technie" and a "non-techie". In fact the inclination to explore, to fiddle, to hack is largely how someone gains the knowledge, experience, and craft to become a "techie".
I've noticed that my own depth of comprehension is far better when I explore and experiment first, and not study the formalized knowledge from a textbook/documentation until I'm satisfied that I have at least grasped the fundamental patterns on my own. This works not just in computing, but in almost every context (or at least those that don't have the potential to cause serious damage or injury).
Consequently, when they install a desktop app on a new machine they are often surprised that it can’t automatically access the documents they created on a previous machine
Really!? I've never, ever observed this.
In the words of Zoolander, the files are IN the computer.
I once worked at an independent hotel where they backed up their reservations nightly, by dragging and dropping the reservation application shortcut from the desktop to a zip drive. I wasn't sure whether to laugh or cry when they asked for help retrieving the data after several reservations were lost. They kept pulling the shortcut back off the zip drive, but the data wouldn't come back. :) (I ended up examining the original corrupted data file, found a phone number that the owner recognized, and all was made right.)
Back when I had downloadable software prominently available I'd get this about, oh, five times a month and twenty times right around the start of the school year.
Some years ago, a relative of my wife bought a new PC and gave her old one to the doorman in her building. She then asked my brother-in-law to "put the data back" in the new machine. She apparently hadn't realized that "the data" was now at the doorman's, inside her old machine.
I think my brother-in-law was able to get a hold of the old machine before it got re-installed.
Wow. That's like buying a new suitcase when you arrive at the hotel, and wondering why your clothes aren't in it. Packing? Don't confuse me with your jargon!
That makes me wonder if we should use that exact metaphor to explain computers/software. Your computer is a suitcase. Everything you do packs your suitcase. If you want to get a new suitcase, you have to unpack the old one and pack the stuff into the new one.
My (non-technical) users don't understand version control at all. Which is weird as it was them who insisted on installing Sharepoint. But every folder has file_v1, file_v2, _v3, _v4 and so on.
That is why I am against using analogies (water through pipes to understand electricity). All analogies are false. The only question is when it is going to mislead, not if.
Files and folders are an analogy. Files aren't really in a folder, they're just associated through clever indexing. Is there any reason we should stop using that analogy?
Analogies are useful when they allow you to not think about something low-level so you can work on something high-level. If I had to think about electrons every time I sat down to write code, I'd never get anything done. If I had to learn it before I started learning to write code, I'd still be in school.
I am not so sure that this is an analogy. These are different names for new things. Except for 'folder' as opposed to 'directory'.
You sort of lost me there with the electrons thing. Are perhaps you really meaning the abstractions that enable you to not have to think about electrons while writing code.
This article talks about consumer software, but most of the issues are applicable even to relatively technical enterprise software. The end users of this kind of software generally have a lot of expertise in their daily routines, but stumble when they are asked to use the software to do something outside of their typical use patterns. It's important to stay humble and realize that the software you're offering to a customer is, in all likelihood, going to have a relatively small (but hopefully positive) impact on the user's life, and in these cases the users are not going to want to spend a lot of their time mastering your software.
Nice one. Regarding point 7 ("what changes can be reversed"), tendencies are different for the younger cohort compared to those who grew up with DOS. Younger people actually try a lot without fear of serious damage, so if you're building for them, you can exploit this .. provided you ensure safety under such trials as well.
This article raises good points, but I'd like to argue / disagree with it in several points:
It doesn’t mean that your users are stupid, just that they haven’t spent the thousands of hours in front of a computer that you have.
Like office workers spending 8 hours a day in front of a computer? They easily make that time only on work time.. not that it helps.
..they don’t know how to (or even that they can) copy and paste text.
Some users don't. But what should I do, send a copy of computer 101 with every eMail I send?
Many users are used to web applications and don’t understand that they need to download and install new versions of desktop software to get access to the features in a new version.
So, what's the maintenance staff there for? In my experience regular office users don't set up or update anything, ever!
Data storage
File system hierarchy and network mount points are so far off the average user horizon that you shouldn't bother anyway. You just point to it in the file explorer. Everything else is completely pointless.
As for filetypes and converting: ok, that's a valid point. It's a good idea to send instructions with it, as it is a somewhat non-common task.
The jargon you use
I agree. Talk the users language, not computer slang.
You should therefore never put something only in a right click menu or anywhere else that it can’t easily be discovered.
Please, I beg you. Put stuff where you would normally expect it from your WM / OS. If it would be in the context menu, then put it there. Normally you also put stuff in the menu bar / ribbon / icon bar depending on conventions. Don't try to invent new interface guidelines, you'll probably fail.. horribly.
Concurrency
Try not to expose them or avoid it entirely (rework the workflow if necessary). Otherwise build the interface in a way that clearly notifies the user of this circumstance.
Non-technical users aren’t so confident and won’t try things in the same way. In fact some of them seem to think that a wrong move could cause the computer to burst into flames.
Most of them don't try anything at all. They will ask you how to do it. If you have some kind of system that has tweaks: train your users (that's what tech training is for).
So try to stick to conventions they will understand
Absolutely.
The need for backups
I preach this all the time. People also always agree to me in this respect. Doesn't help, they never actually do it, much less spend money on it. Could as well talk to the wall..
That they should read the documentation
I totally agree. No one will read your precious documentation. If you roll out a new software, train them with the workflow. You do train your workforce to use the tools they use, right?
Unskilled users often don’t realize how unskilled they are. Consequently they may blame your software for problems that are of their own making.
May? They always do. I figure this must be human nature or something.
One just has to be as polite as possible in such cases.
In other words, stay professional.
-------------
Otherwise I found it the article as entertaining as irksome. Raises some good points.
I also sell software to customers with substantially the same technical aptitude as Andy's. My comments:
1) Copy/paste: they can't reliably copy/paste. Of those customers who can copy/paste, a number of them know exactly one way to copy/paste, and will fail if it does not work in the context they need it in. (i.e. they either know what an MS Word copy/paste button combo looks like, OR they know how to right click, OR they know the keyboard shortcut -- and it is, far and away, the keyboard shortcut which is the most widely supported and least widely understood option)
Andy's suggestion is to make it easier to type things out longhand. I suggest making it unnecessary instead -- you can do some client/server trickery to avoid this (discussed here: http://www.resultsjunkies.com/blog/back-office-exposed-bingo... under the heading "When a sale comes in, can you walk us through the process?")
9) My sole point of disagreement with Andy: this type of user really wants to relate to their computer like they relate to a toaster. No one reads a toaster's documentation, and no one should have to. If the UI needs external documentation, that is probably a bug.