Hacker News new | past | comments | ask | show | jobs | submit login
I take it back. Unity is cool. (lunduke.com)
181 points by swah on Oct 13, 2011 | hide | past | favorite | 86 comments



Sadly, it still does not work with 2 monitors. i.e. attaching a projector for a presentation, all your windows get moved to the projector. can be quite a problem if you are in a meeting. in addition external monitor can only be to the right, since the dash is fixed on the left.

If you want to drag something inside an application to the left side, the dash shows up and blocks the left side of the application. no way to move the dash.

Just two major bugs in terms of usability. ignored for a long time.


>Sadly, it still does not work with 2 monitors.

I've been using it with two monitors both at home and at work since launch, with zero configuration.

The second issue is certainly annoying, but 'it does not work with two monitors' is a pretty strong claim.


I also have been using two monitors since 11.04 came out. It works great.


What kind of graphics card do you have?


One nVidia and one intel, not that it matters. I'm arguing with his blanket claim that 'it still does not work with 2 monitors', not that there are no driver issues on ubuntu.


Yeah, I think this is highly dependent upon the graphics card and the available drivers -- always run into driver hell with Linux


I switched to using just my monitor when using my laptop at work for this reason. It worked all right in 10.10 for me, but switching to 11.04 caused some issues when having 2 screens at a different res.

As I understood with my googling, it has something to with my laptop's integrated i815 graphics chip (and possibly with the chip itself). I do plan to upgrade to 11.10, but not until after my Monday deadline


I can't go back to Unity until it supports three monitors (with Xinerama). I know compiz won't support this, but even Unity2d doesn't work well at the moment, and 11.10 broke basic functionality in Gnome Classic. I switched to KDE a few weeks ago and it worked across the three monitors with zero extra configuration. I've also been very impressed with the polish the environment seems to have gotten since I first tried 4 years ago. What was once riddled with bugs is now slick and very pragmatic.

When three monitor support comes along, I'll give Unity another shot, but for now, I'm on KDE.


Do you know if it works with two monitors? That is my standard setup, and haven't tried unity because so far it seems to only support one.


Yes, it works fine with two (I use Nvidia's twinview for that). It's when you hit three that there's a problem.


Good to know, thanks!


I find it the right thing that canonical focus on the user interface, so unity is a good idea for them to focus on what matters the most, the end users. as a developer I don't care much about desktops, I find them all interesting.


That's an interesting point of view- in the past, I have been annoyed with what I saw as Canonical's over-modification of everything.

However, seeing a nice desktop ui as their main product (as opposed to just another repackaging of Debian) makes that all make much more sense- you want to build your main product yourself.


I used this for 5 minutes a while ago, realized you couldn't get rid of that odious sidebar and wrote it off permanently. Why on earth designers would prevent users from reconfiguring their screen is beyond me. Even if they fixed it, you still have to wonder about their overall judgement.


Shuttleworth once stated that they are removing configuration options on purpose in order to decrease the total number of exsiting different configurations, to build a configuration brand. The logic is that every time 2 users make 2 different settings, it is harder for them to talk and to learn from each other, so removing options artificially levels the field and eases information exchange.


Yes, even more annoying is that the sidebar can autohide, like when you open a full-screen window. They just disable that ability otherwise and provide no way to enable it. I'm not a Mac user, but I was under the impression you could even hide the dock in OS X?


It's autohiding for me, and I don't think I did anything to it. Though, it might just be a bug, I've run into a handful of near showstoppers in the last few days.

edit: Though, now that I think about it, when I first installed, I got the sidebar not autohiding with fullscreen apps, so that it blocked the left inch of whatever was up. Annoying as hell.


install CCSM : sudo apt-get install compizconfig-settings-manager

Open and search for Unity. Find options to mess with.


Totally agree. I switched to Linux Mint after that last release and have no interest in ever going back, even if they fix all of the many issues with Unity. I lost so much time on that last "upgrade" that I will never trust Canonical's judgment again.


Unity is going to be Ubuntu's killer app that will have people leaving Windows and Mac behind for a different desktop experience that can better blend itself with different modern devices.

Soon enough Ubuntu will be pluggable to our tablets and smarts or whatever hybrids will come forth in the next years.


Totally agree, i hope unity will shine, my adding more and more goodies. On my tablet i would really want to have a full linux, if that is at all possible.


You DO NOT want to run a desktop user interface on a tablet.

Was Unity specifically designed for a tablet? Does the window system have multitouch support? Accelerometer support? Do any of the applications themselves support multitouch and accelerometers?


I'm not saying that the same desktop OS will run on both devices, but rather that the overall UI and functionality can be shared.

Though, in a couple of years, who's to say that Ubuntu won't be able to run on most devices? Maybe there are challanges, but I doubt that it would be impossible. It's just a matter of time and maturity.

I think people are already asking for one user environment across devices, so I bet it's something they might have in mind.


Unity is a nice ripoff of the Mac UI circa 2003 or so. That puts it way ahead of Windows which was a poor copy in 1990 and has been going the wrong way since.

But I see no reason a Mac user would find Unity more attractive than the Mac UI.

If they'd copied the Mac UI exactly then they'd have achieved parity.

If they'd done their own UI research and come up with something new, they might have been able to come up with something better than the Mac UI. Certainly there is some room for this, given Apple's need to support old paradigms. But if you've noticed, Apple isn't exactly nostalgic.

But by copying the obvious ideas, but not thinking thru why Apple made the choices they did, they miss the mark... which seems to happen with everybody who tries to copy Apple's innovations. They end up with something that on the surface looks the same, but doesn't actually work the same.

I think this is because its too easy to "improve" what Apple did based on their preferences, without understanding why what Apple did the first time was the right way.

Given that linux has a much higher cost[1] than OS X, to gain a lot of market share, it will need a much better UI. We saw how things went for Apple for many years, despite having a much better UI (and cheaper machines, actually) than Windows...it wasn't until the iPod lowered the awareness cost that people started switching to Macs in droves.

[1] Not all economic costs are measured in dollars. Lack of awareness of a product creates a barrier of entry, and while many people have heard of Linux, few are aware of it as a Desktop operating system. This lack of awareness is a cost that needs to be overcome, or reduced, to get a lot of switchers.


I'm predominantly a Windows user and I hate Mac UI. For me at least, it's hideous. The way you imply that Mac UI is the best in the game somewhat annoys me considering that UI cannot be implied as a fact and it's a personal preference. I might be down voted for this but that's my honest opinion.

BTW, I had trouble adjusting with Unity in 11.04 but I lived through it. Now, I don't find it troubling at all and this is the longest streak I've been using Linux without moving back to Windows. For me, 11.04 might not have been a game changer but it was a great foundation. 11.10 is updating on the background and I can't wait to check how Unity has turned up.


Bill Buxton put it well: it is an unworthy design objective to aim for anything less than trying to do to the Macintosh what the Macintosh did to the previous state of the art.


The problem faced by Ubuntu and other open source projects is that they can't afford to do that.


> I think this is because its too easy to "improve" what Apple did based on their preferences, without understanding why what Apple did the first time was the right way.

IIRC, the Unity team has been actively doing user testing, and there have been several changes in the UI as a result.


It annoys me that I cannot make the top panel (I have no idea what to call it) either go away or have windows be drawn over it.

The only way to get rid of it is when an application has an actual full-screen mode, but then half of the time it reverts back if I move the mouse to a different screen with synergy.

It does not make sense to make hard and fast UI decisions based upon some idealistic vision of how people will use things (such as not being able to move the top bar or whatever) that get in the way of how people actually want to do things.


I hope Unity will attract that kind of app designers to Linux, like the ones that use Macs. I've tried using Linux several times in the past, and one of the reasons I didn't like using it, is that the apps (programs) looked so ugly and rough compared even with the ones on XP. For me, at least, that's kind of a deal breaker.


Most user interface designers and programmers gave up on the Linux desktop and moved over to Macs years ago. The Linux desktop is like Perl 6. Don't hold your breath. And if it finally arrives, hold your breath and don't breath in, because it stinks.

Even if good user interface designers attempted to contribute to projects like Gimp, they would quickly be driven away because of politics and NIH.


True and I have no idea why you were downvoted.


1) Open Firefox, browse to http://www.ubuntu.com/tour/ 2) Open another Firefox. Isn't happy about browsing to http://www.ubuntu.com/tour/ :(

Looking forward to installing and checking out the improvements!


If you google "ubuntu tour" with the fake firefox, you can get there, and then go deeper and deeper within each instance.

Well I thought it was fun anyway.


After working with unity in beta2 over the last week, I'm not a fan. (And I realize this is beta, and things may have changed. But... We'll see tonight when I get a chance to muck with it)

My current project is to put together a little machine for the kids (2,4,7). So, I need to limit options, and generally not put in anything that's going to confuse them. For reference, I've got a 10.04 desktop and 10.04 Netbook remix, as well as various OSX and IOS devices around. Ideally, I'd like something like 10.04 NBR, but with current flash/web stuff, and works with the wifi dongle I have. (This is replacing a 1st gen ppc mini that's got hardware issues. Yay for kids flash games turning off the machine)

For a machine that's got constrained purposes, for lack of a better word, I find the netbook launcher interface is really good. There are a bunch of categories on the side, there are a handful of apps for each category, and everything launches full screen. I can cut back on most of the clutter, so that they don't accidentally click on something that's going to lead them astray.

Unfortunately, the netbook launcher is end of lifed at 10.04 with gnome 2, and unity is the way forward. In my brief time with unity I've:

* had the dock not autohide, and had firefox launch fullscreen, with the left half inch under the dock.

* gotten all the windows to launch behind dash. Only visible blurred. Had to ssh in and restart.

* completely killed the unity system, no sidebar, no top bar, only nautilus running. This persists through logout/login. Or rather, restart, since I can't figure out how to logout without finding terminal in the filesystem and rebooting.

There are also some design head scratchers:

* There are categories of apps when you look at internet apps in dashboard, but not when you look for more apps.

* How do I change the big three apps on the dash screen?

* unity keeps offering to install apps. I don't want that. Maybe it's cause I'm a sudoer, but I don't really want this for the other users. And it's really confusing to be looking for something to change a setting, but having the wrong name for which there are no hits, and have the top line be things to install, and not things that are on the drive already.

* why does the menu bar sometimes appear, and sometimes it's the name of the app?

And then, there's probably a hardware issue: for some reason, my desktop thinks that it has a laptop panel mirroring at 1024x768, so the 20" lcd starts up at that resolution. I know about xrandr, It can't do anything about the phantom panel.

Some of this can be traced to "It's not the netbook launcher", some to "It's not even as well thought out as the nbl", and some to "It looks like you have grand designs on my desktop, and you're not telling me what they are, so the decisions look really strange right about now".

Sometimes, I think I should have just spent the extra $500 for a mini. It would have been smaller, quieter, lower power, and it would have worked out of the box.


Looks like it'll be interesting to play with in this version. I also thought unity in 11.04 was far too buggy to be usable as a daily environment. I guess it's time to check out this new release.


"In fact every piece of hardware on this rig worked perfect without no fiddling around." Holy crap! Where is the fun now?


"Let’s face it, most applications don’t need to be 64bit. How much ram does a calculator really need?"

This passage shocked me. This guy seems to think that 64bits is only useful for using more ram. Who is using 4G of RAM on Linux, seriously?


To rehash a tired argument:

Removing the menu from the window to a common location is still a big usability fail. Not because it's worse than having the functionality bound to the window itself (personally I believe it is, but I could get used to it) but because it's the only major Linux desktop that does this, and as such most of the apps anyone is going to run, are going to be built with that in mind. It's usually not such a big deal to overload the menus with Gnome and KDE, but with Unity it is.

And having it double as a title bar is just stupid. Screen real estate isn't at the premium they think it is.

I tried Unity with 11.04 and hated it. The bugs didn't bother me but the design did. I'll be sticking with KDE.


I disagree; on most Linux UIs I've seen, screen real estate is at a premium because of over-generous borders, oversized icons, oversized buttons, oversized listbox item sizes, all generally looking like a 96dpi UI rendered on a 75dpi device (I've often wondered if X Window historical baggage and ancient bitmap font sizes had something to do with it). Everything is about 33% oversized.

(I'm a fan of the relative information density on Windows. It's really noticeable in file explorer views in details mode; Windows Explorer packs in a lot of information with very thin borders so the text on each line stays large (though the icon for the file is only 16x16), while the nearest equivalent on pretty much any Linux shell file explorer I've used has large inter-item borders that force the detail text to be tiny; and it still doesn't get in as many items as Windows.)


At least in GNOME and XFCE you can decrease the default font size, which is what I did right after installing Debian with XFCE. XFCE also has really slim buttons and borders. Sadly the file manager, Thunar, is not slim at all.

So if you have not already given XFCE a try I recommend you do. Personally I wont use Unity until the skin i as slim as XFCE.


> Screen real estate isn't at the premium they think it is.

It is on netbooks and tablets perhaps...

My beef with it currently is that it doesn't work well with focus-follows-mouse. If menu is on the top, and you want to access it, you mouse will eventually bring other apps to the forefront before it reaches the top, so you end up with a menu of a different application than what you intended.

But, that is just me with my hand-hacked preferences that have lingered around since 4 or 5 version ago. I bet new users don't have a way to enable focus-follows-mouse...


http://askubuntu.com/questions/10481/does-unity-support-disa...

The setting to change focus style is somewhere in gconfeditor.


Thank you. Yeah that is where I enabled it.

Sounds like someone suggests accessing the menu using a key - F10 and not moving the mouse at all. That is not a bad solution, I will try that!


Quoted for truth (viz: FFM)

haet haet haet teh top / unified menu.

It also really f*cks with rapid multitasking between multiple applications/windows, which is a mode of operation far more common on Linux systems than Windows/Macs, in my experience.


Compiz supports focus follows mouse, though it is broken in 11.04. Maybe they fixed it in 11.10.


"It is on netbooks and tablets perhaps..."

And there goes the entire problem with unity. It is a touchscreen tablet UI offered for mouse based computer users. I would have applauded Ubuntu if they made a separate tablet (and perhaps netbook too) distro, but why force me to use it on my desktop?


>but why force me to use it on my desktop? //

There are quite a few different WM in the ubu' repos. You can choose not to use Unity and do so very easily.


I really prefer a single menu location, actually. IT makes much more intuitive sense to me. Also, screen real estate is always premium for me. I need to see all my code. :)


Apple studied this back in the 1980s. This is why the menubar is at the top of the screen. What they found was, people were able to, much more quickly and accurately, pick the menu item they wanted, because they didn't have to be as precise in their targeting when the menu was at the top of the screen than on the window.

If you've grown up with poor copies of the Mac UI (like Windows) then you're used to targeting menu items on a window, and you don't realize that you're slower at it.

But you are.

It's a lot like the one button mouse issue. People think its worse when they've always had to suffer with 2 button mice.


Apple studied this on a 512x342 1-bit display, on a machine architecturally incapable of multitasking more than one app at a time. It's just not relevant anymore, sorry.

The ease of aiming is a real effect, but lost on a 1920x1080 display. And the usability disaster of trying to figure out which of the three "main" windows is in the foreground (and thus owns the menu) is something that was not studied, nor accounted for by Apple in the 80's.


The effect is not lost at all. It's Fitts's Law: http://en.wikipedia.org/wiki/Fitts%27s_law -- the Mile High Menu Bar is still a mile high on a 1920x1080 display. http://joelonsoftware.com/uibook/chapters/fog0000000063.html


Sure, though in interests of full disclosure, on a large screen the menu bar is actually "a mile high, and a mile away". Still, global menu on the Mac is probably still at least a wash for all but the most expansive desktops.

With Unity, though, the global menu bar on a large desktop is "a mile high, a mile away, and invisible". You get to guess where you should be aiming.

Are you really arguing that hitting a target you can't see is easier than a target you can see?

All that said, I do prefer Unity over other options on my netbook. The savings in real estate is well worth the marginal cost in usability. I don't use the menu bar much, anyway.

That last bit is probably Unity's saving grace: most apps used by most people are no longer designed with the menus as the primary interface.

Since they're less used, it makes sense to optimize screen use by tucking them away at the top of the screen.

I wouldn't at all mind if the menu bar had an auto-hide mode, though. Actually, what I'd like is a partial auto-hide. Tuck away over to the top left, with only the window widgets and the indicators showing. I'd gain a line or two in any editor or term on the right side of the screen.


No it's not "a mile away", it's a few inches away. You're performing thought experiments instead of actually measuring, and that's a no-no. Your intuition is no good. The scientific method is. If you can prove your point that the menu bar is hard to hit because it is as far away as it is tall by a controlled experiment, then by all means, publish your work in peer reviewed journals, because a lot of professional HCI researchers will be astonished, and you will be very famous for proving something so counter-intuitive that breaks Fitts's law and flies in the face of all the other studies that have been done, and the hands-on experience of millions of Mac users.

To test your intuition (by quoting one of Tog's favorite puzzles): Name the five points on the screen that are the easiest to hit with the mouse.


It's a mile and four intervening windows all activated via focus-follows-mouse away.

On Macs, fer love of Pete, the Mile High Menu ... is on the other display.

Menus just f@cking suck anyway. I've canned my browser menus via Vimperator (on Firefox / Iceweasel). Sure, I'm a power user and I know what I want to do and I've got finger memory five miles deep (plus command completion). So suck on that teat.

Fipp's Law optimizes for one case: mouse navigation. Sure, it's nice to have a big fat landing zone, when you need it. But often you don't, and the optimization unambiguously and indisputably breaks numerous other optimizations. Which frankly I care a whole f@ck of a lot more for.

We're talking about desktop (or large laptop) displays here. For tablets and small-factor handhelds, there are other considerations. Which is why UI design is complicated and a task and disipline worthy of research and nuanced understanding.

The 1980s were 30 years ago. Go ahead and pop up a 512x342 window on your desktop. On my not-extravagant dual-head display, I can stack those up 6.5 across and three high. With window decorations.

Y'know, I credit Jobs with some good stuff, and he was nothing if not persistent in believing what he believed in. But some things really have to go.


Actually, with pie menus, it's quite easy to hit a target you can't see, because you can "mouse ahead" and be very sure of the direction you're moving, enough that you can reliably select between 8 different directions without looking at the screen. With four items it's almost impossible to make a mistake, unless you're holding the mouse sideways or upside down.


And no, I'm not arguing about Unity's invisible menu bar, or whatever it has. I haven't used Unity, and I have no plans on using Unity, because all of the X11 based Unix and Linux desktops have always sucked, and they always will.


The bar is a mile high, but each menu column is still only an inch wide - so distance matters, even if you have mouse acceleration so that a large vertical is painless.


Even with inch wide menu bar items, the fact that the menu bar is a mile high still completely overwhelms the cost of the distance of moving the cursor to the menu bar. And you can move back and forth while still moving up, to switch between menu bar items without losing the menu bar target. Do the math. Do the experiments. Measure the results. Read the papers. Thought experiments are no good.


"Do the math. Do the experiments. Measure the results. Read the papers. Thought experiments are no good."

Use the software.


Fitts's law (often cited as Fitts' law) is a model of human movement primarily used in human–computer interaction and ergonomics that predicts that the time required to rapidly move to a target area is a function of the distance to the target and the size of the target

How does that do anything but confirm the GP's point, that it's dumb to place a menu bar potentially thousands of pixels away from the window it applies to?


Because Fitts's law relates both the target size and the target distance to the speed and accuracy of hitting the target. Not just the target distance. You can move the mouse very quickly to cover the large distance, without worrying about the accuracy, thus reducing the negative contribution of the distance, because the target size is practically infinite. The target area of the menu bar at the top of the screen extends infinitely up above the screen, because when your mouse hits the edge, it stops moving and stays in the target. Try it yourself. It's EXTREMELY easy to move the cursor to the top of the screen, no matter how far away it is. The distance doesn't matter, because the target size overwhelms it. That's what is meant by the "Mile High Menu Bar" -- calling it a mile high is an understatement!

This is also why pie menus have improved time and error rates over linear menus: linear menu targets are very small, and increasing distances away from the cursor, but the pie menu items all start out adjacent to the cursor, and extend all the way out to the screen edge, so you can trade off increased distance of movement for increased target size. The target area of the pie-slice shaped wedges get wider and wider as you move out further away from the menu center. (I don't mean they dynamically change size as you move, I mean that as you move out, you're in a much larger part of the slice. So with a four-item pie menu, each target area gets about 1/4 of the screen real estate, and you can keep moving the mouse even further when you hit the edge and still be in the same target slice.) Pie menus also minimize the distance, but around the center, the targets are at their smallest, but you can always move out further to get more "leverage" and directional accuracy.


> How does that do anything but confirm the GP's point, that it's dumb to place a menu bar potentially thousands of pixels away from the window it applies to?

That menu bar is effectively a billion pixels tall. You can throw the mouse pointer to the top of the screen and only concentrate on accurate horizontal positioning, since the mouse will not leave the top edge.

Putting the menu bar at the top sells out the distance side of the function to dramatically increase the target size.


So now that you've thrown the mouse to that easy to find top of the screen mile high menu bar, and completed your mouse action, don't you now have to find your teeny weeny window over in some far off portion of the screen and move that mouse back into the window you're actually working in?


That is correct. And one of the largest consistent complaints by new OS X (and prior to that MacOS) users in terms of usability problems.

On multiple monitors, the menu can be 1 or more monitors away from your app window. It might not just be up at the top of the screen, it might be up at the top of 1 screen over and two up.


Fitts' Law is certainly not lost on large displays, set up Expose to use hotcorners and you'll be hitting infinite-width-targets hundreds of times per day.

Same for screen edges: Taskbar buttons or the tabs in Chrome on Windows are awesome, extremely easy to target, because they're on the edge of the screen. (The New Tab Button was recently made Fitts'ier as well: http://code.google.com/p/chromium/issues/detail?id=48727 )

The issue with the Menu Bar is that it's the wrong thing to be putting on the edge of the screen. The menu items are still extremely easy to hit... it's just that they're no longer useful. No need to have them any more. GUI advances such as the Ribbon or just the concept of removing useless features are leading to apps like Chrome having few menu items at all.

On modern apps the cognitive load of the "which-app-is-in-focus?" is the main problem, which you mentioned. But that's a ding against the usefulness of the menubar, not Fitts' Law in general.


> Apple studied this on a 512x342 1-bit display, on a machine architecturally incapable of multitasking more than one app at a time. It's just not relevant anymore, sorry.

... how do these technology changes render their results irrelevant? The menu bar problem sits at the point where a user's ability to aim the mouse in physical space on the screen interacts with how the menu is represented in physical space on the screen. I don't see how any of the changes you've listed affect that problem, except for screen resolution, which only seems to make it worse by shrinking the physical size of click targets.


It's now more of an argument over the usefulness of the menubar, not which is easier to hit. Of course the Mac menu bar is easier to hit. The question to ask is why do we need a menubar at all?

By not having a global menubar, Windows has permitted the development of infinite-width tabs-on-top in Chrome, the Office Ribbon, getting rid of the menubar entirely in Windows Explorer, etc. It's impossible to do things like that in OS X, because we're stuck with a menubar from the 1980s.


If anything, technological changes have made it easier to hit the target, because mice no longer have physical balls that get jammed with dirt, and they are much more accurate. The kid you're replying to has probably never had to pick the black ring of scum off of the wheel inside a Mac mouse, when it stops responding to movements.


I have had to do this. Although I'm absolutely thankful that those days are over, I'm not convinced higher precision mouse control alleviates enough of the problem.


Can you explain why the one button mouse is better than the 2 button mouse? One of my pet peeves with Mac is that my magic mouse takes 2 clicks and a mouse movement to open a new tab in my browser, as opposed to 1 click of the scroll wheel on a pc. (Plus I have to waste a mental thread wondering if its going to register it as a primary or secondary click.)

Usually when I find something on the mac annoying unusable, I blame it on Industrial Design. As in, 'boy the magic mouse really hurts my hand after a while and the secondary click is difficult to use, but it really looks great (the finger scroll is really why I use it fyi)' or 'boy my macbook cuts sharply into my wrists in a just-about-but-not-quite painful way, but damn that unibody is sleek'.

I always just assumed the 1 button mouse (specifically the magic mouse) came about because it just looks so good.


You can configure a Magic Mouse to emulate middle buttons using MagicPrefs. It's a pretty useful add-on.


i will still take the tactile feedback of real buttons.


Cmd+Click. You can even do it one handed on a trackpad. BTW that's what I do on PCs too since the third button is so unreliable across both software and hardware.


Like many of those 20 year old interface assumptions, things like the 1 button mouse are more or less gone from Appleland. Although they really really want you to believe it's still there. Try two finger tapping on a pad, or on the magic mouse a right click is just like a right click on a regular mouse (if you turn on right clicks or secondary clicks or whatever in system prefs).


Apple does studies for these things. They studied multi-button mice and found it slowed people down. The reason is that they clicked on the wrong button some percentage of the time.

What throws people off is, when you make a mistake with a gui, you correct it and forget about it. You don't account for your time because your mind is focused on the task. But and independant observer, who observes you doing the same task in both situations will have a stopwatch and recognize objectively which takes longer.

In some cases-- such as people's preferences for keyboards over mousing-- they perceive that they are faster accomplishing tasks with the keyboard than the mouse. They think this because they press keys faster and the brain accounts for each key press as a bit of an accomplishment... but the stopwatch shows differently.

Its the same thing with the second button mouse.

Trackpads have changed the situation, as a more direct form of control, 2, 3 finger gestures take less cognition. And of course, millions of people prefer 2 button mice, and don't care if they lose a minute or two each day as a result.

So, while there is a reason for the choices Apple makes, those reasons can be less significant over time.

But my point was, there is always a reason, and you shouldn't try to cut new ground unless you understand the reason for the original method.


I agree that finger gestures make up for whatever I have lost from multiple buttons. But I still lose a lot of time every day trying to use the right-click on my mouse. Opening new tabs is a pain and opening new instances of an application is a pain when i go to right click the dock (yes i know you can cmd-click or cmd-N, but thats not always an option).

edit: one last point before this thread ends, trackpads are awesome, computer interaction without one has become much more difficult for me. However they are much too modal, it's fine to make them simple for beginning users who misclick a bunch, but throw a bone to the experienced users who want some power in their interface.


On my Windows laptop, I have configured 1+1 click to act as middle click. It is quite convenient as I move around pointer by touching on finger, and when I want to open the link, I just tap another one, making workflow quite pleasing.

Perhaps you can set the same gesture on Mac ?


>It's a lot like the one button mouse issue. People think its worse when they've always had to suffer with 2 button mice.

It is objectively worse. Cmd-clicking or two-finger-clicking is a pain in the ass. Many things become more annoying on osx simply because of the one button policy.

OSX is generally nicer to use than windows, but it's hardly perfect and there are some things it just plain gets wrong.


Both of your examples are definitely worse on OS X than on basically every other OS in existence. And as proof, OS X is slowly but surely moving in the direction away from at least one of those, and I'd bet that within two or three revs of OS X away from both.

Menus just aren't that hard to hit otherwise all clickable items in a program should be on a screen edge. In fact, according to Fitts's law, they should be jammed into the corners of the screen since that's even easier to hit than a screen edge. But they aren't because it's not that hard for anybody who's bothered to use a pointing device in the last 20 years. More important than that, trackpads are becoming the de facto pointing device on Apple sold computers and Fitts's law works differently on a pad vs. a moving device like a mouse. The edge of the touch surface is the infinite target, not the screen. Since touch devices are not 1:1 mapped to screen area, all clickable interfaces should be at the edge of the pad relative to wherever the cursor starts.

Likewise, a second mouse button turns out to have been a great idea, so great that decades later Apple guarantees that they not only support right mouse buttons, but their default mouse not only ships with support for it, but they even have managed to cram a touchpad into it and their trackpad recognizes a two finger tap as a right-click. Why? Because decades into the great GUI experiment it finally dawned on somebody that interface complexity requires more than one button -- otherwise half of your interface gets buried behind a modifier button (or two or three) or a pile of menus and your sole button.

I've watched many dozens of users move to OS X and one of the first things they ask is "why is the menu bar way the hell over there?" -- with pointing to the top of the screen (or to an entirely different monitor depending) usually preceded by a series of questions about how to do some function that is clearly on the menu bar, but since it's not coupled to the actual program window, they assume it has a decoupled function from the program and don't realize what they are looking for is there.

It's an embarrassingly repeatable user interface experiment that's left me convinced that the only reason it's still part of the OS is to differentiate OS X from Windows.

Physically decoupling software interfaces from the software is almost always a bad GUI idea if you can help it. It repeatedly confused users, particularly new users. It's like putting the steering wheel of your car in your house, and the gas pedal in your back yard shed.

Everything from MDI to full screen apps are now slowly creeping into OS X because time and time again it's shown that users find those alternatives more usable than the old Apple way standby.

Lets stop tooting this "everything Apple does in UI is best" horn. Lots of stuff Apple does in UI is great, it's even the best, these things are simply not.


It isn't the 80's anymore. People often have more than a single window visible on the screen. Refocusing to that window before hitting the menu is a waste of time, and especially aggravating and confusing if you forget.

But again, my main gripe is simply that, right or wrong, having the broad functionality of the window embedded in the window is what people are used to, and what a designer will have in his mind when putting it together, and old habits die hard.

I suspect that if you were to conduct that same experiment with current and prospective users of Unity, you would not get the same results


If you've grown up using the poorly crafted Mac OS interface, you probably don't realize that almost everything else is slower, despite that one really fast menu access (by mouse only).

However, I can Alt-Tab to an app and bring up it's main menu (and hunt through the menu) with only the keyboard. You just can't do that on the Mac. There's a reason most other desktop systems copy the much more logical Windows style instead of the illogical Mac OS style.


On the Mac, you can Cmd-Tab to an app and then hit Ctrl-F2 to navigate the menu from the keyboard: http://support.apple.com/kb/ht1343

I concede that this doesn't address other objections to a single menu bar, such as identifying the active window / application.


The reason most desktop systems copy Windows is simply because Windows is the dominant desktop OS. Similarly, most desktop GUIs ape Windows' white cursor rather than the Mac's black cursor and tile desktop icons from the left rather than the right.


Because the Windows GUI is based on intuition, and the Mac GUI is based on science?


The mile-high toolbar thing is right - a target at the side/corner of the mouse catchment area is easier to hit than a target you have to navigate to stop inside. But this is only right because no usability metric is attached at this point.

If the mouse-button story were just that it takes some milliseconds longer to click a control when there are two, then yes. Likely it does. But you go farther to attach usability claims to it and that's where you become wrong.

For one, the one-button mouse thing was tested on people who aren't used to it. Maybe it takes a few days to get the hang of but it's not the kind of thing where someone who knows what they're doing hits the wrong button.

And even if it were more than microseconds slower for the base physical action, it gets more done in that time. You have to set down your drink and reach for the keyboard with your other hand and I've already right-clicked and am done. Without knowing out workloads you can't make blanket statements like that, and I think that almost anyone could benefit from using more complex control, and macroing, etc, but only once they have a solid understanding of the thing they're trying to do.

"Had to suffer with 2-button mice." Hilarious. Stockholm syndrome.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: