Hacker News new | past | comments | ask | show | jobs | submit login

I found the vision behind gestures very intuitive.

The main idea you had to understand was that you have 'edge' gestures, and 'inner' gestures. 'Edge' gestures relate to functionality that has to do with the phone as a whole and are available at all times, and 'inner' gestures relate to functionality that has to do with the specific app currently in use, and if any actions are available they will be clearly signposted in the app. And the apps were explicitly designed with this interface in mind.

I thought the onboarding tutorial was very clear, and the learning curve to start using the phone effectively was negligible once the above 'phone vs app gestures' was understood.

By contrast, Android has attempted to copy these gestures, but they are severely lacking with no unifying theme in my view (I cannot speak for iPhones since I do not own one, but from my limited interaction with them they don't seem any better, and when I used relative's iPads, my personal response was that the gestures effectively needed to be 'learned' and really didn't make intuitive sense to me).

Effectively android doesn't quite make the distinction clear between 'edge vs inner' or 'phone vs app' gestures, and it comes down to the user (and/or app developer) to figure out what works where by trial and error; the horizontal swipe from the edge constitutes a 'back' button (but only if you keep it 'pressed'), the vertical constitutes a 'apps list' button (but only if you keep it 'pressed'), and a vertical swipe without keeping it pressed acts as a 'home' button. The only thing they've kept from Sailfish is the top-to-bottom edge swipe showing you notifications. But it shows that effectively instead of making gestures a first class citizen, they've just said "how can we add gestures that act as buttons", but it's still a button-centric experience rather than a genuinely intuitive gesture-based UX.

As a result, most people I know tend to turn these off and use the software buttons instead, despite the slighty cost of screen real-estate. I've chosen to keep them on, but whenever I hand my phone over to my wife, the first thing she asks is if I can enable buttons so that she can do what she wants to effectively.




> I found the vision behind gestures very intuitive.

Two things:

1) Only goes to show that "intuitive" isn't the same for everyone. (Stands to reason; neither is intuition.)

2) Look at the length of your post. Anything that needs that much explanation can hardly be called "intuitive".

I didn't know we're married, but it seems I'm your wife. ;-)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: