Hacker News new | past | comments | ask | show | jobs | submit login
Whatever Happened to UI Affordances? (shkspr.mobi)
412 points by pimterry on June 26, 2021 | hide | past | favorite | 203 comments



For more of these daily annoyances: https://grumpy.website

The growing user hostility of UIs makes me want to stop developing software and even stop using computers. I started using computers because they made more sense to me than a lot of things. Now, I'm encountering these little moments of illogic and unreason every hour of the day.

It's not just beginners who need little cues. I'm deeply familiar with the APIs behind many of these monstrosities and yet I still find myself annoyed or momentarily confused by UIs that minimize, obscure, and hide in the name of Design.


Thanks for that link. One related to this one just got me on DVD Netflix's payment settings site.

https://grumpy.website/post/0VkVVuQ6t

There is no button to remove your card info. It had my old ZIP code and updating it wasn't working (it would still fail address verification). So to update the value, you are somehow expected to know to clear the field and then save, then enter the correct value and save a second time.

Except it doesn't work.[0] What does work is entering a different credit card, saving it, then reentering your original card with the correct ZIP code.[1] So bad design with no affordances and a broken help function, lead to a long-time customer wanting to unsubscribe out of frustration.

[0] Neither did the chat function to ask for help. I had to talk to a support person who told me how this was supposed to work. A phone call to support is an expensive cost to a business based on volume.

[1] Which made me think they were storing extra info using my card number as a key. And that's not great.


I tried reading the footer to get the contact information for on https://grumpy.website but it keeps doom-scrolling and only visible for a fraction of a second. What an irony.

I dream of a day when we go back to pagination. No, not just pagination numbers on the bottom of the page to flip through pages, but also the URL should reflect that and be able to share the 7th page with someone using a link.


I hate it when people share page numbers like that with me, because many sites have things that are either "ranked" somehow, or in reverse-chronological order, which means that the contents of page 3 are constantly changing.


Still remember trying to browse fonts on Google Fonts and my browser crashing as I scrolled too far. Shame on me for wanting to look at too many! At the very least they need a way to unload the earlier entries and reclaim resources.


For some experiences, content needs to be updated in page 1 (google is a good example). How would you then share page 7? that requirement could only be met if content was static (no updates) or the ordering of results was old to new. Am I missing something?


That's true. Pagination makes sense when it is static and not chronological, thanks for pointing that out. But, the problem still exists that I cannot get to the footer where there seems to be some contact information of the author :-)


You're right, the behaviour of the footer clearly creates a major usability issue. I personally dislike infinite scrolling with a passion. I can't see any big advantages and there are a lot of cons. I keep my hopes high that one day I will see it die... like Adobe Flash :)


The classical solution is to have a "content before ID_OF_LAST_ITEM_ON_PREVIOUS_PAGE" url, though there isn't really a way to do that and also display page numbers on the bottom.


Many thanks for the insight. I am now going on a rabbit hole to discover what that solution implies, how it works, etc. :)



You can just turn the site over with the button on the top, and all the information will stay fixed (:

You can also just go to any post, there will be no scrolling, too


Or, if you have javascript off, you get no pagination and no scroll, making it look like there are only 6ish items


I can see how this one might be confusing- https://grumpy.website/post/0Vp0pSilq

but the site is misrepresenting the state. That user's state is still "muted," the red line indicates that it is a client side mute (so the user who checked the mute box can't hear them, but everyone else can) while the mute icon without a red line indicates that the other user has muted themselves; they're not transmitting audio.

There is a third state- Server Mutes turn the entire mute icon red and is a "far-end" mute. No one in the voice channel can hear that user and the user probably can't unmute themselves.

Looking more at the site I feel like the point it's trying to convey would be much stronger if it went for a quality over quantity approach and/or proposed solutions to some of these problems. A lot of them seem to almost intentionally miss the point of the UI.

For example, the airpods post[0] notes, "Want to listen to the podcast on your iPhone while playing a game on your iPad? Well tough luck using your AirPods for that," which is just completely false.

Sticking to posts like this[1][2] would lead to a much higher quality website imo.

[0]: https://grumpy.website/post/0VaJdRL-y

[1]: https://grumpy.website/post/0VlYhfUMg

[2]: https://grumpy.website/post/0VfEvLg0j


The Discord mute icon having three states is no excuse for it being bad.

Unmuted -> The mic, no diagonal line.

User muted -> Red diagonal line.

Server muted -> Could be the muted icon with a lock next to it; or the diagonal red line replaced by vertical bars like a jail cell; or even a cross instead of a diagonal, maybe within a circle for further distinction.


Those are still three states with just different shapes. If you hover the symbol it tells you what it is and if that changed it to your system tomorrow I’d still need to hover to figure out what everything means. Unless replaced by text, I don’t see how changing the icons that all mean some form of “muted” really does anything.


I post this everytime, but for the love of UX, even if your the most hardcore code/terminal junkie… and never see a user near your program…read the book called About Face : Essentials of Interaction Design!


This is hilarious, but I'm puzzled by his reaction to a hamburger menu:

> Oh-oh. Drag-n-drop icon used for a dropdown menu - May 8, 2021

https://grumpy.website/post/0V_dyk3EP


You're only puzzled because of its ubiquity, and your accustom to it. Nothing about it inherently says 'opens a menu from the side'.


Late "edit": Turns out I'm the confused one! (Bit of sleep dep...) I thought this was the main menu for a page, just in a weird low position, but it's the menu that appears on hover for each item and in this context indeed would be interpreted as a drag handle. In this context perhaps something unambiguous like three vertical dots would have been better?

Also I agree with you -- perhaps the three lines represent a pictogram of a list (vertically stacked menu items?). And the triple dot version is either a compacted version of the same, or a vertical ellipsis?


Sure, but that's not an affordance. If it were vertical instead of horizontal lines for example, it'd be like a grip horizontal pulling.


Horizontal lines (or sometimes an array of dots) are sometimes used as a "grip" icon to represent reorderable list rows.


I know, I've used them in my own designs. It just sounded like he saw a hamburger menu now for the first time 5 weeks ago! Or maybe it is some kind of joke?


I don't think I'd seen a hamburger menu rendered on a list, for each item, before. Normally a "hamburger menu" appears once, probably at the top of the page, in some kind of navigation bar. In that context, that icon definitely reads "drag to re-order list item" to me.


We have found that people are afraid to mess things up, and don’t know what three lines means, so never click them. A lot more than you would think.


Everyone sees the hamburger icon for the first time. It certainly isn't obvious what it means. You need to decide what is more important, that your site look contemporary or that your users can find the information/product/service they are looking for.

https://www.nngroup.com/articles/hamburger-menus/


Hamburger menu icons usually have a fixed position and are part of a header or something of that sort. Floating lines like that would always read as "drag grip" to me.


This is the case for many people. Another explanation is that he was being empathetic.


As a designer himself, he probably tries to approach design examinations from many different perspectives (we do this at work with new designs: if I was an 70 year old woman, what would I see here?) to find edge cases where the design doesn't work.


There used to be a Hall of Shame for GUIs on the web that I used to read around 2002 or thereabouts. Very similar to this, only for native programs.



Yup! Wasted an hour there just now :-)



> Apple are slowly undoing their great usability work in the name of elegance.

Not slowly, or recently.

The big removal of affordances happened with IOS 7, and Jony Ive's "flat" UI aesthetic. Don Norman and Bruce Tognazzini, former Apple UI researchers and champions of UI affordances, wrote an essay[1] complaining about this problem in 2015.

It seems like this era might be slowly _ending_ now that Ive is gone.

[1] https://www.fastcompany.com/3053406/how-apple-is-giving-desi...


Even less recently than that. As a critique[1] from 1999 of QuickTime 4.0 puts it:

"The new interface represents an almost violent departure from the long established standards that have been the hallmark of Apple software. Ease of Use has always been paramount to Apple, but after exploring the QuickTime 4.0 Player, the rationale behind Apple's recent 'Think Different' advertising campaign is now clear."

[1] http://hallofshame.gp.co.at/qtime.htm


What about when MacOS started hiding scrollbars? Ugh.


I think the hidden scrollbars are great. They remove clutter, yet are always readily available when needed.

Scrolling in MacOS is one of my favourite things about the UI compared with various versions of Linux and Windows I've used. This is due to the two-finger swipe gesture being really well done and the scroll bar being available when needed.

I used Linux as my primary desktop non-stop for 19 years, and in the end the scrolling had got worse. The Ubuntu desktop weird scroll handle thing was awful. MacOS scrolling was so much more ergonomic than that it convinced me to switch. I still develop primarily on Linux, but prefer to do it over SSH even though I have a Lunux desktop when I want it.

It doesn't take long using MacOS to develop an instinct for scrolling things with two-finger swipe, and scrolling is generally a very safe thing to do so trying it is harmless. The way swipe scrolling is independent of the focused window is just right, and the scrollbar which appears is easily grabbed on the occasions it's needed.


Problem is the scrollbars often don’t appear when needed, so user has no idea they are there. Worse, even when they appear, they sometimes disappear when you reach for the thumb to move many pages. It’s excruciating on documents with hundreds of pages.

Disappearing scrollbars also violate the original Mac UI principles, which were based on an enormous amount of human factors research.


I don't have a Mac to try it, so would you care to explain how the hidden scrollbar works? Like when you open a dialog, do the scrollbars of its scrollable elements always appear and then slowly fade out, or do how do you know something is scrollable without trying to scroll?


Yes, scrollbars are supposed to flash when a view appears. This can sometimes not happen, or be hidden (say you open something in a tab and the code is not correctly written to handle visibility, or the more obvious case that you just didn't see it happen) but it's intended to let you know that scrollers are there.


MacOS hiding scrollbars isn't too bad. Google trying to make all scrollbars like 1 pixel wide, that's the worst.


I think that's around the same time? Not sure. Anyway, at least that one is easily overridden in System Preferences >> General.


I kind of long for the old skeuomorphic look of IOS, something about the in-your-pocket form factor and the touch-centric interface made it really work.


The original Mac would pop up a dialog with a threatening icon of a bomb with a lit fuse, whenever it crashed!

https://en.wikipedia.org/wiki/Bomb_(icon)

>The Bomb icon is a symbol designed by Susan Kare that was displayed inside the System Error alert box when the "classic" Macintosh operating system (pre-Mac OS X) had a crash which the system decided was unrecoverable. It was similar to a dialog box in Windows 9x that said "This program has performed an illegal operation and will be shut down." Since the classic Mac OS offered little memory protection, an application crash would often take down the entire system.

Unfortunately, the Mac's bomb dialog could cause naive users to jump up out of their seat and run away from the computer in terror, because they though it was going to explode!

https://www.youtube.com/watch?v=zQGX3J6DAGw&ab_channel=Caitl...

And Window's "This program has performed an illegal operation and will be shut down" error message was just as bad: it could cause naive users to fear they might get arrested for accidentally doing something illegal!


Not sure if you’re being serious or not, but these stories seem awfully apocryphal to me.

I can’t imagine anybody - not even a total novice - running away from their computer upon seeing a cartoony image of a bomb. It reminds me of the made-up but often used story of terrified audiences running for the exits upon watching a film of a train for the first time.


Yeah, nah, it's not an exaggeration. Even as a kid the first time I saw "illegal opeation" on a Windows machine I thought it meant something actually unlawful had occurred.

In the early 90's there was a prank Extension for Mac OS called "Radiation" and a partner application called "Trigger". Trigger would allow you to pop up a system alert dialog box over the Appletalk network on whichever machine had the Radiation extension installed, allowing some nice pranking opportunities. The default message was "The radiation shield on your Macintosh has failed. Please step back five feet." … It worked, and some people would literally jump away from their desk when they saw this message! Ahh, the good old days hahaha :)

(If you're still skeptical, I can probably dig up a copy of this software since I surely have it on an old disk somewhere, but of course I can't prove people moved away from the computer -- there's only my anecdote to go by here haha)


> It reminds me of the made-up but often used story of terrified audiences running for the exits upon watching a film of a train for the first time.

The cultural recontextualization of stories like this in my lifetime has been one of the most fascinating aspects of our modern world for me.

Of course, as I get older, I've seen my own stories and stories about events I participated in get told, re-told, and twisted and exaggerated without absolutely any malice. We also all see how people misrepresent news stories, social policies, and scientific studies (with various degrees of malice). So I think I know almost exactly how the Film of Train anecdote happened.

1. Theater owner starts showing Train film. Stands outside and yells "Come inside and see the wonderous train show. This new 'cinema' is so real that audiences have been reportedly running for the exits in terror!"

2. It's an obvious exaggeration and a joke. Passerbys laugh, but are intrigued nonetheless.

3. Someone writes a newspaper story about the cinema and the train film. Includes the quote as directly attributed to the theater owner. Everyone reading is aware of the context and the situation and the implicit tone, and chuckles appropriately.

4. Story gets picked up in another newspaper but without the context and removes the quote and instead represents it as a factual retelling of what happened.

5. Years later, someone writing a book uses the newspaper as a primary source, and then a cascade of books repeat and propagate the "fact".

Perfectly reasonable sequence of events. But here's where things get even more interesting to me.

There are *dozens* of similar stories that we've all heard and took as gospel growing up that required one person to stop, ask "Wait, does this really hold up to scrutiny?" and the whole house of cards comes crashing down. I'm talking about the "NASA spent $10M to design a space pen, the russians used a pencil", "Water flushes in the opposite direction in the southern hemisphere", and the like.

But why did it only happen just now? What prevented people from being more introspective and curious about these subjects 10, 20, 30 years ago? I guess the answer is we needed the Internet to hit a certain critical mass for enough people with sense to be able to reach the rest of us, but I don't know.


> What prevented people from being more introspective and curious about these subjects 10, 20, 30 years ago?

Perhaps we should be asking, what subjects are we failing to be introspective about now, that will fall into this category 10, 20, 30 years in the future?


I remember seeing this dialog as a kid (probably ~5 years old) and being scared the computer would blow up. Actually, for years, I assumed I must have dreamt it.


It seems kinda hyperbolic, but then again the death chime on Power Macs scared the hell out of me as a kid.


Some of those death chimes were pretty ominous! Depending on the model, there there a few different dirges, breaking glass, car crashes, and others. You knew you were in for a bad time.

https://512pixels.net/2021/04/mac-chimes-of-death/


They still scare me!


Nope. I too have seen users freak out about illegal operations.


And cookies:

>>Almost every time, however, something unexpected would occur, causing her to panic and call her daughter for help.

>>"It could be almost anything," Widmar said. "She goes apeshit whenever a pop-up window comes up. And one time, she paged me because she got a message about accepting cookies. She was all freaked out because now she thought she was being charged for actual cookies."

https://www.theonion.com/getting-mom-onto-internet-a-sisyphe...

Yes, a satire article, but representative of how people reacted to such cryptic browser messages.


>"For practice, I logged onto Yahoo! and had her search for cheesecake recipes," Widmar said. "She got totally confused by the fact that we were searching within a web site for other web sites. She kept typing her keyword searches into the Internet Explorer address bar, not into the Yahoo! search bar. Then, when she accidentally typed 'cheesecake' into the Explorer box, it actually worked, because there happened to be a web site called that, so that just confused her even more."

This part was pretty prophetic for 2002.


When I was a young kid, my family visited another family for a party at their house. I was trying to play SimCity or something on their computer, and the family's son, a bit younger than me, was there with me. Some error was coming up and I was trying to make it retry, but the computer kept beeping. The son was like "you're hurting it!" and got really upset that I was making the computer beep. I mean, I was a bit older and had a real laugh, but he was genuinely concerned for the well-being of the computer! IIRC he even ran to his parents because I was "hurting" the computer hahah


I was that user as a kid.


As a child I remember shielding my face and being scared when I tried booting a computer and seeing the Windows 95 boot logo (with the red/blue/green/yellow squares) on screen as it reminded me of CDs and I heard that lasers were harmful. I was probably 4 at the time though.


As a kid I pulled a lot of pranks. There's always a moment of plausibility. I recently heard this described as defaulting to truth. A hard wired trait which allows most of humanity to coexist.

I once changed one buddy's autoexec.bat to prompt something like "illegal software detected, press [enter] to begin scan" and then repeatedly pipe a directory listing to null (causing the harddrive to make noises). He leaped up and unplugged his computer.

Retelling that story now, I feel just a bit of remorse.

--

Oh, another example, probably apocryphal.

At the dawn of cinema, audiences were said to recoil the first time they saw a disembodied head, like when you cutaway for a close up.


Ahhh yeah, a friend of mine made a boot floppy that echoed something like "Erasing C:\ ..." and maybe listing out filenames (hard to remember). I think he set some option so it wouldn't listen to keyboard input so you couldn't interrupt it? He found it absolutely hilarious, though I never got to see the prank in action, haha


I definitely encountered novice users puzzled and worried about the "illegal operation" message.


I have always hated this phrasing. I prefer invalid or something that doesn't invoke legality.


When I first got a Nigerian scam email, I thought it was a genuine request for help...

Happens to us all.


> it could cause naive users to fear they might get arrested for accidentally doing something illegal!

Oh yes, and I know of a real-life case of this. A friend of mine contacted me about that message when her browser crashed. She made it clear to me she was not on a porn site when it happened. Programmers for so long just didn't think about non-technical users, and lots of them still neglect to consider that was seems obvious to someone steeped in jargon and software conventions is completely meaningless to laypeople.


That reminds me that our Mac (circa 1992) had a menu option to “Erase Hard Disk”. My Dad would routinely remind us to never click this option and (unfortunately?) we never tried it.


>the Mac's bomb dialog could cause naive users to jump up out of their seat and run away from the computer in terror

Whereas the Amiga would just instruct you to meditate on the error code:

https://en.wikipedia.org/wiki/Guru_Meditation


Honestly I don't think minimalism is the issue. And I love skeuomorphism, even the overdone Apple version. Minimalism is ugly but when done right, it's actually one of the clearest UXs possible: only focusing on the content and guides, no weird extras, and every design decision is meaningful because you only get a few.

Issues with today's UX are: more dark patterns, more people making apps easier (= more less experienced developers who don't understand UX), companies trying to "stand out" but hurting UX in the process (because they still want minimalism, and all of the good ways of standing out with minimalism are taken). In the author's case Android's share menu just happens to suck - he even shows Apple's menu, which is still minimalistic but actually clear.


I think the issue is that modern interfaces are self-centered. Or rather, the designers are self-centered: they believe that the world revolves around them, and Their Magnificent Design is the One Thing that everyone will want to learn, appreciate and admire.

In reality, their design is one more thing among the hundreds or thousands of things that a user manages, and should mostly serve to get other things done.

It's more about hubris than minimalism.


That's extremely ungenerous, and no more true about designers than coders.

Perhaps some designers start out have this immature attitude, but professional designers I've worked with are user-centered, not self-centered. That's just a professional prerequisite, and what you learn in design school as well.

And in any case, you could just as easily say about programmers starting out that they too often believe "Their Magnificent Program is the One Thing" and ignore what users actually need as well.

In any case, when it does happen (to anyone), it's not hubris, just immaturity. If something seems like the best solution to you, it takes experience and perspective to realize it's not always the best solution to others. But people generally learn that fairly early on.


> If something seems like the best solution to you, it takes experience and perspective to realize it's not always the best solution to others.

I think this is exactly the point I was making.


Yeah, the lack seems to be the lack of qualified designers in the first place. Or maybe designers who are not givens enough time to do the job.


A lot of times, the approach that the designer took is the right one, and the approach the engineer proposed is flawed when tested with real people.


Either way, a time wasn't spent to iterate on the design until it was good. Designers are good, but they make mistakes too.


As a UX designer: too few companies understand the importance of UX and the difference between UX and UI. It is still too difficult to convince the stakeholders of the importance of user research, adherence to usability and accessibility guidelines, and user testing. Those activities have a cost, both in terms of money and time. But they can save a project, strongly reducing the risks of failure.


UI affordances have been going away for a long time. FlatUI was the first nearly universal obviously-user-hostile movement. It happened out of boredom and the slow intrusion of the contempt-for-the-audience attitude of art and architecture into web design.

Unfortunately, unlike art, we can't ignore architecture or UI design.


I don’t think flat design is inherently hostile. The same way I don’t think heavy skeuomorphism is inherently patronising. We went from a violent explosion of visuals to ambiguous interactive elements. Both shite, both a pain to use, both bad design. Both can be avoided within their respective philosophy.

As OP shows, iOS hinting at more content by partially showing the next item is a good example. You can very well create depth and hierarchy with minimal visual cues.

However, what has increased since flat design became the hottest shit ever (2013/2014) are downright malicious interaction patterns, no understanding of the platform (e.g button vs link), and an ever increasing lust for engagement. The design flavour is merely coincidental.


I think it takes a lot of work to ensure that removing depth and shading doesn't remove information, too. I think flat design inherently "wants" to be, if not user hostile, at least less user-friendly than what preceded it.

It's a win if the depth and shading was just noise, before. If it was signal, well, now you've got to find something to replace that with, or you're harming UX. And you've got less "bandwidth", if you will, to work with.


Exactly. Flat design removes signals in order to let the UI appear simpler. But the removal of meaningful signals makes the UI more more ambiguous and harder to use. Flat design is an obfuscation, trying to cover up the actual complexity behind a visual semblance of simplicity.


But going back to the comment above, I don’t think that flat design is inherently to blame here. Flat design doesn’t require the removal of shadows etc. in order to favor aesthetics over usability. That removal of signal is just bad design. If the designers of flat systems did better at replacing the lost affordances and visual cues, flat design has the potential to be incredibly powerful at making clear, accessible user experiences.

Now that I’ve said that, I do want to make it clear that I haven’t seen an excellent exemplar of flat design. I remain optimistic though.


Just to add, Android's material design is full of shadows and 3D encoding, while still being flat. It's not a great design by any means, but it's proof that flat design does not imply on a 2D UI (despite its name).

The single largest problem is that when we took 3D buttons away, we got no icon for the idea that you some object is clickable. An icon does not need to be skeuomorphic, just unambiguous and easy to recognize, besides, it needs to exist to be useful. Flat design uses a high-contrast background to encode that, what is extremely ambiguous, but there's nothing prohibiting people from creating a better icon.


Flat design is the definition of art because it creates something without the intention of usability. Functionality that is hidden isn't usable by definition. It always annoyed me in early Mac UX that there wasn't a specific affordance for additional interactive behavior.

Option this, Ctrl that. Yes, they can be discovered socially and informationally, but that goes against intuitiveness.


The original Mac or Lisa was good in some ways, but it also problems with faux minimalism. In particular, Jobs insisted on a single-button mouse for 'simplicity', but that required inventing the entirely undiscoverable double click, which had to be taught, and still causes confusion today.


Yes. I agree. Also, the insistence on a "single task at a time" notion when real work required context switching.

Simplicity goes too far on the balance of, and in search for, advancement of design. Form + Function. Not an industrial beige box bucket of parts and not a confusing 2001 monolith, but something in the middle.


Windows 3.1 in EGA mode was fine — it had no depth (can't afford that when you had 16 colours for the whole screen) but it was fine, because it had consistent button borders and things along those lines.


Consistency is key, I think. Part of the problem now is that every app tries to redesign the entire interface from scratch, meaning there are no general interaction principles that a user can rely on anymore. I hate it.


I think the saddest thing about “flat” designs is that technology is incredibly capable of delivering so much more. Sure, in 1988 we only had a few colors available in hardware so maybe a button had no choice but to be one boring solid color. In 2021, though?!? Ultra-high resolutions, massive color palettes, photo-realism has never been easier to achieve (even in 3D!!!), all this computing power, and then these overpaid “designers” give us: boring square buttons with one color with unreadable contrast. We deserve so much better from modern UIs.

UIs should not be so plain and spartan that they are literally unusable sometimes. Every Single Button should look like a button, with way more detail than UI buttons have ever had before (why not have a ridiculous number of colors in gradients to make buttons truly beautiful?). When something is highlighted, I want it to be obvious and, again, beautiful (why not glow with photo-realistic lighting effects, for example, since we clearly have the ability and can spare the processing power?).


I have been wishing for a return to monochrome (green and orange) so that we can from our UIs again. You know it's bad when people are fantasizing about mainframe terminals.


So are those vast swathes of wasted whitespace in modern UI design an attempt to turn a UI into a gallery wall on which to display those intricately crafted icons?


And, as with the yale box[1], all icons will eventually converge on the same exact design.

Finally, the perfect UI (and modernist building) will have no differentiation at all. An exercise in pure, platonic intellectual/aesthetic navel gazing.

You are starting to see the post-modernist reaction in some places in web design, but, as with architecture, I anticipate it will play with forms out of boredom rather than do the hard, self-abgenating work of drawing out the good ideas of the past and humbly driving them forward.

Perhaps I am too cynical.

[1] - https://www.goodreads.com/book/show/41001.From_Bauhaus_to_Ou...


My personal theory is a lot of universities got into “UI design” but basically just had print design teachers teaching the courses. Plus UI design with depth and complex elements requires more talent than flat design. IMO it’s a generation of lowering the bar so people with less technical ability can participate in design


It happened out of an explosion of pretentious eccentricity flowing out of people who never sat with a user.


This 1000x over. Lots of times I think to myself, there is absolutely no way whoever designed this, has actually tried to use it.


Never thought I'd see the day when libXaw would be at the cutting edge of UI design again, but here we are.


Revell (Japanese) model kit instructions: We will show you every necessary step clearly and precisely using line drawn actions to provide as much accessibility to all people as possible. Only for very complicated concerns will we use language.

Windows 95 UI: It might be fuggly, but you know where the bodies are buried.

IKEA: We will show you how to assemble this sawdust into a crappy bookcase that doesn't sit square or level using line drawings and language inconsistently.

FlatUI: Physical products should delivered as a white box inside another white box without instructions. Tech support is an unnecessary expense.


That’s one of the most interesting point to me:

Revell must have super clear, most hand holdy instructions possible, for a process that is long and finicky and they assume people will patiently go all the way anyway.

And that’s not just model kits, other Japanese company take the same approach. Finicky and painfully long but clearly explained.

In comparison IKEA spent decades to make shorter, clearer to assemble furnitures, so they can get by with less explaination.

Current day IKEA furniture is way sturdier, easier and faster to deal with than any other flatpack company I’ve seen, Japanese companies included (and I went through a bunch recently…)


> IKEA

A friend of mine posted on her Instagram about a cabinet where she thought the drawer slides were messed up and she couldn't fit the drawers. I looked and just kind of guessed, "is it upside down?". Yep, that was the problem. Apparently the instructions or parts were unclear enough that it was possible to put the feet on the wrong end or something.


This light sometimes blinks green. Sometimes it blinks orange. What does it mean? Who knows.


What does everything need to have a little blue or red LED now? <engage grumpy old man>... I am pretty strict about not letting any electronic devices into my bedroom because when I turn the lights out to go to sleep I don't want to see a constellation of LEDs, some blinking. The blue ones are the worst, for lots of reasons. My phone goes face down or in the nightstand drawer. I got a new window a/c unit a couple of years ago and it has LEDs and a remote control with an LED that blinks. Stop it already.


I wish I could upvote you more than once. I have to stick electrical tape over all of the LEDs in my bedroom, otherwise they disturb my sleep.


It's either an Ethernet port light or one of those cryptic things from AliExpress without a manual.

----

There was once someone who knew.

It did have a useful purpose at one point for expert users, technicians, and engineers.

A large group came along chanting "What does it mean?" louder and louder until it was a thunderous war-cry.

The one who knew was trying to shout the answer but he was whispering in a tornado.

Then, someone else said loudly: "I don't understand. Let's just get rid of it."

Most everyone said: "Yeah! It's useless!"

No one listened and now that that understanding is lost like the Antikythera mechanism and Damascus steel to the sands of time.


> Modern design is so beautiful to look at - but an absolute nightmare to use. You either need to use trial an error on every element, or hope that someone else can tell you what you need to do.

This. Aesthetics has overtaken usability. Back when it was web design, a good interface was defined as one that was easy to use first, pretty to look at second. During the rise of mobile interfaces, this got flipped. Now pretty to look at ranks highest, and easy to use is secondary.

Is this because of real-estate? Mobile screens are smaller, and something had to go.


What's crazy is that at one point the new wave of designers was hailed as digital native designers replacing the bad old print-trained designers. They'd save us from treating the screen like paper, unlocking the true potential of these interfaces.

The worst sins of the print-designer era may have been pretty bad, but I'd say the average actually got a lot worse when the new crowd took over. First for the Web, then for native when Web-trained designers started working there, too.


The idea of grounding UX in physical analogues, in my view, is incredibly important. No matter how much we interact with digital screens, we still walk on ground, pick things up with our hands, etc. Just because we can do anything in a digital experience doesn’t mean we should.

Skeuomorphism may have gone too far, but the pendulum seemed to swing back too far.


At some point skeuomorphism becomes anachronism, though. The canonical example is the continued use of the floppy disk icon for save. A mechanical stopwatch for a timer. A telephone with a switch hook and dial. A desk calendar with rings and pages. A pencil and paper for text editor.


The problem with trying to get rid of those "anachronistic" symbols is that when you task someone with it, they will come up with a bunch of very arbitrary symbols that are also simple permutations that are hard to distinguish.

There was an example in a previous HN thread on this topic. For "save", they had a rounded rectangle and an arrow, and then a bunch of operations had something similar, but with the orientation and details of the two elements changed.

Or, another example that made an impression on me when OS X first came out, was when the basic window controls became circular, jewel like bubbles, only distinguished by color - instead of each having a distinctive pattern/symbol.

It's so easy to fall into the trap as a designer of applying a pleasing uniformity that undermines the user experience. Imagine the nuclear power plant controls with hundreds of identical switches in rows and columns. Or an aircraft.

Nothing would mean anything without the past providing context, so complaining about anachronism is paradoxically complaining that a meaning is too well established.

The letter A is thought to come originally from a diagrammatic ox head, but should we eliminate it because we no longer use it in relation to livestock?

Just because people don't like a floppy disk icon shouldn't count for anything unless it can be demonstrated there is in fact some other symbol that is more associated with saving.


If we still used an ox head to mean the phoneme for 'A', that would be closer to the floppy disk icon for save. Consider, also, that the word 'save' predates computers. What, if the computer or floppy disk didn't exist, might we use to signify 'save'? Perhaps a currency symbol? A money bag? A bank? A life preserver? A piggy bank? Or maybe we don't even need a 'save' action any more (the best programs have autosave already) but some kind of "mark this state so I can come back to it (undo) later. Apple's Time Machine is one paradigm to consider as well: there's no never a "save" step, but you can go back to a previous state with not much effort.


App companies are optimising for sales, and pretty sells better than usable.

This isn't just true for software. There are lots of things, even trivial kitchen items, where you can't appreciate just how badly designed they are until you've used them for a bit. It is infuriating!


Sorry for this pedantic comment, but he is asking for signifiers not affordances. He wants affordances to be more perceptible. See Design of Everyday Things.


I upvoted your comment because the blog author even references that book, and the book goes into great length to make this distinction between the two. So I think it's only fair to point it out.


And yet, all the upvotes showered on nitpicks again prevent HN from using the top comment for substantive discussion on the subject of the article.


That's a very fair comment. I must go back and read the book again some day.


The difference in terminology depends which edition of the book you read. In the second edition he includes a long passage explaining that a lot of readers of the first book misunderstood the word "affordances" and so he was now introducing a new term, "signifier," which means what people thought "affordance" meant.

In short, the app already affords you the ability to scroll to the right to view more options, but there's no signifier telling you this.


Is there a more intuitive word for "affordances" or similar? I find its definition is always debated or confused in the comments so I prefer to avoid it where possible.


The affordances of an object are with respect to the observer-- the object affords opportunities to the observer. It's like "resonance", which is not in the object or the perceiver but describes the relationship. Signifiers are on or in the object.


In what settings do you find these words useful though? Where do they save time and increase clarity?


Nope, that's the nomenclature and it means something very specific in the cases of architecture and UX. Affordances "afford" cues to the viewer that something has perceived action possibilities. It's the opposite of a hidden passageway door or a flat rectangle that doesn't have any cues to indicate that it's a button.

https://en.wikipedia.org/wiki/Affordance


From the link:

> The different interpretations of affordances, although closely related, can be a source of confusion in writing and conversation if the intended meaning is not made explicit and if the word is not used consistently. Even authoritative textbooks can be inconsistent in their use of the term.


I don't see any specific evidence given any that good textbook is "inconsistent," merely vague accusations cast without evidence. A straw man.

Affordances have a specific meaning in design. Affordances themselves are subjective because they depend on prior user training. Signifiers are nonessential, supportive adjuncts to affordances to reduce cognitive load (fewer uncertainties and more clarity).

"PUSH" sign on a door that already had a door crash bar facing the observer. A crash bar already indicates it is both a door and opens outwards. A further signifier for a clear wall and door would be a faux door-jam around the perimeter of the door so that people can tell where the door is more easily. If a door blends-in completely to a wall, then any indication of it is an affordance.. it's additional, supportive cues that would signifiers. Putting bright orange around a "PUSH" sign or some aspect of an opaque doorway would likely make it a signifier.


> I don't see any specific evidence given any that good textbook is "inconsistent," merely vague accusations cast without evidence. A straw man.

Evidence from your link:

   Human–Computer Interaction, Preece et al. (1994, p. 6): The authors explicitly define perceived affordances as being a subset of all affordances, but another meaning is used later in the same paragraph by talking about "good affordance."

   Universal Principles of Design, Lidwell, Holden & Butler (2003, p. 20): The authors first explain that round wheels are better suited for rolling than square ones and therefore better afford (i.e. allow) rolling, but later state that a door handle "affords" (i.e. suggests) pulling, but not pushing.


So?

An affordance affords. It's in the word.

> pulling, but not pushing.

Because of prior ubiquitous, universal training. Something with a place for fingers to grasp must be for pulling because pushing needs no such requirements.

If you want to split concept hairs or justify common-sense, you're going to have to delve into linguistics.

Have a happy weekend.


I'm not debating the definition of the word. I'm saying I avoid using it where I can because I've personally found it hard to get multiple people to agree on the definition - you're proving the point by debating against your own link.


I would second that take. "Affordance" is commonly used to mean at least two different things all the time. I think Don Norman himself recommended against using the word at some point.


If you present me with a completely blue screen, there are no affordances.

If you present me with a blue rectangle on a black screen, it's still confusing and cognitively-loading what the hell it's for or if it does anything. There are no affordances, only questions.

If you present me with a blue rectangle with an outset, stippled border on a black screen, it's clearly a button. Without that border that meshes with familiar previous training, there's no way to know it was a button. That's an affordance.

If a highlight or animation were added to the blue rectangle that already had an affordance, that would be a signifier.

Affordances are major indicators of action potential while signifiers are minor, helpful reinforcers of affordances.

The messy, subjective discussion is: what happens when all signifiers are thrown away, and then affordances too? Baby/bathwater defenestration.

https://ux.stackexchange.com/questions/94265/whats-the-diffe...


The share sheet on Android is super confusing.

It doesn't help that several apps implement their own, and some of them scroll horizontally while others scroll vertically.

Edit: apparently they're forcing apps to use the native share sheet in Android 12? https://www.androidpolice.com/2021/06/01/android-12-will-spe...


For reference, this issue was brought up to the chromium maintainers but it was closed as wontfix without giving a real reason.

https://bugs.chromium.org/p/chromium/issues/detail?id=112301...

> We are not going to migrate to the system share sheet any time in the near future, so this bug itself is WontFix / WAI. We are also not likely to have an option to disable the chrome share sheet.

And yes, due to API changes you'll no longer be able to replace the system share sheet in Android 12, because it basically amounted to a loophole that shouldn't have existed.

https://sharedr.rejh.nl

> We had never actually intended to allow apps to replace the share dialog, that Intent is for apps to launch the share dialog.

Now I will have no choice but to eat the 2 seconds of loading every time I want to share something as it loads a set of irrelevant share targets that I will never use anyway.

I really wish that feature was configurable but you can't turn it off.


No, they're preventing 3rd-party apps from actually replacing the system share pane for other apps. An app can still choose to implement their own in-app share menu instead of using the system one.


Only one problem with the Share dialog:

Doesn't "hint" at horizontal scrollability, just shows four apps as if that's all there is.

I've seen this trip up advanced users and parents alike.


UI designers hate scrollbars. They must be eliminated. But why? I don't understand.


I build mobile and web apps--full stack on many platforms and I was severely tripped up by that shitty share menu. On your first use if you have to think about it the UI sucks. Period.


It’s because apps want to track where you’re sharing links to.


It always amazes me when I see a post on HN or elsewhere with the vintage Mac GUI - it was visually very clear and attractive, with lots of nice, well-delineated affordances!

1980s/1990s-style graphical user interfaces were remarkable in terms of how much of the available screen real estate, CPU power, and memory they were willing to dedicate to the user interface. It reduced the visible content and available computing resources, but it made the UI very clear.

As much as I like multitouch, it isn't visually discoverable, even after you learn the basics of (tap, tap-and-hold, tap-and-drag, swipe). It's nice in a way that the whole of your tiny phone screen is used for content, but it can be frustrating trying to discover the control methods.


Not just on phones/tablets. If you have an application running in dark mode, and your Windows desktop is dark, you can't see the edge of the window to resize it. Also - with Windows 10 you don't grab the edge of the window, you grab the shadow. Which is not visible because shadows on dark desktops have no contrast.


In another well-known OS, the pointer can be somewhere inside the corner, and you can press a meta key and click and drag the corner without being on it exactly.

I used to miss that functionality quite a bit when using Windows, but I think I use window snapping more now.


I loved that too! I use AltDrag[0] to get that functionality in Windows. It's a bit old and you may have to tweak it a bit to make it work with HiDPI, but it's now become a must-install when I set up a new machine.

[0] https://stefansundin.github.io/altdrag/


AltDrag is amazing. Plus it's becoming a necessity because the window bar is now getting filled with elements too, so you can't rely on being able to use it to drag windows around by clicking and holding on any spot in the bar.


there's a new fork that is supposedly more up to date. (i dont use windows anymore though so i hasn't used it myself)

https://github.com/RamonUnch/AltDrag


Hip drop shadows and hip dark mode just don’t mix. Adding a dark mode after the fact is about as fun as writing tests for legacy code.


I would expect windows in dark mode to I guess have a subtle lighting behind them (as people sometimes do to objects in physical space).


These UI standards aren't taught any longer, so unless you grew up with the 90s, when at least first-party developers were somewhat consistent about following them, you might not even know what you're supposed to do.

I came to programming with VB6, and that was an era where even the trash "Teach Yourself Programming in 24 Hours" books made a big deal about tab order and mnemonics and cues like eclipses on buttons that launched new windows.


I actually ran into the same problem with YouTube's interface on iOS, it made me so angry I actually recorded a video to show people how stupid this is:

https://youtu.be/wGKIz0bWVVU


You ask how you are supposed to find it.

You are not supposed to find it.

Then in next iteration designers can tell management that only 0.3% of users use settings and then they can get rid of it all together.

Just like Mozilla and the settings to remove the top tab bar after you have enabled Tree Style Tabs or Sideberry.

I'm only partially joking here.


Oh that's just infuriating!


Is the distinction between Android and iOS here maybe that iOS supports only a limited number of horizontal resolutions? At least on my Android phone, the rightmost element ends up at the screen edge with no padding, with a similar effect as the cut off icon.

(I.e. a design that's appropriate in a tightly controlled homogenous environment might not be any good in a mixed one.)


Ideally, the layout would adjust the spacing between items, so that one item is always partially cut off. I don't know whether Apple does that or not (or if it's their small number of possible iOS device resolutions, like you suggested). However, that seems like something that Google should consider as a way to improve the Android experience.


Seems like old fashioned computer literate people are becoming the new computer illiterate. Perhaps kids learn these things socially from each other?

I recently discovered the iPhone Notes app has an undo capability. There's no indication of it on any of the screens but you have to shake the phone. Shake to undo? Is that common now? How do you learn this stuff?


Shake to bug report or "rage shake" is another.

Apple's gestures have always been in that category for me though, including touch screen and touch pad gestures, though people used to them seem to love them. Meanwhile I'm just there like "So, how many fingers do I need to see my wifi toggle?"


The "shake to undo" feature is actually very old, I remember discovering it around 2010. It works in all text fields system-wide, not just in the Notes app!


I remember back around the time of the iPhone 1 seeing a “this is the future!” commercial featuring a lady using a tablet. She was making lots of vague gestures that didn’t correlate to anything on the screen and the tablet “just knew” to do lots of diverse actions correctly. I thought it was ridiculous.

Well, here we are. What the commercial didn’t cover was that you had to guess and poke semi-randomly to discover all the magic gestures and were never sure if you were missing something important because it’s right there, but invisible.


Mac OS tripped me up with their new-ish “group things by month” in the recent folder. I didn’t realize those buckets were side scrollable. Scrolling is so easy to do that I’m not sure why it’s even necessary to limit the number of items in view. But I’ve been a computer user for a LONG time and was seriously stumped by this behavior because there was zero affordance to indicate there were more items. No less horizontally, which is very unexpected behavior.


Windows 11's rollout made me want to barf. Lots of marketroidy "Simple, clean, beautiful!" and "We put the Start menu at the center because we put YOU at the center!" Yeah, those are nice inspirational sound bites, now how are you going to make Windows less of a pain in the ass to work with, day by day? Windows 9x put the Start menu down in the corner to effectively give it infinite width and height per Fitts's law. It also gave us buttons that look like buttons. The beveled edges did more than make things look pretty, they signalled availability for user interaction and roughly delineated the boundaries where such interaction could take place. It was a massive UX improvement over Windows 3.x. Is Windows 11 less fraught with friction than Windows 10 in real terms? All the indications say "no, but it is prettier!" No affordances, no signals to the user, just plain white panels that don't look like anything and now the Start button -- the Schelling point for telling Windows what to do -- is harder to hit with the mouse. Oh, and how do you put to use Windows 11's new tiling and virtual desktop features? Hover over maximize! So easy to figure out!


I'm looking for balance, here. Let me provide an example:

I have a "left-swipe-to-delete-from-list." That's the standard way that Apple has decreed that items should be deleted. They provide no affordance.

You can add an "Edit" button to the navbar, but that means you don't have room for other, more important (and frequently-used) items. In my case, this is a non-starter. I need the room for more important stuff.

So far, I have added the left-swipe, but no affordance. It relies on the fact that the platform standard is left-swipe to delete. I need to make sure that the screen shows a fairly standard list (not getting too fancy), so users that are trained on platform standard will know that they can left-swipe.

In some cases, affordances can actually interfere with usability.

I can develop my own affordance, but I am not sure what a suitable one would be.

Implied training is also a big part of usability. I think it was Tufte that talked about that. He has some really strange UIs that don't make sense, until you learn them, and then, you don't ever want to go back.

In my experience, train maps in Tokyo are like this. They are a fearsome mess, when you first look at them, but, once you understand them, they are marvelous.


I get that. But how do you first learn about the standard way?

Admittedly, it has been several years since I used an iPhone - but I don't remember it ever telling me that slide-to-delete was a thing. I literally had to ask on Twitter to find out - https://twitter.com/search?q=delete%20podcast%20from%3Aedent...

I agree that there are some things which work best hidden - pull to refresh, for example - but it needs to be consistent and explained.


I think Apple’s argument for why slide to delete is not explained would be that it’s a shortcut — the explicit way to delete is to tap “edit” or “select” in a view, which exposes an explicit delete button on each row, or the ability to select rows and delete multiple. This is all explicit through buttons in the UI, and all of Apple’s apps tend to have this behavior (I can’t speak as to whatever UI you were trying to use 12 years ago). I would call that the “standard” way, whereas slide to delete is the fast way.


The old Macintosh computers came with a floppy that walked the user through using the computer. How to click. How to drag. Very basic stuff, but it was all new to the world at the time. Now, you are assumed to know all that.

The iPhone/touch paradigm never had such an easy on ramp for consumers. Or if it did, they dismantled it before I got on that ride.

Personally, I think this reflects pure hubris on the part of Apple, and it’s one reason that I have not owned a mac for 20 years.


True, ui action discoverability is harder these days, but somehow I learned to swipe to delete, but I don’t know how. Maybe I was told, or saw a video? My point is, that the interactions are common enough that I think most people learn them from other people, whereas in the early days of personal computers guides were important because users had no previous knowledge, nor anyone else, to turn to.


One method I’ve seen of introducing the behavior is to have the list element slightly “bounce” sideways on load, giving a peak of the red/green color underneath.


Hey, that's a great idea!

It's a bit of a pain to implement, because I'd need to make sure that it only did it for displayed rows, and only once, but that's pretty cool.

Thanks!


Another option is to go to Settings->General->Storage->Podcasts and delete from there.


If you are having trouble revealing all the functionality then your app is already too much for the platform.

Making your users learn a UI is stupid. You aren't building Autocad for the phone.


> Making your users learn a UI is stupid.

I disagree. It's pretty much how everything (not just software) works.

Driving is learning a UI. So is riding a bicycle, or a horse. Heck, using a toaster is learning a UI.

Every standard GUI has platform conventions. In fact, becoming familiar with these conventions is one of the most important tasks that we all do, when starting out.

It's also why so many of us get "fixed" on one or two platforms. I generally suggest to folks that are thinking of switching platforms, to consider just upgrading their device on their current platform, instead.

Many HN readers are probably quite used to bouncing around a dozen different UI systems, but that is quite rare. Most folks like to find their rut, and then furnish it.


The author links out to a writeup on Safari 15 which I found interesting: https://morrick.me/archives/9368

One of the main bizarre design choices was making the address bar shrink as you add more tabs. It made me wonder if these designs reflect the diminution of sites visited in a modern internet user's session. Seems like we are moving from a mode of research-and-explore to residing in one of a few home bases (reddit, twitter, etc) and everything else is reached via Google search -> first result. Since information delivery is now so heavily tailored to a person's filter bubble, there's not as much need to stray. I'd love to read more about something like this.


I disagree with that article on the subject of Safari tab groups. I had already been grouping tabs by subject using separate windows, and now that tab groups let me rotate out sets without them all sitting in memory I’m using grouping more than ever. It’s been very effective at keeping the number of tabs in any given group/window low.

I still do old style internet search-and-explore, and that is also enhanced because I can tuck away my “everyday” tabs and tabs related to other subject and let the topic at hand dominate my browser, with as many tabs and windows as needed being opened with no worries about having to separate them out from the other stuff.


Oh for sure, and to clarify I'm not against the design changes! Tab groups are the logical next step in organizing thoughts-as-tabs and Firefox is sorely lacking in this capability in 2021.


Previous HN discussion on the topic:

https://news.ycombinator.com/item?id=27559832

That's true regarding 'residing' in home bases. I still remember how much personalities each different forums of specific interests have, but that's mostly filled by subreddits now.


You should try teaching someone without tech literacy when to single left click vs double left click vs right click in Office for Windows.


Affordances are one of those core UI concepts that everyone should know and implement to guide the user smoothly all the way through. Design of Everyday Things, and one HCI course pays off so much in the long run.


The obsession modern developers and marketers have with trying to convince people using a computer they're not using a computer is so pointless and perverse I'm tempted to call it a mental illness.


> But there's evidence that Apple are slowly undoing their great usability work in the name of elegance.

Tangent, but am I the only one who is really bothered by the update they made to the way you set a clock (eg for an alarm or appointment) a few years ago? It’s now only digits which I am supposed to scroll vertically, which is incredibly tedious; before, it was an actual analog clock I could drag around and it was awesome.

To this day I still don’t understand why they did this, it seemed to serve no purpose to get rid of the old UX.


Are you talking about the time input like when you create a new alarm? The new one is a text field that you can drag the little digit spinners, but primarily you just type in it.

Being able to put times in with a proper numeric keypad is a big usability improvement IMO.


TIL you can actually just use the numeric keypad to type it in. I must be getting old.


As much as I'm used to android(though still adjusting to the changes in android 12), I admit, every time a major change pops up, it takes me forever to get used to it. I remember when they removed the navigation buttons in favor of sliding your finger over the edge of the display. I can't describe how much I hated that. Sometimes I wonder what was wrong with ncurses for user interfaces and why on earth did people bother making and using anything.


Need more people with industrial design backgrounds in UI teams. I keep seeing more and more people with only graphic design or illustration backgrounds doing these jobs.


https://tyler.io/perfectly-cropped/

https://news.ycombinator.com/item?id=21353920

Here it is, I knew I'd seen a nearly identical story before.


On my Android phone, and probably the vast majority of Android phones, there is a UI affordance that the share options are scrollable. The rightmost icon is only partially visible, which makes it clear that there is more content that lies offscreen. This is the same affordance that's used on iOS devices. The author's phone happens to have a screen width that displays an exact number of icons, so this affordance breaks for him.

This is clearly a tradeoff. Either you tightly control the set of valid screen sizes (like iOS does) and guarantee that the affordance works across all devices. Or you allow for flexibility and accept that on some percentage of devices, this affordance will not work as intended.


Are you sure you're looking at the native share sheet? It scrolls vertically, at least on Pixel on Android 11.

There's some intent: links on https://paul.kinlan.me/sharing-natively-on-android-from-the-... that you can use to trigger it. The first one there doesn't work for me, but the second one does.

     intent:#Intent;action=android.intent.action.SEND;type=text/plain;end


Or you pick a better way to indicate more options, and not have to worry about a trade off at all.

Side note, I personally hate sideways scrolling stuff 99% of the time. I can't imagine I'm alone.


Not on my Pixel 4a. And not in the screenshot on the Android Developer site.

https://developer.android.com/training/sharing/send https://developer.android.com/images/training/sharing/shares...


Wow, I did not realise that was a horizontal slider either. (I've used Android since the G1, so certainly since whenever that was introduced. I've seen it a lot, just never scrolled it, because I didn't know it did.)

Another book I'd recommend to anyone interested in this sort of thing js Spence's Information Visualisation: Design for Interaction. I also had the pleasure of the author's course at university, apart from anything else it was an excellent sort of 'something a bit different to think about', that I often recall when I'm frustrated by things (as in the submission), and perhaps more often should when working on frontendy things.


There's a great deal of fashion over function - especially in consumer.

I used to program nuclear power plants, destroyers, air traffic control, etc. UI design by engineers not artists.

Here's an interesting question. What will people in 50 years think of today's UI fashions?


It’s very funny to me that people think “the past” was a UX wonderland.


I mean… in the 90's, software didn't harass me to update every 15 minutes or pop up extraneous notifications that are irrelevant to my interests. Interactions were, in general, far more discoverable. You could come to rely on simple, consistent design cues to understand how something might work. A whole new piece of software was instantly usable because it adhered to the expected interface guidelines[0] defined for the OS.

In recent years, a lot of this has gone out the window and each dev house seems to invent their own paradigms and complete UI toolkits, so while there are similarities, they are few. The cohesive, simple nature of software UI is generally a thing of the past, that most of us who experienced it are quite nostalgic for.

Generally speaking, most application UX problems were "solved" ~30 years ago, and every time someone tries to reinvent the wheel, they miss subtle, important aspects of those earlier solutions. Like the importance of consistency across time, and "principle of least surprise", and things like that. I can really go on and on ad nauseam about this because it's a huge pet peeve of mine. I cannot stand having to relearn major aspects of how to use the same software, the same OS, every 6mo because some company decided it's time to completely overhaul their design/UI, for Reasons™.

[0] http://interface.free.fr/Archives/Apple_HIGuidelines.pdf


It's like how no matter when you're born, the best music was always the ones from your teenage years (when your taste in music was being formed). And how music sucks these days because kids don't know to make good music (and not because musical tastes has changed around you and you don't like that)

Same thing with UI I guess.


I don't think the comparison works.

Music generally is what people want to listen to for its own sake.

UI is disconnected from what people want; everybody wants to do something, and UI is how you get there. As is in-app advertising, electricity, and so on.

And not only is UI tied to other things that aren't available a la carte, but the whole package is frequently chosen by someone (corporate software) who will never experience the UI and didn't talk to anyone who will on the subject.

So I think it's a fallacy to assert that UI is in general what the public prefers. The same fallacy as saying consumers want binding arbitration, or melamine in their milk, or surveillance advertising.


As soon as I read that he discovered the icon rows slid horizontally, I thought in my head "the design should have hinted at this by showing a bit of the next icon in the row", not realizing that's exactly what the iOS screenshot below would show. That was a nice bit of validation! Sometimes you don't need to do a whole lot to make a design more understandable and accessible.


Note that the default share UI in Android 11 (or is that LineageOS specific?) does indeed show more by swiping up. Firefox implements their own sharing UI and does side-swiping as well, but at least they went the Apple route of cutting some options off.


> ...went the Apple route of cutting some options of.

Is this implying there's more to the comment if I side-swipe?

Hrm. I think it's bugged.


Heh, fixed ;)


That's true for Google's Android too, not just Lineage.


UI became user hostile because companies became user hostile.

Software no longer serves the user, but is the user that serves the company behind the software.


Recently there are posts from designers about bringing back blurred/gradient edges of scrollable content. Hopefully the trend gets momentum (pun not intended).


They became less necessary over time as people got used to computers. Nearly everything on your screen right now is touchable or interactive (you can select the text in this comment) but trying to show visual accordances for all that would make the actual content harder to digest.


> Ideally, all doors would look like this

Ideally all doors would swing both ways.


Don't even get me started on "long press".


People without much life experience, knowledge, mastery, humility, or expertise don't understand what things are used for, and decide for everyone else to throw them away as "unnecessary."

Put another way, it may well be some sort of Dunning-Kruger arrogance that the self-esteem crowd foists on the rest of us without our permission.


„Doesn‘t familiarize themselves with the UI, complains about not being familiar with it“. That‘s almost meme-ish, to be honest.

What‘s a discoverable UI? Does it count to have options in a context menu? That’s very discoverable to someone like me, much less for some of my less computer-savvy relatives. A toolbar with unlabeled icons is not that different in that regard. I think it really boils down to how familiar someone is with the UI already and half of the solution is being willing to familiarize themselves with it.


I've been using Android since the pre-release versions.

Every Google app has a different share UI. If I hit share in YouTube, the panels scrolls vertically. Google Drive's share looks different, but also shares vertically.

Google Chrome's share panel looks identical to Drive, but behaves completely differently.

So, I'd say that I'm very familiar with Android's UI - but I don't think Google is.


There is no escaping Conway’s Law.

A disjointed company produces a disjointed product. Mind you, this might be well be a deliberate choice made by people cleverer than me.


This is fixed in Android 12


I don't understand exactly what the writer is complaining about. The thesis of the page is never explicitly stated.

I see a complaint, "Hmmm. It didn't have the share destination that I wanted" yet the author never states what option they were looking for.

I would agree that, via the screenshot, the default options are all over the place and not prioritized well. However, the "Copy link" option is clearly visible. Being a website, I'm fundamentally unclear about what share option the writer could want beyond the ability to copy the url.


Author here. In this case, I wanted to share to Twitter. That would have resulted in the page title and URL being shared - so copy url wasn't a suitable.

But the actual destination is irrelevant. How was I (or any user unfamiliar with the interface) supposed to intuit that the panel was horizontally scrollable.


The actual share destination isn’t the point, it’s how it’s not obvious how to get to more destinations than are initially visible.


It doesn't matter what option he was looking for (maybe nearby share, messaging, etc as shown in a latter image). The point being made is that it wasn't in the first screen and the fact that there were more options (and how to reach them) was not hinted by the interface in any way. Adding to that the expected way of interacting didn't work and only by chance he managed to figure it out.

From the article:

> There's no scrollbar, no handle, no "more" icon, nothing.

...

> I tried swiping it up - that's what I've learned most panels do in Android. But it did nothing. So I gave up.

...

> my thumb slipped transversely (...) The fucking thing was a horizontal slider!


The thesis is that the UI is not intuitive - that's exactly what the writer is complaining about.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: