Hacker News new | past | comments | ask | show | jobs | submit login

We have 'instinctive visual cues' for depth and light coming from above, hence why button gradients are so immediately effective,because our visual system recognizes it in milliseconds. we don't have "instictive visual cues" for refraction and lensing , that's why we are confused about underwater distances . That's why magnifying glasses make us dizzy. I just can't believe this is coming from apple.





Plus, to be truly realistic it also would need to take into account ambient lighting scenarios surrounding the device displaying it.

Like this it’s really just another try in recreating glass which never made sense to be used in UI.

It is beyond me, how this got chosen as a way forward - taking visual design which makes sense in a VR/AR environment, to ruin their rectangular display UI.

It will make implementation way more complex than it is already and worse it will set off an avalanche of badly done imitations creating a mess throughout all touchpoints across companies taking years to clean up again - just as I thought that UI design finally reached an acceptable level of maturity.

Sad, really sad for a company like Apple to throw out precision, clarity and contrast for “effect”.

Sad.


It's not actually glass, instead the apple engineers and designers are basically simulating effect of surface tension of drops of liquid. Unfortunately the refraction at the edges of a droplet is not informative about whether the droplet is inward or outward facing (i.e. if it it toggled on or off). Hence why they use additional highlights and shadow to indicate the 3D structure. The liquid effect is a total gimmick . And they added insult to injury by adding color-changes and movement which is totally distracting when you re scrolling that diffucult paper.

I know most people couldn’t care less about this, but those gimmicky animations probably consume more computing power than the entire Apollo project, which strikes me as unnecessary and wasteful. Given the choice, I’d much rather have a clean, efficient interface.

I tend to like Material Design in comparison. It’s clean, efficient, and usable. I just hope Google won’t try to "improve" it with annoying gimmicks and end up making things worse, like Apple did here.


"Flat" design is equally offensive by not demarcating controls as controls, or their state in an intuitive way.

Just as we were finally seeing UI step away from that BS, Apple jumps all the way back into much-scorned, cheesily-excessive skeuomorphism... adding a raft of failed ideas from 20 years ago.


Since this is in contrast to "wildly not flat and full of visual gimmicks": the modern "flat" style has severe (and very stupid) issues, yea. But "flat" has been around for a very long time in touch UI with clear control boundaries - just draw a box around it, maybe give it a background color.

That's better than plain text that just happens to be a hidden control, but text with a background color might just be... text with a background color, for emphasis. Or it's text with a background color, to distinguish it from editable text. A background color does not tell the user that it's a control.

A box around it? Slightly better, but still doesn't convey state. Sure, you can fill it in when it's "on," but that's still guesswork on the part of the user if he arrives to find it filled in already.


I'm pretty sure my Amiga 1000 had more computing power than the entire Apollo project. I mostly used it for games.

Historically, design as a priority worsened UI for average and new users, and Apple has prioritized a feeling of elegance over ease of use.

Liquid glass puts UI second (feature cues) in favor of UX (interesting experience), harkening back to skeuomorphism but misprioritizing UI. I appreciated in Jobs's time how skeuomorphism was used to reveal more features, and give new users simple cues.

Now there is this idea that there is a higher percentage of advanced users, but since now there are MORE users (anyone with a screen), and continual change, I think there is still a large percentage of less advanced users "harmed" by prioritizing UX over UI.


It's also ironic that so much effort has been spent on the "liquid" feel of the phone ... which is mostly lost when it's in a case.

I think they refuse to pick a shade of grey for their UI's background, so we're stuck with transparent elements.

You know the dominant apps used on phones have large full screen user-generated video and imagery, right?

These are UI elements designed to work great over scrolling content feeds, full screen product images, album artwork, and thirty second videos of people doing meme dances. There is no room for ‘a gray background’.


This doesn’t justify applying a less than suboptimal design for everything else.

UI on content is a special case just like AR and here it might be ok, but why add “glass” as a background on icons or panels for text that are served much better by using a single colored transparent background without the noise that glass is bringing to the table - if there’s a background needed at all.

The visual signal to noise ratio is being cranked up to 11 for novelty’s sake.


I think you’re watching a way different video about this than me.

In the design guidance they’re explicitly saying liquid glass is for selective elements in the navigation layer. When those elements pop up modals those use a very subdued and opaque glass that loses the adaptive contrast, but still physically embeds them in that same floating navigation layer.

They’re not saying everything needs to be made of glass. They’re explicitly saying don’t make everything from glass.


For highlighting text?

> These are UI elements designed to work great over scrolling content feeds, full screen product images, album artwork, and thirty second videos of people doing meme dances

Liquid glass also seems terrible for this type of application. TikTok's overlays are much less intrusive and distracting.


Now give the laptop back to your parents and go touch grass.

This has no place in the desktop.


What a patronizing and shallow response.

the liquid glass ends up being vital for windows in AR. the vision pro has this, and it really helps you see behind the windows you've placed. while a shit experience on a phone, i do think liquid glass is a useful choice in the AR world

Back in my day (as far back as a month ago), we just called that effect “transparency” or “translucency”. Hell, there are types of AppKit popup windows that have the effect on by default, that have existed untouched since the early days of Mac OS X. Don’t give Apple more credit than they deserve here.

No question about that - see above.

What works for augmented UI doesn’t in a desktop, mobile or 10ft experience.

It’s a terrible mistake porting something to an environment where transparency isn’t helping but brings about the opposite effect.


That’s an interesting point, never thought about it.

These complicated lenses distorting light from all directions look fancy in a designer portfolio, having them almost everywhere… I’m not sure how it will work out.

In contrast, the original material design was quite intuitive, iirc they based their design on paper sheets, much simpler, and much more common in our day to day life.

I still have some hope it will work out great, if Apple can take the accessibility visibility issues seriously, and developers using it in moderation, it can be great.


I see no way around all that optics physics not sucking up computation and battery. Perhaps Apple will add liquid glass silicon to the mix to do that physics in hardware. Using glass to compute glass, LOL

Liquid glass can't possibly be that much more expensive than vibrancy (if it even is). The refraction effects are effectively just a displacement map (probably calculated realtime, but still).

my initial thought is that apple is preparing to launch physically deforming screens which will create bumps similar to this liquid.

Or using cameras to render what’s behind the phone as a background. That would help explain the continued focus on thinness.

The continued focus on thinness betrays a lack of useful ideas... a hallmark of the Jony Ive school of enshittification.

I remember finding this super cool when it first came out: https://www.youtube.com/watch?v=JelhR2iPuw0

I'm not sure I'd want that on my daily devices, however I would like this on my car heads-up unit where tactile feedback with actual buttons is preferred to keep the eyes on the road. At least that would be better than nothing.

I wouldn't want it on my daily devices either, but mainly because I prefer my touchscreens to be perfectly flat and durable glass.

Physically tactile would change my opinion about Liquid Glass. And it would make screens more usable for the visually impaired.

Then it would make no sense to simulate them.

Unless you want the same look on your non-tactile and tactile surfaces.

But I think the theory is too far-fetched.


Paper sheets do not have controls on them. That's why "Material Design" sucks too, as all does the rest of the "flat" design fad.

Minimal visual cues, analogous (not photo-faithful) to real-world physical objects made GUIs a revolutionary advance in computer use. Both flat design and this new Apple junk (which, let's face it, is a return to hated and nonsensical skeuomorphism) ignore the very concepts that made GUIs so usable.


The linked video gives the explicit human interface guideline of don’t use it everywhere.

> I just can't believe this is coming from apple.

Apple has prioritized style over usability for decades now. Remember the godawful hockey puck mouse and how stubbornly they clung to it? It shouldn't be a surprise when Apple picks a solution that looks cool but is worse to use; that's who they are.


I agree that their keyboards and mice are the worst. Even the cheapest no-name peripherals do not invoke the same kind of anxiety that Apple's stuff does. I don't have any issue with their UIs however. I think they are generally very good. Things are not perhaps as discoverable as they could be but the alternative would probably be worse as it would lead to more clutter.

> I agree that their keyboards and mice are the worst.

They are now. A couple of weeks ago I bought an Apple keyboard from the late 80s on EBay (M0116 model). After a quick solder job wiring half an S-Video cable to a ProMicro, it now talks USB, works perfectly and is one of the best feeling keyboards you could hope to use. (One of the later iterations with saner cursor key placement would be better still, though...)


Holding Apple to a high standard this long after the the death of the industry’s one and only true UI/UX purist is folly.

It’s regular “you”s and “me”s there now.

US corporate structure absolutely kills the spirit in the kind of people who could make a difference. And when it doesn’t, it kills the ability of those people to be promoted to a position of influence.

I am not a huge fan of Steve Jobs, but he did understand UI and UX better than just about anyone, and he stuck to his guns.

“I can’t believe this is coming from Apple” is something I said when I saw iPhones with a camera bump. Camera bumps are a fucking abomination.


The accessibility angle is what concerns me. The demos of the Music app, for example, seemed much less clear. You’re gonna have to mess around with whatever settings they provide to turn it off if you have impaired visibility.

It gives off a weird 2.5D HUD effect that works well enough in first-person games (which is basically simulating AR), but is just harder to read and kind of unmoored from the main UX on a flat screen.


Their accessibility settings actually seem decent. You can turn off the animation, increase contrast, go nearly opaque… I still don’t think I’ll love this new paradigm, but it looks like I can mostly mitigate my concerns.

End of the linked video highlights the accessibility settings.

Apple is the company that makes laptops without power LEDs so you can't even tell if they're on.

“Think different”

Joking aside, “design” clearly supplanting ease of use


I have a feeling it's a bit of cart driving the horse. Look at all this GFX power we have, how could we harness it for UI instead of boring old compositing and alpha?

At the same time remember how much of a struggle it was in the 90s to show transparent layers? Good times


How is this wrong?

Our visual system is optimized, rather extremely, for understanding 3d scenes under the simple perspective model that our eyes are based on: x' = (x * f) / z

Outside of that 99.999% experience norm, that are brains are so used to, is disconnect and discomfort. If you've ever put on a new pair of glasses, with a different prescription, you'll understand exactly what he's talking about: depth offset and dizziness.

The disconnect is why refraction and lensing is interesting to look at: the model your eyes are used to seeing, for the world behind the thing, is not normal.


I wonder if this is linked to the reason that so many people become nauseous with 3D glasses.

When we see 3d movements that don't correlate with what our inner ears, the response is that our body assumes something is wrong, we have ingested a toxin, and a nausea / vomit response is created.

There is something visually jarring about this Liquid Glass UI, and it's possible it's related to movements not correlating with an internal frame of reference.


I get car sick quite easily, same with VR, but I actually like the design language of Liquid Glass over the first iteration of Material (I like the new updates to Material too). I think people should watch from minute 13 onward if they're short on time and want the gist of it.

I guess I'm a weird outlier and that's fine.


I can't use 3d glasses because my eyes don't converge properly. Maybe one day I'll have surgery to correct that

the fact that it's surprising does not make it a visual cue. A cue to what? I am not aware of any psychophysics study that says we have perception of droplets or lens transformations (in contrast to shadows , gradients etc that are well studied). There also doesn't seem to be an evolutionary reason for it because the natural world does not have lenses and glass. And UIs are usually based on intuitive features.

Not saying this makes the ui good but it should go without saying that the natural world has water which acts as a lens.

Also, of course we have perception of droplets. What we don’t have is an intuitive understanding of how light interacts with droplets.

I suspect that Apple are trying to leverage this lack of intuition to make their ui interesting to look at in an evergreen way. New backgrounds mean new interesting interactions. I’m not confident that they’ve succeeded or that that’s actually a good goal to have though. I have it on my iPhone 13 and personally I find it annoying to parse, and I feel relief when I go back to traditional apps untouched by the update like Google Maps


droplets of water are not lenses without a glass behind it, and we couldn't see substantial effects behind them before we had glass windows. There was little evolutionary reason to develop any perception of refraction in droplets of water. in contrast, shadows are instant indicators of distance and gradients instantly distinguish concave from convex surfaces for light coming from above.

(water doesnt do lensing unless it s a droplet)


I get that your point is that we don’t have a strong intuition for lenses and that’s tied to a lack of evolutionary reason to have them. I agree and suspect that might be the point of why Apple are using a the lens effects. We don’t need to go so far as to say the natural world is completely devoid of such phenomena. Of course they’re there but they’re largely not relevant to survival throughout human history

Is there any study saying that user interfaces should use visual effects for which our brains have hardware acceleration? It seems a reasonable premise, but is there data?

Taking advantage of innate perceptual cues is smart and our interfaces have always taken advantages of them https://en.wikipedia.org/wiki/Depth_perception

we shouldn't need a manual to interpret a UI


I don’t entirely disagree, but that is still an intuition, not a proof that our interfaces should always work that way.

We used to ride animals with legs, which worked a lot like our legs do. Does that mean the wheel is wrong? We don’t have wheels, and they don’t occur in nature.

I don’t think Apple has invented the wheel, and I’m inclined to agree that leveraging our hardware acceleration makes sense. But I haven’t seen anything beyond blind assertion that of course it has to work that way.


I think things are more "differential" than that. Since many of us look at these interfaces more than any other visual stimulus, our perception will be optimized around them. The ideal system, in the short term, will involve familiarity more than anything.

I assume he's saying the "disconnect" is easy to see.

If our brain understands one thing, it's that glass is a wall between our body and what we see. You can't touch that, or you'll hurt yourself.

That's... an odd thing for your brain to believe about glass.

That --in case it wasn't clear, and I can see why it wouldn't be-- refers to the thing you're looking at, through glass.

The sample interfaces and usecases seem highly legible and match my instinctive visual understanding for transparent materials. They look attractive and well separated from their surroundings. Not sure what this objection is coming from - have you looked at the results?

> we don't have "instictive visual cues" for refraction and lensing

Do you have anything to back this up? Seems a lot of your argument is hinging on this point. I’m skeptical that 1. this is true, and 2. Apple wouldn’t have considered it if it were true


Nailed it.

Apple's vaunted UI has always been crippled by some stupid decisions and practices. But exhuming the idiotic "transparent" UI fad that died 20 years ago must rank among the worst.

What Apple just rolled out is embarrassing and depressing. You know it's bad when a thread like this is full of well-written, incontrovertible takedowns and nearly devoid of apologist drivel.


[flagged]


I've never wanted Ashai to be more of a success. Their hardware is nice, but the desktop and window management sucks the big one.



Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: