Hacker News new | past | comments | ask | show | jobs | submit login
hiccupFX.js (telnet.asia)
452 points by rvieira on Jan 15, 2022 | hide | past | favorite | 78 comments



Love it, feels like browsing the web about 20 years ago. (Or a few bad websites now…)

So this is an example of Cumulative Layout Shift (CLS)[0]. It is a metric (coined and named by Google) which indicates how much “movement” there is during the initial render of a page, caused by reflowing the layout. Ideally you have very little or none of it for better UX. I believe it can have an impact on organic search ranking and potentially ad rank.

The most common cause is images included without width and height attributes, therefore the page has to reflow after the image has been downloaded.

Other things that can cause it are multiple css files overriding each other and loading slowly, or JavaScript effecting the layout after it has loaded.

You can use Google Lighthouse [1] to measure this (and other things) on a site.

0: https://web.dev/cls/

1: https://web.dev/measure/


The irony being that Google themselves doesn't even get good scores on Lighthouse. Check out the report for a simple search for "tomatoes":

https://i.imgur.com/qcaMTBh.png

...although I guess they got a green light on CLS, lol.

URL for transparency: https://www.google.com/search?q=tomatoes&sxsrf=AOaemvKVS3_Ep...


Google search results have these annoying info boxes that automatically slide open, then they score themselves green


> Love it, feels like browsing the web about 20 years ago. (Or a few bad websites now…)

Facebook: scroll down your timeline, scroll back up and everything has changed ...

Google: see a nice ad, accidentally click on a different link, click "back", ad is gone ...


Honestly, this stuff bothers me so much. I’m now subconsciously nervous about clicking the wrong thing and missing the thing I wanted.


Perhaps we should be running a screenrecorder all the time, so at least we can get some of the information back.

Or run our software in a "multiverse" VM, where every click/scroll operation makes a snapshot of the entire OS, and allows us to roll back state.


feels like browsing the web about 20 years ago. (Or a few bad websites now…)

A few? You an I must use very different web sites.

Both of the too-big-to-fail bank web sites I went to today have this problem.

My health insurance web site does this.

I just checked, and unless you're on a really fast connection and a newish computer, Starbucks, Instructables, Pinion Coffee, my local honey farm, Target, PetSmart, and Chewy, Amazon, Twitter, and PetCo all do this.


Check out https://nypost.com/ - shifts and redraws constantly, freezes completely needing page killed on a regular basis, consumes high while completely idle, makes iPhone hot if left open on NYPost page (swear their ad network must be mining crypto or something), routine usage crashes iOS browsers. No additional css needed!


Not only the NYPost, I have an old gen1 or gen2 iPad and it can't render many modern pages. The browser crashes when it tries to load the ads on many pages. Still perfect for reading HN though.


Doesn't do this for me, but then all the ad blocking stuff I have installed probably helps.


Definitely - it’s when their advertising is loading that all the problems manifest, and it’s the advertising that hammers the cpu once it is loaded, on a desktop where I can filter everything out it’s just another site, on iOS where there is no filtering it causes nothing but problems. Do I want to read about local news and probably lose all my tabs when the browser crashes, or avoid the site and keep my tabs another day? Their app has the same issue, I’m sure it’s just a thin wrapper around the iOS web widget (does not even seem to cache or pre-load content), it freezes solid for minutes while loading advertising and pounds the cpu as long as it’s in the foreground.


I am using iOS, with Firefox and various ad blockers, and behaves fine there. Time to upgrade your browser my friend!


Also web fonts with fallbacks that have different sizes.


>a few


This definitely hits a nerve. Google does that nowadays and I cannot count how many times I have clicked by mistake on these suggested links that show up after all results have already loaded.


It's hilarious to me how much Google is hammering on content layout shift as a web vital for ranking websites when they're one of the most annoying offenders. I can't tell you how often I have accidentally clicked that "People also search for" box that sometimes loads in beneath the first result, when I am actually trying to click the second result.

It's infuriating, and makes me wonder if anyone at Google actually uses their own product.


There's an extension [1] and a ublock filter [2] for this

[1] https://chrome.google.com/webstore/detail/remove-people-also...

[2] google.*##[id^="eob_"]


I feel like this is a symptom of google etc not having enough real competition. If google and twitter were scared for their survival they wouldn't allow slack like janky sites and disrespecting the free speech of their users as much as twitter does.


Twitter IS scared for its survival though, they just have truly incompetent product leads (+ all the damage from Dopesey's leadership).

Google is only playing deaf while revenue keeps coming.

I don't know which one is worse.


It happens when you click on a result, then hit the back button and want to click the next link. Then, just before you click, the suggested links appear.

Maybe they measured clicks and thought this must be a really useful feature.


It's not an accident at all. I'm sure some sites are moving content around to make it likely you'll click on an ad. Other forms of click fraud are rife:

This site has ads that cover the content

https://tonyortega.org/

and when you try to close them you will frequently click on the ad instead of the tiny close button. Back in the day the Scientologists would have reported him for click fraud but today they are a shadow of the terror they once were.


"Suggested searches" are not ads. I'm pretty sure Google does not financially benefit from a user mis-clicking on suggested searches.


This kind of jank should be unforgivable in a modern webapp. I am happy to see it being shamed appropriately.


Reminds me of the time someone put a string concatenation function behind a server as a dependency just to see how many unskilled programmers would connect to it

They even made a docker package


Have you got a name? I'm curious.


Maybe they were thinking of http://left-pad.io/. I found a left-pad Docker image, too, but it’s by someone else: https://hub.docker.com/r/wbinnssmith/leftpad (source: https://github.com/wbinnssmith/leftpad-cli).


My favorite is when the link you’re trying to open jumps away 1 nanosecond before you click on it and you wind up clicking something else entirely.


This actually happens on the mobile website during the amazon.com one click checkout when you try to change your payment method.

It's absolutely insane to me that they haven't fixed it yet.


I find the chrome android browser to be the worst for that.

Not only does the page load new things, but google resizes everything just as you click


Google does this by popping up a bunch of "related links" as soon as you're trying to click on the second link... I honestly thought all those well-paid engineers and UX designers who work there could make something better than that


I wonder if that’s the same team responsible for making embedded youtube videos start playing on mobile safari if your finger touches it while you’re just trying to scroll.

Aka, the Evil Department.


If video plays per page load is the metric the techbros of Google are what gets them promoted then you can be sure that's the reason behind it.


Yeah what’s up with that, it wasn’t like that mere weeks ago.


> I honestly thought all those well-paid engineers and UX designers who work there could make something better than that

Don't forget that you are not Google's customers. Their advertisers are their customers. You're just a product they sell. So the UX does exactly what it is supposed to do: get eyeballs on advertisements.

Even if you click back at Google's scale inadvertent clicks are big money. Besides you seeing something advertised/promoted they got some free tracking data to sell. So long as whatever they linked to managed to load some tracking pixels or even full tracking scripts before you hit the back button that's more tracking data hoovered up. Even just the request for the resource is data for tracking purposes.


This is yet another reason why I’m glad to have switched away from using Google search. Not even for privacy/advertising/tracking considerations, just general quality of life things like being able to click the whole search result bounding box (not just the top link), never having to suffer with related links appearing on back navigation, no AMP pages, etc.


I am sure they can fix it but the fix is going to bring the revenue down so some heads can get chopped. Who wants to put their CV "improved Google search results user experience by increasing the ad revenue by -3%"


They get paid the big bucks to make it worse in the way that benefits the owner


This is so bad I refuse to believe it's inadvertent.


A lot of this happens due to a basic lack of understanding of how documents flow at best, pure negligence at worst. Ads and lazy loaded content are being dropped into elements without min heights causing everything to just suddenly shift down. The other thing some of these sites do is take raw lazy loaded json and render a whole view (who would have thought, an on-demand slideshow is jarring if you didn’t account for how it’s gonna just be plopped onto the page). They are also dynamically formatting dates, author names, titles, plopping in client side a/b testing, etc.

Some people really should be fired honestly.


i didn't come here to be attacked like this /s


Bonus: Can this be a pure CSS library?


I thought CSS was generated by JS these days? Apparently it was too difficult to remain its own thing.


Yes, most of it can. I remade this effect using CSS animations, as an exercise:

Demo: https://css-demo-client-side-render-jankiness.roryokane1.rep...

Source code: https://replit.com/@RoryOKane1/CSS-demo-client-side-render-j...

However, a limitation of not using JavaScript is that I can’t randomize the changed properties: the page changes in the same way after every reload.


I think you could do it with keyframes, though the "randomness" would have to be hard-coded (the same every time)


Forgive me for saying this but I find all these complaints shallow, or a sign of youth.

Hear me out, though. The reasons are real. The modern web sucks bad but it's been like that for many, many years. Certainly longer than a decade.

From my point of view, as someone who writes js for a living, I started using NoScript a while ago and the web is generally very fast and nothing shifts position on my screen as things (ads) load. Yeah, I don't do social media and some sites are nothing but a blank page. It's a blessing.

I guess my point is: avoid doing things that annoy you. If you don't like js heavy sites - stop using them. Or just adopt NoScript. No, ublock isn't the exact same thing. The granularity is counter productive in this respect.


"Don't use some of the most popular sites on the internet if shifting around during loading is annoying to you." isn't very productive advice.


Productive? Don't have a short response to the implications here.

In any case, doing things because they are popular doesn't sit well with me. Fast food is popular, and yet I would advise to stay away from most of it


Productive as in helpful. As in something that is of any use to anybody.


Now we're just missing that feature from Google Search which randomizes the order of buttons on the page.


We've been using this in production for about six months, with various reactions from users. The results are mixed -- some really like it but others have sent us terrifying emails and we've lost two support staff in the great resignation over it. Your mileage may vary.


Was literally just dealing with this issue while trying to use Gilt. Happens all the time nowadays. Less server-side rendering


Lowes.com has the word jankines I've ever used.


The hero we need.


Someone must send this to the Google core web vitals (or whatever it's called) team. Their CLS guys/gals will have a fit :-)


OK this is amazing


sometimes I even click on the wrong thing


where's the code?


You can remove the `.min` from the URL and it easily readable

https://cdn.jsdelivr.net/npm/hiccupfx@1.1.1/hiccupfx.js


Why the leading ;?


Some build techniques simply concatenate scripts instead of transpiling etc. And there is no telling how the previous script ended (or not). I seem to remember this is done to mitigate some issues that can arise from poorly written scripts that come before this one.


It's for avoiding syntax issues, when you import multiple self-invoking functions.

Here is a stackoverflow that explains it https://stackoverflow.com/a/7365214/4690715



I do love thinking about all the tree shaking, minifiying and bundling that went into slimming down a website as my browser struggles to decode a few MB of json and makes 20 async calls before fully rendering the screen.


These 20 async calls don't actually need to reestablish the connection nowadays, so you don't actually get a significant performance penalty anymore for doing multiple requests.

Unless the requests are triggered after another request finished of course. That's still significantly impacting loading times


> These 20 async calls don't actually need to reestablish the connection nowadays, so you don't actually get a significant performance penalty anymore for doing multiple requests.

why is everything so terrible to use then :-(


Because in practice, the 20 async calls are never actually made in parallel. The site makes one call, waits for it to finish, downloads a few more scripts, makes another call, parses the results of that call using an accidentally-quadratic-time algorithm, sends a bunch of tracking events over WebSockets and waits for them to finish, makes another call, uses that call to trigger a full-page React/Redux re-render, schedules another call in an asynchronous render effect handler...


Ads, ads, and more ads. The Internet with an ad blocker vs. without is like a completely different universe.

Most sites load pretty quickly with ads blocked, even complicated over-engineered SPAs.


I used to run without an ad blocker to support the websites I visited, but then too many would rev up my CPU and freeze my browser just because of all the ads they loaded. The experience of clicking a link started to become unbearable, and I went back to blocking ads.

I know that circa 2018 the ad industry suffered a shock as major players started measuring more precisely what value ads brought, and CPC went down, I think things just got worse at this point because websites crammed more ads in response.


Reddit and Medium are awful for this. I don't know what they're doing, but every page has a small chance of infinitely looping to max out a CPU core forever, unless you block their ads. I'll feel my laptop heating up and know that some Reddit thread somewhere is to blame.


I've had to dial-back my ad blocker over the last six months or so.

More and more web sites stall, or refuse to load at all with ad blockers enabled. Even ad-free ones like government web sites, because they use the same tracking mechanisms.


That is just terrible design. A third party shouldn't be able to block first party functions like that.

Though you are right, it does happen. If Google's tracking code doesn't load you can't buy tickets on Northern Rail's¹ site for instance.

[1] Northern is one if the franchises running bits of the UK's "imaginatively" arranged railway system


Whenever I do this, I think of the old joke about the prison where all of the prisoners knew the same bunch of jokes by heart, so instead of telling a whole joke, they would just call out a number and everyone would crack up.

You should not have to look this one up: xkcd 624.


Are we...all we all inmates in Randall's prison?


But you didnt finish the joke:

Then one day a new prisoner arrives.

Later that day when everybody is at their cells some other guy yells: "194". And,as usual, all people laugh.

Another one says "315". Even more laughter. Another: "873", the place is a mad house.

Emboldened and wishing to fit in, the new guy yells: "242".

Total silence. Not even a little chuckle. Embarrassed the new guy turns toward his cellmate and asks him: "What did I do wrong?,Isnt 242 a good joke?". His cellmate says: " it is indeed a good joke, it is hilarious.But you see, son, a joke is not just the story, the most important part is the way you tell it"


I wasn't trying to tell that joke--I figured that everyone here knew it already. Maybe there was someone who hadn't, though! So I'm appreciative that you helped them enjoy the joke. Xkcd 1053.


Not having to reestablish the connection is a nice thing, but 20 round trips is still 20 round trips. Lots of opportunity for delay there.


Yeah this is something people love to gloss over. On good 5G the connection latency might be 15-20ms to some topographically "close" server. Realistically that's an absolute minimum and you'll see 50-100ms latency to actual servers.

When work depends on those resources it can't even start until all the data is returned. Far too often nothing can happen until a case are of resources gets returned. Then there's the fun outlier of some slow AdTech script that finally returns, screws with the DOM, and causes a layout and repaint.

It's just today's "What Andy Giveth, Bill Taketh". Great job all around everyone.


LMAO




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: