Hacker News new | past | comments | ask | show | jobs | submit login

Somewhat related series of PSAs:

Nitter is an Open Source alternative front end for Twitter that requires zero Javascript to run, and it's really good, and is way nicer than Twitter's own UI, and yall should use it. I see Twitter threads hosted on HN all the time, and I kind of wish more of the posters would link to Nitter's public-facing instance instead of Twitter. I think its interface is just objectively better than Twitter's for browsing.

Invidious is an Open Source alternative to Youtube that is also really good and requires zero Javascript, but unfortunately it suffers from a lot of performance problems and videos will occasionally refuse to load -- Youtube is I think more hostile about throttling/blocking that kind of stuff, and proxying videos is probably harder than proxying tweets. You can donate to Invidious though, which might help them afford more powerful hardware.

In regards to Medium -- if you disable Javascript those popups will all vanish, and for the most part all you'll be missing out on is comments and occasionally a lazy-loaded image. It's amazing how much of the web (particularly news) works fine without Javascript, and getting rid of it gets rid of a lot of these annoying popups and requests. Use uMatrix, it makes it really easy to turn Javascript off by default and whitelist as necessary. And every person who disables Javascript is one more tiny audience metric to make it ever-so-slightly harder for news sites to ignore progressive enhancement.




  > It's amazing how much of the web (particularly news) works
  > fine without Javascript,
It's amazing that any content would require Javascript. Web applications I understand, but content (particularly news) should have no need for scripting.


They use it to auto play all their shitty videos. That auto play nonsense is ruining so many things for me. I don’t want any movement or sound on a page that I’m trying to read. It’s insane that content producers think it’s a good idea.


It’s insane that a certain demographic responds positively to overstimulation, affirming these choices.

I’m an avid cooker and I fucking hate the auto play video on serious eats.

I like to think the people reading that content are not as susceptible to annoying webpages but I think I am wrong.


Analytics don’t provide context. My wife hates autoplay videos and those “blog posts” that now appear before all the recipes. Unfortunately turning off the video and having to scroll around trying to find the recipe is increasing her “engagement”.


Recipes can't be copyrighted and are basically undifferentiated. I suspect the resulting strategy is to try to hook you with the commentary, so that you like or relate to the author, making future visits more likely.

The relevant metrics would be things like returning visitors, not necessarily time on site or clicks on "mute".


The problem isn't so much that videos autoplay.

The problem is that there's no way for people to set it to "off" as an option, and that any time this option is suggested or implemented someone will see how they can override it, because "people don't really want to block the videos our our site".


I set up a whole separate style with Stylus to get rid of the autoplay videos on Serious Eats. Also fixed up their site in a couple other ways as well.


You can block autoplay on Firefox without having to use any extensions.


I have ADHD and discovering that setting was a godsend. It's definitely reduced the number of ublock origin rules I've had to create.


about:config

media.autoplay.enabled = false


If it's an embedded video that sits there waiting for me to click it to play, it may actually get a chance of being viewed. If it starts playing and distracts me, then chances are I will just click the Back button.


It was surprising to me that disabling JS not only keeps most sites usable, but significantly improves the experience on most "modern" sites.


I've always had JS off by default but one day, many years ago, fell for one of the "you have JavaScript disabled, please enable it for a better experience" banners (on a site that did not need it at all) that you'll see everywhere you go without JS, and tried it on for a moment. The page that was perfectly usable without JS became filled with mounds of ads (with sound!) and all kinds of other distracting moving blinking shit. Selecting text caused more bullshit to appear. Nope. Not falling for that again... Since that time, I've become even more suspicious of sites that ask you to do something and promise a "better experience", and have developed better intuition for when a site actually requires JS (honest-to-God interactive web applications, good ones exist but not many) or just wants to use it to shove more crap down your throat.


I wonder how much of that is some marketing department spamming up the page with google tag manager and not caring about the impact on the design (i.e they block the ads themselves or never read the site on mobile).

Most of the worst web UX I’ve ever seen has been through browsing a news site in landscape mode on a mobile. You’re lucky if you can actually read anything at all with the sticky headers, popup banners, modals that don’t fit the screen and break scrolling, and floating videos.


Or a marketing department worked with an agency.

Not representing my company, etcetera, but some agency ours used turned on all the default settings, and rarely goes back to turn off trackers for campaigns they're no longer running. When someone in house was managing he had only turned on what was required, and cleaned up old campaigns.

I also worked with a user Friday who had noticed links via emails were no longer working. Turned out ublock/Firefox was stopping at the iContact tracking URL. She wasn't sure who to report it to, and wasn't using iContact for her emails, so sat on it until she ran into it when we were meeting.

Based on personal experience, marketing groups usually use whatever their IT department approves, and if they're managed, certain browser settings are being pushed to whitelist these based upon users saying things aren't working. Easier to just push a setting than explain best practices.


More likely they just never read the site in the first place.


The best is when I start reading an article, make it to the second page, and then it starts autoplaying. Now I have to scroll back to the top to shut it off.


NoScript solved this problem for me.


A long time ago we used to cache static content and then use ajax to bring in anything dynamic like user preferences, comments, etc.

Those were mostly superfluous adds to experience on top of the content. It’s been a while but I’d like to believe if you didn’t have JS our content still functioned.

Now with things like Vue and React I see simple web pages with a loader in between what should be static content. Seems crazy to me.


Infinite scrolling needs Javascript, and it's pretty useful for some sites that have feeds.


Oh, infinite scrolling pages with links at the footer, like Privacy policy and About, which you never are able to click. I always wonder who thought that was a good idea.


Fuck no it isn't. Infinite scrolling has no valid use case.


It's only valid use case, is to keep the sheep glued to their phones. In that aspect it's totally succeeded.


I use Reddit Enhancement Suite, and one of the features it adds is autoloading the next page when I get to the bottom of the current one. It's really nice.


i think it might be handy if you want a web game rather than a document.


Please give me paged results instead of infinite scroll.


I agree. Paged results can be used instead. However, what might be good is a new kind of HTTP range request, and possibly some way to programmatically delimit each result in a HTML document.


That's already in HTTP/1.1, the Range header comes with units. The standard defined "bytes", but a Web developer can also use more granular units, for example "items". (See <http://www.iana.org/assignments/http-parameters#range-units>...) I've seen that in the wild a couple of times in the context of JSON based request/response pairs.


I was aware that the Range header has units; I did not know of any units other than "bytes", though.


Speaking as someone who browses the web with JavaScript disabled:

To the sort of person who installs NoScript breaking infinite scrolling is one of several advantages. Infinite scrolling is a great tell that there is an infinite amount of content with no marginal value.


You know "infinite scroll" isn't literally infinite right? It's just a next page button triggered at a scroll offset.


Of course it is not literally infinite, but when viewing a stream like LinkedIn for example, it may as well be infinite in practice.


It's nowhere near as good as a next page button.


I didn’t say it is (I think it’s conditional on the type of interface and data involved)


Visit any infinite scrolling site and click on the link associated with an item in the list. Hit the back button.

Q: Where are you in the list now?

A: It is impossible to know because no two sites behave the same way. You will not be anywhere near the item you clicked on in the majority of cases.


I've run into too many sites and apps with poorly implemented infinite scrolling from a usability and resource consumption viewpoint.


Infinite scrolling is an appalling travesty of a UI pattern.


don't need infinite scrolling if the end of the page has a huge button that says "more"


No - this is still infinite scrolling


An objective person reading this comment, who otherwise knows nothing about this subject, might ask "If the so much of the web works without Javascript, then why is Javascript enabled by default."

Google just made changes that require users to enable Javascript in order to log on, e.g., to check their email via the web. The rationale for compelling users to make themselves easier for Google to track can be exaplained away as a "security measure".

Google also recently made changes to Chrome that make it impossible to globally disable Javascript for all websites in Guest mode.


Nitter might not require Javascript to run, but it's everything but fast.

Every page that's not cached takes several seconds to generate. It's not much better to show a white page for seconds than it is to show a skeleton page and then load the data using javascript.

Also they have taken the decision to proxify all images/avatars through their own servers. Why? And if you do that, at least set appropriate cache headers! Every time I visit a page inside Nitter every single image loads again even if I've seen it already.


> at least set appropriate cache headers!

This seems like some pretty good actionable advice. It might be a good idea to make this an issue on their github page.

https://github.com/zedeus/nitter


> Also they have taken the decision to proxify all images/avatars through their own servers. Why?

Privacy. I trust the devs of the Nitter instance I use more than I trust Twitter, and I like that Twitter has (at least somewhat) more difficulty knowing what IP is requesting what.

Better caching would be nice, but in general, I've found Nitter's speed to be fine. It's not amazingly snappy, and because it's proxying from Twitter it's probably never going to be faster than Twitter. But I've never been tempted to load Twitter to get around speed issues. Invidious on the other hand...

My guess is speed might also be affected by your region? I would be very surprised if Nitter's instance is trying to set up global CDNs or anything. It's probably just a couple Linode servers somewhere.


Invidious isn't proxying the video, just embedding it. If you block third party requests from invidio.us to googlevideo.com, you wont see any clips.


I'd like to add that we can use ublock origin as well with JS-Off-By-Default.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: