Nitter is an Open Source alternative front end for Twitter that requires zero Javascript to run, and it's really good, and is way nicer than Twitter's own UI, and yall should use it. I see Twitter threads hosted on HN all the time, and I kind of wish more of the posters would link to Nitter's public-facing instance instead of Twitter. I think its interface is just objectively better than Twitter's for browsing.
Invidious is an Open Source alternative to Youtube that is also really good and requires zero Javascript, but unfortunately it suffers from a lot of performance problems and videos will occasionally refuse to load -- Youtube is I think more hostile about throttling/blocking that kind of stuff, and proxying videos is probably harder than proxying tweets. You can donate to Invidious though, which might help them afford more powerful hardware.
In regards to Medium -- if you disable Javascript those popups will all vanish, and for the most part all you'll be missing out on is comments and occasionally a lazy-loaded image. It's amazing how much of the web (particularly news) works fine without Javascript, and getting rid of it gets rid of a lot of these annoying popups and requests. Use uMatrix, it makes it really easy to turn Javascript off by default and whitelist as necessary. And every person who disables Javascript is one more tiny audience metric to make it ever-so-slightly harder for news sites to ignore progressive enhancement.
> It's amazing how much of the web (particularly news) works
> fine without Javascript,
It's amazing that any content would require Javascript. Web applications I understand, but content (particularly news) should have no need for scripting.
They use it to auto play all their shitty videos. That auto play nonsense is ruining so many things for me. I don’t want any movement or sound on a page that I’m trying to read. It’s insane that content producers think it’s a good idea.
Analytics don’t provide context. My wife hates autoplay videos and those “blog posts” that now appear before all the recipes. Unfortunately turning off the video and having to scroll around trying to find the recipe is increasing her “engagement”.
Recipes can't be copyrighted and are basically undifferentiated. I suspect the resulting strategy is to try to hook you with the commentary, so that you like or relate to the author, making future visits more likely.
The relevant metrics would be things like returning visitors, not necessarily time on site or clicks on "mute".
The problem is that there's no way for people to set it to "off" as an option, and that any time this option is suggested or implemented someone will see how they can override it, because "people don't really want to block the videos our our site".
I set up a whole separate style with Stylus to get rid of the autoplay videos on Serious Eats. Also fixed up their site in a couple other ways as well.
If it's an embedded video that sits there waiting for me to click it to play, it may actually get a chance of being viewed. If it starts playing and distracts me, then chances are I will just click the Back button.
I've always had JS off by default but one day, many years ago, fell for one of the "you have JavaScript disabled, please enable it for a better experience" banners (on a site that did not need it at all) that you'll see everywhere you go without JS, and tried it on for a moment. The page that was perfectly usable without JS became filled with mounds of ads (with sound!) and all kinds of other distracting moving blinking shit. Selecting text caused more bullshit to appear. Nope. Not falling for that again... Since that time, I've become even more suspicious of sites that ask you to do something and promise a "better experience", and have developed better intuition for when a site actually requires JS (honest-to-God interactive web applications, good ones exist but not many) or just wants to use it to shove more crap down your throat.
I wonder how much of that is some marketing department spamming up the page with google tag manager and not caring about the impact on the design (i.e they block the ads themselves or never read the site on mobile).
Most of the worst web UX I’ve ever seen has been through browsing a news site in landscape mode on a mobile. You’re lucky if you can actually read anything at all with the sticky headers, popup banners, modals that don’t fit the screen and break scrolling, and floating videos.
Not representing my company, etcetera, but some agency ours used turned on all the default settings, and rarely goes back to turn off trackers for campaigns they're no longer running. When someone in house was managing he had only turned on what was required, and cleaned up old campaigns.
I also worked with a user Friday who had noticed links via emails were no longer working. Turned out ublock/Firefox was stopping at the iContact tracking URL. She wasn't sure who to report it to, and wasn't using iContact for her emails, so sat on it until she ran into it when we were meeting.
Based on personal experience, marketing groups usually use whatever their IT department approves, and if they're managed, certain browser settings are being pushed to whitelist these based upon users saying things aren't working. Easier to just push a setting than explain best practices.
The best is when I start reading an article, make it to the second page, and then it starts autoplaying. Now I have to scroll back to the top to shut it off.
A long time ago we used to cache static content and then use ajax to bring in anything dynamic like user preferences, comments, etc.
Those were mostly superfluous adds to experience on top of the content. It’s been a while but I’d like to believe if you didn’t have JS our content still functioned.
Now with things like Vue and React I see simple web pages with a loader in between what should be static content. Seems crazy to me.
Oh, infinite scrolling pages with links at the footer, like Privacy policy and About, which you never are able to click. I always wonder who thought that was a good idea.
I use Reddit Enhancement Suite, and one of the features it adds is autoloading the next page when I get to the bottom of the current one. It's really nice.
I agree. Paged results can be used instead. However, what might be good is a new kind of HTTP range request, and possibly some way to programmatically delimit each result in a HTML document.
That's already in HTTP/1.1, the Range header comes with units. The standard defined "bytes", but a Web developer can also use more granular units, for example "items". (See <http://www.iana.org/assignments/http-parameters#range-units>...) I've seen that in the wild a couple of times in the context of JSON based request/response pairs.
Speaking as someone who browses the web with JavaScript disabled:
To the sort of person who installs NoScript breaking infinite scrolling is one of several advantages. Infinite scrolling is a great tell that there is an infinite amount of content with no marginal value.
An objective person reading this comment, who otherwise knows nothing about this subject, might ask "If the so much of the web works without Javascript, then why is Javascript enabled by default."
Google just made changes that require users to enable Javascript in order to log on, e.g., to check their email via the web. The rationale for compelling users to make themselves easier for Google to track can be exaplained away as a "security measure".
Google also recently made changes to Chrome that make it impossible to globally disable Javascript for all websites in Guest mode.
Nitter might not require Javascript to run, but it's everything but fast.
Every page that's not cached takes several seconds to generate. It's not much better to show a white page for seconds than it is to show a skeleton page and then load the data using javascript.
Also they have taken the decision to proxify all images/avatars through their own servers. Why? And if you do that, at least set appropriate cache headers! Every time I visit a page inside Nitter every single image loads again even if I've seen it already.
> Also they have taken the decision to proxify all images/avatars through their own servers. Why?
Privacy. I trust the devs of the Nitter instance I use more than I trust Twitter, and I like that Twitter has (at least somewhat) more difficulty knowing what IP is requesting what.
Better caching would be nice, but in general, I've found Nitter's speed to be fine. It's not amazingly snappy, and because it's proxying from Twitter it's probably never going to be faster than Twitter. But I've never been tempted to load Twitter to get around speed issues. Invidious on the other hand...
My guess is speed might also be affected by your region? I would be very surprised if Nitter's instance is trying to set up global CDNs or anything. It's probably just a couple Linode servers somewhere.
Nitter is an Open Source alternative front end for Twitter that requires zero Javascript to run, and it's really good, and is way nicer than Twitter's own UI, and yall should use it. I see Twitter threads hosted on HN all the time, and I kind of wish more of the posters would link to Nitter's public-facing instance instead of Twitter. I think its interface is just objectively better than Twitter's for browsing.
Invidious is an Open Source alternative to Youtube that is also really good and requires zero Javascript, but unfortunately it suffers from a lot of performance problems and videos will occasionally refuse to load -- Youtube is I think more hostile about throttling/blocking that kind of stuff, and proxying videos is probably harder than proxying tweets. You can donate to Invidious though, which might help them afford more powerful hardware.
In regards to Medium -- if you disable Javascript those popups will all vanish, and for the most part all you'll be missing out on is comments and occasionally a lazy-loaded image. It's amazing how much of the web (particularly news) works fine without Javascript, and getting rid of it gets rid of a lot of these annoying popups and requests. Use uMatrix, it makes it really easy to turn Javascript off by default and whitelist as necessary. And every person who disables Javascript is one more tiny audience metric to make it ever-so-slightly harder for news sites to ignore progressive enhancement.