Nitter is an Open Source alternative front end for Twitter that requires zero Javascript to run, and it's really good, and is way nicer than Twitter's own UI, and yall should use it. I see Twitter threads hosted on HN all the time, and I kind of wish more of the posters would link to Nitter's public-facing instance instead of Twitter. I think its interface is just objectively better than Twitter's for browsing.
Invidious is an Open Source alternative to Youtube that is also really good and requires zero Javascript, but unfortunately it suffers from a lot of performance problems and videos will occasionally refuse to load -- Youtube is I think more hostile about throttling/blocking that kind of stuff, and proxying videos is probably harder than proxying tweets. You can donate to Invidious though, which might help them afford more powerful hardware.
In regards to Medium -- if you disable Javascript those popups will all vanish, and for the most part all you'll be missing out on is comments and occasionally a lazy-loaded image. It's amazing how much of the web (particularly news) works fine without Javascript, and getting rid of it gets rid of a lot of these annoying popups and requests. Use uMatrix, it makes it really easy to turn Javascript off by default and whitelist as necessary. And every person who disables Javascript is one more tiny audience metric to make it ever-so-slightly harder for news sites to ignore progressive enhancement.
> It's amazing how much of the web (particularly news) works
> fine without Javascript,
It's amazing that any content would require Javascript. Web applications I understand, but content (particularly news) should have no need for scripting.
They use it to auto play all their shitty videos. That auto play nonsense is ruining so many things for me. I don’t want any movement or sound on a page that I’m trying to read. It’s insane that content producers think it’s a good idea.
Analytics don’t provide context. My wife hates autoplay videos and those “blog posts” that now appear before all the recipes. Unfortunately turning off the video and having to scroll around trying to find the recipe is increasing her “engagement”.
Recipes can't be copyrighted and are basically undifferentiated. I suspect the resulting strategy is to try to hook you with the commentary, so that you like or relate to the author, making future visits more likely.
The relevant metrics would be things like returning visitors, not necessarily time on site or clicks on "mute".
The problem is that there's no way for people to set it to "off" as an option, and that any time this option is suggested or implemented someone will see how they can override it, because "people don't really want to block the videos our our site".
I set up a whole separate style with Stylus to get rid of the autoplay videos on Serious Eats. Also fixed up their site in a couple other ways as well.
If it's an embedded video that sits there waiting for me to click it to play, it may actually get a chance of being viewed. If it starts playing and distracts me, then chances are I will just click the Back button.
I've always had JS off by default but one day, many years ago, fell for one of the "you have JavaScript disabled, please enable it for a better experience" banners (on a site that did not need it at all) that you'll see everywhere you go without JS, and tried it on for a moment. The page that was perfectly usable without JS became filled with mounds of ads (with sound!) and all kinds of other distracting moving blinking shit. Selecting text caused more bullshit to appear. Nope. Not falling for that again... Since that time, I've become even more suspicious of sites that ask you to do something and promise a "better experience", and have developed better intuition for when a site actually requires JS (honest-to-God interactive web applications, good ones exist but not many) or just wants to use it to shove more crap down your throat.
I wonder how much of that is some marketing department spamming up the page with google tag manager and not caring about the impact on the design (i.e they block the ads themselves or never read the site on mobile).
Most of the worst web UX I’ve ever seen has been through browsing a news site in landscape mode on a mobile. You’re lucky if you can actually read anything at all with the sticky headers, popup banners, modals that don’t fit the screen and break scrolling, and floating videos.
Not representing my company, etcetera, but some agency ours used turned on all the default settings, and rarely goes back to turn off trackers for campaigns they're no longer running. When someone in house was managing he had only turned on what was required, and cleaned up old campaigns.
I also worked with a user Friday who had noticed links via emails were no longer working. Turned out ublock/Firefox was stopping at the iContact tracking URL. She wasn't sure who to report it to, and wasn't using iContact for her emails, so sat on it until she ran into it when we were meeting.
Based on personal experience, marketing groups usually use whatever their IT department approves, and if they're managed, certain browser settings are being pushed to whitelist these based upon users saying things aren't working. Easier to just push a setting than explain best practices.
The best is when I start reading an article, make it to the second page, and then it starts autoplaying. Now I have to scroll back to the top to shut it off.
A long time ago we used to cache static content and then use ajax to bring in anything dynamic like user preferences, comments, etc.
Those were mostly superfluous adds to experience on top of the content. It’s been a while but I’d like to believe if you didn’t have JS our content still functioned.
Now with things like Vue and React I see simple web pages with a loader in between what should be static content. Seems crazy to me.
Oh, infinite scrolling pages with links at the footer, like Privacy policy and About, which you never are able to click. I always wonder who thought that was a good idea.
I use Reddit Enhancement Suite, and one of the features it adds is autoloading the next page when I get to the bottom of the current one. It's really nice.
I agree. Paged results can be used instead. However, what might be good is a new kind of HTTP range request, and possibly some way to programmatically delimit each result in a HTML document.
That's already in HTTP/1.1, the Range header comes with units. The standard defined "bytes", but a Web developer can also use more granular units, for example "items". (See <http://www.iana.org/assignments/http-parameters#range-units>...) I've seen that in the wild a couple of times in the context of JSON based request/response pairs.
Speaking as someone who browses the web with JavaScript disabled:
To the sort of person who installs NoScript breaking infinite scrolling is one of several advantages. Infinite scrolling is a great tell that there is an infinite amount of content with no marginal value.
An objective person reading this comment, who otherwise knows nothing about this subject, might ask "If the so much of the web works without Javascript, then why is Javascript enabled by default."
Google just made changes that require users to enable Javascript in order to log on, e.g., to check their email via the web. The rationale for compelling users to make themselves easier for Google to track can be exaplained away as a "security measure".
Google also recently made changes to Chrome that make it impossible to globally disable Javascript for all websites in Guest mode.
Nitter might not require Javascript to run, but it's everything but fast.
Every page that's not cached takes several seconds to generate. It's not much better to show a white page for seconds than it is to show a skeleton page and then load the data using javascript.
Also they have taken the decision to proxify all images/avatars through their own servers. Why? And if you do that, at least set appropriate cache headers! Every time I visit a page inside Nitter every single image loads again even if I've seen it already.
> Also they have taken the decision to proxify all images/avatars through their own servers. Why?
Privacy. I trust the devs of the Nitter instance I use more than I trust Twitter, and I like that Twitter has (at least somewhat) more difficulty knowing what IP is requesting what.
Better caching would be nice, but in general, I've found Nitter's speed to be fine. It's not amazingly snappy, and because it's proxying from Twitter it's probably never going to be faster than Twitter. But I've never been tempted to load Twitter to get around speed issues. Invidious on the other hand...
My guess is speed might also be affected by your region? I would be very surprised if Nitter's instance is trying to set up global CDNs or anything. It's probably just a couple Linode servers somewhere.
The majority of content on Medium is low-quality, varying from personal blog post opinions stated as fact to yet another career-boosting tutorial on beginner React fundamentals. I advise most people to keep the content for their own blogs and let that site fall under the weight of hosting their crap content.
Twitter and Facebook are known cesspools that suck away your time and add very little value. I don’t know anyone same that uses Twitter regularly. Facebook seems to be used most by those who understand their business model least.
Guard your time and be careful with what you feed your time.
BTW this also applies to modern journalism sites like the New York Times and the Washington Post. Use archive.is and let them rot from years of neglectful journalism.
I use Twitter to follow developers and artists who post interesting work. It's the best resource I've found for surfacing that sort of niche content (if anyone knows a better one, please let me know).
I would put Medium, Pinterest, Wired, WSJ, Quora, etc. in the same basket of websites that hinder the free flow of information regardless of how useful their content is.
It's a fad, medium used to be usable and then everyone only posted there until people started thing it. HN is very Silicon Valley based, so all the trends in that tech works show up here.
I can understand sites not wanting to give things away forever. I cannot understand what is served by interrupting new visitors almost immediately, before they can practically do anything with your site. All that does is remind me how little I really “need” your services.
So true. Or my current pet hate, noticing that I'm using the mouse to close the tab and throwing up a begging popup - a great way to convert me from feeling a site was useful and interesting to finding it annoying and toxic.
They do that because they know human nature (well, post-popup windows from the '90s) is to move your mouse down to close the popup and then close the tab, as if it makes any difference. That buys them another 2-4 seconds of time-spent-on-page analytics metrics. It took me a bit to untrain myself from that behavior. I'd quite like a browser extension that stops reporting when my mouse moves out of the tab.
Here's the thing, why should I need to keep changing my behavior to stay head of the advertisers? On some websites I interact primarily through the keyboard, but on others I use the mouse because I'm doing a lot of scrolling highlighting.
Yeah those foresee pop ups can go suck it. I mean what are people thinking? It’s stupid really. But then again, that’s what people do: in the pursuit for money they shit chemicals in rivers and don’t clean up their garbage. Or they toss garbage in the recycling bin.
I’m not sure exit intent pop ups are the same as polluting finite natural resources, and the more I learn about the recycling industry the more I believe that everything is just trash any way (reduce and reuse are the only viable paths).
Also, as a quant marketer I can tell you: exit intent works. Most traffic bounces and never returns. Being able to capture email and nurture leads is a profitable way of maximizing the return of your traffic.
There’s a reason why every furniture store is going out of business and has someone twirling signs: we are evolved to pick out movement and change. An exit intent that inverts the color scheme is obnoxious but only because you have to see it.
The interesting question (for me) is how obnoxious do you go before you start to hurt returns.
This comment is (oddly) unlikely to be popular on HN (odd because of HN’s heritage as an offshoot of YC), but exit intent works. It’s one of the first things I put in place at any new company.
Notably absent from this comment: the slightest awareness that "your traffic" is composed of people who have preferences.
Yeah, no doubt, "exit intent works", if the only thing you care about is "the return of your traffic". This mindset is exactly how we got to the situation described in the OP.
You are making things worse for your users. You don't care, because you don't think of them as users, as people, as human beings, you think of them as "traffic" whose "return" you want to "maximize".
Content has a cost; I don’t apologize for maximizing the return of that investment.
I’m personally not offended by exit intent pop ups. They are an expected part of the browsing experience. I don’t think of them like the pop ups (or pop unders!) if the early 90s. An exit intent is limited to the window displayed.
That said, I do strongly believe in respecting user preference. If a person clicks the “no thanks” button I do cookie that preference and suppress additional exit intents. And I have problems with how some companies hijack the back button on mobile to show exit intent style content.
But I disagree with the sentiment that surfacing an offer to a user is hostile.
I definitely agree that the pop ups work, but consider plastic—it works really well also. The problem is that once everything is package in plastic now the streets are littered with it.
Every page we visit has the same things to generate revenue littered all over their sites because it works—but what about the content? It’s becoming secondary to the littered advertising.
Ads used to be strategically similar to the content presented—but now things follow us around the Internet based on purchases in grocery stores etc. The pop ups want me to spend my time helping companies improve their products for free. Where’s the content going? Anything but the primary content on websites is litter in my opinion.
I suppose it’s the frequency of them too: a foresee pop ups on ten sites the frequency becomes like plastic bags following me to the park from the shopping centers, to the rivers, etc. You can’t escape them until they’re eliminated.
Still not close to the same. I pollute a river and generations struggle. I distract you while you’re browsing online and the real impact rounds to zero.
Eventually you’ll die and no one will care about that time you got annoyed. But no one is ever going to swim in the Gowanus... at least not without billions of dollars being spent...
Me? Sure. Multiply that by the number of internet users globally, and the number of interactions they have with these sites. A half second of distraction is double-digit man-years every day.
Or, you can think of it in the context of how easily it is for some people to lose their train of thought. An unexpected distraction of even a split-second can mean minutes of trying to remember what was forgotten, and the mental effort involved therein.
There's a reason why people complain about pop-ups: they represent a real cost to our limited focus and therefore productivity. That means a lot of good ideas and a lot of work gets delayed or goes unrealized towards efforts like "how to clean up polluted environments."
I think, because if you give a little bit away, you need cookies to keep track of when / how much, and then the private browsing trick becomes an easy way to keep getting more.
> There is a solution: To buy a disposable phone number and use it to complete the verification. But this was the breaking point. At this point, I no longer bother.
This may not be enough. A case in point: two months ago my grandma decided to make herself a Facebook account to chat with her IRL friends. She had one made in 2014, but she only logged in to it few times. It was made using a fresh email, and never breached. So she logged in to that account on her phone, using the newest Facebook app on a brand new phone, provided real name, birth date and photo, and linked a real phone number, never used before. She got permanently banned within a week, by just chatting on Messenger and sending friend requests on Facebook. I suspect some of her IRL friends reported the friend request they received. Or maybe some AI decided she types too fast for an elderly woman, or uses too few emoji. We will never know.
The ban appeal form requires you to send photos of government-issued ID, and she decided it's not worth the risk.
It’s especially bad because you can use Facebook to log in to countless other sites, which you will also lose access to. I even had to upload my photograph to Facebook once to ‘verify’ myself, and my photo was rejected with no reason given.
Yes, it is true. The modern web really is becoming an unusable, user-hostile wasteland, and also is very slow, very complicated, very messy, etc. The web developer tools help a bit, at least, but not quite. But better configuration for web browser should also be better. (Of course, there are also some good web pages too, but they are making a lot of bad ones too.)
But (as another comment says) documents should not be apps; they are should be separate. (Although there is the case where sometimes you want to embed apps in documents, such as to make an interactive demonstration of some algorithm, or to include inline currency conversion, to validate or autocalculate a form, or whatever. But the document needs to be readable even if the code doesn't run, and the code probably shouldn't run by default. Also, I think such embedded apps should perhaps be separate frames even if enabled, and do not need to do as much stuff as is now exposed to JavaScript interfaces.)
The modern web is also overused. Often, there are better file formats (e.g. plain text), protocols (e.g. IRC, NNTP, Gopher), etc, that can do as well, or better. Or even existing features of HTML and so on, which you can use but they use worse ones instead.
For better or worse, NNTP and Gopher are effectively dead in the mainstream—and probably outside of extremely narrow niches. Wanting people to use them more is going to be an exercise in frustration.
I can at least understand Facebook (that I'm on), the NYTimes (that I'm not on), or the twenty other big sites being paywalls, even if I hate it.
But random newspaper from random little place telling to subscribe to their meager content after three articles? What do they really think? I mean, even if I subscribed to my own local news paper in my own little burge, am I really going to be going to the trouble of logging on and so-forth?
All the paywalls seem more like excuses publisher make for their failings than anything they could reasonably expect to make money from. It's a lot like scholarly sites charging $25/article. My guess 0% percent of visitor pay this. It's not a price people, it's a price to justify other bullshit.
The funny thing is that the local burgh paper needs that subscription or it will not be there in a couple of years. They need it a lot more than NYtimes, perhaps who has a bigger global base.
My answer to this is yes, absolutely, I will log in and pay for good local news coverage, and I do every day.
If you don’t live there but the story explodes for some reason, why not try to get some new subscribers? Asking for money is not a big deal in my opinion. Asking for it by compromising your UX or user privacy, however (“only paid subscribers can view in incognito”) is terrible.
Why isn't there some sort of coalition between local news sites that allows e.g. a paid subscription to oregonlive.com to allow you access to m[ichigan]live.com, nj.com, and other "local" news across the country?
Sinclair Media already owns half the local news broadcasters around the country; why isn't such a setup already done?
Small publishers don't really have the resources to think much about their monetization. Sad that there are not more companies working on monetization really, i was hoping browsers like firefox would step in to help.
I know the “safe” decision in a boardroom is to suggest whatever makes the numbers go up, but this post is right. People push it too far and damage their products in the process.
Examples: I no longer click on Quora in search results. I’ve stopped using Reddit on mobile. I used to love Imgur, but now I hesitate because they force you to run 1MB+ of React and download 553 resources for a subpar experience of viewing a single image [0].
I understand that these are businesses and the lights must be kept on, but certainly there are better ways to do this. Maybe a more humane strategy wouldn’t be as profitable in the short term, but it would certainly be better for building a sustainable business that actually lasts.
One particular annoyance (present in this post) is the FB pages.
Many business only have online presence in FB and nowadays I think you can’t even check reviews or e.g. the lunch menu (perhaps depends on the A/B test part you fall in).
I left FB many years ago and no way I’m creating a dummy account for the same reason shown in the post (requires a phone).
So I just give up and can’t blame the business owner since the person probably went for the straight forward free choice and didn’t have to learn anything new but FB, I truly hate it nowadays.
Is this really still true? I have not been in the US for a few years, but at least in many of the European and Asian countries I've been in, this just is not a thing. I have noticed that in places like Korea and Japan, local aggregator sites are still popular/crawled by Google as the results for search/maps, but even my colleagues, Tokyo natives, use the same aggregator sites when I'm visiting and we're looking for a place to eat.
Facebook feels pervasive, sure, but from my POV it's still contained to North America, and the influence elsewhere is waning.
I think he mean that for small business facebook is stadart nowadays.
A lot of small shop owners in east europe have only facebook page with location and hours and some updates.
And I can't blame them but there's fact that every platform becomes shit after gaining popularity or goes down if not.
And people without technical skills are feeding facebook with this situation.
Would be nice for domain sellers to give free single page generator with basic info and free hosting for this html page attached to domain.
> One particular annoyance (present in this post) is the FB pages.
Or how about Pinterest?
I've tried to make Google keep Pinterest pages out of my search results, but somehow Google does not want to improve my "search experience" even though they explicitly say they do.
The big divide is between apps and pages. Pages are just documents and should not contain any code. Apps are fine to contain code. The problem is that we now have pages that believe they are code and apps that have to be styled using the mechanisms for pages. This duality is here to stay for at least a while to come, and maybe for a long time.
But it would have been good security wise and privacy wise if this divide were made more explicit.
> Pages are just documents and should not contain any code. Apps are fine to contain code. The problem is that we now have pages that believe they are code and apps that have to be styled using the mechanisms for pages.
Javascript was intended to script HTML pages long before the concept of a "web app" was a thing. A divide between static documents and "apps" never even existed in theory, and with very few bleeding-edge exceptions, apps are also documents.
The reason the distinction isn't present in the web standards is that the web never intended for apps to be in scope at all. Being able to make something like Google Maps available on the web is pretty much a hack.
Intent for the ability to run code on the web goes back at least as far as discussions around HTML3, and you can find discussion about it on mailing lists from 1995[0]. Before javascript took over, the intent behind the <SCRIPT> tag was to support multiple scripting languages[1], and of course <APPLET> isn't exactly new. It isn't true that the web was "never intended for apps to be in scope at all," rather it is true that HTML didn't support anything of the sort from its initial version. Of course, HTML initially didn't support much of anything.
The reason the distinction isn't present in web standards is that likely no one foresaw the massive complexity of modern js applications using HTML as a GUI analogue - it was assumed that code on the web would be like shell scripts or utility scripts, and javascript was just intended for light housekeeping, nothing that would necessitate forking the entire web to create an "app" space.
However I have yet to find much evidence of hostility towards the concept of running code on the web from the people who actually came up with it - the "javascript delenda est[2]" attitude seems to be a modern ideology. Rather, the discussions I've read were about how to make it possible, what languages to use, etc. Not how to prevent it.
Among other things including DOM scripting, yes... but there was never any historical intent to consider HTML pages which used javascript at all as being something fundamentally separate to pages which didn't, or something other than documents.
But the distinction wouldn't have even made sense at the time, and for most of the web, it still doesn't make sense. It may only make sense in the future, when WASM matures more, though.
All of the privacy violations and dark patterns that javascript gets employed for are the result of choices made by developers and corporations... they're not fundamental and innate features of javascript or of having scripting in the web to begin with, so they're not problems that would be solved by quarantining the language to some kind of "app" space.
It's not like all of the problems on the web are caused by technology. Adding a <script> tag to your HTML doesn't automatically make your page weigh 10MB and snoop on its users. All those problems are caused by business decisions. When you decide to add an analytics script, or put an ad on the page, or hire a designer to make it look appealing to customers - that's where things go wrong.
In other words: even if we had a page/app split on the web from the beginning, businesses would still fuck it all up, and we'd have the same problems with bloat, ads, dark patterns and surveillance capitalism as we have today.
What I have been wishing for a while is having well-defined purely document-oriented subset of HTML5 standardized. In practice it probably would mean different levels or profiles defined. It could help with many usecases beyond just web, email coming to mind as an example
The real challenge here is that document and application tend to get mixed up on the web.
Looking at even a good old-fashioned blog, there are really three nested things on your screen: the blog post document, the blog post-viewing application, and the browser.
Why a blog post-viewing application? In theory you could skip it and just have a plain reader view, but in practice you do want to have navigation and comments, and that's what the application part is for.
Many modern systems surface that split in the form of an API: the API is what gives you access to the underlying document.
The challenge would be making that solid accessible to the user. For example, could a browser natively have two address bars? One for the document, and one for the application that is used to access the document?
This might be a brilliant idea, on par with asm.js.
Mozilla can implement it like they they did with asm.js, after all it is a subset, not a superset.
It will then work in all browsers but (I guess) can be made to work much faster if the browser knows it can ignore the piles of old hacks that full html has to consider.
Edit: and like asm.js it might lead to something even better going forward.
I have had thoughts around that same idea as well. Could be a start to just define a subset of existing good semantic HTML elements to use for a blog, a news site etc.
My dream is to be able to browse a subset of the web, that only uses this kind of document format. Something like docuweb.example.com or whatever :)
Yes, true. It is important to understand, that computers brought us further than paper. We now can have "Interactive Documents". And we should respect that, understanding, that this often dissolves the border between document and application. I see no problem with that. In the end, it all comes out to: sometimes you need program logic (Javascript) in the document. If this is the case, then just use it. The problem, as so often, starts with business and marketing. For them it's neither a document nor an application, it is "presence". And this is how it tends to get nasty...
That's true, there are places where code improves the page. But these are the exceptions that confirm the rule, your average news site, blog, scientific paper typically is improved without JS, not diminished.
And the exception use case could be handled the same way that Flash was during its decline -- a box showing that there is some special content on the page, but it doesn't run until you explicitly click on it.
Then it's there when you need it, but the friction discourages it from being used frivolously and gives you the opportunity to decline when the context makes it obvious it's not being used for anything good.
This article is a great headline that doesn’t even begin to scratch the surface of what’s wrong. I’d say the internet has become an Amazon-style marketplace of fake do-do. Anything you search for results in nothing but endless fake domains with fake reviews designed to pitch you more ads and track you even more. Finding an actual original legitimate source of anything is essentially impossible at this point.
How do you find an authoritative source on the web, if you're not one of the older generation who figured them out earlier?
Google directs you to a handful of big-name sites and a large pile on the spectrum between blogs and content farms. Facebook will only give you other Facebook pages. When I tried to track my passport application last year, my government passport agency was only available via Twitter.
Good luck getting to an actual authoritative organisation that runs their own website, or a college professor who actually has specialist knowledge about the subject.
I do find it incredibly hard to find anything resembling a trustworthy source of info on a bunch of topics.
Travel? Full of nonsense
Diet? Don’t even try
Exercise? Maybe
Health/Medical? No
Anything I might purchase? Depending on the category, everything is an advert by someone who has barely done anything with the product.
every month a new site that lift content off stack overflow and somehow manages to top it in Google results is born, and that's an extremely niche audience with laser focused searches. I can't fathom what the web looks like to the untechnicals that don't know how to filter by domain or add keywords to remove the most egregious spam
sure the good stuff is out there of you know how to find it, but the discovery is getting worse by the years
The author is probably in the US, if you're in the EU you get a bonus:
The cookie popups that cover half or all the page, are usually dark patterned so it's very easy to opt in but hard to impossible to opt out and worse, come back every few days if you didn't opt in.
I simply can't follow some of the stuff that gets posted on HN because I'm not about to spend a day going through, for example, Yahoo->Oath->Verizon's popup about agreeing to tracking...
presumably they don't store a cookie indicating your opt-out choice because you chose to opt-out, that is a small price I am willing to pay as a reminder that I don't really need to read the article on their damn website anyway. There are multiple tech 'news' websites that are often posted on HN that I haven't read articles from in over a year. I find personal tech blogs to be much less invasive and more interesting anyway which is the silver lining to the current trend of news websites being so unusable.
I don't have a Twitter account, so I browse it by going to mobile.twitter.com, which allows me to view tweets and replies. I assume this is an oversight by Twitter, and it will be locked down at some time, but it serves me for now.
Similarly, I only ever browse reddit by going to old.reddit.com. I don't expect that to be supported for much longer either.
I maintain a separate website for my sole proprietorship that is fairly tiny made with MkDocs. I have a blog running on ikiwiki. The sole proprietorship’s Facebook page mostly points you back to one of those two sites. There is a tiny amount of JavaScript on the MkDocs site but it is manageable.
Living in rural Ohio with flaky cable broadband means uBlock stays almost permanently on. I’m sure high engagement ads are nice things. Waiting a couple minutes for them to load makes for a poor user experience out here on the fringe of the qualitative digital divide.
I mean it’s only $2/month. Less than what you’d pay for a cup of coffee. Or $340/year; here hoping you are very bad at math.
I suspect the 340 is deliberately designed to combine with the psychology of "bulk prices are always lower"; when shopping, I've seen plenty of examples where the larger quantity and more expensive item is shown boldly as "on sale" with a prominent reduction from a "non-sale price" when its per-unit price is still higher than the smaller quantity one.
No, $2/month is the introductory rate for 3 months, then it goes back up to $34 or $39 a month, so if you're subscribing long-term, it still makes sense to go for the yearly package after your introductory price.
Could also be a business account thing. Lots of companies make you manually file an expenses claim for each payment, so if you're not paying anyway why not pick the easier option?
1. Customize a decent browser eg Firefox to my own safety standard (disable cookies, Autoplay, etc.)
2. Treat every website like an application. If a site is usable under my settings, fine.
3. If a site needs an additional capability, e.g a login credential or cookies, only provide them for that site if it’s compelling enough, otherwise blame the site and move on.
After customizing for a few dozens of sites, and not being able to use countless others, the modern web becomes a lot more manageable for me.
I do similar things and it kinda works for people like us. But, firstly, what about the rest of the world that's less knowledgable or technically inclined; and secondly, the fact that we (figuratively) have to put on armor and carry an arsenal of weapons before we can think of using the web still leaves a bad taste in my mouth.
I don't know what browser you use, but I'm not familiar with any that both allow notification requests and don't allow you to turn them off somewhere in settings.
I’ve noticed sites putting up a fake notification modal first, and then triggering the real browser pop up only if you click yes. I believe this is so they can ask you again on your next visit if you say no, which you can’t do if you deny the browser pop up.
Fool me once, shame on you. Fool me twice? Not going to happen because JS got disabled for the site the first time around. Site doesn't work after that? Too bad. It's not like there aren't a bajillion alternatives.
Why was draw_down's comment killed? A quick glance at his profile indicates he is a 4 year old account with more than a 1500 karma, yet every recent comment by him is flagged/killed [1]. Yes, some of them appear trollish, but not all of them.
Comments which were killed due to flags are distinguishable: they say [flagged]. His comments are just [dead], meaning either they were caught by the spam filter or draw_down was shadowbanned; presumably the latter in this case. HN’s policies around shadowbanning continue to mystify me.
While you guys thinking Facebook, Twitter, Medium is bad enough, we in China is on another level.
Most top websites in China will "ask" (If pop a download window counts) user to download their apps if the user is on a phone, some websites are intentionally left dysfunctional to force user to use their app. And guess what's the first thing those apps ask you to do? "Please login(, and... maybe send ALL your information to us plus a secretly taken screenshot of your portrait)".
Since almost all of them are doing this, I assume it is a crucial strategy for them to keep their user active. Facebook Twitter etc maybe up to the same thing just not bold enough.
This happens everywhere. Privacy considerations aside, asking a user to download an app is just like placing a brand name on his/her main page, so that they're not just constantly exposed to it, but will be tempted to use the app when in need, real or ad-induced, of something to purchase.
For Internet company, user is what generates value. So it make since for those companies to grab their user as tight as possible especially when the competitors are doing the same.
On the same note, if a user just don't want to login or use the app, then the user is no value or even a negative value for the company. For that, a company can intentionally annoy (the actual word used is Convince) those users until they give in.
With all that been said, what I wanted to point out is, sometime, parasite can be successful. This is why we have many types of parasites in the real world. Automatically assuming "'parasites' will just die" is not going to help if the end goal is to kill all the parasites.
"we can't possibly have apps for anything that spark our interest. that just won't work!"
Exactly. That was how the internet worked before the web (and still does today): pop/smtp, ftp, nntp, gopher, etc. had each one its own client because they were different protocols doing different things. But today, when phones have a working web browser that does what 99% of apps can do and more, it's against any logic forcing people to install an app for every brand, sometimes for every product. That would be ok for mail, news, file transfer, IoT, music, video, etc. because they're way different things, but installing 5 apps to read 5 newspapers or buying 5 products from 5 different vendors is like throwing into the toilet all innovation brought since the web was invented.
I can guess only three reasons for this: sticking a brand name into the user screen, bypassing browsers adblockers, and getting a foot into the door for nefarious purposes since most apps require insanely dangerous privileges which one day could be exploited.
I wish this were true, but unfortunately cheaters win and growth (of money, power, or both) trumps all other motivations for some people.
Competing is hard. Why compete when you can buy elected officials and change laws in your favor?
The sad truth is that lots of value-destroying and/or cheating companies succeed for decades.
Examples:
- every management consultancy
- IBM
- record labels + RIAA
- Outbrain and all the other "related links" companies
- many pharma examples, like the Sackler family
- all of the heavily subsidized US industries like corn and oil
- Nestle, Coca-Cola, and other companies monopolizing and selling public water
- many grad school programs
- Mercola
- countless Chinese knockoff brands on Amazon
- the House of Saud
- Susan G. Komen Foundation
- Intuit/TurboTax
- Amway
- almost all fund managers
- HerbaLife
- Donald Trump
- most ISPs, cable companies, and mobile carriers
- many (if not all) major defense contractors
That's a great analogy, but I'm not sure of the conclusion. Being a parasite on man is not good evolutionary move, since we'll pick the little bastards off. But attach to something stupid, that's a better prospect.
So I guess it depends on how bright the host is, and that's questionable.
Looking for restaurants as I'm traveling is a difficult process. I used to land on Yelp and its imitators, thinking that I was getting a solid list of what was available, only to find that there is so much more that isn't listed. And the reviewers are generally awful.
People search sites. Years ago, I was able to track down quite a few people from my long-distant past for free. Then those sites were bought by companies like Intellius, which is about as helpful as classmates.com.
Early on Classmates.com had as pretty much their sole value proposition that they'd actually input insane numbers of class rosters and set up pages for the schools as gathering points. The social network side of things was actually more successful than I would have thought for a long time, but Facebook has pretty much eaten their lunch on that side so they're back to needing to be a good contact method. They're not.
Worse, they're getting creepier. They'd already collapsed into a nearly pure scam by 2010 or so, but apparently they're doing some very creepy stuff with their data sources now. I had to work with them trying to connect another site (ick) and never had an account that had both have my real name and high school (in fact my school was only in test data), but at some point in the last few years they connected the two and are trying to convince me that people are looking for me under a name they'd never know. I guess the inevitable devolution of something selling getting back in contact is selling the idea of contact without the reality.
It's gotten to the point where when I read an article, I will move my mouse around to try and trigger the "Subscribe to our Newsletter" popup, so I can close it instead of accidentally triggering it as I read the article.
This behavior drives me nuts. I'll be like 3 minutes into reading and, bam, a modal popup blocks out what I was reading and I have to click it away. The ones that really make me angry are when I move towards the back button and show a popup (nope, I'm not gonna spend another millisecond here after that).
Or even worse, the ads that come in from the top and push everything down the page then leave a minute later.
It’s the browser vendors’ fault - they not only failed to provide us with options for disabling/circumventing this abuse, they are complicit with the ad industry and actively support their interests rather than their users’. They’ve also made browsers so complex that it’s impossible for small developers to provide a more useful alternative. This is where I am hoping we will reach the breaking point soon, so we can replace the whole mess with something simple and user-friendly.
While I agree, that we need a Community's browser (one, that fully supports XML, ideally bringing back the original Seamonkey/Firefox way to do things), rather than a Corporation's browser, I doubt it could be made more simple. Today's technologies are here for a reason. It's always what you use your kitchen knife for. Change will not happen. But what we can do, is to keep "our" web alive and active, and communicate and socialise outside of the big corps wet dreams.
Just create your part of the web the way you want it to be, and as long as we keep up, we will not vanish. We need to understand, that we must support non-profits, like Archive.org, with our money, though.
On the surface I agree with this, but you don't necessarily need to make a new browser to provide an alternative. Writing a browser is a fool's errand, but there are like a bajillion Chromium forks all focused on "privacy"and whatnot that block things by default.
Its pretty concerning that lines between desktop and web are blurring and the oses are platforming. Some additional things I feel are pretty hostile.
* Paypal - defaulting fee based payment options rather than "free ones" or allowing user to set a default.
* Google maps - Searching for "jiffy lube" will show you the business entry and its reviews. If you search for "oil change" it will show you a jiffy lube "ad" that looks the same as the entry but if the reviews are bad they are hidden. In this particular case the location had several thousand bad reviews.
* Google maps - restaurants "menu" button takes me to a page of a bunch of customer uploaded images.
* Amazon app - why isn't "returns" a main menu item for amazon mobile app.
* Uber eats/grub hub/favor - app carts hide fee increases to account for minimum spends meaning you are loosing out on another item for the same amount, and make it harder than it should be to remove items from cart.
* Nordevpn desktop app - hides the logout button in right click -> preferences.
* Apple/Google stores - make it difficult to view recurring subscriptions, and to request a refund.
* Chrome - over the last year I've had to go into settings and tell chrome to not allow sites to offer me notifications so many times. I don't understand why I have to keep resetting it with sync and similar.
* google images - pretty worthless thanks to pintrest, and that I cant download images cause bs corp ip laws.
* Image sharing - still getting more difficult in 2019.
* App notifications - more and more apps push unimportant notifications or spam for engagement while lacking granular controls. I find myself disabling more and more notifications outright only to miss a notification that I actually care about.
Remember that we are techies, "hyper" aware of these decisions and their implications. What about nana? We are loosing digital agency.
When the general web user finally revolts against sites handing them arb programs to execute, the people that rely on them running their code ahum googfb caugh will have a "next gen" HTML that just happens to also be turing complete.
The only solution is to refuse to execute their code, and in the cases where we actually want to, grind the GET response through a halting engine a-la BPF.
What world is the author living in? Probably just the world of a web developer. Most people use the modern web all the time, it sucks, but sucks != unusable. Very clickbait headline.
I realized, while making a Christmas list this year that I don't know how to shop for things on the internet anymore.
I am not a web developer, but I have grown up as the internet did and I've seen more of it than most people.
Amazon: Its tolerable if you know exactly what reputable-brand item you want, but if you don't you immediately find yourself in an unusable cesspool. Fake reviews. Chinese knockoffs. Low quality promoted items. I searched for "clamps" in the "tools and home improvement" category and there in my Amazon search results were trashy supermarket novels.
Amazon alternatives (e.g. Jet?): The selection is so poor that they are not usable for general shopping unless you are willing to settle for whatever one item is available.
Google/Bing/etc: Haha no. Sure you can find 1 bazillion retailers for any item you can think of, but they feel solidly sorted by "amount of money they pay google." Google shopping is more-or-less useless, and regular google search results don't allow you to compare products. Also, since it is not hard to find decent web developers, it is next-to-impossible to judge the reputation/quality of the bazillion retailers by their website polish.
Ebay/craigslist: Oh god no. Paypal hates me for some reason (it never accepts my perfectly ordinary credit cards) so payment can be literally unusable. All these sites feel like you're just selecting from a long list of scams. The amount of BS you have to manually filter out is completely ridiculous, and you are clearly competing with people who scrape these sites for items to flip for a profit.
Facebook: Unusable by anyone who actually cares about their privacy.
Chinese importers (ali/dx): shipping times of multiple weeks is unusable for many things. At least you know that you will be getting cheap Chinese products, so you can plan accordingly.
I have a similar impression of the current Amazon experience, but what works for me is shopping at small-ish stores using Amazon Pay. Hassle-free, no registration needed and I get proper service if required.
ITA: OMG! I'm required at many huge sites to register for the site to view content! Some of them aren't even free, I have to pay!
If you don't like those sites, don't use them. Sites as described do not constitute the "entire web". There are plenty of great still free still usable websites. There is an entire internet of them.
Believing that the whole web is just the monopolistic giants is the problem here.
Even simple stuff like the back button is now hopelessly broken on a lot of the news sites I frequent. The correct behaviour is to bring me back to where ever on the page I had scrolled to but that's no longer possible if you need to click a link to view the whole article. Even without that dark pattern it can still take 10+ seconds with all of the JS bloatware infesting the "modern" web.
I've started noticing in-page search breaking more and more, for the same kind of reason. Sometimes you'll see more search results reported than are visible and/or can be navigated to.
I know it's unoriginal to attack the 'hypocrisy' of such an article, but I found it frustrating that the page includes a 'kudos' control which a) could just be a click but, for some reason, is a 'hover for a very short number of seconds' action which cannot subsequently be undone b) works with the mouse only.
What we have today is: email addresses are valuable and people give them out.
The problem is that Gmail has failed to introduce one-time-use or per-recipient email addresses. That would immediately solve the problem of coerced email collection.
I use a subdomain for this. *@subdomain is forwarded to me. Anytime I need to give anyone an email address, it's a unique one (usually their name or a derivation thereof). Anytime I tire of email from a given source, I configure my postfix to reject the message with something like "550 Go fuck yourselves."
the modern web reminds me of the days leading up to the .com bubble burst. Adverts being crammed every where, spying on users and other user hostile tactics to an attempt to make a dollar.
Engagement-focussed platforms and publications are of course trying to sign up as many people as possible using dark patterns and other annoyances to make people subscribe to their services to be spared from all the bullshit. I'm not surprised: They actually provide useful services for free and the bigger the audience the higher their revenue from advertising. Over the next few years content on the Internet will go the same route as music and video: It's free if you endure the ads and there's one subscription for a majority of the publications with each getting their share.
However, what the author describes isn't "the web". The web is about 2 billion web pages with just a couple of percent of them being actively maintained. I'd be willing to get that just a tiny minority of these pages are actually serving the annoyances this post highlights.
We've seen the abuse of new technologies come and go, people just don't think about what they're building, they simply use what's available: Blinking text, text with drop shadows, Comic Sans, pages full of GIFs, pages built with tables, iframes, pop-ups, link-hijacking ads, Flash-ads, embedded Java code - the web was never the clean and organized place make it out to be. Never. It's a romanticised notion that just doesn't die. It's the same rethoric as when people claim that life was easier at some point in the past.
It's all about finding common ground on what's necessary and what isn't and it usually starts with a couple of independent publishers trying out something new and the big players following their lead. HTML5 video killed Flash and Silverlight with WebGL/Javascript & modern CSS features being the last nails in their coffins.
The web isn't just a couple of hundred big players and we will probably survive annoying push notification requests, overly complex Javascript and EU-mandated cookie opt-in hints.
We will have to live with me stuff being abused for a while until browser vendors device that it's gotten annoying enough to do something about it.
I wouldn't be too surprised if we find that all browser vendors decide that they'll disable cookie support in their browsers with a clear prompt to allow 1st party cookies globally only to then do some local machine learning magic to click all custom cookie acceptance dialogues without user interaction or something similar. Of course we'll then have to endure our browsers asking for permission to access Bluetooth devices in our vicinity or to run advertisements disguised as tiny WebAssembly programs. But even that will sort itself out as some point.
The web has always been user hostile and that won't change. The level of hostility however, will.
It used to cost a heck of a lot of money to host a website. At one point I was responsible for sendmail.org, which was a pretty popular site for the time (2000/2001). Especially when we had a new release and got "slashdotted".
To build a site that could stand up to that kind of traffic, we had to pay Dell $10,000 for a beefy enough box, and then host that 5U monster in a cabinet in a datacenter. That kind of datacenter space cost about $500/mo.
So accounting for inflation, that was about $700/mo for the datacenter space (assuming I had use of the remaining 37U that I was paying for), as well as an initial investment in hardware of about $15,000 in today's dollars.
This was a static website.
To run that same site today to handle an equivalent burst of traffic, I could run it on AWS for about $15/mo and have much better reliability, or on a $5/mo VPS and have the same reliability as I had in 2001.
All with no upfront investment nor the need to rent a bunch of cabinet space I may or may not use.
these days, the hosting costs of an "internet of yore" site is literally only a couple bucks a month on a vhost provider. In the good old days it was really expensive.
It's literally free to host mostly static sites these days. Just put free Cloudflare in front of free App Engine or other free host. Unlimited traffic, $0.
Nitter is an Open Source alternative front end for Twitter that requires zero Javascript to run, and it's really good, and is way nicer than Twitter's own UI, and yall should use it. I see Twitter threads hosted on HN all the time, and I kind of wish more of the posters would link to Nitter's public-facing instance instead of Twitter. I think its interface is just objectively better than Twitter's for browsing.
Invidious is an Open Source alternative to Youtube that is also really good and requires zero Javascript, but unfortunately it suffers from a lot of performance problems and videos will occasionally refuse to load -- Youtube is I think more hostile about throttling/blocking that kind of stuff, and proxying videos is probably harder than proxying tweets. You can donate to Invidious though, which might help them afford more powerful hardware.
In regards to Medium -- if you disable Javascript those popups will all vanish, and for the most part all you'll be missing out on is comments and occasionally a lazy-loaded image. It's amazing how much of the web (particularly news) works fine without Javascript, and getting rid of it gets rid of a lot of these annoying popups and requests. Use uMatrix, it makes it really easy to turn Javascript off by default and whitelist as necessary. And every person who disables Javascript is one more tiny audience metric to make it ever-so-slightly harder for news sites to ignore progressive enhancement.