Hacker News new | past | comments | ask | show | jobs | submit login

> 22.82 Mbps will reliably download very complex web pages nearly instantaneously.

The author may be unaware of how ridiculously huge web pages have gotten. I just loaded Wired.com, scrolled down and let it sit for a few seconds. It downloaded 96.2 MB, requiring over 33 seconds on one of those average connections. On a pay-as-you-go data plan, it would have cost about a dollar just to load that one page. The front page has about 500 words of content. It also covered 100% of the content with ads, twice.

This is unsustainable. Web developers have utterly squandered all efficiency gains of the last 30 years, and then covered their grossly inefficient work with impossibly annoying, privacy-invading advertising. Google should be applauded if they make these wasteful developers suffer monetarily until they shape up. They've already stolen untold amounts of time and energy from us all.




>Web developers have utterly squandered all efficiency gains of the last 30 years, and

Loading Wired.com with uBlock and no javascript the page comes in below 1.5MB for me, with most functionality seemingly intact (in that the front page looks mostly normal and I can load articles that appear to have their text completely intact). The bulk of that seems to be fonts and images, which are probably unavoidable for a media site.

Some reasonable noscript whitelisting for Wired.com and a few others (out of 12 total that noscript shows me) gives a page size that's still under 5MB.

Looking at the full page with everything loaded and unblocked, the biggest offender here seems to be not web design, but an aggressively downloading autoplay video on the front page. Without that the page itself is - while not necessarily great - at least reasonably bearable.

Truth be told, I'd started this post intending to blame advertisers, and there is still some merit to that since even before the video kicks in the various third party scripts balloon the page size several times over from the minimal functional one that loads with everything blocked. But in this case, it does simply seem to be a wasteful media stream playing without regard to whether anyone wants it to or not.


With uBlockO it was 1.5MB for me, without it was 3.8MB, on Firefox, coming from Germany. Which is still pretty ridiculous on both numbers for what's actually visible on the page.

Once you scroll, however, things get messy no matter what, because of their "Scott Adkins Answers Martial Arts Training Questions From Twitter" auto-play video they have right now. That ate way another 30MB quickly and the video wasn't even visible (I had scrolled past it).


I browse without 3rd-party and 1st-party scripts [0]. I wanted to praise my setup but it does not work well with Wired:

1.37 MB / 722.93 KB transferred, Finish: 6.57 s

versus uBlock default

8.62 MB / 4.49 MB transferred, Finish: 28.38 s

Clean setup mostly increase load time:

11.58 MB / 5.79 MB transferred Finish: 1.13 min

^ checked with "Disable Cache".

Not much content delivered for such big HTML file

660.46 / 167.60 KB transferred

because it's mostly inline script:

    $$('script').map((s) => s.textContent).join('').length
    568681
And fonts.css is inline font:

127.70 / 100.15 KB transferred

[0]

uBlock Origin:

    * * 1p-script block
    * * 3p block
    * * 3p-frame block
    * * 3p-script block
    * * inline-script block
or uMatrix:

    * * * block
    * * frame block
    * 1st-party css allow
    * 1st-party frame allow
    * 1st-party image allow


That's still a lot. The longest pages on my website download around 200kb of data. 1.5MB is a lot of data for text-based information.

Of course, some space should be left for enhancing the text with images, but the Wired front page doesn't have that many of those.


We all use ad blockers but most people don't.


25 years ago I used to read Texas Monthly magazine (on paper) because it was some of the best journalism around. I finally unsubscribed when I had to flip through 16 pages of ads to get to the table of contents, and several ads per issue contained scratch-n-sniff perfume samples which meant that reading an article required you to smell like a bordello afterwards.

Wired is the digital equivalent of that now. There's some great journalism in there, but no journalism is good enough to put up with a hostile attack on one's sensibilities just to read it.


Similar to https://www.google.com/search?q=mattress where 100% of all the content you see there is an ad (turn off your ad-blocker just to witness the monstrosity).

Isn't this a direct insult to intelligence? And where and when do you draw the line?


There isn't much difference with adblock on or off for this page for me (using Chrome on Windows). With adblock off, there are three obvious ads on the top right of the page, search results on the left. With adblock on, these are gone.

In both cases, the top of the search results is a map with a few shops listed, and under this are normal search results.

Repeating this on mobile, the ads are moved inline above the map. They take up about 50% of the screen. That is a bit intrusive. However, I have muscle memory that automatically scrolls down a bit after searching and to be honest, I would not have even noticed them if I wasn't looking.

I don't know where the the 100% of content being ads claim is coming from. Perhaps it's different in different regions?

I'm not trying to be an apologist for Google here, by the way. However, when it comes to intrusive ads there are far worse offenders (although perhaps not when it comes to behind the scenes data collection, which is a bigger issue).


Here is what I see: https://imgur.com/a/OLSWA3K

As far as I can tell pretty much everything on that screen (including map results) is an ad. That's 100% commercial results on a fairly large screen. Plus there is more below the fold.


Wow, that's very different from what I see.


You're right, by turning off my ad-blocker, my mobile screen was fully covered with ads.

What constantly worries me however, is what happens when most visitors use ad-blockers? My guess is, a war would be declared against current web standards, where users can still modify DOM and remove unwanted content. Edit: reword for clarity.


I used to get Wired in print in the late '90s, and i can tell you, Wired was the print equivalent of Wired.com now!

Pages and pages of ads before the contents, and then each article would usually be one or two beautiful full colour pages, before a 'continued on page 94' which took you to the remainder formatted as cramped commieblock-looking text at the back - which you got to by skipping over a few dozen more pages of ads.


Indeed. Ads are just noise that should be filtered out. To visit a website and be served 100 megabyes of noise along with less than 1 megabyte of actual information... It's an absurdly low signal-to-noise ratio, such an incredible waste.

We need to fix the web by blocking these things. No compromises. If a website can't exist without advertising, it should disappear. Eventually, only good websites will remain. Websites made by people who actually have something to say rather than websites made purely to attract audiences for advertisers.


There are two types of noise being served here

The advertising The tracking

The tracking is being used to: tell advertisers how well their advertising is working tell the site how well articles are working give unscrupulous sites the possibility of selling that data to others, which are probably advertisers but maybe other companies.

As far as the ratio of advertisement to content this https://www.editorandpublisher.com/news/higher-ad-to-edit-ra... regarding that ratio in newspapers assumes 60 / 40 where I believe the 40 is supposed to be content (although I find the wording not 100% clear)


How will these remaining good websites make enough money to sustain themselves?


Why should they have to "sustain themselves"? If an author wants to put their ideas out there, maybe they should pay for it themselves so they can have their own unmoderated space on the internet.

Authors that rely on advertising have an inherent conflict of interest: they simply won't write anything that offends the advertisers because they're afraid of losing their revenue. Sites like Reddit will nuke entire communities if they prove controversial enough not because they're offended by it but because it causes advertisers to refuse to associate with them. Activist groups can attack and censor anyone these days by putting pressure on advertisers and sponsorships and causing them to pull out.


Why should they have to not "sustain themselves?" If a reader wants to read what an author puts out, maybe they should be allowed to be subject to advertising so that the author doesn't have to pay for it.


With a few exceptions, I learn more from user comments on sites like this than I do from today's "journalism".

Turns out, people willing to spout random ideas on a topic are not in especially short supply and 99% of them are willing to do it for free. The best part is, these free users usually get right to the point.

Long form and investigative journalism need to be funded but the kind of information I find junk articles on the homepage of CNN or Fox is usually better hashed out (and much less biased) in the comments section than reading an article.

In a sentence, most media doesn't have much value-add. Even less so if I have to click through 6 ads and be exposed to malware to see it.


> If a reader wants to read what an author puts out

Why would anyone want to read stuff like sponsored articles which are nothing but thinly veiled advertising? Articles that were pretty much written by PR firms? Why would anyone trust journalists with conflicts of interest? Social media "influencers"?

I want real information. Real thoughts from real people. Not some censored, manipulated corporate viewpoint created to maximize profits. People who actually have something to say go out of their way to tell as many people as possible about their ideas. They don't need to get paid for it. I'm not getting paid to post here.

> maybe they should be allowed to be subject to advertising so that the author doesn't have to pay for it.

Allowed by whom? The user is in control, not the author. It is the user who owns the computer that runs the browser. If any ad gets shown on the screen, it is because the user generously allowed it to happen. Most people do this out of pure good will only to end up being mercilessly abused for the sake of someone's business model. Nevertheless, it is a privilege which can be unilaterally revoked and there is next to nothing that websites can do to stop it. After content has left the server, the author is no longer in control.


Sometimes the best things aren't commercially self-sustaining. Blogs, paid for by the writers' day jobs. Professors' sites on .edu domains, paid for by research budgets.

As for professional journalism, the lack of conflicting interests caused by ads is essential for it to be considered "good", so no good journalism website should be clouded by advertising. Yes, that probably means subscriptions.


Why must websites sustain themselves financially?

It can literally cost you $30 a year to host a website.

Many people will spend more on coffee. Per month.


True, but i think the hosting cost is negligible against the time-invested-cost


Is it though? Nowadays you have plenty of good free CMS-es which integrate directly with Netlify. Sure, it might be half a centimeter more complex than WordPress admin but it's still really easy to grasp for many non-tech people (checked).

But even if it was very complex -- which it isn't -- I still fail to see how that supports a model of an ad-supported web hosting.


I think the main cost is producing the content. Researching, creating, and editing the actual words and images.


Sure. Not disagreeing. But those people can make money today as well -- via stuff like Patreon.


I hate capitalism. I'm fully an anti-capitalist.

But I'm also a realist. We live in a capitalist society and free content has come out of advertising since before the web. It's an annoying part of the present world but not the most annoying part imo. I don't know how people denouncing just advertising expect the publication of free information to work.

This society has created a vast plenty. I don't see why advertisers, the public and publishers couldn't reach a truce where a moderate amount of semi-relevant text ads get shown the reader in excahnge. But everyone wants to total control, wants to club to death all competitors and that seems to be the way this world of plenty is ending.


It is hard to believe that 100mb of payload is needed to show me same amounts of ads on a page. Merely optimising that without even changing your ad volume /model would go miles in establishing the trust that had been lost.


I agree, but somehow when I google for something, Wired and all these websites are the ones that appear on the first result, so what's the deal? Google don't care if your website is heavy as long as that domain provides a lot of ads income?

I've made a website pure HTML with just a small CSS and no JS; with real great content. Google doesn't take it into account. So I don't know if they are really pushing for a light network, or maybe, I don't know, because it is easier to convert these website into AMP?


High network utilization doesn't have to mean an unresponsive website.

Google cares more about responsiveness. A site is responsive if you can start reading it quickly, regardless of how much network traffic is being used by ads, as long as the ads are loaded asynchronously after the primary content.

Penalizing for high total network traffic is short-sighted and would prevent most video hosts from ranking well in Google Search.


having worked specifically on website responsiveness improvement for a medium-size publisher, I can certainly attest that ads heavily influence performance, even if added asynchronously after the first paint, or after the site becomes interactive; we tried every trick in the book, and then some.

the main reason is that ads are coded by people that have no interest in performance. I've seen huge JS libraries loaded to display a static image. I've seen multiple versions of the same library loaded in the same ad. I've seen things that wouldn't fly anywhere else.

Why? Because the ad agencies are under a lot of time pressure to make things quickly, and there is NO penalty if the quality is terrible. So they take what worked last campaign, add new tags to satisfy the new customer, and ship it out. It displays? Perfect. It's huge and slow as hell? Who cares, it's not their website that gets slow as molasses.


Do you know how they decide if someone is able to start reading quickly? Personally I find asynchronously loading ads around the text so distracting that I basically can't focus on reading until it's done loading so for me I basically need to wait for the whole site needs to load anyway. In the same vein, I would prefer Google down score all sites with anything but entirely static ads.


One of the performance checks they use is how quickly the body text renders into a readable format as a page loads. If your site has slow loading web fonts, AND you haven't specified a fall-back font (i.e. serif, or sans-serif etc.), Google will penalise it.


They decide by the Wired guys having a coffee with the Google guys or a chat in some party.

Google is people afterall. Corrupt as fuck. That's why plenty call it gulag.



You're being downvoted but this is actually a useful suggestion. I use millionshort.com all the time when I want to strip out the low-value SEO crap that dominates search results.


Thanks for the appreciation


Google also considers page and domain authority, which your website likely doesn't have.


"Google should be applauded if they make these wasteful developers suffer monetarily until they shape up."

As the browser author with the dominant market share, Google could start by not enabling these "wasteful developers". These ridiculously huge web pages can lock up a user's computer when the user's browser is a "modern" and "supported" one but not when the user agent is something simpler.

I can read wired.com really fast using various tcp clients to make the http requests and links to view the html. If user A reads the articles with Chrome and user B reads them using something simpler, how is user A advantaged over user B? All things being equal, if we quiz them on the readings, would user A score higher than user B?

The difference is the advertising, which almost always requires graphics - the more dazzling the better, and detailed tracking, which requires Javascript and the presence of other "modern" browser "features". There is a strong argument to be made that Google's browser caters more to the online ad industry (who wants to show ads and do tracking) than to users (who want to read wired.com quickly and efficiently).

Software developers have long been squandering user's resources beginning with Microsoft Windows. Hardware manufacturers were Microsoft's first customers and there was an incentive to get users to keep "upgrading" their hardware. - buying new computers.

Web developers are simply following the tradition.

A user can get those 500 words of content in an instant with zero ads, using the right user agent, even on an "obsolete" device. However there are zero incentives for online ad industry-supported companies/organisations maintaining "modern" browsers, web developers writing code to run on them nor hardware manufacturers to help the user do that.

The easiest way to change the "UX" for the web user is to change the user agent. Trying to get web developers to change what they design to only use a small fraction of what Chrome and the user's computer can do is far more difficult, if not outright impossible.


> Trying to get web developers to change what they design to only use a small fraction of what Chrome and the user's computer can do is far more difficult, if not outright impossible.

This is not only improbable to happen that web developers will change their malicious behaviour, it's also the user agent's fault for allowing that.

Why is there no connection speed detection in the browser? Why does the browser allow media playback by default? Why is there no mechanism that reflects the expectations of the user? Is the user expecting videos on the news website or just to read text and images?

I personally think that user agents are not really user agents anymore, as there's not even the idea of offering the user a choice like this.

And personally, I do not agree with the concept of trusting random websites on the internet - by default. Any website on the web should be distrusted by default, with the user having the choice on what to load and what to expect when navigating to that specific resource.

If I visit an i.imgur.com/somename.jpg, why am I redirected to an html page with megabytes of files just because the user agent accepts html then? Should this not be outright impossible?

But please take my comment with a grain of salt, I am building a web browser concept that allows filtering and upgrading the web (which is superhard to build) [1] and it's still a ton of work to get anything working in a stable manner.

[1] https://github.com/cookiengineer/stealth


All those things you mention are design choices.

Perhaps one of the impediments to the development of new user agents is a feeling that they must be complex and meet some "standard" of features. A standard that is nigh impossible to meet for the average programmer. On top of that, web developers demand the ability to create complex web pages full of executable code.

However we have no proof that users would reject a cornucopia of different agents that did not all have the same set of features. User agents do not need to be designed to satisy web developers. User agents can be designed to satisfy users.

They can be designed to do one thing well instead of do everything.

No user agent need be intended to "replace" any other, and certainly not to replace "modern" browsers. The intent is to create choice and user experimentation.

It is still possible to access the web with simple programs. It is not gopher or gemini but it still can work in a relatively simple way. Web developers probably do not like that but it remains true. The complexity of today's web sites is optional. It is a deisgn choice. Made possible by... "modern" browsers.

Godspeed.


I find it ironic that you bring up Windows as an example, when the amount of data the parent comment mentioned - ~100MB -- is enough for a full install of Windows 3.11 and Office 4.3... and will yield many times more enjoyment than the front page of Wired.


It is sad that Microsoft would never acknowledge that some users might want to run older software on newer computers. IMO, it is easier to see the performance improvements in new hardware when running older software than it is when running "the latest" software. I would have run 3.11 for many years on newer hardware. However the goal of the company and the message pushed to its software users was/is always "upgrade".^1 Today, it is "update".

1. Over time almost all user choice in "upgrading" has been removed. "Forced upgrades" is a thing.


You are not market share Microsoft aiming for. But you are not obliged to use run Microsoft Windows either. Linux runs perfectly on old hardware.

42 MB RAM without graphical system

64 MB RAM with graphical system

You may run Windows applications on Wine. Or Windows 3.11 on virtual machine.

Netbook I bought in 2008 was underpowered for Windows XP but was perfect for Linux. I still have it around. With up to date Firefox and Chrome it feels slow but in console mode it's snappy.

No need for install with LiveUSB. Everything is here, countless people made it possible, would you use it?


"Linux runs perfectly on old hardware."

I prefer NetBSD. I do not need graphics. I make custom images that boot from USB or the network.

As for Windows, there was a time, in the 32-bit era, and before the widepsread availability of virtual machines and Windows 3.11 images, when users were compelled to upgrade hardware and Windows versions. It was not made easy for a non-technical user to buy new hardware and use 3.11 if the hardware came with a more recent Windows version pre-installed. Microsoft will not facilitate installing older Windows versions on newer hardware ("metal", not VM) and may actively discourage it. In contrast I can easily install any version of NetBSD I want on new hardware. I am not compelled to install the most recent version. There is user choice.

How easy is it today to run Office in a VM on Linux?


Super, Linux was just an example I use.

I am running rolling release distribution on desktop and Ubuntu LTS on server. My choice of secure installations is limited [1]. Looks similar in NetBSD [2]. Microsoft had no interest in support - better if customer buy new version and support have significant cost.

VirtualBox solved running Windows in VM at least ten years ago. Office support in Wine is from platinum to garbage [3], [4], have not tried. I can imagine running outdated versions behind firewall. Running newer versions requires newer hardware. And internet facing applications should be up to date so modern browser support is limited - Firefox ESR at best. I run w3m from console only on emergency.

Speaking of hardware - Moore's law is dead. I do not think 2020 notebook differs much from 2014 notebook. Except better display and battery.

[1] https://en.wikipedia.org/wiki/Ubuntu_version_history#Version...

[2] https://en.wikipedia.org/wiki/NetBSD

[3] https://appdb.winehq.org/objectManager.php?sClass=applicatio...

[4] https://appdb.winehq.org/objectManager.php?sClass=applicatio...


> Microsoft will not facilitate installing older Windows versions on newer hardware

It usually works though nowadays, unless you go nuts and try to boot Windows XP or something. Are there any processors that flat-out can't run Windows 7 atm?

(Older versions of macOS, on the other hand, absolutely will not run on newer processors.)


If trying to get this work, Windows 7 is a good choice?

Have you ever successfuly imaged Windows 7 from an older laptop and installed it on a new compuer?

I only need Office. I do not necessarily need the latest version, so long as documents are XML-based.


I have never imaged OS's—I'm sure it's a fine practice since lots of people do it, but it feels "unclean". I always do clean installs.

That said, I was able to pretty quickly install Windows 7 on a then-just-released Ryzen 3950X last October. I do remember there being one hitch, I think I had to slipstream in USB 3.0 drivers.


You can run many old operating systems including esoteric one using virtual machine. Modern computers use very little overhead for virtualization.


This is not a solution to avoiding the resource consumption of running a "modern" OS on a "modern" computer. Your comment completely misses the point.


I just opened it and it downloaded 9Mb with DuckDuckGo privacy essentials ON, and 26.3Mb without. 589 Requests. :-(

A smaller download would be: https://www.gutenberg.org/ebooks/100

Also FIFY: s/Web developers/Suits/


Downloads of this size will also have a extremely large power footprint if you add up all the hops across the Internet.


A lot of sites even download new content continuously even after the page has been fully rendered! They swap out silent media ads one after another, in hopes that you minimized and ignored the page.


Even with adblocking it downloads 30 MB almost instantly, and while typing this reply it's now up to 48 MB. Oh wait, it's at 50 now. It keeps going up.

Looking at the requests it seems to be downloading a video from Cloudfront. And yes, in the middle of the page there is a video playing. I'm sure people with metered connections will love that.

That said, with adblocking at least the design looks clean enough. I'm willing to bet that this is what their designers see, and then another team adds all ad overlays on top of the existing design.


My browsing these days looks like

- JS off by default

- Web fonts off by default

- Media elements larger than 100KiB not loaded by default

uBlock origin is the god of internet.

Edit(s): formatting


> Google should be applauded if they make these wasteful developers suffer monetarily until they shape up.

So you want Google to use their dominant position to force webmasters into a new paradigm that probably(?) benefits Google more than today's status quo? And when people start yelling for antitrust provisions, will you still back Google?


Ironic read from Wired. Why Don’t We Just Ban Targeted Advertising? https://www.wired.com/story/why-dont-we-just-ban-targeted-ad...


>Web developers have utterly squandered all efficiency gains of the last 30 years

It's true, whenever a company has pleaded with me to bring a site in with better performance, I have adamantly refused to do so.

When companies say guess what, Bryan, we are going to focus a sprint on just making sure everything download as fast as possible and we get rid of anything getting in the way of the best possible user experience, I have spent that sprint watching quirky animation, and sometimes turned that animation into a base64 encoded gif and put it in a div that was set with a z-index low enough that it would be not seen by people but would still have to be downloaded by the browser!

I do all these things of course, because it was decided by the Secret League of Obnoxious Web Developers (SLOWD) that we should do the utmost in our power to make the web slower for everyone.

OR - it could be that I have in fact asked project management at sites repeatedly to focus on performance (and accessibility, another thing that always gets ignored) and been told that nobody care about that stuff.

I guess it's up to you to determine where the fault lies.


I prefer fast non bloated websites but your example does remind me of a tangential example. Video Game Load times. People complain but they really only complain if the content is not great. If the content is great no complaints. Case in point, any game by Valve has atrocious load times, even and including their latest, Half Life: Alyx. And yet Valve's titles are among the highest rated games. And so, load time is rarely prioritized because it clearly doesn't matter. What appears to matter is content people want. (T_T)


I think the context of what you're serving matters. In this case the game content is good enough that the performance probably doesn't matter.

The worst example of a refused performance improvement I can think of was in relation to improving a help site, it had generally bad reviews from users (it is very difficult to get good reviews from users for a help site because if a user is coming to your help site they are already in a bad mood)

but obviously if you are on a site that you are mad about being on and it then takes a long time to load all the data so you can try to figure out your problem you are going to be steaming.

Project manager wouldn't prioritize the three performance improvement tickets I made with lots of cogent description of why it needed to be done. Somehow though, this is my fault.


Looks like you’re saying that developers can’t discuss priorities with management.

In some companies this might be true.

But in many, when a CTO or a senior dev demonstrates business value of a speedy website, it would get attention.

I’m sure project management doesn’t care about keeping packages up to date either. But most devs can successfully communicate that this is necessary and that the alternative is way worse.


Yes, when I was CTO I did a lot to make our website speedy.

When I was consulting I would often do performance analysis of the sites I consulted at, show how performance improvements could be made, put links to relevant studies on performance improvements and the effects on the bottom line with nice quotes, to have the task of cleaning everything up be put in the backlog and forgotten forever.

>I’m sure project management doesn’t care about keeping packages up to date either.

How many days or weeks does it take you to update a package? It generally takes me a minute, sometimes problems happen and I need to spend some hours but those are infrequent. If a package update is going to take too long it becomes something for project management to be aware of and sometimes it is not allowed to update a package.

But generally issues with site performance need handling over a longer period of time than updating a package, I would think that was evident to anyone that has ever updated a package or done a performance analysis of a site. The comparison between something that generally takes a minute and something that takes weeks seems insincere.

But I can make an example where an update was needed that would take a significant amount of time, the reason that the update was accepted was that it would fix certain bugs with the old package and it needed to be done. Either the package was updated or the bugs were allowed to remain or we could fix the bugs with longer time than it took to update the package. I think it is obvious how this is a different argument than the site performance thing, the package update is an argument that this way we will fix the problems that you the business have pointed out, the site performance thing is an argument that this is a problem we want you the business to acknowledge.


I once worked for a place where the business unit got us to deploy one of those crappy client side A/B testing scripts which blocks rendering. Conversion rates started to dip and the same people came back to us complaining about it. I was able to pull together plenty of evidence to show them what the issue was. All they cared about was if we could A/B test it using the aforementioned client side script. Some people just can't be reasoned with, so glad I got out of that place.


I don't think anyone is under the impression that web developers can't make excuses; we have plenty of evidence of that.

I think discussions like this are capable of serving as valuable reference material when we are engaged with project managers on this topic.


I don't think there is a paucity of research that shows that higher performing sites get more users and are generally more valuable for the site owners. The problem is not lack of reference material, the problem is that nobody considers the problem that important (generally) when presented with that reference material.

Now the thread is quite big so I have not read everything but I have skimmed through it all, and have not seen anything that would be a really useful argument for getting someone to consider maybe we should try to improve site performance. A bunch of people complaining about Google, Wired, and a few other things that they complain about is not as impressive as just the hits you get if you search "effects of site performance on user retention" (replace user retention with other useful metrics to get sources for the argument being written up)


What if my audience is people in the US on desktop devices? Does it make sense for me to build for data capped networks which aren't a thing for 99% of US desktop devices? Does it make sense that I now have to spend time and money optimizing my company site for an audience that I don't care about so we don't get down-ranked?


FYI for a huge consumer brand we see over 60% on mobile


100% understand. I am not suggesting that this is true for everyone, and I agree that this is a good goal for many. My point was that my site targets an audience (corporate employees on their work computers). These policies ask us to spend a lot of time and money building out a site for an audience we don't have.


" Web developers have utterly squandered all efficiency gains"

It depends on how you measure efficiency.

Edit: I should add what I would thought would be obvious, and that it is that bloat is related to advertising which drives their bottom line, and I'm doubtful of any existing material alternatives to 'wired.com's already problematic ability to exist. Ergo, the bandwidth is set to optimise for things developers perceive to be inefficient, but not other members of the team.


Web-designers and web-developers design and build the things that the people with the money ask them to. It's that simple.


You should build what they need - though a lot of the problem is that those specifying websites have very little idea about how the web works and what is needed.

Unfortunately this isn't the 60's at Sterling Cooper where you can brain storm an idea like go to "work on an egg" or a "a mars a day helps you work rest and play"


In my experience, even if you start by "building what they need".

When you present your V1 "they" come back and say something along the lines of "can't you make it look nicer/cooler? You know add some wow factor!"

What they mean is: "I just took a pile of money from the client for this and you didn't make it look expensive enough"


The people that 'ask them to' are bound by an inexorable set of rules appear in the system, for the most part. That system is observable and a little bit predictable, it's just more economic than it is technical, so it's worthwhile for devs to take one step out of their zone sometimes to see how that math works. I don't really think anyone at wired actually wants a slow site or tons of crap ads.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: