I'd like to call out the somewhat related problem of website rot. Meaning, the websites is online, it once worked perfectly, but becomes increasingly dysfunctional due to technical deprecations.
The soft obligation to use HTTPS these days has deranked old HTTP-only websites in search, making them hard to find. These websites are also "defaced" with browser warnings or some subresources may not load at all.
Embedded maps no longer work, since Google regularly breaks their API.
Facebook login or other FB plugins no longer work, since it needs a yearly checkup of your account and there's the new requirement of needing to have a privacy policy.
Those are just some examples of websites partially breaking through no fault of its creator, if you'd agree that the web should be backwards-compatible.
Hi, I happen to work on Google's Maps JavaScript API, and we actually take great pains to avoid breaking our APIs as much as possible. We take very seriously the implications that breaking changes can have for websites all across the internet, and know that many sites are not under active development.
This page covers the breaking changes from the last few years: https://developers.google.com/maps/deprecations#completed_de...
(Keep in mind that the top portion of the page covers features still in the deprecation period - meaning they still work. And that this page covers deprecations in other APIs like Android and iOS.)
My favorite: the deprecation period for v2 of the Maps JS API lasted 11 years after the introduction of the v3 Maps JS API.
Happy to hear about experiences to the contrary, but I thought a little insight might be appreciated.
Indeed I'm looking at this from a long time frame. From recollection the main breaking changes I'm aware of:
1. The change from v2 to v3, like you mentioned.
2. The new requirement to always include an API key.
3. The visual update to touch-sized controls, distorting especially tiny embedded maps.
Appreciate the feedback. #2 is actually a pretty interesting one - while clearly a change in the API’s output, we were careful to make it so that maps without API keys actually still work (though they do get a heavy visual indication that they need to get an API key).
That being said, when we are talking about such large timeframes, is it fair to portray that as “regular” breaking changes?
Yes, within the context of the type of sites I was discussing. You already indicated that you know that they are not actively maintained. Knowing that, it doesn't really matter if a breakage happens every year, 3 years, 5 years. There's no point at which it becomes "OK", if you believe in the long time preservation of content.
I'm not picking on maps specifically. It's a compounding effect of many individual parts slowly breaking. I would even agree that you did a reasonable job of maintaining backwards compatibility, but that doesn't matter...broken is broken. Content gone is content gone.
also, even if those older http sites get a certificate, any embedded scripts that point at http URLs, even if those URLs are also available on https, will not load and break. Especially hard to fix if you have user generated HTML on there.
Same for http download links from https websites, will not work anymore.
anything that used cookies on embedded resources will also be broken because of the missing samesite header (same for third party cookies obviously)
if google really goes through with removing alert/prompt/etc from their browser like they said they were planning to a while back, so so many things will break
I'm uniquely bothered by this problem as I visit many such dysfunctional sites.
I'm active in the (hobbyist) field of documenting species. There's thousands upon thousands of websites created by amateurs containing unique niche content. For example, somebody might have made it their lifelong hobby to document every species of bee in their territory.
It's a fragmented mess of incompetently produced websites, but I find it incredibly charming and in the spirit of the original web. Above all, it is their content that has lasting value.
The people behind it are good, generous. That's why it makes me so angry when their work is cast aside like this. Things not just technically breaking, in many cases simply disappearing altogether from search results.
But they are the fault of the creator -- first, by including content entirely reliant on the whims and generosity of 3rd parties wholly outside the author's control. And secondly, using the content and not doing the minimum needed to maintain it. Would you also say it is "no fault of the creator" if s/he did not renew the domain registration?
It's a bit like mowing your lawn once and assuming you're entitled to a perfectly manicured lawn forever.
These people are amateurs with near-zero resources generously sharing their content with the world. They may have all kinds of reasons for being unable or unwilling to continuously update their site.
The bigger point is that when you create a website according to web standards, it should keep working indefinitely. As for 3rd party integrations, especially when they're enormous (Google, Facebook) they should do the maximum possible to not break anything. The very point of an API is that it's a contract.
Some of my favorite websites about cycling were created by now deceased webmasters. Luckily they are mirrored, but I’m very thankful that they used regular old HTML and images and aren’t rotting after a mere 11 years.
No. Static vs. dynamic is a concern regarding how HTML / CSS gets rendered by the backend. HTTPS certificates, embedding of 3rd party scripts, outdated HTML / JS / CSS,... are concerns that live entirely in the browser.
Whether you use a SSG or a CMS, you still need to do upkeep of all of those: make sure your certificates are up to date, make sure you have the latest embed script, make sure you keep your site isn't build on top of brittle API's and frontend frameworks, make sure your site keeps following changing SEO practices, etc.
On another note, if you use a CMS - e.g. WordPress - the availability of your site is directly tied to the operational availability of the CMS and the stack it depends on. So, that means upkeep and maintenance of the CMS become immediate concerns. If you use a SSG that only spits out HTML / CSS, that's far less of a concern: if your SSG breaks, the site doesn't break down.
Of course, YMMV in the age of hybrid solutions where a static layout and a popular JS framework is send to the client which fetches fragments via GraphQL API's from a backend, which implies a coupling between frontend and backend.
This struck me as a wolf whistle, and the immediate follow-on paragraph confirmed that suspicion. HTTPS enforcement is a major migration, yes, but other than IPv4 (sadly a long way off) it's the only "technical deprecation" of any kind I can think of that websites have dealt with. Implying these kinds of deprecations are commonplace, rather than 2 in 30-40 years, is a little odd.
> Google [...] Facebook
Embedded maps & Like/Share buttons are not "technologies", they're product offerings from service-providers. Businesses change their offerings over time, this is common in all industries & not a problem specific to websites / the internet / tech.
If you want to get that technical, sure, why not. I don't understand what point you're trying to make though.
Ultimately, ask yourself this question: if I set out to make a website in 1992 that I hoped would be around in 2022 after zero maintenance, would it be possible and why not. The only reason it wouldn't is my visitors are starting to enforce HTTPS. There's literally no other blockers. The backward compatibility within the web as an ecosystem is extreme.
Going back to the original point - if I'm building a long lasting website in whatever year Google Maps embeds became available, should I reasonably expect a map embed pulling from someone else's server to work forever?
That's the same for literally any other programming language ecosystem. The discussion here was about the web being distinctly more rot-inclined as an ecosystem. I'm purely arguing that it's absolutely not. Comparatively it's much easier to avoid rot on the web than it is off the web.
> will not be able to build within a year's time
Fwiw, and I know this is a nitpick, but anything built purely on open source libraries should be technically buildable forever. Especially if you keep it local (e.g. cached node_modules for npm)
That's simply not true. I have WordPress sites I haven't touched in years and they all run auto updates.
Sure I guess it's technically avoiding rot by updating but the fact I don't need to touch it makes the point moot.
> Fwiw, and I know this is a nitpick, but anything built purely on open source libraries should be technically buildable forever. Especially if you keep it local (e.g. cached node_modules for npm)
Technically isn't very useful when my server needs to update npm/node for security reasons.
The soft obligation to use HTTPS these days has deranked old HTTP-only websites in search, making them hard to find. These websites are also "defaced" with browser warnings or some subresources may not load at all.
Embedded maps no longer work, since Google regularly breaks their API.
Facebook login or other FB plugins no longer work, since it needs a yearly checkup of your account and there's the new requirement of needing to have a privacy policy.
Those are just some examples of websites partially breaking through no fault of its creator, if you'd agree that the web should be backwards-compatible.