I thought AMP was an anti-ad-blocking play. It smells to me like Google is trying to use its dominant position in search to reduce consumer control of the browser; it's a risky road to go down for everyone, including Google.
For me, as long as I disable javascript, the modern mobile web is plenty fast enough. That is, the bits of the web I want to use on the move: mostly text, occasional pictures. It takes real effort to break this with javascript disabled. Most JS on e.g. news sites, blogs, aggregators etc. is tracking and ad related, dynamically adding links to viral content, all that stuff you don't need and dramatically slows down sites through repeated reflow.
Yep, this was also incredibly obvious to me when google introduced it. It's such a typical Google move to market something as "better for the web" when what they really mean is "better for our pockets."
Fortunately for anyone with Apple phones, content blockers should still work on AMP pages.
> It's such a typical Google move to market something as "better for the web" when what they really mean is "better for our pockets."
It might be better for their bottom line (I haven't thought about it much, but it makes sense that it would be, so I'll just go with that), but that doesn't mean it isn't also better for the web and users. Same thing with Facebook Instant Articles. I now find myself, as a user, vastly preferring these sites on mobile, because they are far less likely to hang or hijack my browser.
i've seen this assumption made just about every time AMP is mentioned, that it's somehow about preventing adblock. AMP requires that all your ads are specifically tagged as ads, and that no javascript is modifying the page after it has loaded. Instead of making adblocking more difficult, it seems to be going out of its way to make adblocking easier to do and more difficult for publishers to prevent.
No, but I can imagine a future where ad blockers start blocking all non-essential JS. It would likely be more conservative than someone who uses policeman or uMatrix but I have no doubt users would appreciate the speed increase -- especially on mobile.
How about we fix the root problem instead? I know that's out of Google's control, but instead of adding and pushing more JS heavy sites, they should be pushing lighter sites. Lets work with the tools we already have, rather than adding new ones.
AMP is a strategy to take control away from the publishers, to prevent them from cluttering up the site with obtrusive ads and other heavy clutter around the actual content. The carrot that Google is offering the publishers in order to get them to relinquish this control is better placement in search results.
The best way to think of AMP is it's a list of things you're not allowed to do anymore when you make a website. It's just HTML with rules.
A lot of the crap isn't loaded by the publisher's site directly but by third-party tracking/analytic/ad code. Many publishers would love to have faster loading sites but can't give up those third-party functions easily. Creating an ecosystem like AMP allows Google to also arm-twist the third-parties to clean up their act, if they want to be supported in AMP.
They do prioritize faster sites over slower sites, and they announced it at least a year ago or more, but I think we can all agree that despite Google's announcement, the mobile web is still pretty slow. (Like, average page can take 8s to load.)
Google noticed that the mobile web is still pretty slow, and they also noticed Facebook's Instant Articles initiative. Under Instant Articles, news sites provide a special RSS feed with limited HTML features to Facebook, who caches the results and shares ad revenue with the news site. As a result, viewing participating news sites in Facebook is faster than viewing them in a web browser, e.g. in Google search results.
Google saw this as a strategic threat, and said: "How can we do what Facebook's doing, making news sites provide fast lightweight HTML to Google and our users, but in a more Open way? At least somewhat more open?" AMP was the result.
When you use AMP on your site, a third party (Google--in principle anybody, but in fact just Google for the moment) can verify easily that a site will load quickly, by design. They don't have to launch a browser and run a speed index test; they know, for sure, that the page is going to be pretty fast.
A third party also knows that all AMP pages allow third parties to cache them. (Again, only Google is doing this today, but it could be anybody, e.g. Facebook could present AMP pages from a fast Facebook cache, too.)
Google takes no ad revenue from AMP pages, and supposedly doesn't rank AMP pages higher (except by virtue of the fact that AMP pages tend to be faster and faster pages rank better). AMP pages do also get the lightning-bolt badge in search results, which might encourage users to click on them. And AMP has a github and accepts actual PRs, so AMP is open in that sense, too, except that it's officially governed by Google, so it's not really very open in governance.
Folks here seem to be pretty skeptical of AMP; I think it's a mixed bag. One positive outcome of AMP would be if its HTML components eventually become official elements in a future HTML document, built into browsers. Then AMP would be nothing more than a set of guidelines and a validator; that sounds pretty good to me.
I guess I'm still not seeing any advantage to using AMP, except as the obvious power-grab. I don't see anything in your statement that couldn't be done by just punishing sites for being bloated and slow, and stating why. Adding another layer onto something to combat complexity is two steps back.
I hate heavy sites probably more than the next guy, but pushing your own standard is the wrong way to go about it, IMO.
From what I can see, AMP is really less about "here's some cool new tech", and more "here's a buzzword and some conventions" to encourage sites to write lighter-weight, better designed web pages (or versions of web pages).
It seems like a sugar-coating over the "use modern HTML and make less bloated pages" pill.
> I don't see anything in your statement that couldn't be done by just punishing sites for being bloated and slow, and stating why.
They tried that; they're still doing it. But it didn't work, and at this point I think it's naive to think that the mobile web will get much faster purely because Google ranks fast sites higher.
Maybe their failure is not prioritizing it enough in the algorithm. If I look for weather, there are tons of shitty, heavy-js sites. Yet http://myfreeweather.com/ (which still has some JS, but not a ton) is nowhere listed.
The first few pages of sites that google offers me are all terrible choices.
I guess it's easier to measure the amount of crap when you've defined a list of no-no's and yes-yes, and on the other side it's also easier for publishers to understand.
Because Google will treat your content preferentially if you do. To quote an earlier post of mine from a different thread [1]:
AMP can impose constraints because it's a corporate effort to re-frame content in a way that provides centralized ad serving, tracking, and the like. It gives the content publishers the same tools (ads, tracking) they have to provide themselves with on the open web, while simultaneously benefiting those like Google who will put their own viewport and context around that content. It's not that much different from when an individual wants to publish on Medium or Tumblr, so the platform provides the author with a box to put text in, a dashboard, view tracking, and stuff the author wants, while they get to host interesting content with which to attract an audience for ads. It's a win-win, symbiotic relationship, while on the self-hosted web, every publisher is on their own.
This isn't about HTML vs AMP at all. It's about business models, which are given in AMP, but left as an exercise for the publisher on the open web.
It's more of a mutually-consented handholding with small amounts of arm-twisting. Your content needs to make money for you to stay afloat -- if it doesn't, you're not in the target market for AMP -- so Google gives you some tools to that they can surface your content, and you get their ad and analytics infrastructure and their preferential treatment; the premise is both Google and the publisher win more than they lose.
But this is clearly only a plus for Google, by abusing the relationship with the publisher. The logic is circular here. If Google wants people to make better sites, then why don't they just rank sites higher if they aren't full of bloat?
If I read the above posts correctly, it sounds like it's not just Google that benefits; supposedly Google provides ads and such, which provides the publisher with revenue -- the publisher would otherwise have to find ad networks and such themselves. If a publisher already has advertising revenue sorted out, and they're OK with the expense in terms of time spent managing that, then there's little incentive and so they pass.
How much of that is correct? I don't know. I'm just explaining what (I think) the posts above were trying to say.
For some it helps keep your own content lightweight: when your ad sales team tries to get you to make some change you know is bad (or if you dont know, amp tells you) you have a very explicit "we can't do that because it violates amp guidelines" response.
I'm pretty sure that value mostly applies to the Forbes and Techcrunches that have really started suffered from page weight and multiple relayouts after load, not for the individual blogs of hackernews readers.
That JavaScript enables privacy and security exploits is neither a position nor an opinion but the truth.
> please acknowledge there are a lot of tech-savvy people who do want a javascript-enabled web.
Sure there are; I have no problem acknowledging it. They, apparently, have a problem acknowledging that they do not value privacy and security as much as they value novelty.
I could just as easily say you don't value your privacy and security as much as novelty since you are going to arbitrary websites.
People find zero days in image renderers, how do you justify not disabling image rendering? Your user agent and your browsers supported TLS configs are leaking who you are, how do you justify sending that info to every random web server?
Even just being on hackernews right now proves you are accepting a negative security impact for some novelty value.
> People find zero days in image renderers, how do you justify not disabling image rendering?
Rather more rarely than they do in JavaScript. But that's why lynx, links, elinks, w3m, emacs-w3m, eww & friends are so important!
But yes, if one wishes to render an image, then one must render an imagine. But why would one wish to execute JavaScript, when one only wishes to read text? I've no objection to executing JavaScript when it's required for an app (although I do object to apps which could be more cleanly delivered as pages).
> Your user agent and your browsers supported TLS configs are leaking who you are, how do you justify sending that info to every random web server?
Because it's a requirement to use TLS.
JavaScript is not a requirement to read articles or listicles (which are the vast majority of the pages targeted by AMP); people who demand JavaScript in order to display text and images are breaking the Web, and endangering their users' security and privacy.
I really am curious what the folks who are so eagerly downvoting me are thinking. Are they thinking (i.e., do they have persuasive counterarguments), or are they just feeling (i.e., are they reacting emotionally, without a rational basis)? I genuinely wonder what possible objection they can have to 'that JavaScript enables privacy and security exploits is neither a position nor an opinion but the truth'; AFAICT it's as objectionable as pointing out that the sky is often blue or that fire is hot.
> They, apparently, have a problem acknowledging that they do not value privacy and security as much as they value novelty.
Belittling the contribution of scripting to the web as mere 'novelty' is rather disingenuous. I could list other benefits but I suspect you're already aware of them and discount them because they don't apply to you.
> Belittling the contribution of scripting to the web as mere 'novelty' is rather disingenuous. I could list other benefits but I suspect you're already aware of them and discount them because they don't apply to you.
I don't believe that scripting does contribute to the web (i.e., the interlinked web of hypertext documents we all use every day), or at least not enough to be worth the cost. The web is about documents, and documents are eminently readable without scripting (ever since writing was invented and displaced oral tradition …).
The cost does apply to me. Every page which requires me to enable JavaScript (and thus forfeit the security of the computers I do my banking and work on, and forfeit more privacy than that necessary to request a document) costs me. Every page which displays nought but a white page costs me.
I have — as I've noted — no objection to web apps qua web apps. Some of them are quite cool, and some are even useful. It's definitely nice to be able to use Linux and run programs written by people who have never used it. I do wish that browsers implemented a better language than JavaScript (which is an embarrassment to our profession) to that end, but what really gets me is the needless proliferation of apps which are really just document readers. I already have a document reader: it's my web browser.
I remember what the web was like when it was just a bunch of folks writing about things they liked and linking to one another. That was a pretty awesome web. I hate that it has been drowned out by folks who think that in order to read their documents I should give them execute privileges on my workstation.
This is nonsense. I think there are issues around AMP and search placement, anointing one specific solution to the problem, but in the end they're just web pages. You might as well say React is an anti-ad-blocking play. It make no sense.
Any reason you couldn't still use an ad-blocking extension or proxy for AMP pages? If anything it should make it even easier, since there are standardized tags for all ads...
I haven't examined the technical details very deeply, but from what I understand it's the content producer that has to use specific tags. What Google does to produce a page served up to the end user is then up to them. By intermediating the experience, they have a lot of control over the output, and they can integrate it with the rationale of making everything faster (and integration is exactly what would make it faster, make no mistake - all the ads, third party DOM elements, trackers, etc. not being integrated, and being dynamically constructed instead, is what makes modern news sites especially slow).
In extremis, they could do what Opera Mobile used to do, and supply a pre-rendered image to the end user. I don't think they'll do that (it wouldn't be very trackable, for one), but it's possible.
This is exactly what I came to say. I just keep JS disabled on my mobile and most things I want to read work and it reduces load times by an order of magnitude. If I really want to read something that won't work with JS, then I'll turn it back on for that site but I rarely if ever have to do that.
> it's a risky road to go down for everyone, including Google
What's risky about it for Google? They can back down at any time they want (this includes injecting ads directly in AMP pages, precluding competitors from doing the same because they control Chrome).
They have shown again and again that they don't give a shit about lost investments for web developers.
Large companies attract a particular type of executive: they see what a company's current strength is, and try to leverage that strength into advantages in other areas by sheer force, rather than competing and producing better products or services. It's perceived by a lot of people as ramming a product or service down people's throats, it creates resentment, and at the margin - where the leveraged strength approaches monopoly - it's illegal.
I'm of the opinion that infestation of upper ranks of companies by these kinds of people is what ultimately leads to their demise (or rather, regression to the mean of mediocrity). These people use up goodwill and generate resentment. When they're given too much leeway, they can push hard enough to attract government anti-trust attention and tie the company up in knots for years. They suck the soul out of companies by promoting a nihilistic vision of capitalism. They encourage attributing cynical motivations about any future action of the company (think Microsoft); it can take decades to shake that off.
So, the risks are both reputational and potentially regulatory, and they're cumulative. Yes, Google has cost goodwill in the past; but they don't have zero goodwill yet, and nor do they have infinite goodwill.
Working at a media conglomerate I probably have a different take than some - I love AMP, compared to the alternatives. For personal sites I don't see it as a big deal, but for news sites it's amazing.
The devs working on news sites understand the current state of bloated pages is bad and are eager to create fast, sexy pages. Unfortunately there are entities such as BizDev and AdOps that want to add more stuff to the pages to make more money, and it's easier to quantify ROI for a new ad placement compared to shaving 100ms off page load times. So we end up with dozens of scripts and script loaders and all sorts of other things. Editors and product managers want responsive pages with ad placements and enhanced functionality.
Along comes AMP, and all of the above are freaking out at the loss of control while I'm gleefully looking at super fast loading times and pages that still have most of the important functionality and ads.
Yes, the script size is large, but it will be cached amongst all AMP pages and it's refreshing to not have to think about implementing responsive pages, lazy loading, etc. because AMP handles all that.
I have a different media take. Any attempt at making something other than a text article with a photo slideshow attached now carries an SEO penalty. I think that's a shame - not that every article should be some crazy one off creation, but in an industry that desperately needs to innovate and provide value to readers, closing off that many doors strikes me as dangerous.
AMP would be commendable if it came about as a open standard, and won in the marketplace.
But it's come out controlled by the major search engine, that will promote and highlight pages in it's search rankings if you adhere to their new "open" standard.
It may still be good, but leaves a bad taste in my mouth.
Has there been any work at all done on staking out workable subsets of HTML? I looked around a fair amount a while ago and the closest thing I could find was AMP.
I think this is an important issue because the HTML spec is crazy long and it would be awesome for new technologies (say a new desktop GUI library or whatever) to be able to say "we allow and will interpret this subset some HTML" without committing to the entire byzantine spec.
I don't think HTML itself is the problem, what sucks up all those CPU cycles is Javascript and CSS. So remove (almost) all of Javascript and big parts of CSS and we're good to go.
We need a name for this and since it's HTML5 with stuff removed we can call it HTML5--, so HTML4 ;)
JS and CSS are obviously super messy, but that's a whole different issue.
I'm thinking more of cases like letting people write some HTML in a local notetaking program or something like that -- what's a nice subset of tags you should give them so they can make text look good, but doesn't commit you to supporting the lovecraftian being that's the entire HTML spec?
Also your last line made me laugh. Nice delivery=)
I mean, AMP is a bunch of custom elements and instructions not to use certain features that degrade performance. I'm not sure what part of this you want to be an "open standard", can you be more specific? AMP is more of a best practices guide than anything else.
I mean google will show a lightning icon next to pages that follow the AMP standard (and plausibly use it as a positive signal in their ranking algorithm). A competing standard would not have these benefits.
These carrots drive ad dollars, hence my distaste to calling this a "open standard".
As an example, w3c got together to say solid HTTPS websites get a green color in the URL bar. Banks, etc get a higher, more highlighted green color for a special class of certificates. These are open standards that everyone agreed on.
I hope this makes it more clear why AMP is not a open standard.
The unfortunate thing about AMP from my point of view as a browser developer is that it disincentivizes me from making performance improvements. What's the point in making something faster in a competing browser if Google is preventing developers from using that feature until it's fast in Blink?
Having a new gatekeeper in addition to the standards bodies that we have to get all new features by holds back the Web.
Google isn't a new gatekeeper, there's tons of documentation on their domain already that instructs developers to use/not use certain features for the sake of pagerank. Go look at PageSpeed for example.
It would be great if we didn't have to worry about performance, I hope WebRender gets us there some day, please do continue your work and hopefully the fact that some subset of news sites aren't using features doesn't hold you back.
In the meantime I'm going to avoid using with(), use requestAnimationFrame to prevent dom thrashing, put my scripts at the end of the body, and do all of the things maybe some day I won't have to do.
A good place to start would be admitting just one person to AMP's Governance board that's not on Google's payroll. No single for-profit organization should control something as fundamental as the Web's markup.
I think they wanted to move quickly to offer a substitute to Facebook Instant Articles without waiting for other players in the market (Bing? Mobile browsers?) to adopt it in their own time…
And tiny phones as a status symbol. And making sure you are the same network as your mates to call them for free after 6pm and at the weekends. Those were the days.
Has anyone else noticed how Google is using AMP to provide content for searches without you ever leaving google.com? Here's an example of a quick search I did for "Sausage Party review" [0]. You can quickly flip through AMP articles by comicbook.com, USAToday, and others while clearly staying within Google.
It seems to me like they're trying this a lot lately, Structured Data is the other example that comes to mind. Creating "web standards", that are supposed to help a website, but are really most helpful to Google and their crawler. Then giving an SEO boost to anyone who uses those standards. And then once enough people are using those standards, just display the relevant parts of the results directly on google.com
I get that it being hosted on Google allows the servers to respond more quickly, but do you think news websites that added these features with the promise of better performance and an SEO boost were aware their articles would be one thumb-swipe away from a competitor's articles?
It frankly wouldn't surprise me if they made a move for "suggesting" publishers allow their article to be displayed in full in the SERPs/Google News while giving a cut of related AdSense revenue ala FB Instant ours fully own the experience, could curate the feed to maximize earnings, provide a significantly less bloated experience, and gain even more of an advantage in the SSP space to strengthen the moat for DoubleClick.
I'm not convinced of AMP. I've written before on HN about my own webpage[0] were the average page weighs in at around ~10kb. At one point I was working on integrating AMP and some of the constraints from AMP has made it into the site. Ultimately though the 170kb weight of AMP(17x my average page) is a no go. I think imposing the same constraint AMP imposes on you without actually using their JS is a good idea if you aren't after the special treatment from google. Another downside of AMP is that it just display a completely blank page if the user doesn't have JS enable or blocks 3rd-party JS.
The AMP GitHub page claims that "AMP HTML is a way to build web pages for static content". It's ironic that we now need a 170kb Javascript file in order to generate static HTML web pages.
When Sourcegraph's Checkup (https://news.ycombinator.com/item?id=12240380) hit HN, I've discovered that the meaning of "static content" changed from "raw HTML with no backend and no frontend rendering" to "HTML + Javascript with no backend but possibly some frontend rendering"
It can be even looser than that, it can even just mean any server-side call that is inherently cache-able, meaning the same request will return the same data.
"Static" meaning "Not changing over time" rather than "Dynamic" meaning "changing over time". That usage is not common outside of math, but I have on occasion seen it used this way when discussing web programming.
If your pages are 10kb then I think you aren't the target audience for amp. They say themselves that amp pages will never be the most light weight of all pages and never faster then perfectly hand tuned pages, just way way better than the current median website that loads MBs from 75 domains and relayouts 50 times before it's done loading.
You have a good point all though I'd like to put forth the argument that many websites could just as well focus on reducing their page sizes and external resources instead of/before adopting AMP.
As an outsider (not remotely in any position to use amp or not) it looks like one of the positive aspects of amp is a utility with some associated carrot and stick to just get publishers to realize those things really are bad and need to be taken seriously.
I loaded this on a 4g connection and it took 14 seconds from click to progress bar disappearing. AMP talks about sub second load times on mobile. What might be in the way of that being realised?
Thank you for this. Google attempts to validate the AMP rules. If not validated, they won't cache it. Error on my part (misuse of their amp-analytics tag).
I just opened the site in chrome and did a hard refresh on the Good 2G setting of the chrome tools, only took 3 seconds. Maybe you're cell phone dropped the connection or something.
I don't think its AMP. I think its my fault. Out of the box, you can't place analytics on the page. There is a special tag for any analytics. It seems I'm getting an error, and so Google has not cached it. Thank you for telling me. I need to debug that.
That being said, I just received 239 ms with Pingdom.[1]
Yes. There are limits to what you can place in an AMP site. Basically, its all static assets, so the entire AMP version is supposed to be cached by Google.
> If an AMP page is valid and is requested (so the Google AMP Cache is aware of it), it will get cached. Any resources in AMP pages, including AMP images, also get cached.
> Can I stop content from being cached?
> No. By using the AMP format, content producers are making the content in AMP files available to be cached by third parties. For example, Google products use the Google AMP Cache to serve AMP content as fast as possible.
> Can the cache be crawled?
> No. The Google AMP Cache is roboted to crawlers. We recommend that search engines process cache links according to the guidelines for crawling Google AMP Cache URLs. For more information, see robots.txt."
I still don't get AMP. It's a different markup. Am I supposed to detect mobile devices (which is a massive pain) and serve them this separate HTML/JS/CSS? Doesn't that defeat the purpose of the responsive design that we've been fighting for? And why is it for mobile only? If it's so fast, why not make it general purpose?
Google are very good at pumping out "standards" that fill short term commercial goals while in the meantime making a fragmented pig's breakfast of the rest of the web. Aside from HTML 5 (which I'm coming to accept was justifiably hostile to the W3 process), I have yet to see anything from Quic, SPDY, WebSockets, and now AMP that demonstrates any kind of long term thinking.
WebSockets for a long time (is it still true?) could not run over a SPDY/HTTP 2 connection. SPDY itself had multiple fundamentally incompatible versions in the wild before things settled down (they changed how negotiation was implemented), WebSockets and SPDY duplicate many of the same concepts (yes, really), Quic seems utterly ignorant of the early history of the Internet and the importance of flow control, and now AMP.
AMP it seems to me, is a standard designed around the current era shortcomings in the implementation of Google Chrome (and similar browsers). It's hard to get more myopic than that.
> Am I supposed to detect mobile devices (which is a massive pain) and serve them this separate HTML/JS/CSS?
No. It's the responsibility of search engines and apps to choose whether to send visitors to AMP pages. If Google crawls your site and finds AMP pages, it'll send mobile users to them from search.
> the bad one, in a suit and wingtips, that jets from office to office and shuts down Google Reader
Ouch, that hit a nerve. Too early in the day, man.
I've seen more AMP pages on the web than I have the Facebook version of it for what it's worth. The article went in to detail about how the pages were designated as an AMP. it seems somewhat elegant. It's opt-in but there's definitely an advantage to adding the mark-up to your page.
"The Google AMP Cache is a proxy-based content delivery network for delivering all valid AMP documents. It fetches AMP HTML pages, caches them, and improves page performance automatically. When using the Google AMP Cache, the document, all JS files and all images load from the same origin that is using HTTP 2.0 for maximum efficiency."
So the proxy learns all about my traffic patterns, usage stats and userbase.
It looks like you can't opt-out of this feature or at least I couldn't find it.
I like this write up, but it raised a question for me: if Google is serving amp pages directly, how quickly will updates to your page get reflected in Google's cache?
Hardening back to "only two hard problems in programming" and all that.
"We're currently working on a programmatic solution for invalidation. We respect max-age headers for deciding when to refetch a page, but will keep serving the last known version until we have done so."
This is the big no go for me here - why would I need an AMP library, hosted on a third party domain I have no control watsoever. They can inject what they like, track initial script load etc.
The same goes on if you want to include further content such as tweets etc.
What do other folks think about these requirements?
For me, as long as I disable javascript, the modern mobile web is plenty fast enough. That is, the bits of the web I want to use on the move: mostly text, occasional pictures. It takes real effort to break this with javascript disabled. Most JS on e.g. news sites, blogs, aggregators etc. is tracking and ad related, dynamically adding links to viral content, all that stuff you don't need and dramatically slows down sites through repeated reflow.