SRI doesn't really protect against man-in-the-middle. They protect against somebody having access to manipulate the data on the CDN or a malicious CDN.
If an attacker is in a position to be man-in-the-middle and can already get around HTTPS, they might as well compromise the page loading those resources.
If you serve your start page yourself, or via a different cdn - then Sri would protect against malicious code from other cdns such as pika.
If, say, you serve example.com from one source, and link js via pika.cdn - then if pika is compromised, your site would be too. If using sri, the example.com infrastructure would have to be subverted.
All things being equal, mitm ssl or subverting a well run cdn is hard - but if you double the number of cdns in play, you make compromise of one of them twice as likely/easy. With sri, you get to keep your single point of compromise - but can leverage the benefit of a besboke cdn for your js. If any.
Not sure if it can be considered a MITM attack, but without subresource integrity the developers have to blindly trust the Pika CDN to host the same script file on that URL.
SRI might be impossible to implement in this case, not only because of the Differential Serving feature but the fact, based on their examples, that developers should link to the major versions of projects, which would mean that the content under the URL will change.
This is where a reliable IPFS-like CDN would shine.
- Strip SSL by for instance blocking port 443 and hoping they fall back to HTTP.
- Get your own root certificate installed on the equipment of the user you are attacking. This is fairly common in corporate environments for instance.
- MD5 collision attacks (although almost every certificate would be SHA signed these days)
My company would buy into this if there were some kind of boilerplate contract you sold to a business/institution vs the Patreon page you have right now.
Just FYI, you're missing out on some dollars because of that. For better or worse, the bean counters at my work place won't approve anything less. I have a feeling I'm not alone.
If you can quickly whip up some boilerplate business checkout with an invoice, you'd make more than a few dollars today.
Forget the bean counters, the technical people shouldn't approve of this without some sort of SLA or business agreement. How would you feel comfortable depending on a 3rd party service you have no actual business relationship or SLA guarantees with?
They're seriously leaving money on the table, businesses would have no problem dropping 50k/yr on a service like this.
This is something we've been looking for at my job, but we don't have the technical expertise to do it ourselves (a CDN is tricky business, not in our core competency). If an SLA that included support and/or customization as well as had a direct line to high level support for feedback, you could easily net 50K or more.
Seriously. I know my organization would be willing to pay even more than that. If you're reading this Pika founders, you should really give this some thought.
This is fantastic - exactly what I've been looking for but didn't know I was looking for it! I've been re-importing libraries by doing something weird like: `import './three.js'; export default window.Three;` so I can use it as a normal module.
I love not having to use build tools for my personal projects anymore - everything feels so light and "old school". Here's my Minecraft-ish clone in native modules and WebGL2: https://github.com/mrspeaker/webgl2-voxels. No dot files, nothin' to build... just view source!
This is very good development for frontend development. Build systems like webpack were useful technology in earlier days. But they are presenting a big hurdle for newer less experienced developers to enter the frontend development space today. I would love to see a future where we can again run a webserver from a folder to serve a frontend in development.
I do wonder how modular css fits into the picture of es modules though.
I have been coding frontend, backend and other system things since at least 2007 and the state of web development scares me. I shouldnt need a build tool to get JS on a website. I should be able to just run JS on a website. The web went from extremely simple and usable to "oh my what the hell is this?" It gets worse when you code frontend on proprietary systems that are hard to extend. Ever had to override something in Bootstrap with your own CSS files?
You don't need to use these tools if you don't find it worth it. If you want to write the same sort of JS as we wrote in the old days you can still do that. If you want to write modern JS, you'll either need to restrict your audience to the most recent browsers, or you will need tooling to help you.
But it's perfectly possible to write a modern PWA using modern js, modules, and a library like react (loaded from a cdn) without any build step at all. I've done it, and it's not all that hard as long as you don't mind only targeting the latest browsers (and writing out a bunch of file names manually).
Honestly looking forward for WebAssembly to mature, just seeing what Microsoft has done with Blazor[0] is impressive, and that's just scratching the surface.
I think the reason it is like that is because there were many problems with JS "back in the day", and so people felt they had to come up with solutions.
Just look at Svelte. It's a library about compiling JS to get back to the "just JS" days. I mean, it's more than that, but it's about reducing runtime complexity. So another way to think about it is that writing "old style simple JS" is so convoluted that the author felt the need to write a translation layer from modern frameworks to old style JS. Sort of a mindblower to me haha (though, I love and agree with Svelte, to be clear).
The great thing though is that you can still use plain old JS, right? Nothing has changed for you if you don't want it. So is there really a problem?
This "modern web" stuff is just people solving problems. Some of these problems are the fault of old JS/web. Some of them are problems of our own making. Remember how amazing modern UI frameworks were? It's because we had PTSD from horrible jQuery codebases. A problem of our own making.
So stick with what you like, and other people can use the more complex stuff. It's a win win, no?
I did classic server side rendered apps, but experimented with pure JS apps early on where the only way to get persistence was to store things in cookies. The "app" was just a html file on the desktop that you double clicked on to open in the browser. Then came AJAX. and during the last fifteen years more and more pieces have fallen into place to make web apps more viable, like service workers, local storage, add to desktop, etc. But JS apps are much harder to develop compared to server rendered app, because you have to manage state, while server rendered apps are just a snapshot of the database.
I don't know your situation, but ISTM you want to override Bootstrap with sass, not css? Also I wouldn't really call npm a "build tool". If you need a build tool, use parcel.
I haven't used Bootstrap in a while, but all I was saying above is that it rebuilds itself with a simple "npm run dist", which under the covers is a call to node-sass. That compiles sass files, but it's still not what I consider a build tool. (Attempting to override bootstrap with a css file is DOING IT WRONG.) For build tools, people used to use Grunt and Gulp. Now they use Webpack. Parcel is better than any of those, because does the right thing automatically. That could include calling node-sass, if you want.
> Parcel is better than any of those, because does the right thing automatically.
Parcel does seem nice, because it does do a lot of things automatically. But I'm not sure that can always be "the right thing". For example it seems a bit surprising that plain js will always be compiled, to support ie11, while typescript will not.
Neither option is "always the right thing".
And as far as I could figure out, there's no easy way to target deploying to separate cdns?
Not trying to move the goal posts here, it's just that some configuration is to expected in the complex reality of the modern web stack.
And a benefit of npm is that that'll pretty much always be part of the stack anyway.
Did come across this, which (if it isn't outdated) fills in some information that wasn't obvious from the official documentation:
That's a nice link that points to some details I haven't had to consider before. I'm not sure what's going on with the "public-url" flag, because I've never needed to use that and when I look up the default [0] it says "/" which is what one would expect. Maybe this is just out of date?
I don't care much about typescript or ie11, but if there are specific improvements that could be made you could open a PR.
I don't think parcel concerns itself with deploying to separate CDNs. You can do that in npm! Does anything need to be built differently to support that? Maybe just keep files bound for different CDNs in different directories?
The accessibility of the ecosystem is just not good enough for more junior engineers. This used to be a real strong suit of web development, but it has degraded over the years. My believe is that ES modules could reverse the trend, and simplify web development.
ES modules could also bring sanity to dependency management within the ecosystem. It is interesting to see the approach Deno JS is taking here as well.
Webpack's complexity seems born out of necessity since back in the day a lot of things were just not there.
Parcel has no baggage and it shows in it's very lean offering that works without you needing to troll through stackoverflow, old github issues and forum links.
I've tried rollup a couple of times so far and would always get stuck on certain modules like react and react-dom. I would always have to set up namedExports using the commonjs plugin, which seemed tedious. Not sure if anything has changed recently.
I've never understood why webpack got so much attention either. Both rollup and webpack are much slower than browserify, so I just keep using browserify. And I can get tree shaking with browserify by using a plugin too!
There are tradeoffs either way. For popular modules it might already be in your browser cache if another app used the module recently. Plus being a CDN it's probably being served by a machine closer to the user. And if your whole site is running of a CDN then you're relying on a 3rd party anyway.
- Safari double keys it's cache to prevent 3rd-parties tracking across sites
- Usage of common libraries is just too low for their to be a critical mass
A whole site running on a CDN still involves only one connection, will make use of the throughput growing the TCP congestion window grows, and a decent CDN is likely to be more reliable than the origin
I think that HTTP/2 + Push + EMS will eventually become very interesting. Without push, there's a lot of separate connections that will happen for a lot of users on a lot of sites/systems. I think that there will be some break out platforms to come from this.
`curl -i https://cdn.pika.dev/preact` redirects me to a dist-es2019 package (I assume because it detects my user-agent supporting that) but isn't showing anything like a `Vary: User-Agent` header.
Won't this break for any situation in which users with different browsers share a proxy server?
(also tried with Chrome, didn't see a Vary there either)
It describes it in the first paragraph. It acts as a proxy to serve ESM JS files (for any top-level package that's already in ESM syntax) that modern browsers can use natively without a build/transpile step. Looks like they recently added automatic polyfills for older browsers too.
CDNJS is just a standard CDN that serves up files as they're packaged, but if you want a high hit-ratio then https://jsDelivr.com has the most marketshare currently, and more features.
I think you need to treat "commonly searched" and "commonly hit by users while browsing" very differently. It's not popularity of searching or articles written about a CDN that determines the cache hits.
Google search can only tell you how popular something is to search and how much content there is about it. They aren't going to be showing you a website just because there's a src="cdn.example.com" in the code, only if it's in the text.
To actually know then you either need them to report their # of users and trust them, or scrape tons of websites and check their source for which CDN they use.
You may be right that CDNJS is more likely to result in cache hits, it's just that none of these figures actually help in finding out the answer.
CDNJS being in 1000 small github blogs could easily be less impactful than a single large website using jsDelivr. We have no idea if the github projects are even used.
Again, it may well be the case that it's better in this way, but these figures show nothing really either way.
Here's a bit of an attempt at looking more at it, though I don't know their methodology:
This isn't some game to win, I wanted to show that there may be more useful metrics that correspond to what we care about here, and talk of the nuances that might be important. I don't really care who said the right thing first but I do care what the reality is.
We aren't arguing for the contrary. We're just trying to make sure you're evaluating them correctly.
IanCal found some nice links that support your claim, but he even added that depending on what and how you use it, either may be better for you (You wanted to count cache hits).
The thread is about cache hit ratio which is not that simple. A file is either cached at an endpoint or not, regardless of whether it's downloaded once or a billion times. CDNJS only supports ~3k libraries and gets most of its usage from jQuery and FontAwesome.
jsDelivr has an automated backend proxy to support any NPM package, Github repo, or Wordpress.org plugin. It also uses Cloudflare as one of its backends, so at worst it's at parity with CDNJS in cache hits or far better due to more network partners, more global regions, and more packages from more origins.
Anything on CDNJS is also likely cached by jsDelivr, but most of everything cached on jsDelivr is not even available on CDNJS.
I struggled to see the value-add, I think it does automatic bundling & inserts polyfills based on UA strings. The whole pika project looks like a rewrite/alternative to the npm servers, kind of like MetaCPAN is for perl's CPAN+search.cpan.org
I struggled because the landing page & about were very light on the project/product's overview, too details-focused for me and I suppose anyone else who wasn't already aware of it.
The differential serving sounds like a neat idea. Naturally, everyone not using the newest version of Firefox or Safari will go to hell eventually, but until then it could really improve the web for a lot of people.
This is way cool. I recently started a new app and decided to see how far I could get without a build tool. My early impressions left me wanting to write a blog post "ES Modules Make JavaScript Fun Again." The whole development cycle felt clean and simple. Ultimately though I got hung up on dependencies. For a while I was just including things directly from node_modules/. But npm flattens things so that library location is not predictable (this crops up when en ES module dependency tries to look in its own node_modules/ directory for another ES module dependency, but that dependency has actually been flattened to the top level). So you're basically stuck downloading all your dependencies (and their dependencies) manually. This isn't 100% a bad thing. It pushes you to use smaller dependencies with fewer sub-dependencies. You're also stuck using libraries that export an ES module. Pika could be just the ticket to bridge these gaps.
It varies a lot depending on the browser features you're using, total script size, minification vs gzip compression, cache hit rate, etc.
My company is in adtech so our final bundles are 14kb (single TCP congestion window) for modern browsers, 30kb for Safari/iOS 10, and 75kb for IE11/IE10. We've seen similar doubling-of-size in other libraries for backwards compatibility, although we can probably drop IE10 soon and cut IE11 down by half.
This wouldn't work with a standard React project though, right? Because you still need to transpile JSX. You could use the development version of React, I guess, which is slower, but can understand JSX, but that's not something you want to ship.
I'd love to use something like this for teaching, tutorials, and even small projects, but there's some things I still need a transpiler for.
I also realize I could use the `htm` package instead of JSX, which gives a lot of benefits over JSX, including not requiring transpiling, but, since it's not widely used by the wider ecosystem, I'd be a little hesitant to include it in my projects.
jQuery was originally designed and built in a bygone era (on the web / front-end timescale), includes many features that have landed and/or normalized on the web platform and also does not leverage the modern JavaScript module system.
I think what they mean by modern is that every new project nowadays doesn't involve directly manipulating the dom and everyone uses some new library or framework of their choice.
This is a great recent talk about this problem by the former NPM CTO in which she tells her story about NPM and proposes a new decentralized package manager:
> Love Pika? Go Pro! Pika CDN will always be free, but you can support the project with a Pro Membership donation on Patreon. Get early access to upcoming production-only features.
What if in 5-10 years the volume is too big to be funded by donations? Will Pika sell to a malicious company? Will it shut down and kill everyone that depends on it?
Well yes, it will probably shut down just like the official python package repo will shut down if it’s sponsors can’t meet the budget. Nothing is guaranteed to last forever.
Looks great, but I think the homepage should do more to convince me that I can trust it. Who runs it, how is it funded, is there any guarantee they won't run out of money and shut down, etc.
Pika CDN seems to facilitate user tracking by the CDN better than the current JS CDNs can (with simple browser privacy features that browsers should be doing already).
Also, wasn't clear to me whether they support SRI or an equivalent supported by the browser. If they don't, it could also be a centralized vulnerability for user-targeted injection.
(Solution: the best sites will pay to serve their own JS.)
I love the idea of a more efficient CDN for JS (and code overall!), but it isn’t clear to me how this handles the multitude of versions. None of the examples seem to include versioning, which is a huge oversight IMO.
A future I see is IPFS for this sort of thing. All objects identified uniquely, but cacheable by multiple entities.
I built a repo like this but for require (commonjs), where package dependencies was sent along the first request using http2. Only problem was that browsers didnt cache the preloaded files and re-requestsed them. Hopefully browsers will fix this or latency will be a huge problem with several layers deep dependencies.
From what I understand, the idea is that you don't use a bundler, but let the browser download all the modules that your app needs - hence some features like the "differential serving" (a.k.a. polyfills added if necessary with UA sniffing).
Well, that's exactly what every CDN does: if you're a CDN you don't need to inject anything - people give away all their users' browsing histories by making them download files from your servers... the analytics are just your log files.
It's possible to check subresource with es6 module but only if you know the signature first.(https://stackoverflow.com/questions/45804660/is-it-possible-...).
Even Webpack will not handle it with webpack-subresource-integrity (https://www.npmjs.com/package/webpack-subresource-integrity)
Of course HTTPS is strong but not a foolproof solution against man-in-the-middle attack.