But (I think, I’m on my phone and can’t check), they won’t replace the method if it already exists.
So the problem is that if the spec differs in behavior at all from the mootools implementation, existing websites will break (meaning that the website was expecting the behavior of mootools’ implementation and got the spec version instead).
Even if TC39 didn’t care about breaking old websites, the browser vendors do. So they simply wouldn’t implement the spec if it broke a bunch of websites. Backwards incompatible changes don’t really hurt developers as much as they hurt users. And the browser vendors don’t want people to start complaining about how websites are broken in [X].
This is a terrible idea, but if it really is the browser vendors fighting this then perhaps the browser should detect mootools (black list websites?) and disable the newer versions. Perhaps Google crawler already has this list.
Existing websites will break if they depend on whatever the implementation difference is. Does anyone know what that difference actually is? Seems significant.
Even if their implementation is identical to the standard, it is still problematic. The key problem is the way MooTools tries to copy over that method (and many other methods):
> Currently, Array.prototype.flatten = mooToolsFlattenImplementation creates an enumerable flatten property, so it’s later copied to Elements. But if we ship a native version of flatten, it becomes non-enumerable, and isn’t copied to Elements. Any code relying on MooTools’ Elements.prototype.flatten is now broken.
> Although it seems like changing the native Array.prototype.flatten to be enumerable would fix the problem, it would likely cause even more compatibility issues. Every website relying on for-in to iterate over an array (which is a bad practice, but it happens) would then suddenly get an additional loop iteration for the flatten property.
> The bigger underlying problem here is modifying built-in objects. Extending native prototypes is generally accepted as a bad practice nowadays, as it doesn’t compose nicely with other libraries and third-party code. Don’t modify objects you don’t own!
On Chrome and Firefox, at least, as long as I delete Array.prototype.whatever first, I get an enumerable property when I polyfill it.
Not saying that's the ideal solution, and there are certainly other issues with extending built-ins and clobbering the original name. The Chrome blog also mentions forcing a patch to legacy websites is an unacceptable solution (though I'm not sure I agree with that unless we want to maintain name compatibility with every library forevermore).
However, it's certainly not a matter where MooTools would have to completely rework what they're doing, or where the browser would have to make the property enumerable from the start.
In fact, it'd be easy enough to use a compatability shim on the site side that'd nuke all the conflicting properties from Array.prototype before Mootools ever loads, if they know they're on ES5 and would never use the native ones.
> Array.prototype.map = () => { console.log('map, yo'); }
function Array.prototype.map()
> [].map()
map, yo
undefined
> for (k in []) console.log(k)
undefined
> delete Array.prototype.map
true
> Array.prototype.map = () => { console.log('map, yo'); }
function Array.prototype.map()
> [].map()
map, yo
undefined
> for (k in []) console.log(k)
map
undefined
I definitely agree with that last point. I get the desire to keep the top 100 sites going--I used to be on test for Mozilla, and you really don't want to be the browser that can't view a significant chunk of the web.
But that sort of fear-based commitment to backwards compatibility didn't do Windows any favors. I'd really hate to see it become endemic to the web.
I can see both sides of the argument here — from the technical point of view, it is certainly unappealing to maintain such kludges, and from the business point of view, "it's been working fine so whatever changed on the other end is broken, we won't spend money fixing it"…
Do you have any thoughts on how this quagmire can be avoided on the web? I can imagine solutions similar to Ghostery's stub scripts, but putting that in a browser, let alone many browsers, sounds like a large legal problem.
Aside from something like forced namespacing, I don't, really. I suppose being able to pin the JS version in the browser might help, similar to quirks mode of old, but that leads to another version of the BC issue.
But ultimately this is a question of the lesser of two evils, and I do sort of think that the real solution here is that site owners own their sites and their decisions as to what libraries to use, and thus own their own downtime. Browser vendors can't reasonably try to take that responsibility on themselves in an uncurated ecosystem, especially in cases like MooTools where the library vendor did something long known to be questionable.
That may be idealistic, but I think it's the only path that really works in the long run. Unfortunately, in browser market share is king so I doubt that attitude will be adopted. But I definitely wouldn't buy any argument that it's for the user's benefit--it's all about not wanting to get blame splashed back.
But (I think, I’m on my phone and can’t check), they won’t replace the method if it already exists.
So the problem is that if the spec differs in behavior at all from the mootools implementation, existing websites will break (meaning that the website was expecting the behavior of mootools’ implementation and got the spec version instead).
Even if TC39 didn’t care about breaking old websites, the browser vendors do. So they simply wouldn’t implement the spec if it broke a bunch of websites. Backwards incompatible changes don’t really hurt developers as much as they hurt users. And the browser vendors don’t want people to start complaining about how websites are broken in [X].