Hacker News new | past | comments | ask | show | jobs | submit login
Is it Now Acceptable to Require JavaScript? (mondaybynoon.com)
53 points by toni on June 25, 2010 | hide | past | favorite | 77 comments



Back in the day we’d all have scoffed at such a thing... What’s changed since then? Why is this now an acceptable practice?

The only reason it wasn't acceptable to require Javascript many years ago is because each implementation was so different that it was impossible to guarantee cross-compatibility. If you used JS, it would almost certainly mean that you would render your site useless to a significant group of your users.

Those times are over; Javascript is so consistent across browsers that it's extremely rare to see cross-browser issues unless you're on the cutting edge of web development. And those implementations are converging, so these problems are quickly going from rare to nonexistent.

So the answer is yes, it's acceptable.


One major reason it wasn't acceptable to require Javascript many years ago is that many people had it disabled for good reason. Malicious code, popups, and browser crashes were quite common. Furthermore, browser security settings didn't always allow users to enable javascript site-by-site, so enabling it to have a good user experience on your site meant they'd also have a very bad user experience elsewhere on the web.

Most of those issues have gone away. It's still better to have a slightly degraded user experience for your non-javascript users, rather than a complete site failure, but it's not nearly as important as it was in 1997.


Umm... malicious code, popups and browser crashes/attacks are still common, and still rely on Javascript. It has become much less convenient to disable it, but...


Crashes are now mostly the domain of Flash, though. JS hangs in an infinite loop sometimes, but that's not a crash. Actual JS-originated crashes are exceedingly rare in my experience, and are usually the mark of an experimental-build browser.

Popups are largely handled by every major browser having popup blocking that works reliably enough.

Malicious code is what the ad-supported internet runs on. The only thing they can hit is your already-public data - the use of that data is the only thing that can be construed as malicious.


It's still better to have a slightly degraded user experience for your non-JavaScript users, rather than a complete site failure

I would suggest that a company look at their web analytical and make that decision. I know that for most of my clients their trends show less than 1% for JavaScript disabled. This, weighed against maintaining a degrading site programming model, to support 1%, did not make economic sense, given that those dollars could be used for multilingual translation, (a far larger segment).

This is really a question that is best answered by looking at your sites trends. For us it made more economic sense to do a JavaScript detect and if they did not have it enabled pop a page suggesting FF, Safari or Chrome.


I would suggest that a company look at their web analytical and make that decision. I know that for most of my clients their trends show less than 1% for JavaScript disabled.

If you do take this approach, remember to be a good scientist by making sure you respect the control aspect of the scientific method. Specifically, if you already require JavaScript for your site, you should not even venture to extrapolate an answer, even if you think you're going about it in a reasonable way.

I recently had a conversation (although it was less like a conversation and more like a few frustrated, snarky comments from me) with a developer who claimed that 99% of his users browsed his page with at least 1024 pixels. Even ignoring for the moment the issue of screen width not being the same as the size of the browser window on the screen, I asked why he would expect otherwise from users of a web site which required that kind of width before it begins to invoke frustrating things like horizontal scrolling.


> it's extremely rare to see cross-browser issues unless you're on the cutting edge of web development

Or you have IE6 users.


Use jQuery or another framework. The only issues with JavaScript in IE6 is interacting with the DOM. The actual JavaScript implementation is fine.


>The only reason it wasn't acceptable to require Javascript many years ago is because each implementation was so different that it was impossible to guarantee cross-compatibility. If you used JS, it would almost certainly mean that you would render your site useless to a significant group of your users.

You couldn't pick two or three or n browsers with 99% marketshare and make sure your website worked in them? I don't believe the browser market has been so fragmented in the last ten years that this was infeasible.

One objection used to be that requiring javascript locked out disabled (particularly blind) users. Has that changed? Does anyone know the state of screen-reading browsers these days?


You couldn't pick two or three or n browsers with 99% marketshare and make sure your website worked in them?

No, you couldn't. Back then, the web was split between two major browsers: IE and Netscape. However, the feature set of what was supported in JS was not only completely different among the two, but also completely different among different minor versions of the two. Things that worked on Netscape 6.x would be broken on 6.(x+1), and work again in 6.(x+2), although with a different calling convention. It was mind-numbingly frustrating, and I don't think it would have been humanly possible to write JS that supported all minor iterations of each browser and maintained any sort of complex functionality.

By contrast, in 2006, I wrote a JS library that worked on IE6, Firefox 1.0+, and some early version of Safari. It has worked ever since, among dozens of versions of those browsers. That's the difference.

(Edit: I'm struggling to remember what version Netscape was around 10 years ago; the version number above might be wrong.)


The notion of making an awesome experience to 99% of users vs a mediocre one to 100% users seems a bit disconnected from reality, which is: apps are getting increasingly more complex. We could talk about graceful degradation and how that works on some particularly simple apps, but some APIs are simply not available without javascript (e.g. canvas).

If you want to use the power of these APIs to give your users something great by today's standards (think Google Maps, etc), yes I'd say requiring Javascript is acceptable.


I use NoScript, which disables scripting (JavaScript and Flash plugins) by default in Firefox, but allows me to easily enable it for scripts which come from the domain of the site, or for all scripts on the page.

My general policy is: if I'm linked to an article, I should be able to read it without enabling JS. If I'm linked to a video, I should be able to watch it only enabling the Flash plugin. If these assumptions are violated I become offended and I usually press Back, unless I have a very good reason to need to see that content.

I am not offended by web applications which require Javascript, as long as they're linked to by a source I trust.

I won't enable Javascript for domains which I know only serve ads. I also won't enable Javascript if I think I'm on a "sketchy" page -- a page which I believe might have malicious code.


You’re saying you won’t enable JavaScript unless necessary. But why? What are you afraid of JavaScript doing? Issuing an endless supply of alert boxes? Changing the background color of the page every second? Slowing your computer to a crawl with an infinite loop? I don’t know any sites that do that. I know JavaScript can be used to facilitate the display of ads, but AdBlock can handle that without blanket disabling JavaScript. What’s the worst that JavaScript might actually do, based on the sites that are out there?


What’s the worst that JavaScript might actually do, based on the sites that are out there?

I'm using NoScript since I barely escaped from being infected by quite sophisticated malware few years ago.

Innocent site (local public transport) had had injected single malicious line into its content management system (most probably after untargeted "fishing for vulnerabilities" botnet scan). This one line was JavaScript include which in turn triggered series of jumps to servers all over the world (mostly China and Russia).

In the process, multiple layers of heavily obfuscated JS code were both loading hidden frames loaded with ads (presumably for click / impressions fraud) and trying to load hidden PDF embeds which contained a payload of hidden executables for infecting your local computer (exploiting Acrobat Reader security holes).

I only found out because Acrobat is such huge resource hog that I noticed and managed to kill it in a task manager before it finished loading.

I was using up-to-date Firefox and up-to-date antivirus AND I had disabled Acrobat plugin.

Upon inspection, malicious code appeared to be cross-platform and multi-target, using vendor specific JS extension. I remember besides Acrobat it was also targeting Silverlight (and this was in time when it was very new technology).


Every browser exploit in the last few years that I've heard about requires Javascript enabled, except for history sniffing trickery.


This is the main reason I also use NoScript. Just today I saw two very tech savvy friends get caught out by a Facebook clickjacking attack. With javascript off on untrusted domains, it didn't work for me.


Wasn't there a recent study that concluded that Flash was the source of over 90% of all vulnerabilities last year?


I'm not sure, but NoScript also blocks Flash for me, so the outcome is the same.


Sure, but Flash is also straightforward to disable. It's not even installed by default in most browsers.


Flash and Acrobat were in the top 5, but I don't think Flash can take credit for a full 90%.


There are more subtle ways to be annoying with JS these days. eg. Popping up a new window every time you highlight some text in an article. Or slowing the page load down while it highlights the key words on the page and turns them into annoying semi-popup links.


For me it's mostly sites tracking way too much information at my end. It just creeps me out. Consider the kind of information tracked by google[1]. There is no way to dodge stuff like that other than noscript.

[1] http://entertainment.slashdot.org/comments.pl?sid=629659&...


XSS should be noted here, too.


Infecting malware, exploiting holes in the browser, storing data via local/session/appCache, tracking me (google analytics), loading flash, displaying ads, redirecting me to see an ad before the content...

The list goes on and on, sure you can do most of these without javascript, but by that point I've moved on and blacklisted your site.

Is it better to be overly secure, or underly secure?


If I'm linked to a video, I should be able to watch it only enabling the Flash plugin.

You do realize that due to a patent ruling a few years ago, these days almost all Flash content is placed after load with JavaScript to circumvent ActiveX's click-to-interact restriction. How often are you able to enable only Flash to view your content?


Not often. But I can usually enable scripting for only the domain where the video is served (youtube.com) and then enable the one instance of Flash which looks like the content.


What happens with libraries downloaded from the Google mirror?


Users capable of disabling Javascript are capable of programming versions of my site which will support them.


false.


Well, mostly false anyway. A guy in systems programming could still know how to do it, as could a game programmer, and neither might be versed in the multi-tiered art of web development. It's not even a very hidden feature for some browsers and there are plugins available that serve only that purpose. You should have stated your reasoning for why you disagree however, because "false." doesn't cut it here.


I completely agree with his sentiment, although I would rephrase it:

Users who disable javascript are responsible for dealing with the problems caused by lack of javascript.

As for why it's false, I think it's obvious. You can turn off javascript with the click of a user-friendly UI button, whereas programming ability is a much harder to attain skill.


I think he was deliberately using hyperbole. The point is that people who turn off JavaScript are a tiny, highly technical minority.


It's acceptable to require javascript to provide functionality that cannot be achieved otherwise. It's acceptable to improve the user experience, what is known as "progressive enhancement". But whatever _can_ be done without it, should be accessible (not require javascript). It's not acceptable to require it to replace basic functionality like links, navigation, scrolling, etc. in an inaccessible way.

The question that remains in my mind is… is it still acceptable to CamelCase javascript?


> "It's not acceptable to require it to replace basic functionality like links, navigation, scrolling, etc. in an inaccessible way."

This depends on what you're building. For websites, you are absolutely correct. For web applications, however, it is acceptable to use JavaScript for every jot & tittle. Take, for instance, 280 Slides; there is no graceful degradation for an application of that complexity & power. You either use it or you don't.


> For web applications, however, it is acceptable to use JavaScript for every jot & tittle.

Fuck everything about that idea. And fuck web "application" which consider it cool to screw with my middle clics.


camelCase is the de facto standard for JavaScript - all of the DOM APIs and major libraries use it.

edit: oh, you probably mean the language name. Again, JavaScript is the correct formatting, even if it looks silly to most eyes.


Indeed it is. I just don't like it. But that's the standard.


I'd try to estimate four factors:

1) the percentage of users that have JS disabled

2) the cost associated with developing a non-JS version of your site

3) the revenue you stand to lose if you require JS

4) the aesthetic/functional impact on the user experience for non-JS users

Put all of that together and you'll have your answer. For me, it's not worth maintaining two version of what I build.


Exactly. Follow the cost-benefit analysis through to completion. For my startup, it's not worth it to create a second version of the site.

I'd love to serve up the site in a way that lets any screen reader or NoScript user see the site as it is intended, but the costs are too high for the relative gains.


It's been acceptable for a while. It's also acceptable to require images.


But just as images should have alt text for people who prefer not to load them, sites requiring JavaScript should have some obvious message to indicate the need for JavaScript when a user decides to not run scripts by default.

It's insanely trivial to do, and just seems polite.


Also, search engines still aren't to the level of a "full browser", meaning that your site should fall back so it can be indexed properly.


It really depends on your projected userbase these days.

For instance, in a couple of my projects, we aimed at a small subset of users (autistic and/or adults with disabilities in this case) who would have a good reason to have JS disabled (fewer issues without JS when using some screen readers, for instance). For this reason our app uses no JS unless absolutely neccessary - and in those cases we do checks for users to transparently skip the sections of the site that need it.


Luckily for me, some of my core features require javascript. So I decided to ditch the no-scripters entirely, and consequently my site rocks and my code is cleaner!


This is an unusual use case, but for systems management type web applications, if you can make them degrade gracefully you've just built an often very usable terminal application at the same time: just fire it up in lynx.

A social network or something probably doesn't need that, but us sysadmins love our terminals, and its surprisingly easy to make an application very usable in lynx.


We're not even trying to build a non-js site for my startup.


Yes. It's like asking if it's ok to require CSS 5-10 years ago. Javascript is pretty ubiquitous and consistent (even in mobile browsers now). The web has moved on and so should you.


No. And can we please stop talking about "graceful degradation" and start talking about "progressive enhancement"? If accessibility isn't your first concern, you're doing something wrong (with some obvious exceptions -- building an online image editor for the blind is probably taking the accessibility thing a bit too far).


If accessibility is a higher priority than quality or even getting the product finished, you're doing something wrong. Losing accessibility is bad, but having something people don't want to access is worse.


I want to agree with you, but then again Target got sued over this: http://www.sitepoint.com/blogs/2008/08/29/target-settles-acc...


Oh my god that's horrible! Next they're going to get sued because they're not donating to research aimed at improving blind people's vision.

Some of the comments on that article make me want to cry.


One of the great things about providing a fully-featured API for your application is that, given enough demand, someone can create a JavaScript-free implementation of your interface. (Just like desktop apps tend to crop up around popular web APIs -- 37signals apps in particular.)


I think it's perfectly acceptable for web applications like Google Docs, where usable non-JS alternative is just not feasible.

However I think it's not acceptable for ordinary web pages. You don't need JS to show text, images and links.

If you use progressive enhancement it's easy to have basic non-JS version working, and you don't have to sacrifice any features for JS users.


On mobile webapps/sites, absolutely. I have built a gracefully degrading/progressively enhanced mobile website here: www.zocalodesign.com (hit/spoof with mobile browser)

The effort it took me to make this site gracefully degrade was completely not worth it, even though it wasn't THAT difficult for this site. However, my client actually only wanted it to work "on the iPhone".

Furthermore, now with plans for Google to expose native hardware APIs in their mobile browser via JavaScript, how do you gracefully degrade access to the camera or accelerometer?

Finally, on desktop browsing experiences, this argument won't even be around soon, mainly because the current battle is native versus browser (engine) not JS versus non-JS. We are building apps in the browser that are giving us native desktop functionality. You simply cannot do this at the level that people expect (in regards to user experience) nor functionality without JavaScript.


Why have you built that website using Flash?


The word "acceptable" is used as if there's some sort of moral code to building web pages. There's not. As long as it's legal, do what you like. I do, and I couldn't care less if 2-3% of people can't use it as long as the majority are having a good experience.

That many sites in the 2002-2006 period were built ONLY in Flash wipes away any guilt you could have about using JavaScript.

(Of course, if you have certain audiences or need to meet legal accessibility standards, that's a different ballgame, and if you don't meet the requirements, another developer/company will.)


I’ve tried to wrap my head around these poorly implemented Web applications to find out the real inspiration behind them.

Trying to implement something like GMail without javascript would just be a waste of time. It makes sense to talk about building sites focused on static content and simple forms without a dependence on javascript but the new breed of web applications require javascript. To dismiss them as "poorly implemented" is put dogma before practice.


Bad example. Gmail provides a not-as-pretty but very usable non-javascript version of the app.


The HTML-only version is a completely different app. I think the gp meant that sometimes it is not possible to code an application to degrade gracefully. If you want a no-js version you have to code that (almost) from scratch. And gmail is a perfect example for that.


> Trying to implement something like GMail without javascript would just be a waste of time.

You're so wrong even google disagrees with you...

> but the new breed of web applications require javascript.

That simply isn't the case. Unless you're johnny and you can't code, maybe.


As an earlier poster pointed out, GMail's non-js app is a totally different app, and let's be honest, it's a giant pain to use compared to the js version, and for most people it wouldn't have been worth the expenditure to make. GMail wouldn't have caught on if it were just the pure HTML version. Degradation is nothing but a nice-to-have for an app like that.


You're so wrong even google disagrees with you...

Yes, I'm aware of the HTML-only version. No, gmail wouldn't matter if it didn't leverage javascript to build a client that competes with, and in some ways surpasses, desktop mail clients.

Apps are moving off the desktop and onto the web thanks to the new HTML standards and modern javascript environments. To talk about "degrading" these gracefully is to live in the past. These aren't "sites" - they're applications.


One of the best current reasons to have a fall back is performance. It turns out that 100% JavaScript web sites have a horrible performance profile. Having just got back from Velocity conference I can say half the performance talks were about postponing as much JavaScript on the page as possible to cause faster render times.

Accessibility used to be a decent argument (at least to those of us who cared) but it isn't any more. I say that as a co-author of WCAG2.


Would you expand a bit on the accessibility aspects of this? One of the only reasons I even worry about using Javascript heavily in my sites these days is due to accessibility concerns -- are you saying that folks who are blind are able to use Javascript now?


There have been a few JavaScript accessibility arguments over the years. The main thrust goes like this:

1a) Screen readers (e.g. JAWS) don't support JavaScript properly 1b) Only very modern screen readers support JavaScript properly (JAWS 10+) 2) Upgrading screen readers is expensive 3) People with disabilities statistically have lower incomes

QED we can't expect disabled users to have JavaScript.

To address the points:

1a was true a long time ago, but modern screen readers do support JavaScript and things like ARIA.

1b is somewhat true but that proves it's a technology rather than an "accessibility" problem, the screen readers weren't implementing JavaScript rather than it being an actually accessibility issue. As such WCAG2 made the decision to treat it as such.

2 and 3 are somewhat true, however many people with disabilities have access to non-profit, state or federal programs that support them with technology. Also free, open source projects like NVDA are increasingly advanced.


for mobile browsers, it's still iffy.

But for desktop browsers, I don't even really think about it anymore.


There's a divide in mobile browsers: rich browsers like Mobile Safari and the Android browser, and less capable browsers that you'll find on Nokia handsets, Blackberries, and Mobile IE.

I don't think I'd build a web app targeted at iPhone/Android without using JavaScript. If you want to make your web app feel anything like a native app, it's pretty much essential.


The main reason to make a site work without JavaScript is, in my view, accessibility. Try running a typical Ajax-driven app with a screen reader some time. Standards are emerging, but they just aren't there yet.

If you want the visually impaired to be able to use your site, you should make a noscript version with as much functionality as you can. (Side benefit: That's going to make SEO a lot easier, too.)


Yes, it is.


For rapid development, I lean towards "absolutely". For large-scale applications with many users, I lean towards "absolutely not".

If you have to service everybody, the answer is a flat "no". Doing otherwise is irresponsible, however understandable. If you don't, take the 0.1% user hit (assuming some mobile devices here) and require it and save yourself a lot of trouble.


If you have a web site that delivers information, I'd say it should degrade gracefully. But if you have a web app, requiring javascript is no different than having a desktop app that first requires you to download the desktop app before you can use it (duh). Notice that every example they use is a web app.


Pithy answer: Yes for apps (and only apps).


+1 Insightful. Requiring js for navigation is poor practice, especially if javascript is not needed for anything else. If you're only using it for animations/effects, requiring it can probably be avoided as well.

I think the difference is that before, javascript was more of an "addon" feature to web pages, where now we are now able to create things with javascript that just aren't possible with static HTML only, or are far more usable, responsive, etc.


It depends. Some mobile devices can't handle some of JavaScript while web browsers have varying support for it (I know that Firefox supports some JS code that Chrome doesn't). It's only acceptable if you're alright with abandoning part of your potential audience.


HTML and CSS are not optional. I don't see why JS should be.

The only two things I'm concerned about are mobile platforms that don't have a complete JS interpreter, which presumably will be less of a problem in the near future, and accessibility for screen readers.


Doesn't requiring javascript start to tread on MVC? I would prefer the presentation(HTML) to be decoupled from the logic (javascript). I run NoScript and all the pages I visit load way faster.


Generally the view is all the browser-side components: HTML, Javascript, and CSS. The only change in MVC with AJAX is the ability to make calls to the controller without a full page load. Using MVC is still good practice.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: