It's not a silver bullet, it's a method to keep your sanity. What often happens is you start with a traditional server rendered web application. Maybe you have a couple ajax-y interactions for speed or UI responsiveness and everything is happy-go-lucky for a while.
Over time, you find that you start wanting more data in more views, or large collections of items that you want to manipulate visually on the client side, with no need to bother on the server side. One by one, each view starts accumulating async, non-event based cruft. The callbacks become nightmarish.
The JSON Api makes it easier to build those data heavy, complex views become more responsive. And they can change on the fly! The javascript MVC is a way to manage the the complexity and hopefully pairs well with a good event driven system.
The point is, if you know your application is going to be sufficiently complex, it makes sense to build it as client side rendered to begin with. Other great benefits can include:
- JSON API leaves you primed and ready to build native mobile apps or 3rd party integration
- Your js developers don't have to wait for backend devs to modify half-complete ajax views if you have a good api
- Your template rendering happens in once place instead of two, making it easier to isolate logic
Obviously it's not worth the trouble if you only have a few models/resources and you are dealing with a very read heavy application. It is also a lot more work to implement two MVC layers instead of one, but you may sleep better at night.
We DID build Charm with client-side MVC from the get-go. That's exactly the experience Thomas wrote about as being a problem.
Note that Charm was not "read heavy" -- it was totally interactive and doing a lot of real work; we did live (email) thread locking and all kinds of crazy stuff. If we were to start from scratch, you can bet your ass we would keep the client-side as dumb as possible.
>client-side processing and decoupling is detrimental to both the speed of development, and application performance
I can only guess then that it was a bad design choice for this particular application and for the developers working on the project.
My own experience has been the opposite. Client side mvc has resulted in speed of development INcreasing and made various parts of application performance better. I described some of the reasons above. For example, if you have dedicated back and front end developers, you can develop very quickly, indeed!
Another point that seems particular to this project:
> Especially when you’re starting out and need to stay flexible you don’t want to have too much code around—and Rails is great for that, but… adding a JSON API layer and basically a second application that runs on the client is annihilating this advantage for you.
If building a JSON API layer and a client MVC adds significant complexity, rather than decreasing it, then it was likely a poor choice. For other products, building those decoupled pieces can reduce the complexity quite a bit. One example would be when you are creating a platform, of which a web application is only a single piece.
OK, Charm was a live interface for handling support threads. New messages would show up when we swept them in from your inbox via POP/IMAP (max lag: 5 min). You'd leave Charm open, messages would appear. If you work in a team, and say your team member Bob opened the message to respond, it would show as locked on your screen, to prevent duplication of effort. If Bob canceled his reply, it'd unlock. If Bob sent his reply, the message would disappear from the queue.
I understand how that would add a bit of overhead, but it doesn't seem all that crazy. Certainly a well architected client app wouldn't have any issue with this. Maybe I'm missing something?
"I’ve come to the realization that this much client-side processing and decoupling is detrimental to both the speed of development, and application performance"
While Fuchs is a credit to the early JS community, he's been railing against larger JS frameworks (including jQuery) for the better part of 4 years.
He hasn't been proven right in practice. His approach of 100% server side generation may work well for some apps, but the clear trend is client side apps, with shared REST apis between mobile and web.
Client devices are powerful, why do everything on the server? Let the server be the data store, and the UI rendering work on the client. Makes more sense, and is going to happen no matter how much Fuchs and his followers wish against it.
Powerful? I gave up using my ipad 1 to surf the web. From "the next web" to blogger.com, every website is miserably slow and simply opening tabs causes memory warnings and the eventual safari crash.
Also, those frameworks do way more than just 'UI rendering', they are full applications which are usually REST clients, UI is just a part of it.
My company is writing an application that requires the use of some framework. We are evaluating both ember and angular for some time and will definitely use it. There are heavy downsides and we had many meetings, built demos and hacked together until we were sure using a framework at all was a good decision.
That's the implementor's fault, and usually the result of loading two dozen scripts, blocking ads, external webfonts, share buttons and other crap. A framework like Backbone is very lightweight and fast if used correctly (typed from an ipad 1)
Conceded, backbone shouldn't add stress to any website. But if you put together angularJs: directives on external html rendered in a ng-repeat (50 items or so) you will have to put the MBP on a table because it gets really warm.
Rendering anything but the simplest DOM templates from JSON can be extremely slow. We so wrote a book on JavaScript performance -- runtime performance, not simply asset delivery. In that nearly 4yo book, we recommended against templating. A couple years later, we thought things got better & faster and it wasn't an issue any more. We thought wrong.
It is usually more than fast enough. The project I'm currently working on renders completely on the client with around ten different templates/partials, and the rendering adds up to just a couple ms - that's using handlebars which is not the fastest engine around. 99% of the application time is on data retrieval and processing.
I understand you had issues with your project, and email apps are complicated beasts, but let's not generalize. The truth is in between. I bet rendering a complex template server-side on Ruby can be slower than js.
I'm not sure if you intended it, but this comment seems extremely egotistical.
Are you genuinely interested in his/her experience and having a conversation about it, or are you saying (as it seems to read), "How many apps have YOU developed?"
If it's the latter, you're being kind of a dick. Just saying.
I can see how my comment could be taken this way. It's not want I wanted to express. I wanted to state, as a disclaimer, that I'm the author of the post.
As for answering the comment, it's really more of an ad hominem so that's why I wrote the snarky "That's interesting".
I am interested in what he developed though, especially as he's so sure of his one true way.
NB: My husband isn't a native English speaker. His point was that he wrote about our experience with a specific app and that the bulk of jacquesc's argument was "Thomas rails like a zealot" and implies Thomas is an old dead horse (?), and otherwise was basically "nuh-uh." Without supporting detail.
It was an invitation to provide details instead of ad hominem attacks (speaking of being kind of a dick). Because his native language isn't English, this got a little bit mangled. I already pointed that out to him.
Also he wanted to say he was the original author, and it ended up sounding like a line from an action movie.
Bummer. Well, I guess all I can say it's nothing personal. I thought I was arguing against Thomas's opinion (which I disagree with), and his constant attacks against any JS framework over 2K in size.
In case anyone reading this exchange isn't privy to the facts of the matter:
Thomas doesn't "constantly" "attack" libraries "over 2k in size."
He's a core team member of Prototype, wrote/"founded" Scriptaculous, scripty2, and Zepto. All significantly larger than 2k. All different types of frameworks, libraries. All of which we use.
What Thomas does is promote smaller, more modular libraries -- based on experience. He promotes them because large libraries dominated the market utterly, and monocultures are not productive for hackers. So he started http://microjs.com.
Some people like to characterize him as a zealot for promoting an alternative and talking about why huge frameworks/libraries often have more tradeoffs than benefits.
I guess after writing Prototype with it's 2GB source it's understandable that one would become a promoter of smaller, modular libraries ;)
I'm myself a supporter of small libs, actually wrote a zepto "competitor" to be compatible with IE9 [1], but get the impression everyone is being a little too defensive (offensive?) on the matter.
Not when you consider the original blog post, which addressed the question of "Why not let the server be the data store?" Among the other "points"[1] jacquesc raised.
[1] anyone who wants to pretend to have a serious, rational discussion -- don't start it by calling the person a zealot. Bam! Insta-noncredibility.
To be fair, he said he was "railing like a zealot" -- a subtle but important difference. Calling someone a zealot is attacking them as a person, calling their behavior "zealot-like" is only attacking the behavior.
Let us not declare client side rendering as the winner prematurely.
Client Devices are powerful, but they are constrained by the Client Browser/Javascript Language where as Server side rendering are constrained only by the Operating System with the flexibility to use many language.
If the same view is going to be rendered to multiple clients, it makes sense to create it in the server once and send to each client rather than cache the data in the server only and render it every time in the Client? It will improve the performance of the client.
The ideal approach will be to have both client and server side rendering and take a decision based on the requirements rather than be constrained to only client side or server side.
Although client-side MVC may indeed have been a bad choice for the development of Charm, I disagree with the author's conclusion that it's an inferior design pattern for SaaS web apps as a whole.
Most of his criticisms are invalid:
>> ...and if you use something like Ember (which we didn’t), it’s even worse as all applications using it practically look the same
This seems strange to me. Client-side MVC frameworks like Ember and Backbone have absolutely no effect on the appearance of your application at all.
>> We’ve spend a lot of time getting Backbone to work properly, and the ease-of-use quickly deteriorates when your models get more complex
Backbone is imperfect, but this is hardly a point against client-side MVC as a whole.
>> What you end up with is building a layer cake that doesn’t add any value and slows down development. Especially when you’re starting out and need to stay flexible you don’t want to have too much code around—and Rails is great for that, but… adding a JSON API layer and basically a second application that runs on the client is annihilating this advantage for you.
This, I think, is his strongest point. There certainly is extra code you need to write for applications that rely heavily on front-end code. But, on the flip side, there are advantages, too. For example, you tend to have a significantly snappier user interface without having to resort to complex and error-prone nested caching schemes. There's also the beneficial side-effect that you'll end up testing your JSON API very robustly.
At the end of the day, the "correct" answer is likely the boring one: the pattern you should use depends heavily on personal constraints such as who you are, what you're familiar with, and what you're building.
i'm not sure who is right (having been in both sides of the issue) but we can all agree on this one thing. client side MVC frameworks like backbone.js, knockout, angular, have helped front-end devs earn a "lot" more money than ever before. so you really can't fault them for using these frameworks regardless of if it's good or bad. :)
> For example, you tend to have a significantly snappier user interface without having to resort to complex and error-prone nested caching schemes.
Depends on how you define "snappy"… the ability to use a lot of seemingly live interactions? Yes, but…
We found that all the client-side templating was a client-side slow-down. Charm did not have a ultra simple UI like Twitter (not that they're a fabulous example of client-side MVC success, but there you go).
Which is why sending packets of HTML would actually be faster. Insertion of HTML is faster on the client-side. The exception would be if you're sending a LOT of HTML, vs the amount of bits the JSON formatting takes up. Depends on where the complexity is.
We didn't have server-side performance problems for rendering.
While I have tended to be suspicious of Client-side MVC on the grounds they talk about...
...I'm curious why they are recommending Rails 2.3 and RJS, which are pretty much deprecated technologies. Rails 2.3 will stop getting even security patches when Rails 4.0 is final (first Rails 4.0 beta was released today). RJS has been discouraged in Rails for a while, and I believe is now a 'third party' gem in current versions of Rails.
I had to double-check the timestamp in the URL to make sure this wasn't somehow an ancient post resurrected.
After years on the bleeding edge, our motto is: "Use what works. Don't faff about with shiny new toys which don't benefit the customer." Ask any of our customers if they care that Freckle is on the simpler Rails 2.x. Or ask patio11's customers the same thing; he agrees with us on how awesome Rails 2.x is.
Also, Rails 2.x is NOT deprecated -- hence the spate of patches. Perhaps it will be, some day when Rails 4 is final, but given its popularity we'll all be extremely surprised if Rails 2.x doesn't live on as an independent and thriving fork.
RJS is not everyone's cup of tea but it's not defective. What ships default with Rails reflects all kinds of political & other realities and isn't necessarily what is "best."
.. and after years developing, maintaining and administering rails apps, my motto is "never let your app get out of date".
Rails 2.x is still functional, and still being updated, yes. It will very soon not be. And you've let such a gap open up that it's going to be a lot of work to catch up to where everyone else is.
As an example failure situation you are opening yourself up to: one day, your server crashes. You quickly order up a new one, but the DB version will be newer and it won't work with your current db gem. You'll try to install a new one but it won't work with your old, old version of rails. You'll hurriedly look into what it takes to upgrade and realise it will take days, and you don't know anything about the weird new stuff. You'll try to install the old version of the DB but it's not even in the repo and so you ask for a couple earlier versions of ubuntu but they don't support that anymore and you've been down 6 hours now and is that "benefiting the customer"?
Stay up to date.
(edited to make it clear I was not talking about live-coding on a production server)
How on earth do you get from "Rails 2.x will be deprecated sometime in the future, they say" to "your server is down 6 hours"? Do you think anyone reading this is coding (and updating libraries) live on the server? This is Rails, not PHP we're talking about.
If you don't have a system / recipe in place for rolling out new servers, having the newest framework is really not going to help you all that much.
That said, we run our current Rails 2.x products on shared servers (Rackspace). Their default set up works just fine.
You built up a horror scenario to prove your point, unfortunately it doesn't reflect reality / relies on a lot of assumptions about dev/ops incompetence that you didn't make clear.
Actually I've seen very similar things happen in practise. "Horror scenarios" happen all the time.
Look, you can sing the "works for me!" song all you like. The fact is, you are years out of date and this is very bad practise.
You will:
- have trouble even installing a compatible environment from scratch
- have trouble searching for information on issues you encounter
- have trouble hiring anyone good to work with your decidedly legacy code
With a fast moving platform like rails, you allow yourself to fall years behind the mainstream to your sorrow. It's like keeping backups. Yeah, you're fine without them, until the "horror scenario".
All the info I can find says that Rails team will stop releasing even security patches for Rails 2.3 as of Rails 4 release.
Do you have different info, am I wrong? I guess it's semantics on what "deprecated" means, but I'd be personally nervous using a product that is not even having security patches released by any maintainers.
"if you use something like Ember (which we didn’t), it’s even worse as all applications using it practically look the same (many people choose using Twitter’s Bootstrap library, for example)"
wat? How does a choice of frontend framework like ember relate whatsoever to the look and feel of the application?
For the record, and this is supported by its absence in the linked post, nobody ever said it was a silver bullet. It's a rhetorical strawman that he doesn't bother knocking down, so don't get hung up on the title.
That said, Charm appears to have had quite a bit of unnecessary platform spread. I smell cowboy-architecting (vis a vis cowboy coding), but it's hard to tell because he doesn't describe the process by which they chose to go the route they did, which would have added a ton more value to the post.
Lastly, it's a little odd to me that he seems to contradict himself at the end, saying "don't repeat yourself," immediately after recommending the tools he's used for a long time. By my interpretation his implied moral to the story is, "we should have repeated ourselves."
'Lastly, it's a little odd to me that he seems to contradict himself at the end, saying "don't repeat yourself," immediately after recommending the tools he's used for a long time. By my interpretation his implied moral to the story is, "we should have repeated ourselves."'
That's not what "don't repeat yourself" means. Building things using tools and patterns you've used before, refined by the experience, is a great thing (other things being equal). "Don't repeat yourself" means primarily "don't put the same info in multiple places such that differences between those copies is yet another thing that can break."
If you're building the kind of app that Ember is designed for, all of the above will be true. But we never anywhere claim that Ember is the right solution for every webapp. There's a definite type of app that doesn't benefit from Ember and I'll happily tell people that. There are also other simple widget based apps where I think Backbone is a good option.
Ember is not marketed as being for a specific type of app, and not for others. The sole hint to kind of app is "ambitious apps." As a reader, that seems like a challenge. Oh, I don't think ember is right for me? Must not be ambitious. Nowhere on the site that I can see is there a discussion of fit, or tradeoffs.
I'm not attacking Ember specifically. All the MVC frameworks' sites are the same in this regard: breathless boosterism, slathered in benefit-speke.
That's a good point. We probably should add a section on helping users decide whether Ember is a good fit for their goals. While there's obviously some disagreement in this area, we can at least make it clear what sort of apps we intend Ember to be used for.
Basic Marketing 101: if you try to attract everyone, you will inevitably get a lot of people for whom the product is not ideal, and then they will be angry. And then there will be backlash.
Better to build your fences and make them clear so people are educated, instead of seduced.
You're quoting from the Ember website but your bad experience was with Backbone.js. In fact, Ember is designed to avoid the kind of deficiencies you hit in Backbone. Sounds like you should give Ember a try in your next app. ;)
How could you know enough to accuse us of being "cowboys"?
For the record, we chose carefully… after deciding to build fresh from what we had, to start with a live (client-side MVC) code base instead of shoehorning in AJAX interactivity here and there.
The simple fact is: that was also the wrong choice. But it was carefully done. That's the whole point of the post.
Unfortunately the mainstream view is using client side rendering
Good solutions are not only got from good problem definition, but also on the constraints imposed on the solutions.
My constraints for my solutions, were separation of concerns (MVC/MVVM), and separation of skills (Conceptualizer/Designers/Developer/Mobile Native Developer) leading to the same conclusion as said in the article.
I wanted my back end server (Need to run in Android/IOS/Shared Host/ to be purely API driven (REST/RPC) and my views templates to be mustache only and wanted to have the flexibility to merge the model to the view template in both the client as well as the server. As I spent more and more time on this architecture, i completely used server merging of models with type safe generated view template generated from plain html based mustache templates.
As my dependency on Javascript reduced, i was amazed to discover that my performance started improving, I can independently regression test the API back end automatically by just capturing inputs/outputs to the API and I can think about the functionality independent of the UI.
My belief is any architecture which gives both client side rendering as well as server side rendering using a Web API (REST/RPC) approach may be the ideal architecture as long as the view templates can be reused. This means we can mix in the same application, both client side as well as server side rendering.
This is the approach i have taken to develop a cross-platform solution which works in Mobile/Server/PC environments using both Browser/WebView as well as OpenGL View rendering of UI.
Interesting post. I agree with one statement, many (not all) of the popular client side MV(not really C) libraries do lock you into a particular pattern. Especially Ember, Angular, Enyo... any with the scent of magic. That is what is most compelling about Backbone, it gives you the bare minimum, stuff you likely need anyways (the router and event emitter particularly).
Not knowing about their setup I can't comment on why it didn't work for them. I have to wonder if they truly were developing 2 apps, as they said, or was it a mish-mash? I find that when you trying to blend a single-page app with a traditional multi-page app you run into the most issues. You duplicate a lot of logic because it's hard not to.
In a true single-page app, the server doesn't need to be aware about the client at all. The server code should contain no views. It shouldn't even serve index.html in my opinion, nginx can do that far better.
I'd love to know what others are thinking in this space. I've flip-flopped a lot on this issue myself, looking at both sides I think good interactive web design is just simply hard no matter how you slice it.
I've done Rails+PJAX along with minimalistic simple jQuery components (similar to Bootstrap's javascript) I think it works quite well for very simple content focused designs but falls short as interactions get more complicated. Lots of convinces that I find in modern JS-MVCs (Angular and Ember come to mind) are missing, like model-DOM bindings (so not Backbone).
Now I'm playing around with AngularJS on something new and building that on top of Rails as an API (and for asset pipeline support which works quite well with Angular). This is also pretty good. Angular's style fills in a lot of gaps on the client-side that jQuery components did not. However I also see myself duplicating efforts in the views that would be more effortless on the Rails side. This is a side effect of different tools doing different things. I can't really take advantage of certain rails-isms if I choose this route, but the trade-off seems ok for me as I do feel it makes much of the interactive design on the client-side easier (so far, so good anyways).
That said I don't yet feel like I can really come down on either side of the debate in terms of what is better for me. I do think if you switch in Rails for "client-side MVC" most of the author's statements are ironcially still true. Particularly the statement "they lock you into certain patterns" is arguably more true for doing everything the "Rails way" than for a good client-side MVC library.
I do want to say two things. One, I've found Backbone to be the least productive of both of the choices I described above. It was a step up for a team with minimal Javascript experience, but once I understood the lay of the land I left it since I didn't really feel it was doing anything for me. Two, Backbone is definitely not what I would call a client-side MVC pattern. It lacks model bindings for starters.
I'd like to see what the author thinks after some more time looking at the latest Ember or Angular. I'm also looking to see what happens with Flight as that is sort of a beefed up version of the components+events approach I had been taking before. Again, perhaps both approaches have their place depending on what your needs are.
Addendum: When I was doing Rails+PJAX I sort of envisioned a concept where I delivered both data, layout and style with HTML+CSS (as semantic as needed) and used JS to handle interaction. Interestingly none of the MVC frameworks seem to support doing this very well (including Angular which seems best suited to the concept). Flight seems to which is why I find it interesting.
PJAX-rails has some unfixable problems, which goes back down to layouts.rb in the actionpack. I had to fork it to make it usable to me. (github/eduardordm/pjax) A nice post from Yehuda about turbolinks:
Most frameworks do support initial payloads (discourse does that with Ember, for instance)
This discussion is really important to the internet as a whole, some frameworks are potentially dangerous, not only to search engines but to the long-term availability of information.
I agree I found PJAX a pain in the ass in a number of scenarios. It's great for basic navigation but not so easy for anything more. Perhaps we just tried to do too much with it, but in the end we were able to get it to work for our needs. That said our app was very read heavy and content driven so it made sense. I would be very hesitant to consider it for anything more than that though which is what makes me a bit skeptical of the author's conclusions. I'm arguably no expert though.
I do client side "MVC" with backbone every day. With proper architecture the possibilities are limitless and far beyond what one can do with a simple server side architecture. Using ASP.Net SignalR + Rest API I can have a fully reactive front end with little to no effort on my part. I work in the transportation industry and I can track a large number of buses and react to situations entirely with javascript being fed by a Redis Pub/Sub. I believe this is the direction the web is heading and one should not simply brush it off.
While I don't agree with the post, I have a feeling that we are not quite there yet with either Backbone or the data-binding pack (ember, angular, batman etc). After building three different products, two using Backbone, the least painful one was the one without a framework, purely event-based.
One thing that I believe he is missing is the value of these frameworks paired to node.js - when the rest API behind the app is also javascript it all feels like a cohese project that crosses the server/client boundary, not a duplicated effort.
It's crazy how it went from "creating DOM nodes in JS is slow and shouldn't be done too often" to "lets do everything via the DOM and JS" in a matter of 2 years.
Looks like author is simply taking a dig at what is becoming a popular technical cliche these days. MVC is a proven design pattern for web-application's server side, and the server side tends to be less context specific or not as much as client side does.
Client side design needs to be as much as context specific as possible for not only performance improvement but also effort required for the learning curve. If an application doesn't have the complexity of too many views, then MVC doesn't make any sense. Future scalability is just another argument which is thrown around without keeping in mind the context.
However, I do realize that doing an entire business app in JavaScript could be tough for people new to this technology. For those people it become a bit more easier to understand and estimate the effort if they know a fancy framework which can do a few things out of the box for them. So, using client side libraries do have their benefits too.
So, I guess it all depends on the context and the existing skill-set of your development team.
I am currently working on a SPA using knockout.js and asp.net MVC in the back end. There is no server side rendering whatsoever.
The performance is pretty good. The page does take some time in html parsing + js execution. But I think I can be smarter in the way I load my page to get some performance improvement while still enjoying the power of model/DOM binding from knockout.js
Where the performance really starts to degrade is on mobiles, but I don't know if loading the html through ajax would be faster.
Again, I think I need to be smarter in the way I load my DOM. I could have a index.html with the most initial page, and a remaining.html that I would download from ajax as soon as my initial page has loaded. This would allow the user to start using the app faster.
All these issues would be solved if you work in the same language on the server and client with an integrated framework. The reason RJS and Turbolinks are so nice to develop with is because it allows you to do exactly that, though in a very limited way. For many applications you want to have a more featureless client-side framework. Ideally you'd share models and templates on the server and the client side; models on the client synchronized with models on the server, and templates automatically updating as models on the client side get changed (as you already have in many JS frameworks).
Overall I tend to agree with the author's thoughts. However, I would like to say that just a dash of backbone and client-side MVC can make your site _feel_ more responsive without all the baggage of a single-page app.
No approach to technology architecture is a "silver bullet," especially with a stack spread across client and server.
I always find myself returning to the "boring, old-fashioned" way of doing things (like server-side processing or relational databases) and these do indeed seem like the best choices for many applications.
But I don't see how this calls client-side MVC into question, except that perhaps it's considered as a default choice too often.
Silver bullets have existed only in vampire fiction. In technology X->!SB is true for all X.
Client side MVC is good enough in many cases. Where the app is small, rich in interactions and where the UI needs to evolve rapidly and where search engine optimizations, browser compatibilities (IE7 etc) don't matter much.
Completely depends on the type of app. Plenty of apps could t really be done any other way, and the client env doesn't have to suck, it just does because we keep building apps like it's 2000. Moar modules!
If it's double the work, you're doing it wrong. Possibly it's not a good fit for your specific case, or you haven't fully understood the best way to use it.
That is definitely true; there are different cases and I'm still figuring out the best way to use it.
But I think even if your server is just a RESTful API (which I believe is the best way to work with bbone) the code you have for your models (both bbone and server) gets duplicated. The application logic must be on the server as a requirement for the API and if you want the logic to inform the UI, you need to write logic of the model in backbone too. It's similar to how you have to do form validation on the client/server.
Take for example the simple model that some sort of a collection can only have 5 items. This rule has to be set somewhere in a model on the server. It also (should?) be set in a bbone model. If it's not, you could rely on the API's response code when you try to POST the 6th element, but isn't one of the benefits of a client side app supposed to be that you DON'T have to rely on the server for this kind of logic?
This is actually something I think can be solved. There have been some experiments in Ember with building Ember model definitions from the server model definitions. However, in practice I haven't found this to be much of an issue in my development.
Over time, you find that you start wanting more data in more views, or large collections of items that you want to manipulate visually on the client side, with no need to bother on the server side. One by one, each view starts accumulating async, non-event based cruft. The callbacks become nightmarish.
The JSON Api makes it easier to build those data heavy, complex views become more responsive. And they can change on the fly! The javascript MVC is a way to manage the the complexity and hopefully pairs well with a good event driven system.
The point is, if you know your application is going to be sufficiently complex, it makes sense to build it as client side rendered to begin with. Other great benefits can include:
- JSON API leaves you primed and ready to build native mobile apps or 3rd party integration
- Your js developers don't have to wait for backend devs to modify half-complete ajax views if you have a good api
- Your template rendering happens in once place instead of two, making it easier to isolate logic
Obviously it's not worth the trouble if you only have a few models/resources and you are dealing with a very read heavy application. It is also a lot more work to implement two MVC layers instead of one, but you may sleep better at night.