That doesn't solve the problem at all, it just moves it from the generate-HTML request to a followon get-the-content-appropriate-to-this-user ajax request. Generating the information for each page is complicated, especially at Kink where your ability to interact with each "shoot" is determined by subscription rights, admin rights, microcurrency purchases, and who knows what else these days (I haven't seen the codebase in over four years).
Hibernate's clustered 2nd-level cache is still pretty magical. It means that the vast majority of web page requests are serviced out of RAM with zero database hits - without writing any special caching code. And it's transactional. For a certain set of scaling problems, this feature is golden.
I've worked for a gaming site under the same constraints (probably stricter because real money was involved with nearly every transaction, no caching allowed).
Hibernate caching can surely be helpful, but you make it sound like a silver bullet and like there's no other approaches. There are plenty without tying yourself to J2EE hell. A little bloom filtering with memcached or redis can work wonders and might be more predictable than an opaque caching layer that can make you very unhappy once your working set exceeds a "magical" threshold (been there, with hibernate).
Oh, I don't claim it is a silver bullet. It can be frustrating as hell at times. And of course you can build your own distributed, transactional caching layer.
My point is that once you end up with a certain level of sophistication and scaling, you create your own hell. Hibernate is fairly refined technology for dealing with this exact situation. Homebrewing your solution is like wandering around in the desert - if you're smart, you'll make it out, if not, you'll be the next Friendster.
And FWIW, Java EE is not hell if you avoid the overengineered pieces like JSF.