Hacker News new | past | comments | ask | show | jobs | submit login
Warner Bros Suspends Sales of Batman: Arkham Knight PC Version (wbgames.com)
164 points by minimaxir on June 25, 2015 | hide | past | favorite | 151 comments



The suspension is likely happening due to many, many technical issues with the port: http://kotaku.com/batman-arkham-knights-pc-version-is-a-mess...

The average rating for the game on Steam was 31%. Since this was the first AAA game to be released on Steam after the release of Steam Refunds, it's also suspected that a high number of returns could force drastic action.


Steam could be a major force in turning around the recent slew of absolutely craptastic, completely broken, game releases recently.

QA/QC should not happen at the customer level.


"We don't need unit tests because we have integration tests. We don't need integration tests because we have QA acceptance tests. We don't need QA acceptance tests because end users will report any bugs to us. We don't listen to user feedback because declining sales will tip us off to the problems. We don't pay attention to sales numbers because creditors seizing our assets will alert us of the shortcoming."


I know you're joking, but it's pretty much infeasible to test the kind of bugs typical in games using automatic testing. So much is content driven, and the state space is far too large.

Even for stuff that's fairly testable, like navigation, collision, etc, you have the issue of there being a million edge/degenerate cases, and it's very hard to know this never hits one in the game code or content.

So the best you can usually do is test library code.

Not that this has anything to do with AK, which sounds like plain old sloppiness.


Ok, but in this case they chose to put in a 30FPS cap. Ignoring that performance is well below even that, they should have known it was unacceptable.


Why is that unacceptable? From the reports I've read re enabling the 30fps cap removes many of the performance problems. Some of the game logic can be hard coded to accept 33ms frame times, especially if the console versions are capped at 30 and the PC version is a port of them. They made a technical decision to cap it at 30, there's nothing wrong with that. If you don't agree with it, then don't buy it.


I'm a developer, but not a game developer. Making the decision to hardcode game logic to only accept 33ms frame time sounds like a pretty dumb choice. I would wager that if your fix is to artificially cap performance of your software, then something in your code base sucks majorly. Call me crazy.


I'm a developer, but not a game developer. Making the decision to hardcode game logic to only accept 33ms frame time sounds like a pretty dumb choice.

I'm not a game developer, but I am a realtime system developer. We could hem and haw about all the different hypothetical ways all the implementation details can suck, but locking and/or hardcoding for certain framerates is a perfectly fine decision. Doubly fine when those frames aren't graphical.

Easy but contrived example: If I hard code a purpose-specific audio pathway to only operate at 44100hz, that can be a perfectly acceptable design decision given the purpose. How deeply the code assumes that rate can be indicative of code quality, but if the assumption is hard to excise from the code in performance-critical areas, well that happens.

Here is the part where I have to qualify I'm not saying more than I've said. I'm not defending the use of 30fps cap here, or saying that this game's code is any good overall. The gaming customer has higher realtime expectations than 30fps, especially on PC. And although I haven't played this game, the market is showing it is a bad product beyond that.

I would wager that if your fix is to artificially cap performance of your software, then something in your code base sucks majorly. Call me crazy.

In a realtime system, consistent performance is a more important goal that maximum performance. For games that want high-end graphics, the goals are to have maximum performance on data of highly variable complexity, push the limits of what can be done, but rarely (if ever) whiff on the real time deadline. I won't disagree that most game codebases have pockets of major suck or that a locked rate can be a bandaid for sucking, but it is not indicative of such.


Very interesting, you are correct. Thanks for this reply. My experience has always been in the world of 'performance is king', it's easy for me to lose sight of the idea that systems do exist in which such a limit is beneficial.


I used to be a game developer and every game I've worked on, pretty much, has had a fixed frame rate, both for rendering and for game updates. (The two rates don't have to be the same.) A fixed rendering rate tends to make the game better to play (though of course this is a bit subjective), and a fixed game update rate avoids nasty timing-dependent bugs (e.g., due to parameters that work fine until you have overly long or short timesteps). Both have to cater for the commonly-encountered worst cases rather than the best ones.

(30Hz is somewhat common as rendering these days tends to involve a lot of fixed-cost full-screen passes for lighting and postprocessing. So by halving your frame rate you get over double the effective rendering time, which you can spend on having more stuff, higher res, etc.)

Even for 30Hz games, during development it's very common for the game not to run reliably at even 30Hz, until towards the end, at which point stuff is taken out, and assets and code optimised to make it all work. So it's not a cap that artificially limits performance so much as a floor that the developers are striving to hit :)

(I have no idea what the problem with Batman is specifically.)


Yeah, I'm thinking now that I was totally out of my element and a little foolish with my original comment. Were you developing console games or PC games? I'm guessing console as my impression is that fixed frame rates are much more common in the console world.


I worked nearly entirely on console games. Generally the frame rate would be decided early on (either after prototyping, or because you've been given a pile of old code to start from and that runs at a given rate), and would stay fixed. The games I worked on were split near enough 50/50 between 30Hz and 60Hz, with one game update per render, and usually some provision for handling one-off frame drops due to rendering taking too long. (That's not supposed to happen, but it can, particularly if you're trying to hit 60Hz.)

I can't say with any confidence what people tend to do for PC - I don't play PC games, and I last worked on a PC game in 2006 - but supposedly one tactic I've heard is used is to interpolate object positions and properties when it comes to rendering. So you run the logic at a fixed rate (to avoid the problems I alluded to in my previous post), and save both current object states and previous object states. Then, when it comes to render, you'll be needing to render the game state as it was at some point between the most recent game state you have and the one before that - so you can imagine your latest state as a keyframe, and the previous state as a keyframe, and just interpolate between them.

I imagine this can be a bit tricky to retrofit, though, so no doubt many ports from console to PC just decide to use the same frame rate as used on the console. But I'm guessing a bit here. Unreal and the like might just do it all for you these days anyway. Maybe the Batman programmers just forgot to tick the right checkbox ;)


Well, it's not like running at anything above 60 FPS makes any sense due to the refresh rate on the vast majority of displays being capped at 60 hz. You can slam geometry through the graphics card above that rate, but some of those frames won't make it to the screen. 30 FPS is not great, but it can be better than having wildly variable framerates. Consistent 30 FPS is better than traditional films, which mostly ran at 24 FPS.

Now, for a fast-twitch action game, processing input at 30 FPS is unacceptable; input lag sucks giant donkey balls. But I would expect that a AAA development studio knows that running input-handling, networking, physics, AI, etc in lock-step with the rendering loop is not the smartest idea. These things should run on their own schedules, independent of the graphical framerate.


30fps is horrid and basically unplayable for a first person PC game.

This isn't a movie, it is active, not passive. "Better than traditional films" is completely irrelevant.

Take a game where you can cap the framerate and bind a key to switch between capped at 30 and capped at 60, if you're playing the game you'll immediately feel the transition and 30fps will feel relatively unplayable.


This isn't a first-person PC game, and until quite recently 30fps was considered a playable framerate. It's only in the last couple of years that "60 has become the new 30".

Yes, of course a frame lock sucks, but it's not the end of the world and many people wouldn't even notice if not for everyone complaining about it.


That is complete nonsense about "last couple of years", I remember wanting ~40fps for half-life over a decade ago.

Yes, at that time (10-14 years ago), 30 fps was considered playable. But that is "playable" and even then wasn't recognised as a great framerate for most of that period.

For the last decade, less than that has been seen as bad. In fact a few years after half-life's release (when it was still popular but hardware had matured) 100fps was the goal, to match the 100Hz that most monitors back then could do. 60fps only even entered the discussion with the switch to TFT/flat panels with their lower refresh rates.


In fact a few years after half-life's release (when it was still popular but hardware had matured) 100fps was the goal.

Half-life also exhibited strange behavior if the 100 fps frame cap was lifted by turning on developer mode. If I recall correctly (forgive me as this was almost a decade ago) weapon mechanics or movement speed changed.


There is so much wrong with this post.


When the office staff go home for the day, their rubbish hodgepodge of PC varieties and specs should spending their evenings playing the game with a game bot.


Even if it is impossible for automated testing, that doesn't account for QA and other manual testing. Obviously the bugs aren't too hard to find for most of the people playing it.


I don't know if you're quoting or referencing something, but that's fantastic.


Thanks! Just something I dreamed up one day when someone said "you shouldn't have unit tests for private methods because unit tests of public methods will catch them" and I extrapolated from there.


If you have an unit test on private members, then you can't change the implementation safely without breaking tests.

The point of the unit test is to test the interface in isolation, not the implementation - e.g. do you really expect when testing a hash-map the elements to be in specific order, or do you test for their presence?


Really?

> If you have an unit test on private members, then you can't change the implementation safely without breaking tests.

This makes absolutely no sense

You change the private implementation, you change the unit test. It's that simple

Then you keep your API the same so that external users don't break.

It seems to be from the same people that like to whine about "missing tests and lack of coverage" quite funnily. It seems they like to nitpick and idolize tests instead of shipping


Depends. If we take the example of a library, then there are some tests which should never break ( except if the major version increases). But it is entirely sensible to check if you broke something internally during development.


I've been a game developer for 15+ years (mainly C++), never used unit tests, just good old plain asserts, sometimes ad-hoc code that creates/simulates errors or slowdowns. Pretty much QA people testing your game/tools and some form of automated tests (run this level, expect this to happen). Then I changed jobs, started writing in Java (+ GoogleWebKit and Javascript), was exposed to Unit, integration and end-to-end testing. Do I know it properly? Hell no. I'm still confused.

But this is what I seems to be getting out of it: You are given a black box, with inputs and outputs. There is also a spec (it could be in your head for all I know) that defines that for certain inputs, certain outpus are expected. This spec also tries to cover quite a lot distinct cases. Each such representative case of input and output is an unit test. (If your spec was really in your head, your unit tests kind of becomes it, or I like to think about it in this way - a Unit Spec :)).

The tricky part is when this blackbox is internally working with other blackboxes. Unit testing is all about testing the blackbox in isolation from other blackboxes. As such one needs to isolate them away. Currently what I'm using is DI (Dependency Injection) with guice/gin/dagger to achieve that.

Thanks for all comments, it seems I have to fill my gaps in what I know.


This is true for totally isolated unit tests but false for tests that require stubbing. You can theoretically use the same tests for a map implemented with one version of a hash table or the other. But once you start testing classes with dependencies, your tests inevitably touch implementation details. For example, unit testing a hash map that is backed by a memcached server would require some form of stubbing.

The metatesting reason for this confusion is that in the example you used, the scope of unit testing is the same as the scope of integration testing. If a class has 0 dependencies, a unit test is also an integration test, because it tests all the dependencies of the class, of which there are none.

So going back to your original point, your affirmation is true for full integration tests, i.e. tests that do not stub any dependency. If you do not stub your network connection to the memcached server, the same test - albeit with a different setup - can be used for a local hash table implementation.


The method should still do the same thing, regardless if it is private or public. If you purposefully create a new method and get rid of the old one, regardless if this is done by creating and deleting or by modifying, then the unit test should be changed as well.

In short, you are testing the interface of that private method, not the implementation.


By that logic you could test a single line and say "I'm testing the interface of that line, not the implementation".


You could if your line had a syntactical construct that presented an interface, like say, a name and a parameter list.


You really shouldn't. The public methods is the interface to the surrounding code, and that is what you want to make sure works. How you implement it, with private methods or 3rd party libs is up to you. If a bug in a private method makes it past your unittests of public methods, there was an edgecase you didn't test for.

It's also a matter of praticality - I simply odn't have time to write tests for each and every method.


You have some tests to ensure your public methods don't change implementation.

That isn't the purpose of all tests.

A complex public API should consist of smaller private parts. When you change those smaller parts of code, you would like to know if you break something and specifically what you broke. Testing of a small, isolated chunk of code is the 'unit' in the term 'unit tests'.

Unit tests on actual units of code allow you to more quickly isolate failures.


I test all public methods. Anything private is part of the implementation for those metods, and rarely would I want to test that. If I make a breaking change in a private method, and the public unittest does not test it, it's an edge case I didn't test for, or the public method is too broad.


The public implementation consists of smaller private parts.

If the public test fails, how do you know what specific part of the public implementation was responsible for the change?


I don't and that is not why I'm testing either. I'm testing so the contract (i.e. public methods) other developers rely on does not change. That said, I can see in the build history what checkin caused the test to fail, so I would still know where to look.


Well yeah that's the view being taken to its logical conclusion here:

- Private methods are just an implementation detail with respect to the public methods. (You really care about whether the public methods reply correctly.)

- Public methods are just an impdet with respect to the module API. (You really care about whether the module's public surface replies correctly.)

- The module's API is JAID with respect to the user/game API. (You really just care about whether the gamer gets the right responses to controller inputs.)

- The user/game API is JAID wrt to the code/gamer interface. (You really just care about whether the gamer likes what you've written.)

- The code/gamer interface is JAID wrt the company/customer interface. (You really just care whether people still give you money.)

- The company/customer interface is JAID wrt the investor/company interface. (You really just care whether the venture throws off more money than you put in.)

Nevertheless, you'd still like to catch the failures as soon and as narrowly as possible!


Absolutes and exaggerated conclusions are no good. As long as a test ensure we do not introduce old bugs (regression) and help me refactor to the point where the time spend writing and maintaining the test is less than I save, then it's golden. I just never had a need to test a private method - I do have to deliever a working piece of code, and unit- and integration tests help me knowing I have met the spec.


Tests that help you fix failures are great - just be sure they're helping more than they're hurting.

Beyond that I think the key is to test the things you don't expect to change. The user always needs to be able to log in. But I expect developers will rearrange the private method boundaries.


To go further: define an API, write tests against that API, and then do a pass of dead-code analysis on the resulting library-plus-test-suite. Any private functions left uncalled by your public API can just be removed!


Let's just hope that function doesn't end up being the one that's invoked to adjust for leap years.


Funny, but good point.

To counter that I'd say you should have a test case to cover the leap year handling is working as expected. If you aren't testing that since it wasn't in the spec, than why would you have the code at all?


My approach when I find myself wanting to test private methods is often to extract a new class, for which the methods under question form the public interface (working with Java here).


Reminded me a lot of "for want of a nail..." (Look it up)


I wish I could fit that quote on a regular coffee mug!


> Steam could be a major force in turning around the recent slew of absolutely craptastic, completely broken, game releases recently.

This is an interesting comment because the primary suspected reason for the implementation of Steam Refunds was Steam Greenlight, which allows anyone to upload a game to Steam, and after community vetting, the game could be released on the store proper.

Of course, many of the games were either poorly tested or use assets which the developers claim as original but are not. Steam Refunds fixes this in most cases

Jim Sterling has a good showcase of these games: https://www.youtube.com/playlist?list=PLlRceUcRZcK17mlpIEPsV...


The absolutely dreadful launch of Assassins Creed: Unity probably did not help. Consumer confidence was dropping fast, having a refund option encourages trying new things and takes some of the risk out of AAA pre-orders.


> Steam could be a major force in turning around the recent slew of absolutely craptastic, completely broken, game releases recently.

> QA/QC should not happen at the customer level.

With the exception of games running under Linux https://news.ycombinator.com/item?id=9757382 :).


Several console first game companies are outsourcing their PC port development.

Sometimes it doesn't work out, like with Batman. It lacks various shader effects like rain and looks graphically worse than on PS4. So the developers already knew the performance was a problem in their PC port and left at least two advanced effects out.

In general give the PC more love, it is more popular than one might think (international view point). PS4 is a pretty strong platform (22+ mio sold), which cannot be said from the competitors console platforms (around 10 mio or less).


Exactly. There's only twenty people working on the PC version according to Batman's credits.


That's not a small team for a PC port...


http://www.shacknews.com/article/90184/batman-arkham-knight-...

There's a screenshot circulating that has 12 people listed. And I assume those aren't all developers.

But count is neither here nor there. Who knows how much was multi-platform middleware and how much custom code needed to be reimplemented.


Sorry, twelve. Not sure where twenty came from.


If this sort of thing is par for the course for buggy PC ports going forward, then I guess Steam refunds will be a positive force. I gather Valve added the new refund system in response to EU consumer protection laws. I'm glad they exist, maybe the US should have laws requiring refunds on game purchases as well.


I think they added it in response to increasingly bad PR about their atrocious customer support.


[deleted]


I never noticed any issues with GTA V, its gotten quite good reviews on PC to http://www.metacritic.com/game/pc/grand-theft-auto-v


Might have meant GTA IV, which was a terrible PC port.


GTA IV was a port of the XBox360 version which has 3 CPU cores. GTA IV was the first triple A game that maxed out all three cores. Most PCs at release time still had dual (2) cores albeit a lot faster cores than the XBox360. So GTA IV never run fine on a PC with just two cores.

Another thing was the bad driver compatibility with ATI/AMD graphic cards. For the nVidia cards there was the first time ever that they released a special launch beta driver specially for GTA IV PC.

Beside that, to run at very high settings one needed a high end graphics card. GTA IV was with Crysis 1 the last PC game that pushed hardware to the limits. Even today both still look great, and with certain mods they look as great as GTA V (though in the later case the engine is more optimized). Nowadays GTA V is a XBoxOne port and it doesn't push the PC hardware to the limits, it uses just 25% of my CPU and 50% of the GPU running with 60 fps in 1080p.


Part of releasing any game to a set of hardware should address the question "what type of systems are actually in the hands of our users?" and making the game run well at all of the settings the game supposedly offered. GTA IV failed at scaling down to the minimum (or even "medium") requirements they listed.

If you're releasing a game that only works well for next-generation specs but advertising current generation support, you get the reputation you deserve.


Maybe the reason everything runs so well for you at 1080p these days is because the creame of the crop in PC gaming now is way higher? GTA V on PC will do 4k, but you'll need a beefier computer.


what was wrong with gta v? the only issue I've heard of was the unicode user name issue, which is arguably an edge case.


I, and many other people regardless of graphics card, CPU, or RAM, are experiencing an issue where the game slows to a stuttering crawl after a while. Adding RAM extends the playable time period, but only staves off the inevitable. There are many lengthy threads on the topic, with people suggesting things like putting the pagefile on an SSD to ameliorate the bottleneck, or using third-party programs to aggressively clear caches. For every homegrown solution, there are people who claim it doesn't work for them. Google "gta v pc stuttering" for an overview.

Seriously considering asking for a refund myself.


I don't have the game to confirm, but a lot of folks are reporting that the actual game files are ~20Gb less than the recommended space requirements.

It makes one wonder if the game is actually missing a huge chunk of files.


Maybe they added a bunch of headroom because storage is cheap and they want to make sure you don't run into problems with caches, swapping, DLC, etc.


Could be that the installer needs a certain amount of swap space.

Alternatively, it could be "we tested it with this much free and it worked, we don't have time/budget to test again with less space."


We still have people planning to "port" software anymore? I thought we all learned how to design things cross-platform by now. This isn't 1995.


Games make more extensive use of the hardware (and hardware specific libraries) than your typical CRUD application.

A layer of abstraction above the different libraries can help, but often at the cost of a performance hit that you don't want in games.

Different input mechanisms (mouse and keyboard vs gamepad) need different input schemes to be fun.

Console games are usually played sitting on a couch in front of a TV. Sitting at a desk with a computer monitor much closer can also make a difference in what UIs (eg game HUDs) work well.

I have no clue what was involved in porting the Batman game, but your blanket statement about porting as a solved problem doesn't cut it.

An interesting example of a well-done port from PC to iPad is the game FTL. They changed the interface to feel native on the iPad. Recompiling the code was probably the least of their worries.


The same company that failed at porting batman arkham origins to pc was hired to port arkham knight.


Does anyone remember what the X in Xbox stood for? =)


I'm not sure what you're suggesting here.

The X in Xbox is a leftover from its early development when it was the DirectX Box.

The X in DirectX came about as shorthand for all the various DirectWhatever APIs that Microsoft was introducing in the late 90s.

EDIT:

If the suggestion is that the porting should be relatively easy because the Xbox runs DirectX, well, kind of. It's not a 1:1 mirror of the API available on Windows, and the actual hardware is fairly divergent from what you find in PCs, even if it is x86 again.


Depending on ones definition of fairly divergent, that's not really true. It is a near-standard GeForce 3 chipset, just without some legacy stuff like a keyboard controller. Other than that it has a Celeron 733 MHz CPU, an Vidia GeForce 3MX, 64MB RAM, a 8GB hard disk, an (admittedly non-standard) DVD drive and an nForce Ethernet card. Controller ports are USB underneath.

So yeah, some hardware differs but it is mostly cosmetic and as a software developer it shouldn't matter to you. See https://web.archive.org/web/20080209140715/http://www.xbox-l... for more information.


As castell pointed out, I was actually making reference to the Xbox One there.

I did reference the original Xbox in a later post, and I mostly had the GPU in mind there. The NV2A is close to the GeForce3 in design, but it's still not quite the same chip that was available to consumers. It was kind of a stepping stone between the GF3 and GF4 and was designed to support some things that would later show up in DX9 while not truly being a DX9 GPU.

Close, but not quite the same as standard PC hardware, and that in-between generations nature of the hardware and APIs means that optimizing for it as a target platform is going to require extra time to get it working as well on "true" DX8.1 or DX9 hardware.


Your parent meant in his last sentence the Xbox One (aka Xbox #3), you wrote correctly about Xbox 1.


Wow, my bad! I had no idea they switched back to an x86 architecture after the 360.


Both the Xbox One and the PS4 are running on very similar AMD APU platforms, with Sony opting for a variant with considerably more GPU power.


The overarching point was that Xbox was supposed to eliminate all this hardware/driver/platform incompatibility, wasn't it?


I don't know that Microsoft has ever made that claim.

The original Xbox was the closest to standard PC hardware, and even it diverged somewhat from what was available on standard PCs at the time at both a hardware level and API level.

The common APIs make it easier to develop cross-platform, but the platforms are still different.


Nope. Citation needed. That never was the point.


Yes, but the PlayStation 4, SteamOS, and OSX all use OpenGL instead.


OpenGL was available on the PS3 but wasn't used by AAA games. LibGCM was the PS3's main graphics API. For the PS4,GNM is the primary graphics API:

http://www.eurogamer.net/articles/digitalfoundry-how-the-cre...


> OpenGL was available on the PS3 but wasn't used by AAA games

Not even real "desktop" OpenGL, but OpenGL ES 1.1 (branded as PSGL) implemented on top of LibGCM.

Also not to say you wrong (almost no games used any of PSGL), but at least several games used some of it. E.g Techland titles like Call of Juarez / Dead Island.


Not even OpenGL ES, since PSGL did not support the OpenGL shader language, using Cg for shaders instead.


Just checked documentation. Actually it is OpenGL ES, but based on OpenGL ES 1.0 that doesn't really have any shader support.

So it's GLES 1.0 + some 1.1 features + more additions.


Is there any wonder that it wasn't widely used then?


Lol no. PS4 doesn't use anything close to OpenGL, the overhead would be too large. I mean...you can use OpenGL if you really really wanted to,but no game developer actually would, you would be losing performance for nothing.


They are using a modified version of Unreal Engine 3. (Much like X-COM, and its upcoming sequel.)

So this is less of a port, and more of a deployment to a platform that widely varies in terms of performance. And this is much easier to do when your graphics aren't so "next gen".


Thanks, targeting wildly different level of performance is another thing. Yes, there's not just one PC platform.


And issues like these are why you should never preorder PC games. They have an unlimited amount of copies, so it's not like you will be waiting weeks for something. Is a tiny preorder bonus really worth the risk of gambling your money?

Now Steam has introduced refunds, you can refund a game if you haven't played more than 2 hours, and a few weeks haven't passed.


The only time I'll pre-order a game is when the game won't get made otherwise (like Kickstarter or Early Access), and I have loads of faith in the developer. Luckily I haven't been burned yet (though I consider Star Citizen a near-run thing until it delivers). Bonus if the game is already in a playable and enjoyable state.

Pre-ordering a AAA game? No thanks. Most pre-order bonuses aren't worthwhile because they can't offer people too much or retail purchasers will get pissed off.


If it's a small company, you might want to give them money up front to help them keep afloat. Steam has early-acces games for that reason.

(It's better than pre-order, since you get to play the unfinished game earlier, if you want to. If you choose early-access but wait for the official release before you run it, it's exactly the same as pre-order.)


Right, but you should only give them money if the product that they have on Early Access is already fun, right?

Take Rust (the game, not the language) for instance. They cobbled together what is now known as "Rust Legacy" as a sort of demo of what they wanted the game to be. Thousands of people bought it. Now Facepunch is using the funds from all those sales to completely rework the game from the ground-up. Now it's better than Rust Legacy, and only going to get better and more interesting.

But if Rust Legacy wasn't fun, nobody would've bought it. TL;DR: don't preorder (including early access!) games purely on promise alone.


What if what they have so far isn't fun, but you think it might become very fun if they were given enough money to finish it? Isn't that (sometimes) worth taking the risk on?


To steal an idea from Bungie, when it comes to games you want "thirty seconds of fun" -- a base gameplay loop (in the case of Halo, where it originally applied, it's the loop of taking out baddies from a distance, then the stronger baddies from mid-range, then mopping up the scattered weak baddies). An early demo (like the aforementioned Rust Legacy) is a proof-of-concept of that base gameplay loop, and if players like it, they can fund it.

Once it's funded, they can build out from that base loop, and eventually you'll get a finished product.


Interesting concept! I wonder how it would apply to other genres, eg old school point-n-click adventures?


Then why not preorder to get the bonus and refund if it's broken? Looks to me like Steam solved the preorder problem when they introduced refunds.


They usually give away free DLCs when you preorder.


If the game is terrible you won't play those DLCs anyway. If the game is good you will have reason to purchase the DLCs when they are released.

Pre-ordering is designed to specifically work around bad reviews a.k.a. releasing poor quality games and getting away with it. The publisher is screwing you and you are paying them to.

Big publishers have no other reason to implement pre-orders. It is a shit-game risk mitigation strategy.


Pre-orders also reduce the amount of working capital required to make a game. This is, obviously, doubly important if you're a small company with very little working capital...


Do pre-orders (non-Steam) go to the developer? My understanding is that they stay with the retailer, who will use that to order stock. For Steam, they have to keep some cash on-hand somehow to handle refunds.


40% off at GMG is one reason.

Limited editions are another. AKA, I WANT MY PIP BOY.


I'm not surprised they suspended it but I wonder why they even released it. A vast majority of games that are released for consoles end up having a delayed PC version so why didn't they just delay it even for a short time?

For what it's worth the game is really fun to play on a console.

Slightly off topic: am I the only one who had to open developer tools to hide the huge spoiler modal because no key combination / refresh / mouse clicking would get rid of it? That was frustrating.


Should've clicked the big link at the bottom of the modal which says "Continue to forums".


This was not displayed for me, smart ass. They have the CSS so that it doesn't work very well at all resolutions. No matter what I tried I couldn't see the bottom edge of the modal.


Delayed PC version like in GTA V have mainly business reasons. The PC version is usually the most advanced version and the highest graphical details. Some people bought GTA V for PS3/X360, later for PS4/XOne and later for PC.


I bet it was an executive decision. The game was already delayed. Twice. They probably decided to push out what they had for PC just so they wouldn't have to announce another delay. For the PC version only, of course. While simultaneously releasing the console versions.

My guess is, they thought even though it is buggy, they can fix it with patches and will only get minor flack for this from the PC users. And that this would be the better choice than to anger every PC gamer by announcing the third delay.

Well, now it is clear that this was the wrong decision. They both, underestimated just how broken the game was and how incomplete compared to the console version and they forgot about Valve's newly introduced refund system.

WBGames can claim that they suspended this game for PC but if I were to venture a guess, I'd say it was Valve that suspended this game because they were tired of processing so many refunds. They would have said "A game that gets this many negative reviews and all are claiming horrible performance and missing effects must clearly be broken and therefore we shouldn't continue to sell it as it will only lead to even more refund claims."

A couple of interesting things to note: 1. They removed the PC logo from all Arkham Knight pages. 2. The company itself tells people about how to request refunds 3. The company also asks for patience while this is being worked out

The last two are somewhat contradictory to me. When a company tells customers on their own free will how to get refunds sounds like an admittance to failure and that they do not believe this can be fixed in a timely manner. And then they still ask for patience which sounds like they are working on a solution. Paired with them removing all PC logos from their promo pages for this game really seems weird.

While this announcement was leagues better than the marketing speak of the first announcement they did, I think customers still need more, clearer information what is going to happen. People who deal in software already know this but the average customer has no idea how long such things can take. Therefore they should explicitly announce what they are going to do. To me, it doesn't sound like patches will do the trick. From all accounts, this pc version sounds broken by design and the root causes are stuck deeeeeep in there. So, if they plan to fix this, then they should announce that an entire rewrite is needed and that this takes time. While at it, they should also tell us who will do this rewrite. I am sure no one wants Iron Galaxy at this again. We want Rocksteady to do it. Because we know they actually can make a Batman game. As evidenced by the ps4 version of this one and the first two games in general. Both ran fine on pc. I already heard speculation that they might try and release this in the fall when they scheduled the SteamOS and Mac version of the game. One can only hope for the sake of the windows version that Rocksteady will not be busy writing these versions and continue to outsource the windows version to the same cracksquad that arsed up this time.


The Mac and Linux/SteamOS versions are being outsourced to Feral. They've done a good job with XCOM on Mac/Linux and the previous Batman games on Mac so I have confidence in them doing a decent job there. It'd be interesting if their OpenGL implementation actually ends up being better than the Windows DirectX version, but I suspect the issues are at a higher level.


Didn't Valve pull Brink shortly after release?


I was in mobile and I found no way to close that modal.


With my low-end computer (previously mid-range), I've found that even with driver updates and patches, launch performance in games never improves by more than a few FPS. Now, to my knowledge, the problem with Knight is that it runs poorly and is capped to 30. If the pattern continues and Rocksteady can't reliably get it up to 60 (a 2x gain!!), then what happens? The game remains suspended forever?

(In other news, I was really hoping to maybe eke by playing Knight at low/30fps on my Macbook w/discrete Nvidia graphics, but I guess that's not going to happen.)


Here's the thing: you're not the person who's raging out over this. Yes, you're part of the demographic who might be upset, but it's people like me, who have stuff like 2x(or more) 980/295x (or better), 3000+MHz DDR4, x99 Processors/Motherboards, all overclocked, with custom cooling solutions and hooked up to G-Sync/FreeSync displays, that are upset. While graphics are not everything, and I play plenty of non-AAA games, a good chunk of the rationale behind even buying AAA games is beautiful graphics and high framerates. With the shit drivers from NVidia, and the absolute tripe so many studios are pumping out these days, I'm questioning the ongoing viability of high-level desktop gaming solutions.

Then, to have the absolute audacity to call a game AAA when it's hard capped at 30fps on PC is just an absolute act of spitting in our faces.


Yeah, it sucks. Unfortunately, it's been sucking for mid-range gamers for a lot longer than that! The idea that a gaming PC can last you for an entire console cycle is a dirty, dirty lie. I guess the bloat is finally seeping up to the high-end.


A decent PC lasted a long time during the previous console generation but with this new generation and its unified 8 GB of memory the VRAM requirements on PC have gone up a lot. WB claims a minimum of 3 GB for AMD users to play Arkham Knight. A new graphics card with > 6 GB of VRAM will probably last quite a few years from now.


> A new graphics card with > 6 GB of VRAM will probably last quite a few years from now.

Isn't the Titan the only card with more than 6gb ram? You'd hope this would last a few years!


Nvidia's 980ti and AMD's R9 390 and 390X (and some 290X) have 6 or 8 GB.


On the plus side, at least with that hardware you can probably run the game at a playable framerate. Of course, "playable" isn't really acceptable on hardware like that, but it sounds like at the moment, anything less than dual high end graphics cards gives unplayable performance


I'm a gamer too, and the trend of craptastic console ports is troubling. There are exceptions. GTAV is awesome even on ageing hardware. I have a 660ti and a phenom II, and the game runs wonderfully.

I'm hopeful that the coming VR PC exclusives will lift the quality of PC gaming.


GTA 5 is the standard to which all console ports should be held. It's almost flawless. Not only a great game, but impressive from a technical standpoint.

Except for Rockstar Social Club...


GTA5 took a long time to come to PC.


I can see how that would bother impatient kids.


Interesting that you're spending orders of magnitude more on the hardware than on the games.


I actually do make full use of the hardware decently often for data crunching/analysis, and I've messed w/ CUDA a bit before as well. Ultimately, my steam account has probably had more money spent on it than this box...but not by much. ;)


Why is that interesting?


Because the effect achieved is a function of game+hardware. For a total budget of X dollars to achieve the best possible result, the hardware is a much larger component of the budget. Would people spending thousands on hardware also be willing to spend more and wait longer in order to get higher-quality video in games? Is that what the PC version is "supposed" to be about, compared to the standardised console product?

Audio climbed this curve and has topped out. Video games haven't really explored it.

(evidently it's uninteresting enough that someone's downvoted it ..)


I didn't downvote, but the whole point of the PC as a game platform is to get the best possible experience. That usually comes with a higher price tag that many are willing to pay.

Very high-end systems are more niche obviously and the price onvolved their shouldn't be indicative of anything other than enthusiasm by hobbysists. Enthusiasts in any field or hobby are usually willing to spend lots of money to get a 'superior' experience.

As for waiting for higher quality I think yes, many would be willing to wait. GTA V came out nearly two years after the console releases and I was one of those who waited for it.

It's not like I was aching with anticipation for it, I had more than enough games in my backlog to play and when it did eventually release it was much better than the console versions.

It's not like GTA V is indicative of multiplatform releases though. Most will release on PC at the exact same time and not suffer the kind of issues we have seen with Arkham Knight.

Arkham Knight's absymal PC release was due only to a shitty port by a small third party studio that was obviously incapable or unequipped to handle it. I don't think I'd even really blame Iron Galaxy for this either, I blame Warner Bros and possibly even Rocksteady for thinking outsourcing the port was a good idea.


There's a nice side benefit here for the fact that I'm always at least 6 months behind games.

I don't have enough time to play games as much as I'd like (i.e. as much as I did as a teenager), so I only play a few selected highlights, and given that, it makes no difference to me if I'm playing the cutting edge or what came out 6 months ago - it normally takes me 6 months to finish something.

This has lots of benefits: You get the community consensus on what is worth playing, not the marketing hype. The game is usually cheaper. The major bugs have been filtered out.


Totally agreed. Why pre-order to get the exclusive content when you can have it 6 months down the line in the "Game Of The Year Edition" - and usually much cheaper and with all of the bugs ironed out. What's not to like? :)


Steam refunds, in conjunction with the recent political mobilization of gamers, will change the gaming landscape for the better, allowing developers and consumers to stand in a relationship that begins to approach good faith. This seems to be the first AAA reaction to the new situation, but we have already witnessed the complaints of Steam crapware/shovelware producers, as well as the exit from games of some bad actors (such as Tale of Tales).


That forum must be really heavily moderated/censored. Based on eg the Battle.net forums I'd have expected to see much angrier posts.


The corresponding announcement on Steam has what you're looking for: http://steamcommunity.com/games/208650/announcements/detail/...


Technical question - I've seen reports that, "...people who have installed the game to SSD's are apparently getting better performance" [1].

Which implies, to me (both intuitively and having worked on a lot of high performance I/O limited use cases) that part of the issue is read/write speed from the disk? But if that's the case, what kind of software architecture makes sense for a game to have (apparently) real time data processing pulling resources from the disk?

Obviously you need to load maps/games/cutscene etc etc, but this makes it sound like there is constant polling every n-th frame?

[1] - http://www.pcgamer.com/batman-arkham-knights-launch-appears-...


I think this is an honest and fair decision towards the consumer base. Perhaps they can offer one of the two previous Batman games to make up for the time lost before the update arrives. Let's hope the patch resolves the issues.


They must've just run this one through a shell script to port it. Seems like they put the least amount of effort they could into the PC version.


It seems there are quiet a few issues with this port. I wonder if this is a result of rushing QA?


Stuff this dramatic would have been noticed by QA, and almost certainly known to the programmers. The publisher decided to ship it in this state.


It should be noted that Warner Bros also recently published MKX on PC, which had similar issues. http://www.gamespot.com/articles/mortal-kombat-x-pc-patch-pu...


Someone uncovered a Steam ID that suggests the PC developer Iron Galaxy only received the game 2 months ago: https://steamdb.info/sub/45433/


All that means is that they started the steam integration two months ago. That isn't an unreasonable time frame.


A lot of game companies develop their AAAs with PS4/X1 in mind and the PC port is more of an afterthought. Though not the case here, it is usually released up to 6 months after the console version. I believe PC versions of games are often done by an external (or at least different) studio, and they are also assigned less Dev/QA resources. The responsibility for the quality of the PC port does not always fall on the management in charge of the console version, so a lot of faults can slip through the cracks.


the previous 2 rocksteady (Batman AA and batman AC) games were developed concurrently for 360,ps3 and pc. the first party development resulted in an excellent game that was highly regarded on all the platforms.

Origins was developed console first and ported by iron galaxy. the port was atrociously bad. you could not finish the game. the main story line was bugged so hard that you could tell that the developers had never done anything more than walk around the first mission.


Not at first though.

They had to patch to get it into a good state.

I had one game breaking bug with origins, but I found a workaround on youtube. I think origins get too much crap personally, I had fun.

This port is one of the worst in a long time. This isn't bethesda where it is just bugs on bugs, they capped the FPS.


[deleted]


Regardless of how competent the studio was, if they decided they wanted to hit a ship date for the original PS4 game and the two ports no matter what, and the port wasn't ready, this is what you get. People in QA or Programming over at a porting house don't have the ability to tell a publisher to slow down.


Not true: Iron Galaxy, who did the Arkham Knight PC port, also did the Arkham Origins PC port: https://en.wikipedia.org/wiki/Iron_Galaxy_Studios

IIRC, that port had technical issues, but not to the scale of Arkham Knight.


Rocksteady didn't do Arkham Origins. Rocksteady is only responsible for Arkham Asylum and Arkham City.


Technical issues is an understatement. the main storyline wasn't completable.


Which they eventually fixed, only to then say 'no more patches. We're moving on to DLC'


[deleted]


Runs great? It's hard capped at 30 FPS as far as I understand, and higher quality textures/effects were intentionally removed from the console versions.

Not everyone is used to the difference in appearance, but that's hardly what most people would call "great" for a $60 AAA title.


Especially when you have 'ports' like GTA V, which in reality was a different version of the game for PC, which ran like a dream.


The 'price' you had to pay for that was waiting 6 month from the console version came out before getting the PC port.


It was over 12 months I'm pretty sure. And it was absolutely worth it. I already owned the game on Xbox 360, and was more than happy to buy the PC version.


It was just under 2 years.

    PlayStation 3, Xbox 360
    17 September 2013

    PlayStation 4, Xbox One
    18 November 2014

    Microsoft Windows
    14 April 2015


Does anyone know whether the Xbox version is ok?


was there a confederate flag on it?!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: