Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Dwarf fortress will crush your CPU because it isn't, and likely never will be, multithreaded.


Odd considering that several of the CPU-intensive procedures described in the article (rejection sampling of terrain, generation of independent feature fractals) are easily parallelized.


Those take place during worldgen though, which is a one-time event that occurs before gameplay. Where most players actually experience CPU problems (and it matters) is during gameplay. The world is effectively frozen after world creation (though I think the latest version does allow for events to still go on), but what has historically been the source of performance problems is dwarves and animals pathing, large numbers of items, and water/lava flow. Even though you can generate these huge regions, the average fortress is built on a pretty small map.


Doesn't that mean that DF will only "crush" one core rather than all of them?

Is it really a big concern how well DF will scale? It's a game, it is made to be played. It isn't really meaningful to criticize games for whether they are multithreaded or not.


Yes, it will crush a core.

It actually is a concern as DF in game (dwarf mode anyway) does a huge amount of pathfinding for the little buggers and other creatures. Not to mention item tracking and other bits.

This means that once your fortress gets somewhere around 200 dwarfs you start getting into FPS death because it doesn't scale. So it actually as is a problem, and development of the game is going against the grain of modern computing.

I'm not downing on it at all, DF is Tarns forever project, and a damn good one. He, by design, doesn't owe anyone anything. But the reality of it is that it only gets less and less playable over time and will not be able to reach the mass civilization simulation that many of its players want as more features are added and core speeds remain relatively unchanged.


> He, by design, doesn't owe anyone anything.

A commercial game would have a tutorial mode and other bells and whistles to make it more approachable. Thing is, I have one hour a day before sleep to screw around, so is it Bob's Burgers or Dwarf Fortress?


> A commercial game would have a tutorial mode and other bells and whistles to make it more approachable.

Does Jackson Pollock have a tutorial? Duchamp? The beauty in Dwarf Fortress is that it doesn't cater to the lowest common denominator. It's the vision of Tarn and Zach Adams manifested without the filter of commercial interests.

Nobody's opposed to a tutorial (and there are plenty of community created ones: http://dwarffortresswiki.org/index.php/DF2014:Tutorials ).

Commercial games are like corporate art. A company would never create a game like Dwarf Fortress.

Alternately, you can spend your hour before bed watching reality television.


I'm confused by what you're getting at here


Back when I was hacking on DF ~4 years ago I believe it was actually the item tracking that was ruining performance in big forts. The main loop had to touch every item in the game every frame, including things like every rock mined by your dwarves. You can usually get significant speedup in big forts just by destroying all the spare rocks (I think the best way to do this in-game was put them under a bridge and then close it on top of them).

I ended up having more fun hacking on DF than playing it, but it is an interesting game.


Pathfinding was the other big CPU problem; better fortress design means better pathfindning means longer playtime before death by slowness.


This is the only thing that disappoints me about Dwarf Fortress, my CPU has gotten more and more powerful over the years and despite that my FPS crashes faster and I have to drop to smaller and smaller worlds every release.


What are you talking about? Of course it's meaningful to criticize a game for how well it uses the resources available. Not many applications are are resource-dependent as a video game. Word can be a little slow, but as long as it recognizes what I type, it's fine. Excel can hang for seconds while it calculates my formula. 3D rendering can take days and that's to be expected. No one talks about frames per second in productivity applications. But look at how much crap WatchDogs got when it came out and people realized it was only using 70% of the available resources on AMD cards. Look at how many hundreds of dollars people spend on video cards and quad/octo core processors to get a little better performance. Meanwhile the workstation on my desk has a Core 2 Duo with 2GB of RAM, and I couldn't care less.

If video games shouldn't be developed to high levels of efficiency, what should be?


Ya, sadly Toady has basically said "I don't get multithreading, and it is so far down the list of things to do that it might happen in... who knows when."


I just wonder why this isn't 64bit? Seems it should profit vastly from being able to use modern amounts of RAM plus better registers.


Multithreading is a last ditch effort since you can only get your speedup from it once and it will add a whole new level of hard bugs and will dramatically complicate everything forever. For something like df it is much better to work on smarter code first.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: