A great example of not getting bogged down in analysis paralysis. Adams chose and continues to choose "sub-optimal" technologies for developing and running DF. Heck, last I checked he is still manually doing version control with folders and flash drives. As he states in the article, he is infinitely more interested in the design side than the technology side. The takeaway is that he is motivated to make his vision a reality and ships regardless of the tech choices. An imperfect project that ships is better than a perfect one that doesn't.
I always take the opposite view of the development of Dwarf Fortress. I think it is a shame how much the game has stagnated, and it is frustrating when the creator brags about not bothering to use version control.
I love Tarn and I think there is some wisdom in the way he develops, and he is clearly brilliant.
But I look at the state of Dwarf Fortress now, and Dwarf Fortress 5 years ago, and I do not see a huge amount of additional features and design space that has been explored.
Even when Tarn decided: "Hmm, maybe making some money for a product I am proud of and work extremely hard on" when he decided to add the Steam version, it has been a long load to make it there.
The game still has FPS death when fortresses get to a certain, actually relatively modest, size. This is a big deal, it means people cannot dig in as deep as they want with their fortresses. Imagine if your Skyrim game choked to death after investing months into a character because of an unspoken, undetectable limitation.
It is an impressive project, but the lesson I always take from his talks is: "Take your time. Build it right. Someday you may have to make a new GUI front-end, and I don't want that to take multiple people multiple years to accomplish."
> Even when Tarn decided: "Hmm, maybe making some money for a product I am proud of and work extremely hard on" when he decided to add the Steam version, it has been a long load to make it there.
"Not long ago, he explains, Zach had a cancer scare. Even with a good insurance plan, the costs to tease things out were high. When Adams looked at his own plan he was shocked.
“We were looking at the health care prices,” Adams says. “If it had been me — we grew up in the same town, right; we have all the same doctors and everything — I would have been wiped out.”
“It’s just scary,” Adams says, referring to the health care system. “The whole thing. So that was on our minds. But also, the whole crowdfunding environment has been a little bumpy, just in terms of how Patreon changes. [...] That’s just how it is in games in general. People just feel kind of a little squeezed right now.”
And so the brothers at Bay 12 are doing something they thought they’d never do. Nearly 16 years into the project, they’ve pulled the trigger on a plan to make a proper commercial release. But how they’re planning to do it is the remarkable part."
FPS issues is what eventually killed dwarf fortress for me. If you want to do anything interesting be prepared to watch the game slow to a crawl.
That and construction of anything being a nightmare as your dwarfs will get stuck in every conceivable way trying to build even a simple wall more than a meter high.
That said, I have spent hundreds of hours on the game with some great stories to tell so overall I can't complain too much.
It is certainly not. It's a well-known limitation.
And it's not even specific to Dwarf Fortress: with most "simulation" games you really can't have your cake and eat it too.
Some games sidestep it with "make-believe" simulation, where they reduce simulation resolution to arrive at something plausible, but not necessarily correct.
But that's not what people are playing DF for. I like my pinkies damn well simulated.
That entirely depends on your objectives; pragmatically, of course there are worthwhile optimizations to be made, from _my_ perspective, but my idea of optimization is not Tarn's.
Over the entire development of DF an afternoon or two to try out git or similar would have been a drop in the bucket. There's a lot of (useful!) ground between using flash drives and trying out a hot new version control workflow every week.
Absolutely agree. The single master branch, commit when you feel like it workflow is very simple, and still miles better than just copying files in the file system.
He should care. The effort to learn git would have saved him time. Time he could have used to improve DF or to sleep, whatever. But instead he's still moving folders around.
I believe it's a balance. You don't want to spend your life just adjusting your IDE settings to the point that you never work on your actual project. But you shouldn't completely neglect anything that could improve your life with little effort.
I confess I'm guilty of "procrastination by configuration". I probably spent a year of my total career life just messing with my settings or getting the perfect git or vim configfiles.
I don't disagree he needs a source code control system. But it's his intellectual property and right now he has some protection through redundancy. At least I certainly hope he keeps some USB sticks with a friend because if not one house fire and his livelyhood is gone.
If he goes online he has to worry about someone hacking his account, if he keeps it at home we're back to what is his backup strategy, what does his redundancy model look like, and that's just not losing anything he already has. Next he needs to git's enough to be effective. Not that it's hard but if you've never used any sccm before it can be intimidating, and all of it takes time.
It may just be easier to say later, later, later because you don't want to take however long it'll be to move and you're thinking of all the lost development time. Of course if he ever really needs it (house fire, tornado, hurricane) the price will the seem quite cheap.
In the meantime I'd at least find a way to backup off-site even if it's scp'ing passworded zip files to an Azure VM.
> But, who cares? If this worked for the developer, what's wrong?
Chances are it didn't work for the developer. I've been in places where people were versioning by folder, and pretending it was fine, because they didn't want to learn how to do it properly. They'd come up will all manner of excuses that git/svn/hg/etc had already dealt with. "It's easier to just copy paste", "It's a waste of my time to spend time on a VCS", "What if we both change the same file". I've heard them all, and when I look back at the colleagues who said this, they produced a lot less than they ought to have. Not just because of version control, it's one among several software red flags (never commenting anything, not having tests, not making an effort to make things work on other people's machines, etc).
I'm not going to say anything bad about the DF guy, this seems like a strange foible, but in general not using version control is a major red flag that someone is actually a bit of a novice who is low on productivity.
Version control is the big one, because if you're productive you tend to write a lot of code, and you'll tend to have hit the same issues that everyone has when they're writing a lot of stuff.
> I'm not going to say anything bad about the DF guy, this seems like a strange foible, but in general not using version control is a major red flag that someone is actually a bit of a novice who is low on productivity.
Sure, not using version control in an environment where collaboration is to be expected (like in teams/companies where you work with others), I'd agree with you. But that's not Dwarf Fortress, it's just a single person doing the work. Obviously Tarn is not "low on productivity" or "a bit of a novice" since he have done something many of us will never do in our entire life, create entertainment that will forever live in some peoples memories. Just because he's not using Git, doesn't mean it's a red flag, since using Git would basically give him zero benefits and just add more to learn for again, zero benefits.
It depends whether the process is part of the art itself, or just the result. If someone wants to be the excel artist (https://www.spoon-tamago.com/2013/05/28/tatsuo-horiuchi-exce...), that's cool even if other tools would give them more possibilities. But in this case, the 80s development style is neither part of the art, nor is it better in any way at all, so people keep pointing out that this decision is just objectively worse than doing the absolute minimum of modern version control.
I get your point, but it's also not entirely accurate to say the workflow and tools aren't part of the art.
Creativity is a fickle thing, and it's unlikely that steps X,Y,Z would produce the same inspiration as A,B,C.
I agree that modern version control would offer benefits, but maybe in moving to git Tarn would decide to sha1 everything internally, which would have some ultimate impact on the game that results.
> not entirely accurate to say the workflow and tools aren't part of the art.
That workflow/tools is not something they advertise, are proud of, or concentrate on. We know about it from a random interview only, so I don't see a reason to count that as part of the DF-as-art.
I think you're talking about the artistic _product_, and the ethernet bridge is talking about artistic _process_. Like, you don't care what kind of paintbrush a painter uses, but it matters a lot to them.
* Ability to roll back to some older state. often I have dismissed some solution only to later realize that I need it again, or parts of it. If I have ever committed it, it's still around. Even if it is an orphan commit now, or in the extreme case, if it was once in the staging area you can still recover it later, at least until garbage collection has deleted it.
Also the ability to see when what feature of the code was introduced.
That can help a lot with understanding code bases: people seldom comment their code properly, so any context you can get is good.
(In my opinion, really good code can often answer 'what' and 'how' without comments, but it has a hard time answer 'why', and an almost impossible task of answering 'why not'? Ie 'why not' this alternative approach?
Whether the answer to that one is "we tried it and it didn't work" or "we didn't think of it, you are a genius for bringing it up" or something else can be important.)
There is barely a week that goes by that I don’t start off with a good idea but halfway through I realize I’ve taken a wrong turn or bitten off the wrong chunk first. If it weren’t for git or more frequently Jetbrain’s local history, my output would be lower quality.
And what the Mikado Method teaches us is that when a refactoring gets out of hand, there’s a much smaller refactor hiding inside it waiting to get out. Erosion history is both a blessing and a curse here, but generally there are bits you can salvage without falling into Sunk Cost.
Prevents collaboration with others, ability to rollback to a specific release to test a bug, partially developing multiple ideas.
All of that is possible manually, but it is doing version control by hand. There are plenty of great tools out there to simplify that process. And if you start automating your manual custom process, you just end up at your own version control system that is something additional you have to mantain.
Nothing until a catastrophe causes lost work, or if they wanted to collaborate with someone else.
In fact, right now the main DF developers are working with a partner studio to put a better UX on DF to sell on Steam. I bet an industry standard version control would be extremely helpful right about now.
If I remember right, svn used to be a bit more involved than eg git for just starting tracking of a local project. I think you used to need to set up a (local) server etc?
They seemed to have fixed that, so the barrier to entry is lower than it used to be.
Almost any version control is better than manual mucking around with files.
Git is probably the better investment, if you are new to version control systems; if even just because everyone else uses it these days.
But svn would work ok-ish, too. And you can always convert your history later, if necessary.
Maybe people are wired differently but I have always thought that git is way more intuitive than Subversion and I was more productive after 1 week of git than I previously had been after 1 year of using Subversion. I am sure Subversion has improved since then, but so has git.
* Syntax - in svn different commands use different conventions on how to use revision
* In svn moving directories breaks "subdirectory as branch" abstraction. I.e. there are no actual branches. Only files and versions of them.
* "Svn up" can fail, leaving working copy in broken, unpredictable state. This breaks abstraction, revealing that svn is little but copying files around. Git forces you to save your work and only then merge. That doesn't fail even when it fails to merge cleanly.
* ahem. Revision is always a revision in SVN. It’s rN and all commands accept it.
* ahem. All branches in SVN are copies of an existing tree at a particular revision. Call it a branch or whatever you want. But if you make a copy of your main development tree, you make a branch and you can work on it in the same fashion as if it was git. Merge it back to trunk when done or continue your work and sync with trunk or other branches. The principle is the same.
* Failing “svn up” is an unexpected technical problem that occurs due to problems with the server or network. The problem should never happen in normal cases.
Two styles of revisions is super confusing to users like me. Git confuses me a bit, but not in command syntax. Hope this illustration helps to see what I mean.
You forgot to mention the "click on this one specific checkbox or your source code is now leaked to the entire internet forever" part. So actually, not that simple and potentially much much worse.
Except you also get a better history (assuming you write good commit messages), and it's all in the same folder.
You could also use git-send-email to make a mailing "list" with just yourself, and that way make it much easier to move between machines as you work. Or if you don't trust your email hosting provider, self host gitea or sourcehut (or have the option to later).
Finally, you also get the ability to better organize development with branches.
I don't even know git that well and came up with that off the top of my head.
Git doesn't require an internet connection. With git you can keep doing backups the same way and store the git repository in flash drives, Dropbox, Amazon Glacier, or anywhere you want. The main difference here is that you don't need one folder per version anymore.
Git has no checkboxes. It is purely command line. Are you talking about Github or Gitlab? I know Gitlab used to have this bug where you could easily accidentally give everyone access to all your repos.
Probably, but also that will always be true “one afternoon of X will make things better” but if it’s working for somebody I’m not going to tell somebody their workflow is wrong.
Normally lauding a "sub-optimal" tech means the optimal tech isn't a fit for the problem space or the cost of adoption is higher than the benefits. Something like Tarn doing everything in C/C++ which is arguably good and arguably bad but works. It is unclear that switching languages would be a net win or loss.
In the case of not using a VCS system "sub-optimal" means he's making his own life difficult for no gain. It is difficult to see that it is anything other than a mistake - a 20 year old project with 700k lines of code is going to benefit from version control. The only saving grace is that Dwarf Fortress development predates git so at least there is a good excuse for how it happened this way, but it is still a mistake.
We all make mistakes though, so the world will continue to turn if Tarn sticks to it.
I've noticed that people tend to have varying differences in their mental map of organization. If you view a codebase as largely monolithic with only a few states such as previous/backup, current, and future/editing, the addition of version control may not help workflow. Editing is always a rolling-release model, be it backwards or forwards (at least that's what I interpret from "there is kind of an active molten core that I have a much better working knowledge of" in the article.
If version control were introduced, you would now be tasked to actively divert workflow to the versioning software, such as when reverting commits, etc. You don't create a repo for trivial objects, such as a simple shell script you write for your personal desktop, so it's easy to see this applied to scale, especially with a single-person codebase like DF. For the script example, a file will typically have undo/revert history in an editor, which itself is a limited form of version control. So there obviously has to be some form of version control used by the author other than file copying, albeit not as explicit as git.
I think it's unfair to call it a mistake unless you're directly affected. In the 20 years of (solo?) development of DF, do you not suppose a rational person would adopt versioning were a lack thereof impacting them noticeably? So obviously it's a non-issue as far as scope of the codebase goes.
I understand your example, but as a counterpoint: I do keep simple scripts in version control, too.
But the real simple stuff usually starts out in my misc scripts folder. But that's more for filesystem organisation, not because creating a git repo has any (mental) overhead.
> In the 20 years of (solo?) development of DF, do you not suppose a rational person would adopt versioning were a lack thereof impacting them noticeably?
If they never got around to learning about version control, it's totally imaginable that they are sort-of rational, and that version control would have improved their lives.
Not sure. I use git at work, but I would never use it for my solo projects. It just doesn't add anything I need, but it takes mental effort to manage it.
If nothing else to see and learn from your own history. Imagine a rich 20 year history of commits. Even if it was all the most basic system with just branches and commits and merges. I am still a source control simpleton after 20 years, but having a few branches using git is rather trivial.
Well there must be 20 years worth of backups. I wonder whether there is a way to turn 20 years of zip files storing different versions into a git repository, using the dates of the zip files, the dates of the files inside and perhaps the name of the zip files. Bonus points for detecting branches.
> I wonder whether there is a way to turn 20 years of zip files storing different versions into a git repository, using the dates of the zip files, the dates of the files inside and perhaps the name of the zip files.
'course there is. While Linus didn't bother, there are people who rebuilt the Linux history from the tarballs, and you can `graft` the historical sequence and the "real" one to get something of a continuous history from 0.01 to today.
It's a lot of work though, old archives often contains partial garbage data (many of the rebuilt linux history incorrectly attribute various historical commits to 2007)
I did exactly that at work, when I inherited maintainership of a legacy internal software program, which until then has been developed with the same "version control" approach as DF - a folder/zipfile per release or per feature branch.
As a bonus, it is something that has to run on three different operating systems, and the source code for each has been kept separately, and synced manually every now and then.
Granted, it was only like a dozen or so releases and not 20 years worth of development like DF, but that's just a matter of scale, not of principle. :)
Yes, PostgreSQL did this when they switched from CVS to git. They used that as an oportunity to prrpend the history woth the contents of some ancient tarballs that they had.
Assuming there is a folder for each release, all you would have to do is commit the first release then each subsequent release.
So after the first release, for every release delete all the files in the folder except the .git folder and copy the next set of source files in and commit.
Doesn't account for branches, but seems like this would work and could be done quickly with some shell script.
I managed code in version control long before Git, and before DF existed in 2002 we had decent and sometimes great tools (ClearCase?). Still, I personally do not care how other people work. Different strokes for different folks, etc.
Linus Torvalds famously preferred tarballs over CVS.
For further discussion, keep in mind that thanks to competition from git etc, modern svn is substantially more bearable than it used to be before distributed version control systems became popular.
You don't remember when your CVS admin would send out the email announcing the repo would be frozen for the next 48 hours so a branch could be created for the golden master? I don't know why it was such an onerous task but the disruption creating a branch would cause was significant given you would only commit every few weeks when all your changes were ready. Compared to modern VCSes it was like having no VCS at all.
You know what happened when you did that one simple command that every sysadmin hates? It copies all of the thousands of files containing all the millions of lines of production code from spinning rust-coated glass to spinning rust-coated glass on system that run at hundreds of megahertz. If anyone accidentally checks in a change between when the copy starts and when it ends, the entire source history going back years gets horribly and irretrievably corrupted and you need to spend a couple of days restoring everything from backup, which is inevitably rust-coated mylar tape to rust-coated spinning glass disks, rebuilding indexes as you go. That is assuming the restore from backup actually works, which is rarely if ever tested and you know full well even if it is tested that's the day you experience not one but multiple disk failures probably because of all the intense activity. Meanwhile you lost your slot at the manufacturing facility in Taiwan because your golden master wasn't delivered in the brief window required and they have already retooled the line for another customer. Maybe you can recover from the financial loss, or maybe your sysadmin just takes the reasonable precaution of freezing access to the CVS repo for the days the branch takes to create, and schedules it over a weekend to minimize downtime.
You have to wonder why, twenty five years ago, the sysadmin didn't just synch the entire network to the cloud through his iPhone. That, of course, would just show how none of was is your lived experience.
Maybe at one point ClearCase could have been considered state of the art, but that ship has long sailed. I used it professionally from 2004-2013, and it was awful. Branching was slow. Well, everything was slow. It was way too easy to miss new (view private) files. Non trivial merges were atrocious. We were begging to migrate to svn...
I'd be cautious to take a 20-year project with only a single developer the entire time as an example for other projects that are not that.
But the particular advantages of a single-developer project are worth pointing out; I do think you can "get away" with a lot of... things that would be messiness on a multi-developer project, but aren't on a single developer project when they match whatever predilections of the single developer. There are real "organizational" costs to adding more developers, with the biggest spike at adding a second. There are also obviously disadvantages and limitations to having only one developer. (in this case, not even only one developer "at a time", literally only one developer over the lifetime of a project that's lasted longer than most still-alive software projects!)
But the other side of the argument is that the game is amazingly complex, and this level of complexity may not be possible if you add in a 2nd developer - their vision would conflict, or compromises given early on that prevents future complexity to be added.
AKA, a 20 year project with 1 developer could take more than 20 years with two (to reach the same level of complexity)!
Right, my point was meant to be that some things are indeed possible with a single developer that aren't with a team and that adding more developers adds organizational challenges. (But there are of course other things not possible with a single developer).
I agree that there are advantages to a single developer project that I don't see talked about a lot.
I see the opposite. DF is stagnant as hell and other games have eaten it’s lunch by building and iterating at light speed compared to the comparative snails pace of DF.
Using git as a single developer takes tens of minutes to understand at most. It is hugely helpful over time to see what changed and why. The fact that he isn't doing this is a net negative.
I'm really looking forward to the steam version. When I tried the game back in college (admittedly this was 12 years ago), my conclusion was that the beauty and complexity of the simulation engine was effectively inaccessible from the user interface. Not the fact that it was text based, but that it was limited in its ability to expose the state of the system, and provide an interface to perform the mutations of the state that are an intended part of the game experience. Presumably, these are the main bits of polish that are going into the steam version.
To get the full benefit of Dwarf Fortress I found it was necessary to use 3rd party tools. One of the big ones is dwarf therapist [1]. This tool makes it far easier to toggle job preferences for all your dwarves according to their abilities. It presents a big table of your roster and lets you see all their skill levels and job assignments at a glance. It also has convenient features for grouping dwarves based on criteria such as migration wave which lets you tackle the mustering in one shot whenever migrants show up.
therapist seems like an important one, iirc they showed a similar interface in a steam version demo. Without it, jobs and aptitudes of all different dwarfs are all over the place and woe to you if you forget the key combo because by the time you figure it out you probably forgot the friggn guys name
"Shouldn't" as in: the developers of dwarf fortress should be ashamed?
Or "shouldn't" as in: the guy who uses mods is not purist enough?
Or something else?
Since dwarf fortress is free as in beer, I wouldn't complain too much.
I've played dwarf fortress a bit about a decade or so ago, and had fun. But I didn't stick with it. For me, the main benefit was in all the other games, like RimWorld, that dwarf fortress inspired.
"shouldn't" as in the game is obviously lacking an important component, if the essential game experience requires a mod made by a different party than the actual game developer.
It's not officially endorsed by the creator of the game. It's not mentioned on the download page or the "links" page.
The game is clearly missing something - a gap is being filled by Dwarf Therapist, and it would be a better game, accessible to more people, if the game just had a better UX built-in.
I'm not saying something whacky. The dev team has the exact same opinion as me and that's why they're putting a bunch of effort into the UX for the steam version.
Yes, aside from a few problems (like exposing a character’s real identity, when the game only wants to show the player their assumed identity). Also, Dwarf Therapist can make some complex deductions from the available data that aren’t shown in the game, such as showing which dwarves would be good for particular jobs by considering their stats, traits, skills, and preferences.
This was the tool that finally made DF playable for me, with it I was able to get into the game enough to enjoy myself and realize what everyone was going on about :)
What state are you looking to access/mutate? To me, a large part of the fun from the DF experience came from only being able to twiddle the high level knobs and watch the shenanigans emerge.
I also got the impression that low level interventions may not be usefully realizable in the codebase.
Awesome interview. DF is certainly a fascinating project. But I always felt like it is held back by it’s poor accessibility.
A newer game Rimworld takes most of the core ideas from dwarf fortress but packages it up in a nice gui with help text so you can learn to play in the game rather than with a wiki.
That is what I have been told. I have tried both and I ended up getting a lot of enjoyment out of rimworld while in DF I spent several hours studying the getting started wiki and still didn't really feel like I understood what was going on.
Would be amazing to see all this work that went in to DF exposed to a wider audience with some kind of UI improvements.
Rimworld is an accessible game and while it has a certain level of depth compared to other games, especially on the emotional side of your characters, it's shallow compared to DF.
The problem with DF is, it's hardly accessible. This is why so many people hope for the Steam version. It's often more fun to read people stories playing the game compared to playing it yourself.
DF offers you the illusion of a living world, that is writing it's own history.
Dwarfs discovering some kind of creature living on a mountain edge. They gain interest in it. Throwing stuff at it, talking about it, giving this creature a funny name. Later on your mason will carve stories on your walls in your castle and somewhere in the story, maybe, he will write about this creature.
Dwarf Fortress goes into depth to a truly magnificent, ridiculous degree. (I guess also in the non-ridiculous degree of having a 3D world instead of 2D ;)) It simulates a lot more, and thus more is possible in game and more weird and wonderful chain reactions can happen.
Are those stories written by a character in Dwarf Fortress or a dramatization of the emergent behavior? The first story I saw could’ve happened in Rimworld or even Prison Architect, given literary license to dramatize.
Both are driven by systems so they're both effectively simulations, but dwarf fortress is much, much more in depth. It's basically a "real life" simulator but set in a fantasy universe, where you control a colony of dwarves. The interplay between systems is unheard of in any other game*, the closest you get is some roguelikes.
And similarly to DF it's also getting a more accessible Steam version (Space Station 14 as well as Unitystation) to make it easier to start playing it.
In my play experience SS13 wins out in terms of funny stories and variety, mainly because the speed of development is so much faster (especially on big servers like tg), it being multiplayer and there existing many spin-off forks from the usual space station gameplay.
To be fair this is basically just a texture pack, which is already something you can use. The real problem is the highly consistent UI design, differing in things as simple as if you have to use a keyboard or mouse to navigate a list.
Toady is understandably disinterested in putting time into that element of the game, and I think he had a bad experience the last time he 'outsourced' it.
It's not just a texture pack in any way - just read the dev blog. They are also overhauling all of the UIs, adding mouse support to everything and so on.
I'm super hype on this. I've managed to play DF a bit successfully thanks to Dwarf Therapist, but I'm really excited to come back and play again with the UX overhaul.
Are there any articles/posts/resources that delve into technical overview of how DF achieves it's simulation complexity? For example, something like: how does the game constrain it's randomness or sampling process for new procedural generation, how is this woven into the story development?
I realise this is somewhat of a vague question. I've never played DF (or any video/pc games) but I find descriptions of it's development fascinating, and would like to learn more about how the creator actually designed and evolved this game.
It should be noted that Tarn would probably agree with a lot of the criticisms leveled here regarding the UI. The focus so far has been on getting the systems working (and there are many more to go and be further fleshed out: magic, the organic development of economies in the game, world history, warfare, etc). The Adams brothers essentially want to construct a general fantasy simulator capable of constructing rich worlds all on its own. Wouldn’t make much sense to pour tons of energy into making and refining the UI over and over again when the game isn’t even halfway done yet.
If that is the goal, they should make it trivial to make UIs for it by others. I know there are many UIs for it, is it easy to make those or is it reverse engineering and if the latter, why? I don't know; I personally don't mind the UI.
DF has always been an interesting game to me, but I find it such a shame that such a deeply complex game remains closed-source. I've fortunately since found a number of open-source games that scratch the same itch - Cataclysm: Dark Days Ahead, Space Station 13, Space Station 14, KeeperRL. I hope the dev one day does free his code to be examined and fully modified.
Where could one learn more about the Entity Component system as described in the article? Is it as simple as an object contains an array of components?
ECS or "Entity, Component, System". You have Entities (like a soccer ball or a player, or any object in the game) which are objects, that contain a list of Components (like phenomenons on the entity, like physics, sound tracks, health bar, etc) and a game loop that calls out a list of Systems (like Physics System, Sound System, HealthBar System), which are "static" objects that hold a reference to all Components and which update them to move the game forward (eg. Physics systems moves position down by vt + a/2 t^2)
The point in the article was to prefer composition over inheritance, which is sort of like using OOP “the right way”.
It didn’t discourage OOP as a whole, nor do I think it should - making a game without objects would be a nightmare.
Plenty of games were made using plain C and even Assembler.
What is really needed are not objects in a OOP sense, but a way to make a mutable, composite data type. Else you would have to pass tens of arguments to even simplest of functions.
Bit off topic, but whats going on with there being pronouns next to names in the video chat recording ?
Is this some new standard ? Should I start adding he/him just in case I offend someone by not being explicite with something that is rather obvious in 99% of cases?
Some people's preferred pronouns might not necessarily match what others might pick to refer to them with on their own. For example, in my social circle a few AFAB friends prefer the "he/him" pronouns which might not be immediately obvious to others.
Normalizing putting our pronouns next to our names seems like a good way to approach this, without needing to explicitly ask others, or make them correct you, which seems nice, as it is pretty much effortless to do.
I wouldn't call it weird, just something that takes getting a bit of used to, but addresses shortcomings in the language. Much like how the Mrs. and Ms. thing is not obvious by looking at someone and therefore needs to be pointed out.
For the few people out of thousands that this really matters - sure, they can use this to help address them property. But I think what we seeing here goes a bit too far. What's that saying ... You can't see the forest for the trees - yeah that's the one.
Dr Clifton is using (he/him) when he has no reason for doing this. Of course You can try to explain this as solidarity but for me it threads dangerously close to some kind of strange conformity to woke culture (I still don't understand why this is the minority that was granted this kind of political power - there are lots of other minorities that could also use this kind of recognition).
> Dr Clifton is using (he/him) when he has no reason for doing this. Of course You can try to explain this as solidarity but for me it threads dangerously close to some kind of strange conformity to woke culture.
One possible reason in my eyes is indeed what you're hinting at - normalization of doing this and thus solidarity in a way, at least that'd be my best guess.
I find it odd that many websites out there feel like they should ask me for my gender when that has nothing to do with the services that they'll offer me, most likely being some advertising/data mining scheme or just a data point that's ingrained in our culture that they think they're deserving of. Similarly, i don't see why pronouns are that different.
It's just that in non-online mediums you can attempt to figure those out based on how people present themselves, at least sometimes, should they be relevant, however that's not always possible online, hence being asked to point those out. Of course, if the aforementioned AFAB friends want to use masculine pronouns, people around them presenting theirs, even when they're the "default" ones makes it easier for them to not feel awkward about their self expression.
On the opposite end, might as well use gender neutral pronouns for everyone, if that doesn't matter that much, even though that most likely won't be done either because of information loss. Some languages out there have even more pronouns and feminine/masculine forms of words - languages and social aspects are sometimes curious and puzzling. Oh, also Japanese honorifics are as interesting as they are overcomplicated, cool stuff.
> I still don't understand why this is the minority that was granted this kind of political power - there are lots of other minorities that could also use this kind of recognition.
This may or may not have something to do with the fact that it's relatively easy to do and doesn't take large amounts of effort to be nice to someone in that regard. The whole Git master to main branch thing and master/slave to leader/follower and whitelist/blacklist to allowlist/denylist thing probably took more effort than that. Sometimes these newer terms actually make more sense than the older ones, though.
As for other minorities that could use recognition - sure! I don't get up in the morning and think: "Hmm, today i'd like to be kind to LGBTQIA+ people, but i'll only care about racial issues on Wednesday." There's no reason not to be nice to most people, it's not like our capacity to do that is so limited for it to be hard.
Now, admittedly, i don't agree with everything that's happened in the world, like how Richard Stallman was cancelled over some claims that to me read like expressions of him not being entirely neurotypical and perhaps caring only for the facts (which isn't wrong) whilst lacking social tact (which isn't good, but also probably not deserving of the scale of backlash).
But for the things that seem like they might make sense and aren't actively making the world a worse place, i guess i can get with the times and be supportive of them.
From my perspective (conservative part of Europe) in America it's not seeping - it is total downpour (If I were a believer I would say: where's Noah when You need him?).
I'm not sure why, but I started thinking about "Ceterum censeo Carthaginem esse delendam" when I saw this (he/him) thing for the first time.
LOL. I saw the title, saw that it was on a developer blog... and took it literally. I wanted to know the build tools, the testing tools, the whole DevOps of DF. But, no, the title is completely misleading. Oh well! Enjoyed the too brief article anyway.
I had the same expectations, and the article was weirdly light on technical details. Aside from the fact that there's nothing much going on tooling-wise.
Let’s see the source! Tarn teases it, but this game has a dedicated community already built in that could do wonders. A crowd fund to buy him out and FOSSify the code might work.
Weird fact: DF doesn’t have version control, except for ‘mkdir’ and ‘cp’.