Hacker News new | past | comments | ask | show | jobs | submit login
Your users will do what you make easy (handmade.network)
160 points by lerno on Nov 7, 2021 | hide | past | favorite | 108 comments



> Your users will do what you make easy

This is why game designers must remove all repetitive, boring, yet effective strategies in order to make good games.

If there's something the player can do to gain advantage, no matter how boring or tedious it is, they will do it and they will ruin their fun and they will assign all blame for it on you, the designer.


I remember playing Pokemon GB games on an emulator and would actively battle with lower level pokemons in the wild or other NPC trainers to level up my own pokemons. Though, I enjoyed it, for some reason.


I understand what you are getting at and I think that applies to the majority of games. That being said, how do you approach games like RuneScape where the grind is part of the appeal?


It's not about removing anything repetitive - it's about trying to find a balance that works best for the player and for you as the developer/producer.

MMOs and games in general have you experience content in roughly 5 different ways: authored (expensive), procedural (complex and generic), emergent (problems explode with size), repeated (the grind), or user generated. Each game, and each system and region inside a game, is a balance between those.


In RuneScape social is the appeal, and you got things to click and numbers go up while you chat with people.


Exactly, I played RuneScape a bunch in highschool and would spend some weekends just grinding. Sounds boring af, but the fun thing was talking with my friends, or with complete strangers. MMO may be able to use social interaction to offset the problems of grinding, although it's probably not for free. Some incentivization for interacting with others, and safety features to feel comfortable to reach out, are needed.


The grind is absolutely not the appeal, but the other cool stuff outweighs it.


I really don't see the connection you're trying to make. There are huge swaths of the gaming industry which are based entirely on repetitive and boring tasks, including some of the most popular games in history, such as Bejeweled and all of its Candy Crush offspring. Or idle, incremental and clicker games. Or slot machines, for that matter. All of those gamers are perfectly happy to spend money on boring repetition -- sometimes for years on end.


All these games are singleplayer games. In multiplayer games the point of the GP post is much more pronounced. If a single strategy beats all others, approximately nobody will play the other strategies and a significant part of your game content may as well not have been written. You need multiple viable strategies (preferably in some sort of rock/paper/scissors loop) to keep the game interesting.


Yours is the only reply that didn't use absolutes to describe human behavior, and I genuinely appreciate that. I agree that there are situations where a similarly-structured playing field becomes important to the main function of the game, but the OP made sweeping generalizations about every game type needing to be developed in a hyper-specific way (his way), otherwise the game is automagically bad -- and I hope we can both agree that's pretty silly.


I guess it depends on the definition of "boring". The games you mention cater to different audiences than strategy games, and they make the repetitition a core part of their experience. Special thought went into making their loops feel engaging and rewarding.

So I guess the lesson is more like: Game designers need to remove all effective strategies which short-circuit the core gameplay that your audience expects.


Except there is a huge market of strategy gamers who buy them strictly for the single player campaigns, and many enjoy the slow and boring aspects of maximizing output before decimating a particular level. So, there's no actual differentiator in your example.

Much like the parent commenter, you appear to be insisting that there can only be one type of gamer per game type, and that a developer is making a "mistake" of some kind by allowing people to play the game in their preferred manor. But outside of FPSs, that just isn't the case.


Bejeweled and Candy Crush are not good games, and we click through the repetitive parts in games to get to the point.

The point is very rarely repetition itself, and even when it is, it's not actually performing the task but rather figuring out how to automate it.

Nobody enjoys clicking on a unit then clicking on where we want it to go. Our brains reward us for successfully moving units to where they need to be, whatever the action necessary to get there.

The same reason family abuse is so effective - your infantile survival depends on your parents. If your parents are abusing you, you develop a liking for it, because your survival odds improve by doing so (abusive parent is better than no parent at all). You seek partners to repeat that behavior even if it makes you miserable, because brain chemistry.

A good game is one where such mechanisms aren't exploited for profit.


Again, you're asserting your personal preferences as universal fact and using absolutes to describe human behavior, which makes this less of a conversation and more of a series of opportunities for you to repeat the same claims. You don't get to make the decision that Bejeweled and Candy Crush are bad games on humanity's behalf. You also don't get to decide which game mechanisms aren't enjoyable, especially when a billion other people enjoy them.

You're effectively saying that a baseball player can't enjoy the act of swinging their bat or throwing the ball, just because it's not the end goal. This is obviously untrue for so many reasons, and I'm struggling to understand why you're so caught up on there only being one way to enjoy something -- and that way has to be your way.

> The same reason family abuse is so effective - your infantile survival depends on your parents. If your parents are abusing you, you develop a liking for it, because your survival odds improve by doing so (abusive parent is better than no parent at all). You seek partners to repeat that behavior even if it makes you miserable, because brain chemistry.

This is both disturbing and another attempt at using absolutes to compress the behavioral spectrum into a singular idea, and unsurprisingly, that idea matches your own personal views. Unfortunately, none of what you said in that paragraph is representative of any psychology book I've ever read, or of my own experiences, or of the experiences of anyone I know.

So, if you'd like to link to some research backing up your claim, I'd love to read it. Otherwise, please avoid making outlandish assertions such as "if your parents abuse you, you will like it, and you will like it forever, and seek it out, and there is no other possible avenue for you." That's a movie you saw one time, not real life.


> you're asserting your personal preferences as universal fact [...] You don't get to make the decision that Bejeweled and Candy Crush are bad games on humanity's behalf. You also don't get to decide which game mechanisms aren't enjoyable

Yeah, I do. I've researched video games for decades on a scale and detail comparable to the cutting edge of any field. I'm a professional designer. My opinion is not a matter of preference, but expertise.

> You're effectively saying that a baseball player can't enjoy the act of swinging their bat or throwing the ball, just because it's not the end goal. This is obviously untrue for so many reasons, and I'm struggling to understand why you're so caught up on there only being one way to enjoy something -- and that way has to be your way.

Video games about baseball do not come with bat controllers to swing - you press buttons to play. So clearly swinging the bat is not the point of baseball, or baseball video games would include that aspect.

> if your parents abuse you, you will like it, [...] and there is no other possible avenue for you.

Your words, not mine. I didn't rule out other possible avenues, that's your mind betraying you with the very absolutes you accuse me of. I said it makes abuse more effective, and a simple thought exercise will lead you to the same conclusion.

It's called Stockholm Syndrome, look it up. Widely recognized in psychology.


I suggest you look more into Stockholm Syndrome. It is widely known, but not recognized. With what we currently know, it is a "contested illness" (as Wikipedia calls it) at best and useless media frenzy at worst.


> Yeah, I do.

This is all we need to know about you. Everything you've said comes from a place of extreme arrogance, while also possessing an immense ignorance on so many subjects. I'm done engaging with your psychopathy.


Haha oh yes, now you're "done" engaging with me, out of personal choice and disdain for me of course, not the dozen rebuttals I just made to your falsehoods you are unable to address :)


Yes, but that is how the game is designed to be played. The point is if there is a most effective path that is different from the most fun path then the game designer has got room to improve.


Again, I feel like you're stretching to make an analogy that doesn't actually reflect reality. You can't admit that people enjoy boring effectiveness as the premise of half the worlds' games, while also insisting that it's mutually exclusive to having fun in the other half.

I'll give you an example. Inarguably one of the most successful games of all time, World of Warcraft has multiple different paths and styles of leveling, gold farming, and most other tasks. The least effective paths are absolutely the most boring and involve hundreds of hours of grinding for very little gain, and yet, that's the consciously chosen path for a majority of players. Not because they can't figure out more proficient methods, but because they enjoy the grind (for any numbers of reasons).

The lesson learned from game designs like WoW, is that the implemented paths shouldn't be either-or propositions, because singular or linear paths alienate far more gamers than optional paths. Whereas what you're insisting, is that that the developer has done something "wrong" simply by presenting multiple paths, and they need to "improve" upon it by removing the ones which don't produce some arbitrary amount of dopamine and adrenaline per button press.


I don't think the argument here was to provide a single path with maximal dopamine-per-action.

The original statement was to make sure repetitive and boring paths weren't highly effective.

Your own characterization of grinding in WoW fits that description. It may be repetitive and boring, but it's not highly rewarding so those who engage with it do so because they are enjoying it for it's own sake. This is fine, the players can do it if they enjoy it and won't make themselves suffer through it if they don't.

(My own experience with WoW and the term 'grinding' in general does not make me inclined to use it as an example of an unrewarding activity that players don't force themselves to engage in entirely for the sake of the in-game results, but maybe things have changed... it has been awhile.)


No, the original commenter repeatedly used absolutes to make a baseless claim that a game is automatically bad if it includes any functionality that they, personally, perceive as a "boring and repetitive, but effective strategy." Using absolutes to describe human behavior is absurd, and they were clearly trying to assert their view as universal fact, but offers no foundation for that assertion -- because there really isn't one.

So, as I've established several times now, if many (possibly even most) gamers find repetitive and boring enjoyable, then it's rather silly to insist that the developer "must" do something to "fix" it, otherwise the user "will definitely not have fun" and the game "can't be good."


You seem to think that game design is a purely subjective field in which no expertize can be learned, and as such, all games are equally fun and also game analysis and review are futile professions.

Of course people exist who know what makes games better. For whatever reason, you can't entertain the possibility that it's me specifically?


I literally said none of those things, or even suggested them, so now you're just making up arguments for the purpose of repeating the exact same flawed claim -- and yet, somehow I'm the one who can't entertain a possibility? You're honestly starting to sound like a troll now.

I never said a game couldn't be made better. All I did was point out that there is no singular "best" set of game mechanics, as you have repeatedly suggested. You're obsessed with using absolutes to describe behaviors the vary wildly, and insisting that's "how it has to be" and any other approach is somehow lesser than yours. But the only actual argument you've made so far is that you, personally, have discovered the absolute truths of the gaming universe, and anyone who doesn't agree with you simply doesn't understand game design.

So, it has nothing to do with your expertise, and everything to do with your insistence that you are THE expert of EVERYTHING gaming, and there is no room for any other opinion or development path.

Sorry to burst your bubble, but you're not the only person with experience in the gaming industry, and I don't actually know ANY developers who share your mentality toward game design.


> I literally said none of those things, or even suggested them

Every single one of your comments explicitly discredits a game designer's opinion from having any merit on what makes games good. That's your entire argument against my statements.

> I never said a game couldn't be made better. All I did was point out that there is no singular "best" set of game mechanics, as you have repeatedly suggested.

I would love to read this argument of yours, presumably from years of research into the subject, for why it's literally impossible for a set of mechanics to be better than all others. If you haven't found it, that's fine; but others are looking for it.

> the only actual argument you've made so far is that you, personally, have discovered the absolute truths of the gaming universe, and anyone who doesn't agree with you simply doesn't understand game design.

Yeah that's how expertise works. I could record a series of lectures explaining everything, or maybe try write a book on the subject, but that'd take years I'd rather spend applying what I know and making better games. You can ask me specific questions to challenge my self declared expertise, if you want. That's discussion.

> I don't actually know ANY developers who share your mentality toward game design.

You literally know lots of game developers, all of whom think that Bejeweled and Candy Crush are good games? And you think I'm living in a bubble??


> what you're insisting, is that that the developer has done something "wrong" simply by presenting multiple paths, and they need to "improve" upon it by removing the ones which don't produce some arbitrary amount of dopamine and adrenaline per button press.

Nowhere in any of my comments will you find any mention dopamine nor adrenaline. I said "good", which is a point along the dimension of quality. You know, that thing we all strive for because it's literally better?

I'm exactly arguing *AGAINST* exploiting brain chemistry for profit in the video game industry. You are defending that practice as a valid form of art and entertainment.


And therefore some would say that those games are deficient in something important. Of course they're "successful", but usually through exploitative and disrespectful designs.


How can it be exploitative and disrespectful if the person is enjoying it? You're projecting your own preferences on others, and insisting that anyone who doesn't fall into that arbitrary category is either "wrong" or being manipulated into enjoying something.


For many it might be just a type of relaxed enjoyment (shutting the brain off) - and that's fine!

The OP is likely talking about what those games _exploit_: It's the few that will play the game to an extreme degree and likely pay tons of money to do so (often called "whales" but not all of them are).

The mechanism behind this is grounded in how our brain rewards us if we find seemingly rare patterns of success. And some percentage of people are susceptible to that kind of manipulation.

Whether you put the blame on the game or the players is another question, maybe a philosophical or ideological one. But to dismiss the mechanism as simply "enjoying something" implies that you don't recognize the problem as an addiction that may or may not be willingly exploited here.

For comparison: Walk into a casino or similar which has one-armed bandits. Tell me that the people hanging out there are truly enjoying it.


I hear you, but like the OP, your argument is still based entirely around absolutes, and is rather dismissive of the vast spectrum of human behavior. Just because some people get lost in the addictive mechanisms of a game, doesn't mean everyone does, and an addictive mechanism is not universally malicious just because someone becomes addicted to it. If that were true, every company and product would be considered malicious, because everything on the planet is designed to be addictive to some degree.

My grandmother and aunt, for example, visit a local casino once in a while and exclusively play slots. They each bring $20 and play until they've either doubled it or lost it. I've gone with them on a few occasions, and can personally attest that they genuinely enjoy the experience of playing the game -- my grandmother even says "wheeeee" out loud while pulling the lever. I, on the other hand, do not enjoy casinos at all, and easily get caught up in the addictive aspects. Is the casino malicious because I, personally, am more susceptible to addictive behavior? Should my grandmother not be allowed to have her fun because of someone else's addiction?

At the end of the day, my experience is not my grandmother's experience. And your experience is not my experience. And the OP's experience isn't a universal truth, no matter how much they want to insist it is.


I very much agree with the general point that you make. What I'm personally concerned with is the power relation between the casino or game publisher and the group of players that are susceptible to addiction (specifically not your grandmother - in fact I found it very fun to imagine how she enjoys playing).

In my opinion when there is an asymmetric power relation like that, then there is a question of responsibility. Especially if the incentive is to exploit the relation - which is almost undeniable in this case.

So I'm not making a statement about the general case and I don't think things can be boiled down to "this type of game is evil". It's the specific connection between these two extremes that I find worrying, or saddening really. But I'm 100% not the type of person who would deny your grandmother her fun. I think these types of problems start with things like education and attempting to heal the human connection between the powerful and the weak (sorry for the pathos).


I appreciate the clarification, and I do think we're pretty much on the same page. There is definitely an overall responsibility to be a more-helpful-than-harmful member of society, and particularly when it comes to something like a casino, it's probably nearly impossible to prevent greed from manifesting as malicious intent. But as I detailed at length in another comment, it's a slippery slope trying to figure out where "the line" is.

In my grandmother's case, those trips to the casino actually greatly benefit her. At 94 years old, the casino provides a level of sociability and exercise that she doesn't get otherwise. That said, even the particular casino she visits most likely does more harm than good to the local community. Not by a lot, but by enough. Unfortunately, the perception of whether that crosses the line is going to vary wildly between individuals. Some will think it's not greedy enough, and some will think it's the equivalent of murder, and there's no legitimate way to decide where the exact appropriate middle ground is. That's something we all have to decide for ourselves, and it can't be forced upon others.


> there's no legitimate way to decide where the exact appropriate middle ground is. That's something we all have to decide for ourselves, and it can't be forced upon others.

This statement is false. Governments and judicial systems around the world routinely decide where the exact appropriate middle ground is, and make binding judgements forced upon people by threat of fines and imprisonment, ultimately acts of violence.

Some of it is even democratic (deciding together, not each for ourselves).


> because everything on the planet is designed to be addictive to some degree.

This statement is false. I can see in my immediate vicinity many human-made objects that were not designed with addiction in mind at all. For example, there is a rubber band on my desk. It effortlessly and immediately disproves your statement.

> And the OP's experience isn't a universal truth, no matter how much they want to insist it is.

Accounting for how many basic statements of yours I've now refuted, I'm inclined to think that my universal truths are better than yours.


> Should my grandmother not be allowed to have her fun because of someone else's addiction?

I isolated this argument because it's also used by the tobacco industry. Consider that correlation. Should we ban smoking because it's addictive and harmful? Yes, we should - and already have. In my country you can only buy the stuff by specifically asking for it - a good compromise in societal responsibility and personal freedom.


I'm not judging those who play it, but those who create those systems recklessly. The same way I don't judge a drug user per se, but I judge those who exploit drug addictions for their own gain.


That's a slippery slope argument, at best. Every single video game is designed to be addictive to some degree, much in the same way that every chef is attempting to create recipes that bring people back into their restaurant.

What you're saying is that there is a line somewhere that, once crossed, becomes reckless or harmful, and that's certainly true in a way. But where is that line, and who decides it? Is it you? Is it the OP? Is it some committee somewhere? The answer is rather simple. It's everyone, individually. The right decision for you is not necessarily the right decision for me, and our life experiences will always differ in significant ways, which means my line is not in the same location as yours.

Ultimately, the OP's mentality asserts that every business on the planet is acting maliciously and exploitatively, simply because they're trying to make money by retaining customers. And hey, I'm willing to admit that might actually be true -- but it's almost certainly not. Intent matters. It always has and always will.


Just because everyone has different thresholds for addiction doesn't mean that we should therefore not care about it. Some people can enjoy heroin responsibly, therefore opioid addiction is a just a matter of personal responsibility? Of course, total prohibition is the other extreme, but surely there's a middle ground that makes more sense than total apathy. And I'm not really even talking about regulation here, I'm just saying that some people delineate ethical standards in their line of work - with game design, I personally draw the line at exploiting users through addictive design and leaving them worse off.


You're twisting my words. I never suggested that nobody should care about addiction, nor that severe opioid addiction (how'd we even get there?) is purely a matter of personal responsibility. In fact, the whole "line" commentary was about the potentially exploitative actions of developers crossing the line -- not the addictive thresholds of users.

And you're still missing the point. You say that you draw the line at "exploiting users through addictive design." And yet, if your designs had absolutely zero addictive qualities, very few people would play the game a second time. So that brings us right back to figuring out where that arbitrary line is. Is one addictive mechanism the right amount? Two? Ten? Given that we've already established people have different thresholds, how do you even quantify addiction at the design level? You could implement something completely innocuous by your own standards, but somebody out there gets addicted to it and you make money from them. Does that make you exploitative?

The whole point is that you can draw your own personal line, but you don't get to dictate that line to others, nor publicly declare that any developer on the other side of your personal line is automatically a malicious asshole trying to harm people. Because I guarantee there are developers whose maliciousness threshold is even lower than yours, which makes you the exploitative one in their eyes.


Everything is fuzzy and a lot of things are subjective, but that doesn't disqualify them from being a point of contention. I brought up opioids because the same logic applies. Although finer, there's still a fuzzy line between manageable recreational use and destructive addiction. Not knowing exactly where the line is doesn't turn it into a special case where nothing can be said. You're talk about drawing lines but I haven't mentioned anything about that. I don't have some personal line where everyone past it is bad. I'm just saying that exploiting addictive design is bad, and people taking advantage of that are being unethical in proportion to the extent that they are doing it.

Also, I was careful to not blame addictive design itself, it's not evil inherently, because like you said, many games have some degree of addictiveness. I said exploiting addiction is the bad part. You can use it to incentivize healthy behaviors that make for a better game, one that respects the player's time and ideally even enriches them in some way, by telling the narrative in a stronger way, or teaching competition or hand-eye coordination in a better way, or just being more fun in a better way. You can also use it to extract more money from players, or more ad views, or more social anxiety, or shallowly pad the experience to feel like you're getting more "value" out of the game. Again, all of this is fuzzy, not black and white evil, it's just that responsible creators should take the time to consider their own values and be conscious of the decisions they make, understanding that what they do have repercussions on others. We should create things in reference to our values - it's easy to just blindly autopilot to the default rubric of "maximize revenue".


> And yet, if your designs had absolutely zero addictive qualities, very few people would play the game a second time.

Now I see our problem. You've only played shit games.

Procedural content and random generation are a wave of the future of gaming, happening right now; new techniques that makes games fun without exploitation of brain chemistry. By actually being good, by actually being replayable.


> Every single video game is designed to be addictive to some degree, much in the same way that every chef is attempting to create recipes that bring people back into their restaurant.

This statement is false. There exist significant subcultures of video game development, particularly in the independent sphere, where games are designed with no consideration for addictiveness, including myself. You are beginning to offend entire professions.

> Ultimately, the OP's mentality asserts that every business on the planet is acting maliciously and exploitatively, simply because they're trying to make money by retaining customers.

This statement is also false. It is you who are making that assertion right here, right now. I only ever addressed the video games industry, and the specific practice of designing games that emphasize boring, repetitive, tedious tasks, exploiting brain chemistry to profit from lack of understanding of game design.


That's what they do though. They hold meetings to develop the most elaborate systems they can get away with to manipulate you into giving them as much money as possible.


Every single business in every single industry is based around developing products that make as much money as possible. They even have meetings about it.


This statement is false. There provably exist businesses that develop products based on criteria other than making as much money as possible. Do you really need me to do that research for you as well?

You seem to think that capitalism has reached some kind of total control over humans, and that every single human running a business on this Earth is obsessed with making money.

Do you also know a lot of businesspeople, all of whom think money is the end goal of all ventures?


Those make it clear what the game is upfront. The upset happens when you believe a game is one thing and it turns into a grind for incidental reasons.


I really wish y'all would stop with the absolutes and sweeping generalizations. It's truly exhausting.

None of the games or categories I listed are any more clear or deceptive about their mechanisms than any other game. I listed incrementals, but those often don't turn out the way you expect, because most of the mechanisms are hidden until you unlock them, so no, they are not "clear" about what they are upfront. I did not list FPSs, which definitely come with a mechanics expectation, and yet there are a mountain of different FPS mechanisms across many game styles.

If you make an assumption about a game, and that assumption turns out to be false -- or you just plain didn't enjoy the game -- it's a hell of a leap to automatically assume that the developer intentionally deceived you with malicious intent. Sometimes you just don't like a game, and that's okay.


You wrote

> There are huge swaths of the gaming industry which are based ENTIRELY on repetitive and boring tasks, including some of the most popular games in history, such as Bejeweled and all of its Candy Crush offspring. Or idle, incremental and clicker games. Or slot machines, for that matter. ALL of those gamers are perfectly happy to spend money on boring repetition -- sometimes for years on end.

I have highlighted the absolutes and sweeping generalizations for clarification, since you seem to be having a go at people for doing this. It is quite hard to not write in generalisations, because otherwise you have to caveat EVERYTHING.

I also did not write anything about intent, I said that the upset was finding a game becoming, for whatever reason, not what you thought it was or previously enjoyed.

> I listed incrementals, but those often don't turn out the way you expect, because most of the mechanisms are hidden until you unlock them

It is usually clear that a game is an incremental. I would be rather surprised if a new version of DOOM was released as a $60 boxed game that turned out to be an incremental.

(To a great extent genre is just this, the categorising of things by similar elements in the hope that people who like one will like another)


A sweeping generalization can't be made with one word -- there has to be some context which applies that generalization to the totality of the overall subject. Unfortunately, neither of the words you capitalized are absolutes or sweeping generalizations, they're individual words within long sentences representing a totality of a targeted subset.

There ARE MANY categories of games where the mechanics are PURELY REPETITIVE TASKS, such as Bejeweled. So when I say there are "huge swaths of the gaming industry which are based ENTIRELY on repetitive tasks," I'm not making a generalization, I'm talking about a specific subset of the gaming industry which is literally defined by repetitive tasks.

And when I say "all of those gamers" are happy playing games they enjoy, it's because they're literally enjoying the games, otherwise they wouldn't be playing them. That's not an "absolute" being applied to the entire population, it's a comment on how people like things that they like and will spend money on those things, as evidenced by the billions of dollars spent on those particular game categories.

Neither of my original comments were difficult to comprehend, and your trolling is unwelcome.


> And when I say "all of those gamers" are happy playing games they enjoy, it's because they're literally enjoying the games, otherwise they wouldn't be playing them.

Once again, your statement is false and easily provable by simple thought exercise. Imagine I am threatening you at gunpoint to play my game. There, you are now playing it for some other reason than because you're literally enjoying it.


I went through this when playing Spiderman on PS4. It became repetitive but couldn't stop myself from doing the side quests.


Depends on the kind though, Nintendo is said to willingly spread repetition to ease pedagogy so you can play while ignorant. They rotate around patterns gradually removing repetition.


Yes and repetition induces expectation, which, when broken, creates surprise and presumably enjoyment. Kind of like in music.


I believe the surprise comes from the reuse. Your brain feels at home yet gets a little stimulus of new.


This post is somewhat ahistorical as far as the objective-c content is concerned.. perhaps Brad Cox had the author’s conception of how one should write ObjC, but what he describes was definitely not how Apple thought about or encouraged folks to program far far before the iPhone gold rush. Objective-C’s clean interop with C is a major selling point, but dropping down to C has long been a technique to be employed only in the targeted areas where it’s important.

(Source: joined the Cocoa team in 2005.)


Yeah there's plenty of criticisms for ObjC, but this post kinda makes no sense to me. No one wrote ObjC like that and to make a coherent argument as to why would require a historical non fiction book. I'm not sure who the target demographic for this post is, and it's Java section makes sense, but the ObjC section doesn't seem focused. It kinda turned into a "OO developers are stupid" post. I'm a game dev (primarily iPhone/ObjC!) - we started that shit and I can't get on board here.


I don't think the Java section makes sense either. I'm no historian, but I don't think it was ease of serialisation that led to huge amounts of config in Java. I suspect that was more to do with its status as an "enterprise" language.

Having encountered a similar proliferation of config in "enterprise" C++ (where it's usually not implemented in terms of serialisation) it was mostly driven by a need to modify the behaviour of systems in production without having to go through software release processes that were often cumbersome and beaurocratic.


Quite right, before Java, enterprise was full with CORBA, DCOM, VB/Delphi, Smalltalk.

Also many are unaware that Java EE was born as a Objective-C framework for OpenSTEP, Distributed Objects Everywhere, which was then ported to Java.

https://en.wikipedia.org/wiki/Distributed_Objects_Everywhere

Which was anyway mostly inspired by Objective-C way, not C++.

https://cs.gmu.edu/~sean/stuff/java-objc.html


My graduation project was to port an Objective-C framework developed on NeXTSTEP into Windows.

Additionally by going through NeXT manuals, or the Brad Cox book, I don't get where the author came to the idea everyone should be doing C.

Heck even the device drivers had a framework.


I find Ruby gives me multiple patterns to solve problems in, each of which feel easy.

Even when it becomes evident doing it another way would have been better, usually it is still readable and consise, and quick to develop.

My biggest disappointment in this profession is that ruby isn't used much more outside of rails. My team use it for everything except low level driver stuff, and it feels like we have this huge unfair advantage over our competitors.


> and it feels like we have this huge unfair advantage over our competitors.

As one man company I feel you on this. May is subjective, but starting from watching people that spend literally weeks to combine some buggy wordpress plugins to build out a functionality that is essentially just CRUD to folks who build multi server, multi service apps for just a few hundred monthly users. Where we just run a few generators, add a few major gems and can start focus on the fun stuff.


Although rails alone is enough to carry ruby. I think there is likely just not enough reason to use ruby over python in most non web use cases.


Is there particular reason to use it over Python even in web use cases? I'd say it's pretty arbitrary, choose what you know, or on the basis of the main library or two you'll need to use.


I've worked in a Ruby/Rails team within a mostly Django company.

Sure it's not fair to compare mostly Django to mostly Rails.

However our team was tasked most things that had a higher complexity but should be done rather fast. I think Django slightly limited their flexibility for business logik that doesn't follow their standard pattern. This is all anectotal however. The reason I prefer ruby is simply because I like it more.


I would consider this kind of the disadvantage of Ruby. If everything is obvious nothing is.


This is why we have things like Rails. I can instantly tell if a project was done by someone who loves rails or someone who just does it. If someone understood what it wants, everyone after them will be able to instantly find and understand the code.

Pure Ruby on the other hand... I've read some scripts that are as hard to parse as perl for me, some others that could be a copy of the official tutorial.


What does that mean?


There is a continuum between a language that's very restrictive and forces you to do things one way (I'd think of Go as one pole here), and a language that provides countless redundant ways to do things without clear advantages over another, and in my opinion Ruby goes too far in the direction of giving you too many ways to do things. The disadvantage of that is that it works against uniformity in a project and makes each file its own little puzzle.


> too many ways to do things

If there are many ways, then there is no obvious way, but then the "if" part of what you said does not apply.

I think your thought is much better captured by "if everything is possible, nothing is obvious".


You’re probably right. I was just thinking it is not “obvious” what code does after the fact.


What languages do you think your competitors are using that are slowing them down?


What kind of advantages?


I always get frustrated when I see Python code written by Java programmers. It's full of unnecessary classes and complicated abstractions, and doesn't make use of Pythonic features that might keep the overall design much more simple. Good programmers know the philosophy of how each language is supposed to be used.


On the other hand, one is often called upon to work with many technologies to accomplish small tasks, making it impractical to become an expert in each one.


I've recently moved to Python from a Java and Kotlin background. Is there any resource in particular you'd recommend for quickly getting over "Javaisms" for people in this situation? I'm keen to write Python as idiomatically as possible but it's often hard to escape your background.


These are a few videos which present Pythonic style. They do not form a proper curriculum, but they are example-heavy and give a good insight to a very Pythonic mind.

1. Beyond PEP 8 https://www.youtube.com/watch?v=wf-BqAjZb8M

2. Transforming Code into Beautiful, Idiomatic Python https://www.youtube.com/watch?v=OSGv2VnC0go


Which in basically every language means “imitate the standard library” or “imitate the framework you’re running.”

Your code is gonna be full of standard library and framework code, your code might as well blend in and be roughly intuitive to people who already know them.


Kind of yes, but I wouldn't say so about all of the Python's standard libraries. Some of the old 2.x ones were pretty weird. Same thing with e.g. PHP, standard library was quite a mess of different styles and philosophies.


Oh, I'd say that matches most PHP projects pretty well.


Wait until you see JavaScript written by Java programmers.

It had "Java" in the name.... right??? right!!??


I wonder - is programming language agnostic? Suppose if I learn a particular language - L1, my programming style & thinking is biased by the convenience of the language L1. Now, if I start programming in language L2, obviously for next few months (or years) - I continue to think in L1 & program in L2. It takes time to unlearn L1 & learn L2. Because we are humans. Do we have a bug in our expectations?


Arguably this is too 1-dimensional a description to be able to say. Having learned Python before Java, spending time with the latter has only helped me understand Python's simplicity. That said, if it was the other way around this same might not apply, since Java will force a certain level of boilerplate, but Python can't force you to not over-complicate things.

Learning the fundamentals of interpreters, compilers, etc. can also give understanding transferable between almost all languages.


Most people and employers (other than the handful of places using e.g. APL) aren't willing to spend months learning a new way of doing things before they can do anything (and I dare say that's a reasonable position in most respects). So people demand - and the market provides - languages where you can at least write the basic lowest-common-denominator stuff the same way as in Algol '52. The amount of time it takes to learn to write idiomatic L2 might be the same as the amount of time it takes to learn to write idiomatic APL, but people would rather write L1-in-L2 than not be able to write anything.


How do we tell truths that might hurt? - Edsger W.Dijkstra, 18 June 1975

https://www.cs.virginia.edu/~evans/cs655/readings/ewd498.htm...

> It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration.

> The use of COBOL cripples the mind; its teaching should, therefore, be regarded as a criminal offence.

---

Long ago, I learned C in college. My AI class was taught in LISP and I had one of the emacs hackers in the lab look over my assignment before I handed it in - "You've written some beautiful C in LISP."

It took me many years to get back to a LISP (Clojure in this case) before I wrapped my head around functional programming and the idea of passing functions around to operate on data rather than data around that has been operated on (I recently blew the mind of one of my coworkers who's only written Java when I had a Map<String, Function>).

It takes a bit to go from "how do I do X in {language}?" and think of that as a series of steps in Java or Python or whatever to thinking "how do I solve the problem of Y in {language}?" without being constrained by the language.

I've personally found that writing the same program in multiple languages is the best way to learn how to move out of "how do I do X in {language}?"

Write the Leibniz formula for π or e in Java, in Python, in Clojure (or Lisp), and in Forth. Don't try to write it the same way - use the expressiveness (or lack of) of the language to solve the problem. Does it have rational numbers? How are sequences defined and built? How does it handle recursion if you use that?

---

Epigrams on Programming - Alan Perlis, September 1982

http://pu.inf.uni-tuebingen.de/users/klaeren/epigrams.html

> #19 A language that doesn't affect the way you think about programming, is not worth knowing.


I don't think you need to "unlearn" one; just become familiar with the philosophy and conventions of the new one. Your first language may always affect your approach a bit but you aren't going to be writing awkward code forever unless you have no interest in improving.


In my opinion it's like human languages. You can never apply the idioms and structures in quite the same way. But you can still be fluent in several languages without unlearning previous languages first.


As a .NET dev I feel 100% the same in regards to JS and TS. I'm proficient with all three but I have seen so so so many C# devs writing javascript/typescript with classes "bEcaUsE iT haS cLasSes tOO".


I know exactly what you mean, but I hate this on programs in Java and C++ too.


I feel your pain, brother. with the encroachment of frameworks and TS into the frontend, things that could be done with 50 lines of straightforward code become 10 files with a build step.


In a similar vain I feel like frontend got completely taken over by backend programmers. Modern react apps feel so much like enterprise C# or java it's uncanny - complicated tooling, long build times, overengineered libraries. They've made themselves completely at home and managed to all but obscure the actual underlying tech.


you know, I've have had the exact same idea floating around my head for a while - why is everyone talking about react when JS isnt even a thing at majority of CS programmes. It's great that JS is everywhere but frontend has become a grey, amorphous blob that no one really knows. And modern JS web frameworks really do feel like a OOP framework - as if some companies had an excess of backend C#/Java/PHP devs and decided they should infact do frontend. When they got there, there were no types, no classes, everything is an object and the realization that frontend tooling is simple but crazy difficult to get good at.

But to be fair, there is some pushback to the complexity - libs like Alpinejs and Svelte are a joy to work with because they try to keep it as simple and magicless as possible. So there is hope.

At least until I see someone build a landing page with npx create-reac-app


> ObjC was intended to be used in a different way though. ObjC OO is C with an OO layer for interop. You're supposed to write 95+% pure C.

Hmm, it seems then that the programmers at NeXTStep and Apple didn't understand ObjC either, because the highlevel macOS/iOS APIs almost never use C structs for grouping simple data items but instead expensive ObjC objects where even the most simple setter/getter operation involves an objc_msgSend call under the hood.


Developers will do things they like more often than things they do not like.

Hubris and laziness is the hallmark of developers, so the easy path is the most often used path, which becomes familiar, which then leads to a choice between familiarity vs FUD. You get a few mavericks, but unless they can prove a productivity change by some metric or miracle, they never make much headway.


> The user will look at the programming language and think that what is the easiest thing to do is the best thing to do

Hard disagree that this is what we should cater to. Programmers should also know the ecosystem they are working in. Just because you learned python with really long chains of method calls, doesn’t mean you should use the same style in Java… though it’s “easy” because that’s what you are used to doing.


If you know a way to attract only good programmers to your ecosystem, great. Most of us have to design systems to handle the way the average programmer will use them. It's easier to build a language that nudges mediocre programmers in the right direction than to make those people spend more time and effort learning the "right thing".


Also enough good programmers.

You could have the most perfect programming language in the world, but if not enough people know it, it's not going to get deployed and get into the world. JS is not around because it's some paragon of programming language, it is just smashing everything else frontend wise in terms of developer numbers.


"There are only two kinds of languages: the ones people complain about and the ones nobody uses"

Bjarne Stroustrup


> Just because you learned python with really long chains of method calls,

Long chains of method calls are no more idiomatic in Python than Java to start with (Ruby tends toward fluent style, but Python doesn't.)


I have seen a weird emphasis lately in the dotnet world of trying to mimic the semantics of node or python by sweeping complexity under the rug with syntactic sugar, in the name of making the platform more minimal and welcoming to developers coming from those ecosystems.

I'm not much of a fan, because you still have to deal with how asp.net works once you build anything serious, and now that isn't something the quickstart tutorials expose you to.


asp.net was closed source, asp.net core was open source. if by sweeping under the rug you mean opening up all there source code to the community to contribute too? Sure, they are doing that, go read it and get educated! Or do you mean async/await? (dont use it - i prefer observables).

as someone who was a C programmer that discovered Java then worked full time in C# for 15yrs.... i feel like the snubby nosed C guys who think pureness is the only way are likely hailing webassembly over JS and wouldnt accept anything less.

in reality, if you bring C near the web, you ARE the problem, not the solution. Different languages power different parts the technology world. Except your place in the ecosystem - instead of wasting energy wagging your fingers with angst at the others.


What in the world are you talking about?


excellent take on why programmers prefer one way of doing things over the other. i am adding it to my ever growing list of possible explanations.

another common one is: that's what we have been taught in school.

python has mostly replaced java in schools now but java used to be the introductory language of choice. one can even say that it was like a feedback loop, i.e., the industry dictating what students should learn...


Isn't that a universal law ?

We live in valleys, we go where taxes are low, we cut corners in parks.

Balancing structure and ease is an interesting art.


On a related topic, I found the Badass users book to be very insightful and practical: https://www.amazon.com/Badass-Making-Awesome-Kathy-Sierra/dp...



I think Apple makes it too easy/fun to force-close apps on iOS by flicking them upwards from the app switching screen.


There is a different way of closing applications on iOS?


The different way (the recommended way) is to not close them.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: