Hacker News new | past | comments | ask | show | jobs | submit | ataggart's comments login

>at a certain point, it’s too expensive to keep fixing bugs because of the high-opportunity cost of building new features.

While I may agree with this in the abstract, in practice most folks don't really know whether they're at that point. It also doesn't consider cumulative effects over time.

Bugs don't just affect application stability or user experience. A system that does not behave as designed/documented/expected is a system that will be more difficult to reason about and more difficult to safely change. This incidental complexity directly increases the cost of building new features in ways difficult to measure. Further, new features implemented by hacking around unfixed flaws will themselves be more difficult to reason about and more difficult to change, exacerbating the problem.

The larger the system grows over time, the more people working on it over time, the faster this incidental complexity problem grows over time. At a certain point, it's too expensive to not fix the bugs because of the increasingly high cost of building new features. At that point, folks start clamouring for a rewrite, and the cycle begins anew.


If the only alternative is between a rewrite, and not-fixing the mess gradually, then I'll take the rewrite anytime and let the cycle continue.

The problem is: is your rewrite really going to be a full-rewrite, or some kind of hybrid monster (at the architectural level, of course, there is no problem in reusing little independent pieces, if any exist)? Because you can easily fall in all the traps of both sides, if the technical side is not mastered well enough by the project management...


Or 3g of shaped explosives.

https://youtu.be/HipTO_7mUOw?t=67


>"shouting fire in a theater."

Relevant Christopher Hitchens: https://youtu.be/4Z2uzEM0ugY


I'm always amazed by his impeccable cocktail of eloquence and straight-to-the-pointness, but now I miss a bit of detour, the on the rocks for this particular drink: how would society heal out of a pathological state of ignorance/misinformation/xenophobia/bigotry? Sure, probably control of free speech is not the answer, but now I'm intrigued.



If their next best option to working at an Amazon warehouse is starvation, then it's a pretty good thing for them that Amazon has a warehouse there. That said, I strongly suspect they have better "next best" options.


Funnily enough, we have enough food and shelter in the U.K. that this shouldn’t need to be a choice people have to make - the issue is one of inefficient resource allocation. Instead of noticing this and solving the problem, our Government have decided that if you desire any sense of dignity, you deserve to die.

Have some fucking empathy.


An Amazon warehouse being there is the reason they don't have many other options.


I'm curious to know these workers' next best employment option, the one they're forgoing to work at an Amazon warehouse.


Knights of the Old Republic


I'm not clear on why I should be prohibited from accepting more pay in exchange for being more available.


http://legistar.council.nyc.gov/LegislationDetail.aspx?ID=34...

Here's the legislation. It excludes "overtime", so you would be able to do this. The law is just creating requirement that you opt-in to extended availability, not forcing you to always opt-out.


Busybodies gonna busybody.


172. First let us postulate that the computer scientists succeed in developing intelligent machines that can do all things better than human beings can do them. In that case presumably all work will be done by vast, highly organized systems of machines and no human effort will be necessary. Either of two cases might occur. The machines might be permitted to make all of their own decisions without human oversight, or else human control over the machines might be retained.

173. If the machines are permitted to make all their own decisions, we can’t make any conjectures as to the results, because it is impossible to guess how such machines might behave. We only point out that the fate of the human race would be at the mercy of the machines. It might be argued that the human race would never be foolish enough to hand over all power to the machines. But we are suggesting neither that the human race would voluntarily turn power over to the machines nor that the machines would willfully seize power. What we do suggest is that the human race might easily permit itself to drift into a position of such dependence on the machines that it would have no practical choice but to accept all of the machines’ decisions. As society and the problems that face it become more and more complex and as machines become more and more intelligent, people will let machines make more and more of their decisions for them, simply because machine-made decisions will bring better results than man-made ones. Eventually a stage may be reached at which the decisions necessary to keep the system running will be so complex that human beings will be incapable of making them intelligently. At that stage the machines will be in effective control. People won’t be able to just turn the machine off, because they will be so dependent on them that turning them off would amount to suicide.

174. On the other hand it is possible that human control over the machines may be retained. In that case the average man may have control over certain private machines of his own, such as his car or his personal computer, but control over large systems of machines will be in the hands of a tiny elite — just as it is today, but with two differences. Due to improved techniques the elite will have greater control over the masses; and because human work will no longer be necessary the masses will be superfluous, a useless burden on the system. If the elite is ruthless they may simply decide to exterminate the mass of humanity. If they are humane they may use propaganda or other psychological or biological techniques to reduce the birth rate until the mass of humanity becomes extinct, leaving the world to the elite. Or, if the elite consists of softhearted liberals, they may decide to play the role of good shepherds to the rest of the human race. They will see to it that everyone’s physical needs are satisfied, that all children are raised under psychologically hygienic conditions, that everyone has a wholesome hobby to keep him busy, and that anyone who may become dissatisfied undergoes “treatment” to cure his “problem.” Of course, life will be so purposeless that people will have to be biologically or psychologically engineered either to remove their need for the power process or to make them “sublimate” their drive for power into some harmless hobby. These engineered human beings may be happy in such a society, but they most certainly will not be free. They will have been reduced to the status of domestic animals.


I find the Unabomber a bit downbeat. I think we're more likely to merger to an extent with the AI than the above.


I find it curious that when someone suggests leaving an area of human action unregulated (i.e., not _a priori_ restricting otherwise non-criminal acts), others rush to read that as allowing crimes such as fraud to go unprosecuted.


Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: