Hacker News new | past | comments | ask | show | jobs | submit | 87zuhjkas's comments login

I would pick nginx in the first place.


Nginx may need tuning for high numbers of server blocks, so it's not completely immune: http://nginx.org/en/docs/http/server_names.html#optimization


It depends in real life you always have a 3D chess board (+ time dimension) and in some chess UIs you have the option to get a 3D board rendered, like in chessbase https://de.chessbase.com/Portals/All/2017/_eng/products/frit...


But you actually only play the game in two dimensions, because the pieces cannot move up or down. Normal chess is a game happening on a 2D plane whether or not the representation of the board and pieces is 2D or 3D.


No, because that again depends on the representation. You can store the information of moves and chess position in 1D. As for example, two players can play a game of chess just by exchanging moves via morse code.


Don’t solve a problem that doesn’t exist. I find this article is against behavior, principles and thoughts that only very few developer have, its against extremism by formulating anti-extremism statements. But the anit-extremism statements form a kind of extremism on its own.

> The wrong way: Always use a framework on top of PHP.

If a dev is very knowledgeable with e.g. laravel, then let him use this framework as he wishes.

> The wrong way: The religious following of rules and guidelines.

If a dev team commits on a set of rules and guidelines and enforce them throughout all their projects, then let them do so.

> The wrong way: Always use object-oriented programming.

If a developer only knows how to program with this kind of programming paradigm (or chooses so), then this is fine too. Don't force him to learn other paradigms. There is no "wrong way" and no "right way" because programming is not black or white.


the problem is that its fine if your doing this on your own

its when you force that mentality on others because you don't know how todo it a different way that's wrong

e.g. If I only use Laravel why should my decision of only using a larval dictate that the next project has to be done in laravel, it might be something that can be done in 50 lines of code

But I only use laravel, there for you have to use laravel to solve the problem


> its when you force that mentality on others because you don't know how todo it a different way that's wrong

All decisions you make might be forceful for others in collaborative work. It's also forceful when you have to work on a project that does not use any framework.


In short, law of the instrument. When you have a hammer, everything looks like a nail


>If a developer only knows how to program with this kind of programming paradigm (or chooses so), then this is fine too. Don't force him to learn other paradigms.

Ok, I don't know if I would want a developer that won't learn or can't be forced to.


I think "rough edges" is an understatement. It has serious design flaws and inconsistencies which are probably never going to be fixed due to backwards compatibility. It's like C++ now, tons of language features are added over the years but non of them is able to repair the language, similar to a game of Jenga, the tower will collapse eventually.


Or live with it and list GitHub downtimes as one of your business risks.


Absolutely, but if downtimes go up, then it starts making sense distrusting that provider and thinking about alternatives.


How much downtime could it get? They might redesign the homepage a few times and break it accidentally, then it's done. They're not gonna redesign it again and again every week forever.


You under-appreciate how enterprise software development tends to go.. there's always new "features" to be had, KPIs to measure and no time to tackle tech debt...


Are you sure about that?


> "Or live with it..."

Try working within a deadline, no engineer worth their salt would ever take this advice.

self hosting is definitely a viable solution.


Hosting your own does not guarantee you 100% uptime. It just means someone less expert than the GitHub folks will be responsible for bringing things back online. It also costs you time and effort, which is a problem if you're on a deadline.

Same goes for the question of where to host it physically. It seems unlikely your physical server will have better uptime than a virtual server in the cloud.


It's about control, not anything else.

If all I need is a git repo, with some tools. Why pay someone to mess it up, when I can mess it up for free?

The moment you place it in someone else's work queue you are tied to them... and they might not care about your projects deadlines. Just like github.


> Why pay someone to mess it up, when I can mess it up for free?

Because it isn't free. Your time is a huge cost.

A senior dev who spends even 10 hrs on standing up a git server has blown through years' worth of GitHub costs, and that's assuming you're even actually using the paid service.

Factor in the extreme security requirements of a code server, including needing to update dependencies daily, and you're spending far more time self-hosting with riskier results.


To add to this, and at the risk of restating my earlier points: even if you get it working, it still probably won't be as reliable as a provider like GitHub.

Keeping it secure is no small thing, especially if you want to permit access from arbitrary IPs on the Internet (rather than using a VPN, say). GitHub does this, and presumably they have solutions in place for everything from intrusion-detection to DDoS protection.

GitHub employs people to take care of server failover and data backups. You could spend your own time building your own solutions here, but they're unlikely to be as good as GitHub's. Your solution is guaranteed to be less well tested.

And that's assuming you even have a server room in the first place. You could run your own Git in the cloud, of course, but you're not really 'running your own' if you do that. GitHub take care of the server question (apparently they use a physical-server provider called Carpathia [0]), and because git always needs to be available but is only used rarely, the amount they charge you is probably less than the cost of running a dedicated server for the purpose.

And all that is assuming that a self-hosted GitLab is just as good as GitHub from the developer's point of view. It may or may not really matter, but GitHub is probably the more polished and feature-rich service.

Building a competitor to GitHub is possible, but not trivial, see SourceHut. (We've been talking about GitHub, but of course they're not the only Git provider.)

I can see only a few situations where it makes good sense to run your own Git/GitLab:

1. Your Internet connection is slow and/or unreliable

2. There are extraordinary safety/security concerns associated with your source-code (military avionics code, say) so you want to run Git in an isolated network (no Internet connectivity at all)

3. Related to point 2: You don't want your organisation's data to reside in the USA. (To my knowledge GitHub don't offer any choice about this, but I could be mistaken.)

For the average developer though, I don't see much upside. Having more control isn't a compelling advantage, it's another way of saying you have more obligations.

[0] https://github.com/holman/ama/issues/553


An engineer can surely plan enough leeway into their deadlines that few hours of github outage doesn't scupper their project


Yes, the question is probably will management allow that?


What about the business risk of someone tripping on your git server power cable?


You can plug it back in yourself.


For anyone interested in this sort of thing, ReGoth (Gothic 1 & 2) comes to mind and is still in active development: https://github.com/REGoth-project/REGoth-bs


Wow.

Gothic 1 was one of my favorite RPGs of that era. The setting was intriguing (it was devised to limit the world, but it was an interesting way of doing so), the world felt "alive", with night-day cycles and NPCs going their way, working, etc. I still have the box lying somewhere, because I keep the boxes of my favorites games.

(Unfortunately, I don't think I have any way to read the original game discs anymore, so I have the game and at the same time I don't have access to its assets...)


The development has come to a (temporary?) halt right now, the bs::framework we were using encountered some issues right after we adopted it... Hopefully we'll be able to resume some time in the future! In the meantime, https://github.com/Try/OpenGothic is more feature-rich.

(Disclaimer: I worked on the project but I am not the maintainer nor am I any kind of leader figure in the dev team)


Please stop acting as if you know what literally all people are not saying.


There's also Cloudflare Analytics, no js tracking or serverside logging required in case you are behind CF.


I prefer https://www.cpubenchmark.net/cpu.php?cpu=AMD+Opteron+6272&id... vs https://www.cpubenchmark.net/cpu.php?cpu=AMD+EPYC+7351P&id=3...

AMD Opteron 6272: Multi-Core Score 4368, Single Thread Score 741

AMD EPYC 7351P: Multi-Core Score 15792, Single Thread Score 1691

So the AMD EPYC 7351P is more than twice as good in Single Thread usage and more than 3.5x times as good in Multi-Core computation.


But it is more than 10x cheaper.


That depends on the context of cheap, e.g. how much would it cost to run these chips 24/7 for 5 years based on their TDP (performance per watt).


> Conway's Law is a physical law in the same sense as Murphy's Law. It's also obviously true.

It's like a tautology: "In logic, a tautology is a formula or assertion that is true in every possible interpretation."


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: