That's a little misleading. Git was created in 10 days in the same way that Apple was created in a garage in a month. Or the way that Facebook was created in a dorm in one night.
Git v0.0.1 might have been created in 10 days, and the foundational ideas were certainly excellent. And it's amazing that two of the biggest contributions to the world of software came from the same person.
well, this quote from Linus Torvalds is in the article:
> "So I’d like to stress that while it really came together in just about ten days or so (at which point I did my first kernel commit using git), it wasn’t like it was some kind of mad dash of coding. The actual amount of that early code is actually fairly small, it all depended on getting the basic ideas right. And that I had been mulling over for a while before the whole project started. I’d seen the problems others had. I’d seen what I wanted to avoid doing."
Well, the fact that he and the other Linux maintainers had previously used (and were intimately familiar with) BitKeeper, and thus had a good idea of what their new FLOSS DVCS should look like, probably also helped.
Not saying that he copied BitKeeper 1:1, but if you look at BitKeeper usage examples (http://www.bitkeeper.org/), they do look familiar, don't they? BTW, BitKeeper is now open source - and development seems to have largely ceased...
Well, this is too is pretty clearly highlighted in the article... I mean it doesn't really matter too much, but
> He aimed to create a tool that was:
> Distributed: [...]
> Compatible: The new tool would incorporate as many features and workflows as BitKeeper.
> Secure: [...]
and there is another quote from Linus Torvalds contained in the article:
> 'Well, it was obviously designed for our workflow, so that is part of it. I’ve already mentioned the whole “distributed” part many times, but it bears repeating. But it was also designed to be efficient enough for a biggish project like Linux, and it was designed to do things that people considered “hard” before git – because those are the things I do every day.'
Honestly, I think that is the key. He didn't just create Git out of thin air... he spent alot of time thinking about what he wanted and using something that gave him a conceptual foundation.
10 days of writing code is impressive, but there would many,many days of thinking that preceded it.
> And it's amazing that two of the biggest contributions to the world of software came from the same person.
That's not that amazing at all, because the fame and weight of being the creator of the former led to the adoption of the other as it became the official version control management system on which Linux was developed. — These are not two independent results.
If someone completely unknown developed Git 0.01 in 10 days, it would probably have stayed relatively obscure, and the reason for it's quality is because it became so famous due to being tied with Linux, attracting many developers.
I certainly agree that marketing plays a huge role in software, but I was actively looking at different VCS trying even the obscure ones, there were really no reasonable non-proprietary options.
Bazaar and mercurial got created the same year as git. Before that it made you happy if people were using subversion instead of CVS.
Sure, if nobody knows about the project nobody will use it, but I think git stands on its own. If any medium project adopted it (and it seems likely that its author would be working on some other things too), I think it would spread.
Linux might have provided fertile ground for the adoption of Git but Git had to exceed existing standards to achieve its own merit-based acclaim. He created these two tools that both have achieved high degrees of utility and widespread adoption within their respective domains.
And many things that exceed existing standards objectively somehow lost out to exceeding standards or inferior standards coming later. The success of software in particular is about being at the right place at the right time as well as interoperability. Plan9 is generally considered a technical improvement over Unix, but where is it, who uses it? Redox can improve or Unix all it wants on a theoretical level, but it will probably never outcompete it due to interoperability and because of that it won't attract the developers to add the necessary features to compete with it.
Linux in particular started because a student made a task scheduler on his home computer with no real goal to make a Unixlike kernel, and it grew into it and it so happened that at the time there was a big vacuum for a free Unixlike kernel. kFreeBSD and kMinix were not yet free at that time, had they been, Linux would have no doubt stayed a hobby project.
I'm fairly certain a lot of very influential software came from the hands of Bill Gates as well such as the FAT filesystem which is still used, the original MSDOS and BASIC Interpreter which made a huge impact.
Of course, that's all mostly because much of that was bundled with the original MSDOS.
Also, in terms of video games, a great deal of innovative techniques and games game from John D. Carmac, which of course also is a result from Id Software pushing it.
Not for nothing, but after the first big contribution from a person, aren’t successive contributions from that person more likely to be big and successful?
i don't see that many people with more than one contribution that have the caliber of linux. so to me it looks like that when the first big contribution is wildly successful, then they will stick working on that and not start other projects.
possibly one of the reasons why git could be successful is that linus quickly handed over development to someone else.
of the top of my head i can only think of a few people that created multiple wildly successful projects. donald knuth with his books and with TeX (and i am not sure if the books are considered wildly successful). richard stallman with emacs and gcc (and the GNU licenses), anders hejlsberg with turbo pascal, c# and maybe typescript.
if you look at everyone else, they usually have only one wildly successful project that they are known for.
here is a list of programmers: https://en.wikipedia.org/wiki/List_of_programmers
most there only have one project that is widely known. if they have multiple projects then most of those are smaller.
Not necessarily, there are plenty of examples where earlier success did not lead to future success. A recent example that comes to mind is CloudKitchens? I could be wrong about CK, so happy to learn differently.
I am not sure why you're stipulating that there needs to be a threshold of minimal examples to present a contradictory pov, or why you need those counter examples to be greater in number. Most mammals don't have bills, but platypuses exist. Should we not describe a paltypus as a mammal simply because there are not enough examples of this type of animal? Or maybe I misunderstand your point.
Yes, they said "more likely", so I think what I said still works because just because you were successful in the past doesn't mean you're more likely to succeed in the future in all circumstances, i.e., if you extrapolate an infinite number of scenarios then your likelihood of success doesn't remain constant in each and every single one of them. If that were the case, then we wouldn't have Lehman brothers or Polanski (filmaker) or SBF.
Indeed, if you look at the first Git commit, you can see that it's around 1k lines of code, and it's so basic that you can't compare it to today's Git. Much of it is also devoted to memory management in C, rather than actual business logic. You could achieve the same functionality in just a few hundred lines of code in Go or Rust today.
“The Linux kernel, an enormous open-source project, used a proprietary VCS called BitKeeper. However, due to a conflict between the community and the company behind BitKeeper, the free-of-charge status was revoked.”
The details about that conflict are worthy of an article in and of themselves :)
Imagine having the most famous open source project using your product for the world to see and losing all that free publicity in a desperate attempt for control.
Yea BitKeeper was a one hell of a story. They also had a clause that holders of commercial BitKeepers licenses could not develop competing products and even forced a company to ban one of their employees from contributing to Mercurial(?) or the whole company would have their BitKeeper license revoked.
The sad part is that BitKeeper really was an amazing product and way ahead of its time, pioneering many of the features git later made popular. Had they just been a bit smarter about the whole thing they could be where git (and GitHub) is today
Ironically the last thing they did before going bankrupt was release BitKeeper as Open Source.
Agreed -- and it's absurd that BitKeeper's enormous influence here has been reduced to two sentences in the article. Not to take anything away from Torvalds and git, but McVoy and BK (and TeamWare and NSE/NSElite before it) are the real pioneers here. I got a bit into this deeper history in a USENIX talk in 2011[0] -- but I would welcome a more thorough treatment!
he has only himself to blame for that. the whole story shows the power of collaboration in the FOSS community. get in the way of that collaboration and you will get rolled over. i have seen that happen multiple times. you get to ignore the will of the community at your own peril.
Not mentioned in the article but I think what triggered BK to pull the license was Andrew Tridgell (SAMBA and rsync) reverse engineering BK's protocol to implement a third-party client.
Torvalds didn't create anything like the `git` we know today in 10 days.
It was heavily inspired by `monotone`, which popularized the concept of a Merkle tree. Torvalds' comments about what he liked and didn't like about `monotone` are preserved in the Wikipedia page. In particular he criticized their use of C++ and SQLlite.
Torvalds' git, on day 10, was some basic routines which saved and restored the contents of a directory. But it did satisfy his desire for it to be fast and minimal.
Someone named Jason Stopak helpfully found the original version of git and commented it heavily, here:
A bash script shows what the workflow would be like. You would be adding files one by one, saving trees, obtaining a SHA from that tree, then retrieving the SHA. There is a very basic notion of history.
As is typical with Torvalds, he left it to others to write tooling around the basic concepts soon after this. Torvalds deserves credit for the core ideas and a barely usable prototype. But most importantly, he enforced its use on a major open source project, very early in its life, which ensured that it would be improved upon.
Sorry ! I did a quick visual check and I thought I was replying to someone different from the one who posted the article, and she had additional anecdotes. Don't know why I missed they were the same.
i have used cvs, subversion, and git, and to a much lesser extent, darcs and hg. i would have to agree with both the article that this (d)vcs is a breath of fresh air compared to cvs and svn and also with your post and this tweet https://twitter.com/markrussinovich/status/15784512452490526... .
i believe that i notice some nice changes have been coming to git over the years. and i believe i'm largely comfortable with the toolset it offers and occasionally i advance my knowledge of it. i mostly use: git diff, git status, git commit, git rebase, git pull, git log, occasionally git merge or once in a rare while git reflog or whatever else comes up when i try to look up "how do i get back commits from a stash i popped and then checked out" (losing the changes within the stash). i actually did that again the other day and i think i may have tried git reflog and git fsck for the first time on my Windows Subsystem Linux Ubuntu machine (no idea if that is relevant) and i could not find my lost stash at all, and just ended up recreating the changes.
git is nice in the distributed sense, it is fast for my use, git lfs has been quite a pain for me in the past (but maybe it has improved). but i do find the commands quite arcane and it was very hard for me to get used to it. i am all in now and i appreciate the ux changes the project has made (and likely will make) over the years - the git command output often seems to give a short education message about what command to execute in replacement of some arcane thing i've been using for some years and there are often good blog posts describing how to use new commands, but i spent a lot of time figuring out what the hell i was doing that it would be nice if i could have spent another way.
I always said git is very nice and understandable after your read thru git book three or four times.
Which is to say everything in it makes intuitive sense if you know exactly how it works which is fine for developers, not exactly great for normal user that just wants to have some version control writing his novel.
But I think tooling got to the point where that is no longer the problem.
The "porcelain" (as opposed to the chrome) is terribly arcane.
I had a really tremendously bad experience with using git professionally and haven't touched it since, but apparently instead of something like "undo" you get "reflog." I don't know what to say about that. Everyone tells me that the internals are great, wonderful, magic, and I am willing to accept that on face value, but the selection of command names just baffles me. As a solo developer, I will just live without it until the names make more sense and I can find a use case.
(I am quite sure that someone will begin howling that you cannot program without git the same way projects cannot be done without Agile, and yet programming existed before either.)
It's not the internals that make Git special, it's the fact that GitHub uses it, and all the other developers know it.
I always say people should learn VCS at the same time as their first programming language, it's that important. And GitHub is as close to "the standard" VCS as it gets.
Git cola makes most tasks easy, and the stuff it can't do, can be googled.
Massive amounts of yak shaving. Two weeks ago I decided to build a new web app. Before I build it, I need a solid deploy strategy to save money since Heroku nuked their free tier. Currently writing a new programming language so I can reinvent Terraform with an AGPL license. I should deploy hello world by 2049.
yakshaver.dev is available. We should be able to implement a DNS server and brainfuck to web assembly transpiler to launch a landing page by 2035. Let's do it.
Hacker News is completely filled with people who have gotten crappy early versions of things out in that time frame. You don't see people claiming their first versions were great.
SourceSafe was nice with its built in tools integration when everyone else at the time was non existent. It was really meant for a 2-5 person team. Anything bigger and the thing would die on its own database updates. But if you are still using that thing today you should move away from it. It is terrible for usage today. I got to the point I was doing backups every 6 hours and keeping rotating backups on hand for when it decided to corrupt itself. Then I as subjected to clearcase and sourcesafe was amazing compared to the complexity that thing brought. I knew it was bad when they had to send out a consultant just to show us how to get it installed correctly and make branches.
ClearCase?! An enormous pile of unnecessary complexity. Where I worked they hired a fulltime person just to manage it!
I remember fondly, there was no way to back out a commit. The hired consultant told me "Just check out the previous version and check it back in over the bad change!"
So now our test group would have two(!) changes to test, instead of zero. Brilliant.
I also remember ClearCase had five (5!) views of the same database. Commands worked on one of the five view, and god help you if you got confused.
SourceSafe apparently had integration with the VBA editor in Office 2000 [1]. I've never used it, but it seems like it would have been handy when I had to work in VBA.
When you anger a wrong nerd and they will rip apart your entire business's niche in 2 weeks long caffeine and anger fueled hackathon #justbitkeeperthings
These pieces of ostensibly impressive trivia of the format "X was written in Y days" promote a fallacious impression that managers often suffer from: That "work" is writing code, and great developers "work" fast. Writing v1 is the easiest and fastest part of making a great product, by far. Understanding the details of the problems it solves and the ramifications of specific solutions - that's the hard work that takes years of experience and months of planning, and is the real measure of a developer. Unfortunately the former can be tracked with metrics, while the latter can not.
Linus's biggest contribution to software engineering is git but it took writing a kernel first to give him the idea. I'd argue that git has had a bigger impact on the world than Linux.
The Linux kernel is an amazing piece of software but especially in the beginning wasn't especially innovative. The Solaris kernel was a much better implementation of the Unix lineage. The Mach kernel was interesting from a research perspective, and lives on in MacOS and iOS. HURD was a worthy experiment, as was Plan 9. But git was really head and shoulders above anything that came before.
I wrote a curses based cheque processing system in C over a weekend after our old cheque processing server literally went up in smoke. I'm guessing it is long dead and did not become a leader in its field though.
I was using BitKeeper at the time and was just glad that there was (going to be) an alternative to all that drama! BitKeeper was a nice product but man did they misunderstand their customers.
Git v0.0.1 might have been created in 10 days, and the foundational ideas were certainly excellent. And it's amazing that two of the biggest contributions to the world of software came from the same person.