Hacker News new | past | comments | ask | show | jobs | submit login
GitHub major issues - repos have "lost" commits and site is erroring (support.github.com)
119 points by andrewljohnson on Jan 30, 2011 | hide | past | favorite | 86 comments



Suggestion for creating a startup:

For private repositories gitolite is really awesome, but I think there should be a company about it. What I mean is that github or some other company should offer hosted gitolite instances, in the following form:

You pay something like price_of_small_linode_instace+price_of_service, to get:

A virtual machine with gitolite up and running, with a good web interface and online documentation. I think that the web interface part is the piece this company should not opensource. With this web interface you can create repositories and users, and modify the gitolite settings in general.

A backup service, in some other different VM and network, so that if something goes wrong with your gitolite VM you can ask a free (and automated) restore, possibly with hour granularity for the latest 24 hours, and less granularity for bigger periods and so forth.

Note that this gives the user virtually infinite repositories... if they don't upload movies to their instance, as there is a lot of space even in a small linode instance for git repositories.

This way this is a "pure product" as there is no need to repair or install manually virtual machines, and the company can focus into evolving the offering, developing a better version of gitolite, of the web interface, and so forth.

p.s. I mentioned linode but can well be ec2 or whatever is better suited for this stuff.


What do people really think of this idea? Would you buy it? I have written a web interface for managing git repos called GitHaven (my personal home install is here: http://git.meyerhome.net:8080/). It has the usual goodies, private/public repos etc. I have some friend's companies that run it in the cloud already. Do you think it would actually make money i.e. Would you be willing to join me in a startup on it (in boston)? Would it pass the y-combinator test?


Probably not. Github is on a path to become the market leader, has a critical mass of customers, and has a business model that harnesses those customers to benefit from network effects. This idea is akin to building a business offering private white-label Stack Overflow clones. Key point: the actual software is not what makes Stack Overflow and Github valuable.

Simpler reaction: Github:FI is already priced at a rounding error for the kinds of companies from which most of the revenue in this space will originate.


From the research I have done there are three major types of git server buyers:

1) We are so small that we don't even think twice and buy some private repos on GitHub. (i.e. no market there)

2) We are so big that we can't possibly even fathom the idea of having our code on someone else's server so GitHub.com is out of the question. Sadly GitHub:Fi is per/user and because we are really big that means we have lots of devs so it is too expensive in most cases. (i.e. enterprise sale yuck)

3) For medium sized companies that either have some devs (more then 5) or a ton of repos or large repos they want an alternative to github. Many are slapping up gitolite which while good is a far cry from github.

Like I mentioned I already have found companies that want this solution and they are using GitHaven. The real question is if this market is too small or should I just move on to another project, open source GitHaven and let everyone (who wants to) slap it up as a frontend to their existing gitolite installs?


Most products targeted at developers are per-seat priced (or have pricing derived from per-seat pricing); it's how you do value pricing when value scales with the number of developers. Look at how Atlassian does it; that's still effectively per-seat licensing (it requires prospects to think about how many devs they'll have).

You can try to avoid these schemes, but you have to do it and not go out of business.


Yeah the per-seat point is really a moot point when it comes to enterprise. ClearCase is $4000 a license and a lot and I mean a lot of large companies use it. I know of a company that is currently spending $2.5 million a year on ClearCase licenses and they really want to get off it but ClearCase is so entrenched/integrated into their work flow that this is not a trivial task. Github is not going to solve this problem for them.

The problem with Github:Fi is the value proposition is extremely weak when it comes to meeting enterprise needs.

- They have an issue tracker that doesn't support custom fields. This is critical for companies with 500+ developers and/or very diverse product lines.

- They have a wiki but you can get confluence for $12,000 for the entire company which is more feature rich. It has decent searching.

- They have blogging but this is of no value in the enterprise world. The majority of employees really don't have the time or care about blogging internally.

- They have Gist which offers no real value for the same reason why blogging isn't that big of a deal.

When it's all over, their only strong value proposition for the enterprise world is their intuitive push and pull interface. But the problem with this is they are competing directly with free solutions like:

- gitorious

- gitolite + internal web interface

- gerrit

I'm obviously speculating here, but based on how their pricing model changed:

old: 1000 per user (initial purchase) + 200 something a year per user subscription cost

new: 250 per user if less than 25 users and 200 per user if more than 25 users

I think they realized they over estimated their value proposition initially. If Github wants to seriously break into the enterprise world, they are either going to have to lower their per seat cost and/or strengthen their value proposition.


Github:FI is already priced at a rounding error for the kinds of companies from which most of the revenue in this space will originate.

How many of these companies do you think have developers using git in the first place (As opposed to subversion or [wait while I assume a look of sufficient disgust] visual source safe)?


Guess the real question is why do people keep upvoting the first comment?


Probably for many of the same reasons they tended to vote up the "I could clone Stack Overflow over a weekend" comments. I wouldn't confuse karma with market research.


Just posted to the support page.

"Sorry guys, there was a Javascript error displaying an error when there was none. We've deployed a fix for this and everything should be good to go."


This seems to be fixed now - for my repo at least.

The GitHub guy saying it was a "JavaScript display error" is incorrect though. I was pulling a repo and getting an old version.

EDIT: I'm not getting the javaScript display error, but I'm still missing my commit, so I guess these errors are unrelated :/


Did you file a support ticket for this? It sounds like an unrelated problem. We can investigate if you let us know what repo is causing the problem. Visit http://support.github.com or email support@github.com.


I posted the name of the repo to the thread, and someone else on the thread is confirming the same problem.


It turns out that GitHub didn't lose any of my commits. All future readers should ignore my comments on this.


This was my fault. GitHub was never messed up.


For what it's worth, I was pulling & pushing just fine while the error was displayed on my repositories, but other people reported commits lost or mangled. Maybe there was something more to the Javascript display error?


Is it just me or has guthub's performance and reliability been getting progressively worse over the past month or two? Even putting this issue aside, and the outages, pushing to a private repo takes _minutes_ now, and file lists in the web interface often don't load without a page refresh.


At the price and size of Linode instances, it's cheaper and I feel better knowing I'm in control when I run Redmine+Gitosis. When I had my github account and something went wrong, I couldn't just ssh in and fix it. All I could do is wait.

Now, for some people, I can see how waiting is an easier solution. I personally just don't like it.

(of course, this is reffering to private repos. public repos are still king on github)


It's amazing to me that you feel more confident in your own abilities to keep a server running than GitHub... you must be a really great sysadmin! The GitHub folks are pretty hardcore!

I, for one, am a positively mediocre sysadmin. And the amount of time I have to spend waiting for GitHub to fix bugs is almost certainly far, far, FAR less than the amount of time I'd spend scrambling (but not waiting!) to fix my own.

This is also how I feel about Heroku.

For whatever reason, I prefer the helplessness I feel while waiting for them to fix stuff to the helplessness I feel when I'm in "OH SHIT MY SERVER IS DOWN AND I DIDN'T EVEN DO ANYTHING AND WHY IS THE DATABASE NOT RESPONDING IT RESPONDS WHEN I QUERY IT MANUALLY AAAUUGGHHH!!!" mode.

Although I do try to keep myself sharp by doing things by hand in other areas of my engineering activities.


I don't feel like I could keep Github running as well as most, but for the entire time I've had my personal server running, it only went down once due to a hardware failure on Linode's end. For simple stuff, a simple sysadmin will suffice. Github has thousands of users and cares about performance, I have maybe 5 and I couldn't care less if my server used 2000% more cpu than it's supposed to. Different levels of complexity, you see.


+1 to you for that. We tried github for some of our private projects, but soon switched to a linode+redmine+git/hg. Now we are no rails champs; we all are python guys but setting up redmine(with some plugins was a cake walk) same goes for git/hg. We never needed to hire hardcore sysadmins for this stuff.

PS: We love github very much. its made it so easy to collaborate everyone.


The beauty os a dvcs is that I have a full copy on my machine as well.

But seriously, if you use git over ssh there is no sys admin work other than running apt-get upgrade


There's always sysadmin work.

1) You need to keep current on your security patches.

2) You need to upgrade your OS when it reaches end-of-life.

3) You need to make backups, and more importantly, verify that those backups will actually restore. (For git, this is most critical if you have a large team and dozens of repos.)

You can ignore a server for years, but eventually you'll get compromised or loose a hard drive. (If you use RAID, you'll eventually lose or an entire RAID array. Not fun at all, nor cheap.)


I'd say "most developers" already have a box running some kind of website where they are doing this anyway, but even if you disagree I think you have to be willing to cede "many", especially given how many people who use GitHub are currently doing Ruby on Rails or node.js work. I'd even go so far as to say that those that currently aren't /should/, as the experience being a sysadmin is important when understanding how other sysadmins will react when they see how your software is deployed, which I guess is another topic of discussion that comes up often here (oft filed under the "Debian vs. Ruby" banner).

(Part of me is wondering if many of the more controversial discussions on this site are between people who have sysadmin experience (and considered it valuable) and people who don't.)


Part of me is wondering if many of the more controversial discussions on this site are between people who have sysadmin experience (and considered it valuable) and people who don't.

I've done a fair bit of sysadmin work over the years, mostly in self-defense, because I want fewer crises when a critical development server eats itself.

Lots of people can figure out how to install Ubuntu, or rent a Linode. But if they don't master upgrades, patches and backups, they'll eventually end up paying a real sysadmin a lot of money at the worst possible moment.


If all you're doing on your server is git, then the first two points you make are covered basic apt-get usage. It's super easy.

point 3 isn't that necessary if all you're doing is git over ssh. All the people that have a checked out copy have a backup of your repo. Also, if you're only doing git over ssh then it's not hard to make the box relatively secure. ssh will be hte only open port, and it won't be on the standard port.


Maybe for a few months. When you run a server for 2 or 3 years the distro goes out of support, software upgrades are getting behind, before you know it you are compiling patches from source, ...

I've done this several times over the last 5 years and am now finally moving everything to specialized hosting services. Just hosting a web site, photo repo, mail and svn on a machine (vps) is enough to make it a serious hassle Moving all those to specialized services costs the same, is less work and more reliable.


> Maybe for a few months. When you run a server for 2 or 3 years the distro goes out of support, software upgrades are getting behind, before you know it you are compiling patches from source, ...

You can get 5 years security updates support with Ubuntu Server LTS and you can set up safe unattended updates[1] with email notification if anything ever requires your attention. If you set your backups right as well, you can get away even with catastrophic hardware failures.

Remember, this is not some gigantic scale, it shouldn't be that difficult.

[1] https://help.ubuntu.com/10.04/serverguide/C/automatic-update...


"...it shouldn't be that difficult"

Famous last words...

I'm not saying it can't be done, I'm just saying that my experience has been different. Another example, I used to have my own installation for our websites and bug tracking and time sheet management software, on a locally-hosted machine with apache and mysql and a bunch of other software. The number of hours I've spend migrating data between versions, tweaking mod_rewrite rules, setting up backups for various databases and other data, making usage statistics etc. - I don't even want to think about it.

Recently I moved to a shared hosting server. The yearly cost of that is covered with one hours of my hourly rate (and I'm not even expensive). I can set up 50 or so different applications with a few clicks in a web UI, and upgrades are handled by it, too. Backups are taken care of, I have a web ui for dns, ssl, everything. It was like a breath of fresh air.

To each his own and each situation is different, of course. But there's something to be said for division of labor.


> The number of hours I've spend migrating data between versions,

It seems (from prev. post as well) like you were trying to stay on the cutting edge functionality wise. OTOH I was arguing for running stable and maintained Linux distribution.

I'd certainly agree that the initial investment is pretty big—not only it takes some time, but you need non-trivial specific knowledge about Linux administration—and for that reason alone I'd advocate using maintained hosting. My only beef with your comment was that, in theory (and in my experience), a properly configured VPS running stable software shouldn't require nowhere near the level of maintenance you seem to be suggesting.


Absolutely a matter of scale - it depends on the app, the audience, your budget, your workflow....

A one-man operation doing sysadmin, development, everything? I'd be looking for hosted solutions too for my clients - it makes my life easier.

But as a company, with a development and operations staff? I'd be wary.


Sure, but I figured we were talking about small organizations. One-man shop to maybe 10 people or so, or even more, anything too small to have a dedicated sysadmin.


Sorry, but there is no such thing as "safe unattended updates" when you have anything more complex than your home desktop box. There's way too much that can and will go wrong to be that naive.


Safe unattended means no configuration files are altered and those updates are not adding new functionality in the first place anyway. It's only Package X.Y with security patch applied.

Actually your home desktop box should be much more difficult to upgrade than a simple generic server box with generic virtualized hardware drivers.

There are valid reasons to use service providers or maintained hosting but properly configured (and backed up) VPS running stable (& maintained) software can get you a long way.


there is if you're only using your vps for git over ssh which was my point that started this whole thing.


easy.

1) get a distro with 5 years support.

2) when it's out of date, buy a new $20/month vps, install git and ssh. Move your repo and turn off the old server. This might take 2 hours, every 5 years. Not a big deal.


It's not that hard to install a source control server, whether it's git, svn or mercurial.


Yes, I can confirm that it's surprisingly easy to run git with gitosis over ssh on your own server for a few dollars. Takes a couple of minutes to setup. And of course you can make it public and you can share it with other people with keys.


> The GitHub folks are pretty hardcore!

What gives you that idea? I run several cronjobs that deal with their service, and it's down a lot compared to other similar services. My jobs deal with that gracefully, but if I wanted to I could easily set up a more reliable service than GitHub if my only goal was to host *.git repositories, not have any of the other nice things they provide.


Well, this: https://github.com/blog/530-how-we-made-github-fast

And the fact that they have somewhere north of 5 terabytes of active data, and their job queue is handling many hundreds of thousands of jobs a day, and it pretty much all happens extremely fast with nary a hiccup. I know there are outages, but I'm pushing up changes to them something like 10 hours a day every day, and I've never noticed them.


They've struggled and dealt with a number of hard engineering problems, which they've blogged extensively about. In particular, their architectural (both hardware and software) changes since leaving Engine Yard demonstrate more than a simple "working knowledge" of how to build large-scale systems.


A major selling point of Git is the distributed part. You are allowed (even encouraged) to have more than one "source of truth". For ~7$/mo (including admin. and cheaper than Linode), GitHub is just another place to have a hosted version of your repo, with a nice UI and social features.

Shit happens, servers go down, that's why you also have a remote repo hosted on Linode, and X, and Y too.


This is an interesting theory, but it isn't how git's client actually works. If the repository exported a list of "mirrors" to the client that it then stored and was willing to use, that would be awesome, but otherwise you have a million people out there who are now just getting error messages when they do "git pull" and the only fix is for them to go back to your web page and try to get information on what is happening and where else they can switch their origin. Meanwhile, if you do your own hosting you can just update your DNS to point to another box and no one is the wiser.

They key problem, frankly, is that GitHub conflates two entirely unrelated things: a nice UI and social features, and a hosted version of your repo. I love the idea of outsourcing a nice UI and having cool social features, and /maybe/ to make those features work they need to have a mirror of my repository (I'm not convinced), but when people go to pull it the URL listed should be the actual upstream "I own the DNS on this and feel I can make this stable in the long term", not the GitHub mirror.


This is a straw man argument. The question isn't whether Git could be more intuitive/user-friendly (Hint: It should be, in fact I bet my company on it [see my profile]), or whether it is more secure/cheap to host your own repo.

If you have a million people `pull`ing from your repo, of course you should have be hosting your own public access point. But, in 80% of cases, people can't be bothered to figure out how to set up Gitosis, pay for slices, mess with DNS, etc. just to host a repo.

To put it another way, see: Heroku vs. EC2


This comment is totally unrelated and is itself a strawman. Yes, it is easier to use GitHub: I will not argue that fact. However, using GitHub will cause people to be pulling from GitHub, and GitHub may go down. This is a tradeoff, and is one people use a lot: you use a shared platform and give up control of the URL to get easier outsourced hosting. But to argue that git's decentralization solves that problem is disingenuous: it means that people could theoretically still pull your repository, but only after finding out what that fallback URL is and manually resetting their origin, which 90% of git users don't even know how to do. Meanwhile, many people are willing to spend the five minutes it takes to learn how to run their own server and want to avoid this tradeoff by hosting their own stuff on their own hostnames so they can publish stable URLs, but /can't/, because they like GitHub's social features, none of which (due to the aforementioned distributed features, humorously) actually require GitHub to be the canonical repository URL: if you want to use GitHub, you are going to have people cloning and pulling your GitHub mirror (or even worse: adding your GitHub mirror as a submodule) and when it goes down they are going to get errors, and you will have no control over it. That sucks.


Web services go down. It happens. You make the choice to use them anyway.

In this case, Github is very responsive about outages and clearly strives to eliminate or reduce them as much as possible.


And sometimes web services go down for good. Again, this is an understood tradeoff, and I'm not arguing that. What I do argue with is that "git is distributed" does not cancel out this particular tradeoff, which I'd the statement that was made by the person I am responding to. An actual solution used by many other services is "let me use my own hostnames with this service", which GitHub does not support for your repository, as While their fundamental value comes from the social features and nifty git UI, they seem to mentally be stuck in a "we are the git hosting company" mindset.


I would venture to say it does.

You can't really fault Github for individual teams not opting to host their code in more than one spot online, even if Github doesn't offer the capability for users to use their own domain name for seamless switching of git hosts.

Does Github encourage keeping everything centered at Github? Perhaps implicitly. But they certainly don't lock anyone's data in, so blaming them for their customers opting to NOT put their code anywhere besides Github seems unfair.


You are conflating "hosting in multiple locations" with "claiming to be a canonical URL". If I choose to host my repository at git.saurik.com, but want to be able to use GitHub's repository browsing features, social timelines, etc., I may choose to /also/ host a copy at GitHub.

However, people are now going to copy/paste the GitHub repository URL and use that to clone my repository, and that URL is going to end up as a large number of peoples' origins. Even worse, that URL may end up in third party projects as a submodule (which is much more difficult to retroactively change).

Again: the problem here is not that GitHub is somehow encouraging people to keep things at GitHub "centrally": it isn't, and the goal is not to have your data in multiple places.

In fact, that's what you need to /avoid/: there should be a single URL for "this is the git repository that we consider to be the official, canonical source for our (distributed) contributions to this project".

That URL should be one that you feel comfortable you can maintain for a long time, as that URL can end up baked into a lot of things. Some of them are theoretically easy to change (the million users who are pulling from that URL, assuming they know how to do that without just re-cloning), and some of them aren't (usages of your project as submodules in other peoples' projects).

To quickly put this in another, maybe simpler manner: the problem isn't that people aren't choosing to /also/ put their code in places other than GitHub, it is that putting you code /also/ in GitHub undermines your git repository URL.


But git-over-http is just a normal http client, and http supports redirects. So while nobody does this, it's possible to load-balance http clients to "valid" servers just like you would with any other http-based app.

ssh:// and git:// are more difficult, but project contributers with commit access can just ping you on irc to see what's up with the repo and where to push to today.


That's great, but still requires doing your own setup on your own hostname: conceptually that is a single repository with one URL.


"GitHub is just another place to have a hosted version of your repo, with a nice UI and social features."

The main point of Github is the social aspect; git just makes that easier.

They were other public git hosts before, but they didn't get the traction of GitHub because they didn't offer the same magic as The Place to share code.

It's git hub for a reason.


I'm a student, and I make a new git repo for each assignment or class. Github would become extremely expensive if I was to do this.

And my model for working on git projects is not distributed. We use a centralized model (which is totally viable and one of the many uses) instead of pull requests and branches. On small projects, I find this faster and easier for people to use.

Yes, I have backups, but when my centralized repo goes down, it's annoying to tell people to start pushing and pulling to a USB drive. Workflow is the issue, not losing work.


You mentioned Github would be prohibitively expensive if you were to use it for all of your school assignments?

Github actually has a program where if you tell them you are a student/professor you can get free private repo's.


I didn't know that, thanks! Free is always nice :)


I have two repos living in "Repository temporarily unavailable" status for several days now. I understand that shit happens, but what's really frustration is inquiry about what can or will be done about it and getting no response.

Do I wait? Do I just move stuff to my own server? Go back to Gitorious?

It's not that I can't keep working, but the main reason I use Github is to make my OSS projects available. For other things I run my own server and my own git host. Having a project unavailable to people for days, when I'm trying to get bug fixes and such out, is not so good.


Two questions: 1) did you file a support ticket? that seems like the sort of thing they'd be serious about fixing, and 2) do you have a paid account? In some sense, it doesn't matter, OSS developers are important to their mission too, but... well, I'm curious because I have a paid account and if you do too then that makes me more afraid for myself.


Yes and yes ($7/mo). I just now got some mail from github, and need to take a look at some stuff to sort things out. So it's being looked into.


Responding is key.

I believe the easiest way to build a successful business (software and/or service business anyway) is to respond promptly. Even if you don't have an answer yet. If you do, your customers will stick with you through almost anything.


I haven't had any trouble, but I have seen more interruptions happening recently. I relate it to growth, because as you grow, these things sometimes happen more often; "growing pains" if you will. Overall I am happy with Github, and their downtime has not cost me any time or money.


It's just you.

My team uses dozens of private repos hundreds of times a day and everything's been smooth for the past few months, aside from the disruptions noted on their status blog.


It's not _just_ him. Right now, it's me too.


I'm still on the fence about that whole cloud thing, personally.


Well, here's the beauty of Github--if they lose all my repos I just don't care. Since I've got a copy of my repos locally, there's absolutely nothing they can lose... Worse case if they go completely down for a couple days I can still collaborate using ssh or "git send-email".

This is the only reason I decided to use Github after avoiding Sourceforge and their ilk for the past decade and a half.


"This is the only reason I decided to use Github after avoiding Sourceforge and their ilk for the past decade and a half."

But why github? Why not gitorious.org?

If github goes away, I still have my code, but I lose any issues people have filed on the site, and I lose being able to easily check on forks of my projects where people might be doing interesting stuff.

I had been using gitorius.org, but moved to github for all the things other than git that makes one public git host different from another.


There is the Github API, so you could backup the non-repository data as well by writing a couple of scripts, if you were so inclined.


Right, and I should probably set up something to auto-snag that stuff where it counts, but having to do that is the sort of thing you have to so for any non-git-based site.

It'd be nice of all those related items were also in a repo, easy to pull.


That's the beauty of a DVCS (Git in this case) _not_ Github.


You mean this cloud? http://cloud.github.com/


Not sure how this is relevant. Github isn't hosted on any cloud.

http://www.anchor.com.au/blog/2009/09/github-designing-succe...


Well, every resource is locally on some physical machine somewhere. I think his point is that Github is often treated by its users as a "local" resource, in that it takes the place of what would otherwise be a local server running a Git front end or repository.

Now, Github probably wouldn't encourage total reliance on their service. Git is fundamentally distributed, so this doesn't have to be a problem. Still, the Github service is an integral part of many developers' workflow, and I do agree with the above commenter that I'm not sold on the idea of trusting remote servers with integral steps in my workflow.


This echoes my thoughts all along.

For open-projects, it's a FANTASTIC platform... no doubt about it.

For a corporate closed project, I honestly would not use github - not because of any feature or lack of them, but because it's a hosted service. I want control over where my code sits, period - if only for the same reason as most hosted services - a subpoena or similar court order can be served against the provider with no notification to the owner of the data. This is the #1 reason one should be careful basing one's business on hosted services.


is reliability the price of convenience, or the other way around?

edit: that makes no sense


Well luckily you have the entire repository where ever it exists on your dev machines. So I don't expect there would be any data loss. In addition all the commits, tags, trees and files are hashed. The content is hashed, if there is a problem you'd know about it.


It isnt just private repos. I'm seeing it on various public repos on github.


Someone had shared a public repo with me on Monday. It started 404ing the next day and I had assumed he took it down. I realize now that it must have been this issue. I just checked for it and it is back.


In future, they keep a very up-to-date status page at

http://status.github.com/


There is nothing useful on that page. The error is in full effect right now.


Oops, my bad. I mistakenly thought the 'db error' notice was for today.

I retract the 'very' in my original parent comment, replacing it with 'usually'.


I hope this has nothing to do with the people who hacked sourceforge..


Git has full cryptographic history (and data) authentication, so any changes to the history would be easily detectable. And you can sign known-good commits, so that even if you've never pulled before, you can still verify the part of the history that's been signed.

SVN and CVS are missing this key feature, which is why the sf.net hacking is scary.


Git's hashing detects data corruption to an existing user, but to someone who hasn't downloaded the repository yet and is getting it for the first time you really have to remember that SHA1 is currently considered "broken". It is increasingly feasible for someone to generate a collision to an existing file, allowing them to forge a commit. This has been discussed on the git mailing list, for the record, and the response is generally "don't care, we don't claim this to be a security mechanism". This is even a problem if you use signed tags, as you are only signing the result of a broken hash function.


I'm a Git n00b, so thanks for pointing that out! No wonder git has killed off SVN


It hasn't yet, but it is making great headway, and I for one can't wait!


Not good.


Why can't we have a web interface for GitHub?

I should be push a new release right from my browser.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: