devops sysadmin here. I started with gitlab about a year ago and can honestly say I wish I had taken gogs/gitea instead. The main problem for me is gitlabs utter dearth of somewhat counterproductive features. Git LFS support is almost a cruel joke in gitlab as git operations under the hood now take inexorably more ram to complete. In turn im rewarded with more traditional RCS programmers asking why git has problems with their newly requested LFS and a 40gb video file they decided to store.
gitlab is also so large it requires its own chef deployment to competently install it from omnibus, and has no HA roadmap in sight unless you want to break apart its rube-goldberg structure and attempt to HA the individual components of it.
Other things like Wiki, local runner CI, pages and integration with a corporate Jira ticket system seem like the excessive demands of programmers without enough skill to understand that there is tangible merit in ablating these systems into separate applications entirely and growing them as needed. gogs and gittea allow you to put the brakes on developers that just want to get the code out the door in order to study and implement competent things like competent CI, blue/green deploys, immutable infrastructure and reproduceable builds that target scalable platforms like Kubernetes instead of one-offs that just reward coders with a merciful standup and the chance to say "its done."
if you use gittea for no other reason, it is also massively less resource hungry than a gitlab installation. im currently throwing 4 cores and 8 gigs of ram at my gitlab install with another 2.3tb of disk and frankly no end in sight. The gitlab rake task for backups has sent oomkiller on a rampage 3 times so far and takes forever. Large installs of 1000 users or more and you'll begin to see why gitlab might not be the best choice. with programmers, the thirst is real...nobody cares about features if the repo wont clone.
I concur with you on the resource usage of Gitlab.
It's a complex system, and takes up a lot of resources.
I haven't been able to get it to run in a stable manner with less than 8GB of RAM on a very moderately used setup.
BUT I very much disagree with your general rejection of it's worth.
Yes, the feature set is ballooning, and some of them are very half baked.
But I find tremendous value in a integrated solution that offers code hosting, wikis, issue tracking and a tightly integrated CI without having to jump between 4 different systems.
There are a lot of use cases where Gitlab is overkill, but for others it is a really great solution that even comes for free.
Also, as long as the system has enough resources, I've found it to run very reliably and upgrades are painless.
(I also agree though that the backup task is a PITA. Really slow and resource hungry).
Well gitea/gogs also have code hosting, wikis, issue tracking, and while CI is not tightly integrated it is fairly easy to integrate. So the value has to come from something else :) GitLab has integrated chat and kanban boards so there is other cool stuff in Gitlab but it's not really any of your listed reasons.
Edit:
I would have to say if I were to choose GitLab the biggest reason would be the UI. The front-end just looks a lot better. If Gogs/gitea could get a sponsor to help clean it up it would be a lot more popular.
What would you recommend as CI solution for gitea?
I'm looking for something that's self hosted, foss and does not require an admin to set it up for every project. GitLab offers this, but I haven't found anything that competes really.
We use a Gitea for source control and Jenkins for CI. There's an existing jenkins-git plugin that makes connecting them very easy.
We switched from GitHub to Gitea last year and love it. We looked at Gitlab but it reminded me too much of phabricator - trying to be all things to all people, bloating all over. I prefer specialized tools that focus on a smaller core of functionality and interoperate easily with the other pieces.
I‘ve been using drone (http://drone.io). I self host it alongside gitea. The documentation us quite good, loads of plugins available and it works flawlessly with gitea.
I was confused by that, too. I have a problem with the word ‘dearth’ in that I can often not remember whether it means an abundance or scarcity. So I looked it up, and found that it means scarcity. Perhaps GP makes the same mistake I do and misused dearth to mean abundance?
As another systems engineer, I'd much rather have clearly delineated features and how to interconnect them, rather than a cornucopia of disparate and badly documented "features".
I'm thinking of the class of "Does:X, Y, Z" but Z requires unstated dependency of A and C, and C requires you to have a specific database organization type that was asked at the initialization of the system. And there would be no hint that you had to choose the DB type, and how that would have ramifications 10 steps down the line.
Going back, what I'd rather have is more in line with the Unix philosophy. Do something and do it well. Let me connect somethings to other somethings. But Gitlab is a cathedral, and disorganized at that.
I think you've misunderstood who is responding to who. A "dearth of somewhat counterproductive features" is being described as a bad thing in the OP, using an incorrect/nonexistent definition of "dearth" as "there are too many somewhat counterproductive features".
> gitlab is also so large it requires its own chef deployment to competently install it from omnibus, and has no HA roadmap in sight unless you want to break apart its rube-goldberg structure and attempt to HA the individual components of it.
Use the gitlab-ce docker container and save yourself the hassle. Literally takes less than 10min to set up and a version upgrade is nothing more than docker stop, docker rm, docker run <new version> away.
Only thing I haven't solved with this is HA but you can use a central NFS mountpoint with an HA'd setup there, and simply start a new container on another machine with the NFS mount... will require manual intervention, yes, but it's more efficient than trying to break up the dozens of moving parts.
Thanks for posting. HA is a paid feature of GitLab and will likely stay a paid feature.
We're working hard on a cloud native alternative to the docker container that works well on Kubernetes. There is more information on https://gitlab.com/charts/gitlab/ and you can see the activity on https://gitlab.com/charts/gitlab/commits/master Since it is cloud native I assume it will allow autoscaling and it neatly orders every component in its own container.
Because the Rails application server is currently single threaded it forks multiple processes that each take up about half a gigabyte of RAM. Running fewer processes would cause increased latency for users.
In a multi-threaded program, the threads share memory. Currently the application server is running in multiple processes, each of which contains a full copy of the core application.
We're working towards Gitaly 1.0 that would allow you to run GitLab without NFS. This isn't HA but it would prevent one NFS server from taking down the whole cluster.
> gitlab [...] also [...] has no HA roadmap in sight unless you
> want to break apart its rube-goldberg structure and attempt to
> HA the individual components of it.
We run a very large in-house GitLab installation and the HA is easy
because:
1) GitLab stores data on the FS, and you can outsource this problem
to e.g. a NetApp filer.
2) The data it doesn't store on NFS it stores in psql or MySQL, which
has established HA solutions.
Once you have that set up you can load-balance the GitLab nodes and
provision / destroy them at will.
That just splits up the task of application server & git executor into two, it doesn't get rid of the dependency on a monolithic storage backend.
I understand that this is something gitlab.com is working towards and excited about, presumably because it'll allow for provisioning web workers unrelated to the CPU you're spending on running git, but for us I don't see it being useful.
I hope GitLab eventually gets something like GitHub Spokes[1] which'll allow for provisioning the entire setup on a bunch of dumb boxes with RAID 0.
> [...] and implement competent things like competent CI [...]
I've got an OSX build box with a bunch of virtualbox VMs so I can build for Gnu/Linux (i386, x86_64, and arm), OSX, and Windows. Single slow box, so one runner at a time. It has to be running OSX because I haven't found any way to run an OSX VM on any other platform.
Take a look at Veertu, it uses macOS’s built in Hypervisor (like Docker for Mac now does) and integrates the VMs into a pretty decent Jenkins plugin. I’ve been using it since early alphas and it’s one of the best I’ve used in a self hosted solution.
> It has to be running OSX because I haven't found any way to run an OSX VM on any other platform.
It is fairly easy to get decent OS X virtualization using Debian as base OS and QEMU-KVM for virtualization. You won't have niceties such as GPU acceleration but it will work.
Search for "OSX KVM" on Google, it will take you where you want - unfortunately a direct link would expose me to legal liabilities. Also, be warned that using this method may violate Apple's EULA.
Generally Hackintoshes are fine for CI machines, since it doesn’t really matter if Wi-Fi or the camera or iMessage works correctly, as it might matter if you were using the machine for personal use. Updates are also generally fine if you wait a bit for other people to test before installing.
As far as I know, there isn’t anything that goes against using OSX in a VM in the EULA — so long as your host runs on Mac hardware. I don’t think you even need to use OSX as the primary OS in the machine.
That’s an admittedly big caveat, but also one of the reasons why virtualizing OSX works in the first place.
Have you figured out a way to bootstrap it without a Mac? I successfully got an image running under Vbox on Fedora, but I had to build the image first on a Mac and then export it as an appliance.
Also, performance was atrocious. It was so bad as to basically make it unusable. 5 second delay before processing mouse and key events.
It's possible to create an OSX VM under KVM. It's a little clunky, you get very bad graphics performances, and you can run in really weird issues (for example, the NIC not going-up properly), but it works, and if you are using it as a CI platform and/or only sshing into it, it's more than usable.
It's really handy when you are testing/adding support for projects (like C/C++ with cmake) on OSX.
I had a class last semester that submitted everything through Gitlab and when we tried to run unit tests to test for the correct code, it would hang for 7-8 minutes. Very frustrating. Happened on all the assignments so was not just a one time thing.
> [...] programmers asking why git has problems with their newly requested LFS and a 40gb video file they decided to store.
The thing is, git, as our industry uses it, is not really a good tool for
keeping artifacts, and "video" sounds like an artifact, not a source file.
I'd like to see more artifact repositories, especially the ones that you can
talk to from command line or from a Python/Ruby/whatever script. I haven't
found any such thing, so I wrote my own.
As severine found out, it's called GrailBag
(https://github.com/dozzie/grailbag). Docs and README are lacking, to put it
mildly, so I don't deem it ready for being published, but I don't exactly hide
it either.
From the features, GrailBag stores files along with key:value pairs as
metadata (not surprising), has a command line client to
list/modify/upload/download/delete the artifacts, and has a Python module for
doing the same from a more sophisticated script.
Additionally there is an interpreter of a simple language that describes
directory tree where artifacts will be deployed (which artifacts to download,
how to name the files, what directories and symlinks to create). This way you
can write a cron script that deploys whatever you have in the repository, e.g.
in a cold-standby server scenario (two servers having the same configuration,
but one shut down and only powered on when the first one fails; on the first
run of the cron task, the spare server downloads all the missing files,
including whatever got stored while the server was shut down).
Yeah I didn’t have much luck with LFS but we didn’t really need it so it wasn’t a problem to just disable.
I love gitlab. I have it running for a small team on a Digital Ocean Droplet it was trivial to get running. If we had gone for self-hosting we’d have gone the docker route.
Having CI built in was a massive boon when we switched over.
My only complaint really is the resource usage but it isn’t causing problems (8GB). Aside from that I love it.
Exactly my feeling, and I just tried it once on an AWS instance or similar. It was the only serious program running on the machine and it dragged down all the resources, worse than Jenkins ever could. All that doesn't say gittea is much better, though.
Activity timeline
SSH and HTTP/HTTPS protocols
SMTP/LDAP/Reverse proxy authentication
Reverse proxy with sub-path
Account/Organization/Repository management
Add/Remove repository collaborators
Repository/Organization webhooks (including Slack and Discord)
Repository Git hooks/deploy keys
Repository issues, pull requests, wiki and protected branches
Migrate and mirror repository and its wiki
Web editor for repository files and wiki
Jupyter Notebook
Two-factor authentication
Gravatar and Federated avatar with custom source
Mail service
Administration panel
Supports MySQL, PostgreSQL, SQLite3, MSSQL and TiDB (via MySQL protocol)
Multi-language support (29 languages)
If anything, Gogs seems like a more polished product just by going through the README.
Gitea with all its 2+ maintainers can't take care of that?
It's hard to do a comparison without that list. Are users expected to install and find out?
Gitea is a fork of Gogs, so it has everything on that list and more. Sadly Gitea suffers from the usual problem that open source projects has, poor documentation. If you really want to compare feature lists, here's the one listed on their website (https://docs.gitea.io/en-us/).
Why compare two products by their README and not their website? Heck, why not compare them by using their test instances that are there for you to try.
Gitea has every feature that Gogs had at the point when they forked it, plus a few that they've ported over.
But there are still features that have been in Gogs for over a year now that Gitea does not have.
One example that burned me earlier this year, Gitea's backup/restore feature is still very underdeveloped. Gogs' backup/restore feature has been capable of backing up and switching databases for over a year, while Gitea still can only backup and restore to a database of the same type (e.g. Postgres to Postgres)
Doubtful. IIRC it was somewhat of a hostile & opportunistic fork. The Gogs maintainer went on vacation or something and didn't reply to issues for a couple weeks so someone forked to Gitea and declared themselves the new mainline only for the Gogs maintainer to return and not appreciate the attempted usurpation.
I think it was more than a couple of weeks. Also before the fork the maintainer was asked multiple times to allow other contributors to become co-maintainer to reduce the bus factor.
As far as I understand the reason for the fork was that Gogs has a single maintainer who want to keep control over the project (he created it, so he has that right).
Gitea on the other hand was to be more of a community project with multiple maintainers and ways for active contributors to become one.
Though Gogs has more stars, and currently, I think, more related project/integrations, looking at GH stats, Gitea has more activity, PR's, commits, etc.
That made me choose Gitea over Gogs. Really content with it.. was planning on using Gitlab at first, but it would require AWS instances with double the oomph.
The documentation could be better, agree, but a search in issues or filing a Help wanted issue comes a long way.
I'd say if you don't have time to compare them, don't offer a comparison. It's not doing anyone any favors. In fact that only serves to waste more of your precious time.
When I was looking into switching from Gogs to Gitea, that is exactly what they expected you to do. I recall there being an issue saying that they won't compare the two because it's hard to keep up-to-date.
Which is a stupid excuse, considering it's supposed to be a community-based fork instead of run by one person.
I'm wondering if Gitea is a little better than Gogs in term of security.
I remember that Debian considered Gogs to replace Alioth, but rejected it because of security concerns like injections (SQL XSS, etc).
I've also seen Gogs REST API behave badly when we tried to migrate/create a few hundred repositories using said API. Some of the repositories had special characters in their names, and the REST API accepted them without warnings. And after that the instance was pretty much unusable.
It was a while ago, but at the time, it seems that Gogs didn't do much input sanitization.
Except for the features list, how good really are stars and issue counts indicators for anything? Gogs has been around for longer and has received much more publicity, so it's bound to have more stars. Similarly, Gitea has more contributors whereas Gogs is mostly a one man job, so Gitea users might feel more likely to receive an answer than from Gogs. It really feels more like you're comparing the community there than the product.
That said, I agree that Gitea should publish a features list and list the differences between the versions.
Outside Jupyter Notebook Gitea supports all of these plus is managed by a community team not a singular maintainer with commit access (last I checked).
A good product is not defined by having a feature list.
Well, let's call it not "feature list" but "specified vision statement" and then it should have that. It should be quite clear from the main README and the main website what distinguishes this project from others. Otherwise it's quite likely that such specification doesn't exist and goals are unclear, which usually means bloated blobb of unfinished 10 million "features".
There is a milestone system setup on github which tracks when features will be introduced into master and such, so I wouldn't call it a lack of a vision statement.
I don't think a vision statement is necessary either since I'm a fan of developing software primarly for dogfooding, fixing problems you have is probably a good way to improve the software for others.
It isn't though. People will make stupid demands, not because they are stupid, but because they don't incorporate previous decisions taken and they don't incorporate how their desired solution affects other users of the whole system. Last but not least having a 1000 feature blobb doesn't help anybody.
Gogs author didn't want to incorporate proposed changes into Gogs, so a bunch of other developers forked Gogs and went on their own promoting Gitea as true open-source by spirit and blamed Gogs author for his inflexibility. I am using Gogs, becuase it has all I need (Gitea is too little added value for me)...
> Gogs author didn't want to incorporate proposed changes into Gogs, so a bunch of other developers forked Gogs
This sounds like a reasonable way for forking though. I mean what is open source for if you can't go and and implement your own features if you so desire. Sounds to me like both sides are at fault.
I don't know enough of the details to really tell you, but it was more hostile then just that. I heard the maintainer went on hiatus and that's when they forked it - like there was some drama going on, it wasn't just a "I'd like to incorporate my changes here" and more of a "HEY LOOK THE DEV LEFT, WE'RE THE NEW GOGS" kinda thing. Check out some of the other comments in this thread, they explain it better.
The problem was/is that on Gogs there is only one person with write access. When Unkwon goes AWOL or on vacation, it means nobody can fix urgend security fixes into master. In that case a community fork would be necessary everytime the main dev is not available but PRs require merging.
I would gladly switch back once Gogs is no longer vulnerable to the Bus-Problem.
That's just not true (or rather not the only reason). The Gogs maintainer went away for months with no sign of life. He also was the only one with write access to the gogs repository on GitHub which often resulted in no progress for months because of his disappearances.
The project had already forked in the past, but it eventually was deleted and was merged into the upstream because of one very simple reason: Unknwon, the creator of Gogs, came back. The fact that he left again is the main reason why the project forked again.
It's not like other people are mentioning in the thread that "some contributions would not get added" - but rather the fact that he often has really long periods of absence: just take a look at the contributions on his profile https://github.com/Unknwon
And of course, I'm not putting the blame on him - all of us need breaks from time to time - but during these periods where he can't work on the project, the project is essentially brought to a halt, seeing as there is no one else in the community of contributors who is able to merge pull requests - even if they are critical.
Gitea is however the more active (and quite active) side of the fork so I'd say it was quite successfully. You can easily check this in the contribution statistics on Github for both projects.
I evaluated them both again a month ago and found that Gogs is as active or more active than Gitea by actual features developed, they just have less frequent releases.
Frankly, I'd have expected better from the creator of sway. As someone who is not involved in either project, I see two active projects, which offers more choice to users, makes it less likely for both projects to go away, offer competition and all around be in the spirit of open-source.
Often I hear, "if you don't like it, fork it" and then when it gets forked, you get called hostile? Doesn't make sense to me from a rational perspective - nobody forces you to adopt Gitea or Gogs if you don't want to.
I've been using Gogs for a team of 40 people for over two years. This is the second time I've heard about Gitea, and I know it is a popular fork of Gogs. Can anyone sell me on why it's better than Gogs and why I should switch?
The biggest difference is the community around them, Gogs is more or less the brainchild of Unknwon and he used to have a tendency to just drop of the planet and Gogs would sit still for weeks (or even months) at a time without anyone being able to do anything about it. Gitea was a community response to this, nearly all the major contributors switched over and I think it shows in terms of features and fixes. Gitea now has well over a thousand more commits than Gogs and a lot more manpower behind it. You can see this by looking at the contributor graph for both projects, Gitea has had a steady supply of contributors and commits in the last release while Gogs has a few spikes of activity.
Also, if you look at the release notes for both of them I think it's fairly easy to see that Gitea is developing at a much faster rate and has more features than Gogs already with a lot of nice features coming in 1.5.
However, I'm fairly biased, I used Gogs for a long time but switched as soon as Gitea popped up for the reasons above. Since then I've contributed a few fixes and features to Gitea as well.
Long story short: Gogs was slow in committing pull requests. The maintainer says it's because he has specific standards, the community sees it as just slowness. Anyway Gogs was forked into a new version that is supposed to be more active and well maintained.
Well the forking lit a fire under the ass of the Gogs maintainer and he has since become much more active than he was. For a month there there was reason to switch to Gitea as it was adding features that Gogs did not have at a rapid pace, but Gogs is now mostly caught up and there is reason to prefer it. I personally would go with Gogs. 1 passionate person > design by committee of people with shared low responsibility.
+1. About two years ago I switched to gitea but then switched back. gogs has (had?) slightly less features but seemed more polished. No issues since then, I'm glad to run again the original.
A quick question, how simple is it to switch from one to the other? Do you need to ditch all of the metadata each time? (pull requests, issues, users, etc).
(can only tell my past situation, don't know how it is in general and today)
From gogs to gitea all metadata has been taken over (db was identical then). -- For gitea to gogs I read that the two projects have diverted. As I could afford to loose metadata of my not too many projects, I went back to gogs ditching and recreating everything (wrong switching decisions should be punished, don't they;).
i would prefer the original here too. This is a simple enough project that a single maintainer can easily manage the workload of accepting PRS and setting the projects direction. The community forked because they argue that the community can't make changes, however this seems to have changed. So problem solved.
Personally I've found Gitea to have a bit of a quicker pace in adopting new features, though lately there has been mostly a bugfix spree before the release.
If you're comfortable with Gogs you can stay though I do recommend to try out Gitea. IIRC from last time I compared, other than a few minor features they are mostly on part with Gogs.
"Never change a running system", right? Stay with Gogs, it's not worth the hassle. Here are some differences Gitea users colleced https://github.com/go-gitea/website/issues/40 but many of them were implemented in Gogs as well since then. We're using Gogs at work and didn't see a need to change.
I've not used either, but reading their forking announcement (https://blog.gitea.io/2016/12/welcome-to-gitea/), it seems the main difference is that Gogs is maintained by a single person and Gitea by a team of people.
The comments on that post suggest that Gitea is more actively developed, at least shortly after the fork, but I haven't checked to see if that's still the case.
That was a painless installation. Seriously, this is something to immitate if you mean for your self-hosted application to be widely used.
If only other self-hosted/distributed projects were this easy to install. I skiped on discourse, mastodon, gitlab, etc. because they were all pain to setup on a base linux system.
Not sure if you missed the official installation doc for Discourse but it recommends you to install via docker and with it, it's a 10 minute work. Not sure how faster can it be.
Sure, though I was talking about simplicity of the installation of app itself within any Linux distribution, not an entire pre-packaged Linux distribution.
I'm not a fan of Go, but it does lead people into some right decisions, that Ruby and Python lead them away from.
Anyway, the problems start when you want to be sure it's running all the time your server is on, and on integrating it with your other web applications. Still, it's strictly better than interpreted environments.
What’s the issue with making sure it’s on when your server is up? I assumed it’d just be a case of setting up a simple systemd service that restarted it in case it died
Something have not been mentioned at all, Gitea still got support for OAuth2 which have been simply dropped by Gogs because "nobody needs that".
Beside that Gitea still got more really useful features compared to Gogs. But Gitea will always focus on Git hosting and won't bundle with CI, Chat and all that stuff what Gitlab does. There are enough tools you can easily connect to get features that Gitlab got, only if you need it.
Is there any fundamental reason why no tool offers a federated alternative to GitHub? I mean, the distributed nature of git is ripe for federation, but why aren't issues, releases and pull requests federated?
Maybe even some deeper integration with git itself, such as seeing new remotes on federated servers and fetching them.
This may sound redundant to GitHub, which currently centralizes everything, but different projects and companies have their reasons to not use a single external tool for this. As an example, CERN's Open Hardware Repository (https://www.ohwr.org/) would be a perfect fit for this, and other laboratories and companies could integrate with them.
> Is there any fundamental reason why no tool offers a federated alternative to GitHub? I mean, the distributed nature of git is ripe for federation, but why aren't issues, releases and pull requests federated?
Thanks for mentioning this here, I've been trying to raise this whenever I can, but not a lot of people seemed to care enough about this, hope this gets more visibility.
Could you please point me to them? Mostly for curiosity, and maybe add my two cents, as someone who constantly feels this need in a specific community.
This is the way to go, nobody should be relying on one apathetic corporation to house source code of all things. The amount of people with powerful 64bit machines, and $5 VPS's is just too damned high to be relying on something that could kick you out, ban you or just straight up go bankrupt is amazing.
> Is there any fundamental reason why no tool offers a federated alternative to GitHub?
There are tools like Gerrit and Phabricator. Also, projects like git and the linux kernel use mailing lists to handle things like issues, releases and pull requests/patch sets.
I've been using Gitea, previously using Gogs, for a team of 45 and have had a very pleasant experience. Getting it up and running with docker* was smooth and it works easily with Certbot.
Definitely a great option for a GitHub alternative where you aren't looking for a feature-packed, i.e. large, deployment like Gitlab. It's simple and clearly delineated in its functionality.
However, it does suffer from a lack of good documentation - but I could just look at Gogs' documentation if needed since they're practically identical.
I've been running Gitea for a long while now, basically since the fork occured. It's mostly me with some other small users for course work.
It's a very pleasant and lightweight alternative to Github that I can only seriously recommend to anyone considering hosting their own Git Server. Recent update even brought mentions and reactions from Github over.
So I keep a few GB of source online via CVS http://unix.superglobalmegacorp.com/ , and I have been shying away from git anything as I did a gitlab install for someone and was amazed at how much hardware I had to throw at it, unlike CVS+CVSWeb.
Although just seeing how trivial it was to deploy now I'm thinking about maybe running this in parallel at least. I still like being able to use CVS from ancient machines... But this does look nice. I guess time to look at cvs2git.
If you just need a simple way to host git repos and don't need much UI, look into gitolite (http://gitolite.com/gitolite/index.html), I'm using it to manage my own repos and it's been really nice to use. I especially like the possibility to automatically create a new repo when pushing to one that does not exist yet.
Then to visualize the contents you can try cgit (https://git.zx2c4.com/cgit/about/), I don't use it but it seems popular and is what kernel.org uses.
You can just host the git repo itself though if you know very specifically what repo(s) you want. A clear advantage of distributed version control is that each person has their own repo. You can give access via ssh or http I think (only tried ssh tho).
Because I like hosting the stuff myself, and I don't need anyone's approval or their 'community' bullshit, I'm free to do as I wish.
Just as I like having CVS, so I can take a stock Darwin 0.3 instance and sync code without having to spend a week plus trying to build the infrastructure to just talk to some external company who may either ban me or fold at any moment. If there is any kind of change I want, I can make it myself. It's called FREEDOM, which I'm always amazed that in the era or virtualization and cheap hosting so many would rather surrender the freedom to do their own thing.
Because self-hosting is fun, a little bit challenging (and rewarding), you get to show off your ability to maintain a setup, it gives you independence and control. Also RMS would disapprove of using a service running on non-free code.
Indeed it is. Plus I don't have to fight with other people if they don't like something about me or what I'm doing, I'm on my own, in my own little world.
I originally setup the CVSweb thing to let google crawl the pages, so instead of building a search service they would do it for me, which has worked surprisingly well. Although with all the other search engines around it does get a fair amount of traffic.
It strikes me as incredibly strange that we now have Linux and *BSD, this entire built up idea of taking power away from few hands like AT&T, or the various VAR's, moving away from Microsoft, and yet everyone is expected to throw everything they have into the hands of some other corporation to house their source code of all things... Having worked with http://www.oldlinux.org/ trying to hunt down 20 year old 'open source' projects which many have been lost, because of the same mistake of trusting 3rd parties who don't care, the amount of stuff that is going to be lost in the next 'I can't believe we are going to have another .com crash' is going to be incredible. And of course it'll seem like a trivial amount of space, just as something like 4.2BSD full source fits into 20MB, and yet could have easily been lost if it weren't for SIMH interest would have been completely lost.
I'm more impressed with sourceforge, as it's been around for quite some time, offers far more services than github, and survived their prior owners stupidity of trying to inject spyware into downloads.
Oh well, I guess that is my old man rant, don't rely on one thing, spread the software, don't let it sit on an apathetic 3rd party company who isn't even close to being able to operate without angel funding.
I might be the only one here, but I wish Gitlab had stayed on providing the best git / issues solution instead of bloating it further and further into CI/CD and more. After all, Gitlab started as an open source alternative to github and that’s what I really wish for and while I understand how Gitlab needs to differentiate itself from Github, the direction is causing some of the issues. My two cents...
I would like to see a full-fledged docker installer something to the effect of the discourse installer that I can put on something like digital ocean or my own machine, handles https in an opinionated way (let's encrypt), and integrates with popular email services like sendgrid or mailgun (with generous free tiers).
Check out projects like Traefik (https://github.com/containous/traefik) as well. Traefik is a reverse proxy that can automatically configure routes based on labels on your docker containers, and automatically setup Let's Encrypt certs for your subdomains.
E.g. I add a label to my Gitea container to say "Serve this from git.mydomain.com", and Traefik will start redirecting requests and automatically configure a Let's Encrypt cert.
Basically you have all of your containers join the Docker network that your Traefik container is running on. Each container needs to expose a port (but not bind to the parent). The Traefik container will bind to port 80 on the parent. That guide should explain everything.
Let me know if you have any other questions or get stuck, I'd be happy to try and help!
I have made a tool with which I can easily deploy and backup/restore dockerized web apps. Configuring and deploying gitea (or any other service I integrated so far) only takes a couple of minutes. Services are run behind an nginx reverse proxy container and Let’s Encrypt certificates are automatically issued if you choose so during config. The only requirement is docker.
I have been using it for about a year and recently put it on github, because I found it useful enough to share.
Maybe not exactly what you are looking for, since I am not familiar with the discourse installer, but maybe interesting for someone? I’m happy about some feedback, since I never actively shared this until now.
I find the discourse intall approach to be outdated, to say the least. It reminds me of myself tinkering with docker for the first time. Just provide me a docker image and deployment examples for compose and kubernetes.
Currently for private repos I just have a dedicated account on my vps that uses git-shell with a few scripts for creating and listing repos. I've scaled it to include a few friends using Linux groups for shared access.
Might look at switching to this though as it looks quite a bit more featureful but still simple to maintain
I played around with the test instance of this on their website. It is very impressive. It seems to have the 90% of features that most people use github for, and it hosted easily. Certainly worth a look in my next startup when considering whether we want to self host our own codebase.
Gitea got code and issue search, the code search just needs to be enabled within the config because the indexing process takes up more resources, that's why it's disabled by default.
OSGeo dot org is using Gitea, with superpowers by Sandro Santelli (strk). It's great ! popular and lightweight, does much and asks little. The graphic design is familiar and so are the functions, since it obviously originated as a clone of another G-site you may know.
Running gitbucket is simply "Java -jar gitbucket.jar"
JVM performance is pretty much world-class (and has been for many years). Golang and JVM compete neck to neck at top of the performance charts. Remember that startup performance is not what Java optimised for - it's long running processes like any web server or api.
And I'm far more excited by Graal than any programming language breakthrough over the past few years ( https://youtu.be/_7yIUkP5LiQ)
Does it manage issues, pull requests, etc modeled in a distributed way? I was always dumbfounded as to why you'd take a great distributed collaboration tool like Git and then centralize it around issues, pull requests, comments, etc.
To me all of those things should be Git objects as well either tracked inside the same repo on another branch or something or tracked in a parallel Git repo.
I'm not a user of LFS, but I use phabricator and I find that it meets all my needs even on slower hardware, and it's pretty simple to debug. Has all your typical LAMP stack debuging techniques and when it breaks, it's pretty straight forward what needs to be fixed.
I wouldn't say that directly, maybe it gains more trust because it's serving on their own code, but maybe this could also decrease the contributions because the people got to authenticate on just another system to contribute something.
We will see how well this works out, most required features are already there, the remaining will follow soonish.
yes I recommended gitea at HN about one year ago, light-weight and great for self-hosting, good enough for probably 80% of the use cases for most people.
I put it on a $5 VPS to host all my personal projects, works great. At the moment I ssh-port-forwarding to access its UI as I'm not sure how secure it is if exposed to the public directly.
The only feature I want to have now is something like : https://pages.github.com/, i.e. a static html site behind gitea login, so I can host all my project-related documents for easy access, not really a wiki fan as wiki is painful to navigate sometimes.
There are enough people and companies that don't trust these american companies, or they are forced to keep code within there network... Not everybody wants to give his stuff into the hands of others ;)
I'm with you. I've ended up using GitHub for anything public and Gitlab for anything private recently and it works well for me. I've considered running Gitea/Gogs on a machine at home or a VPS but could never justify the additional effort for it.
I like open source self-hosted alternatives to everything, though some headhunters, for those looking for work, may either ask for your github link or figure it out themselves in order to further size you up, one possible utility specific to github.
Don't count on your Github account for recruitment. I've a decent number of personal projects on Github, some of them actually used by other people. But I can count on the finger of one hand the number of times I was contacted by someone saying "hey, I've looked at your Github account, you seem like a decent developer, would you want to work with us?"
However I prefer to put stuff on Github because it's the go to place for OSS projects right now. If you are publicly releasing a project, it's with the hope it will be actually useful to other people, putting it on Github increases the visibility of said project and the likelihood of it being useful to other people.
> But I can count on the finger of one hand the number of times I was contacted by someone
Different strokes for different folks. I usually fill the fingers on one hand counting the folks reaching out each week. Rarely are they interesting (like a lot of my stuff on GitHub) positions, but still tons of unsolicited offers.
Open source software tends to be plauged by UX, css, design, spacing/padding/margin & typeface issues which makes it somewhat grating to use, which I mention after browsing around the the gitea test instance for a bit.
At least for Gogs and Gitea nobody gets payed at all, same for lots of other opensource projects. So of course it's hard to pay some frontend engineer if none contributes to the project :)
If i'm not mistaken, gogs is managed by one person. Twice in the past few years he stopped being active for a few months. The first time a bunch of contributors started a fork to continue working on it and be community driven rather than managed by one person. When the gogs author started being active again, the features from the fork ended up being implemented in gogs (the wiki and a few other stuff) so the fork was killed. The second time it happened the fork (gitea) was created again with no intent for it to be temporary this time.
The owner of the Gogs repo doesn't accept pull requests that won't fit his roadmap or quality standards, he is also not the most responsive during times this is why some people created Gitea. I am still using Gogs personally.
> The owner of the Gogs repo doesn't accept pull requests that won't fit his roadmap or quality standards, he is also not the most responsive during times this is why some people created Gitea. I am still using Gogs personally.
Let's break it down:
> The owner of the Gogs repo doesn't accept pull requests that won't fit his roadmap
That generally sounds bad. Like, in an ideal world, you would be open to changing your roadmap. Specifically the timing, for sure. Also, you would be open to accommodating features you hadn't thought of, etc.
So, this sounds like a negative, basically.
> or quality standards,
Starting with a negative, continuing with just an "or" makes it sound like you're about to list another negative.
I'd expect a "but" or a "but thankfully," or something.
> he is also not the most responsive during times this
This is absolutely a negative.
He doesn't bad thing, or xxx, he is also bad thing.
Reading that sentence structure makes "xxx" sound like another bad thing.
I feel like the vendor lock-in will prevent most from ever leaving github, specifically things like the deployments api and other api interactions via webhooks (into PRs for example)...
Someone needs to make a _simple_ git interface, have a post commit hook generate a partial and let a client side interface simply pull relevant partials.
By looking at README and the original Gogs project, it looks like these projects are gaining some attention in China. It is quite interesting for me when these are written in Go which is Google’s proejct blocked by the Chinese government.
- uses less resources, a 512MB VM is enough for a small team. Meanwhile Gitlab requires at least 4GB IIRC.
- is easier to setup, just a single binary file.
Everything else, it can't compete with Gitlab.
I'm using it as my personal Git server, works pefectly for 1 user.
gitlab is also so large it requires its own chef deployment to competently install it from omnibus, and has no HA roadmap in sight unless you want to break apart its rube-goldberg structure and attempt to HA the individual components of it.
Other things like Wiki, local runner CI, pages and integration with a corporate Jira ticket system seem like the excessive demands of programmers without enough skill to understand that there is tangible merit in ablating these systems into separate applications entirely and growing them as needed. gogs and gittea allow you to put the brakes on developers that just want to get the code out the door in order to study and implement competent things like competent CI, blue/green deploys, immutable infrastructure and reproduceable builds that target scalable platforms like Kubernetes instead of one-offs that just reward coders with a merciful standup and the chance to say "its done."
if you use gittea for no other reason, it is also massively less resource hungry than a gitlab installation. im currently throwing 4 cores and 8 gigs of ram at my gitlab install with another 2.3tb of disk and frankly no end in sight. The gitlab rake task for backups has sent oomkiller on a rampage 3 times so far and takes forever. Large installs of 1000 users or more and you'll begin to see why gitlab might not be the best choice. with programmers, the thirst is real...nobody cares about features if the repo wont clone.