Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Ask HN: Deploying my project on multiple servers?
12 points by askingaq on July 28, 2023 | hide | past | favorite | 7 comments
Currently I have 5 Linux servers setup, my code is based on Python and celery. Every time I make a revision, I have to manually upload the code. What would be the best solution to automatically deploy? I was thinking of setting up an NFS share, but I'm curious if there's a better way?


If you don't want to go down the NFS share route then Capistrano is a useful tool if you're willing to write a little bit of ruby. It comes with some built in goodies like rollbacks. It's an oldie (pre-dockerize everything), but still useful.

https://github.com/capistrano/capistrano

You can start by deploying from your machine to simultaneously get it deploying across all your servers, then I'd consider having a CI/CD pipeline take over and run Capistrano for you.


If you are using GitHub, you can use GitHub Actions or just a basic web hook. If you don’t want any of that and want something Python based, you can use Fabric [1] to run remote shell commands.

[1] https://www.fabfile.org/


NFS or SSHFS or maybe rsync or maybe a simple script that makes a tar gzip, runs scp to transfer and then runs ssh to execute a script on each server to uncompress and restart the program.

Or could have a simple agent that polls your server for updates and then downloads and deploys.


One popular and complicated way is kubernetes.

I enjoy using k3s with argocd. The containers are built using GitHub actions but you can probably use many other tools.


What is your project about? I'd setup an automated pipeline which deploys your code automatically on each git push. You can check out buddy for example for that


You can use ansible to deploy to multiple servers with a single command.


Look up CI/CD tools




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: