-Keep your software in a DVCS of some sort. I find Git and Mercurial to be great.
-Use virtualenv to abstract away from your current environment's Python distribution. Start clean and download the packages you need.
-Create a requirements file listing the packages that you need for you program. Put it in the same format that the 'pip freeze' command outputs, so that installing a new environment is quick and easy.
-Set up a local configuration file (not managed by version control) and a base configuration file with all the settings that are immutable. Import the local config file within settings.py so as to avoid local setting conflicts.
-In production, set up an nginx frontend to serve static files, and route all the Django urls to an Apache backend (or Tornado, or whichever App server you may want to use).
-I haven't tried anything else, but WSGI is super intuitive and easy to use.
-If you need it, try using Django-South for versioning database schemas. Do take into account that it has a bit of a learning curve.
-You don't need to put your python files in /var/www, any directory will do.
These are some excellent points, the only things we do differently are:
- We keep a local settings file for each environment, and those /are/ versioned. We have a /settings_local directory which contains each of the variants (localdev/dev/staging/df/live). The appropriate one is sym linked to /settings_local.py, which in turn is imported into settings.py.
- We bypass Apache entirely and just plug Nginx FCGI into Django directly.
- We have a separate pip requirements file for each environment (also kept in source control)
- We use a Puppet to configure our systems. Perosonally though, I have found Puppet to have an exceptionally steep learning curve, so you may want to shop around.
If you haven't used fabric, give it a try. It's an extremely simple API for performing remote management, deployment, etc.
I've tried numerous tools but nothing compares to fabric. It is just so easy and it always works. I can't imagine not seeing a fabfile.py in my project's deploy/ dir anymore, it just wouldn't be right.
> - We keep a local settings file for each environment, and those /are/ versioned. We have a /settings_local directory which contains each of the variants (localdev/dev/staging/df/live). The appropriate one is sym linked to /settings_local.py, which in turn is imported into settings.py.
I didn't read your comment (thoroughly) before posting my comment above, so I was parroting your point about organising project settings, however there are a few differences between our approaches. And I'd be interested to hear what you think about them. I don't work that actively with django at the moment, and never really had the opportunity to use my above system in a commercial project so maybe (probably) there are some gotchas I haven't thought about!
> -Set up a local configuration file (not managed by version control) and a base configuration file with all the settings that are immutable. Import the local config file within settings.py so as to avoid local setting conflicts.
Have you tried keeping your settings in a package rather than in a module? Most introductions to django use settings.py to keep things simple (works out of the box using manage.py), however there is another (better) way!
You can store your settings in a package instead, so you have a settings directory, and inside this you have modules which represent a configuration. So you could have settings/development.py rather than settings.py, this just means you just change your django settings environmental variable to point to the correct configuration for each machine.
There are a few perks to managing settings this way, first you can extend existing settings (for example if you wanted to have some default settings that all configurations use, you could have a settings/shared.py and then do a `import * from .shared` in each of your configurations). Which means you have access to all the existing settings so you don't have to repeat yourself if you say want to add some middleware, or an application (think debug toolbar.)
And another benefit is that you are able to manage your settings through your version control system. Which I understand is not always ideal, however my guess is for most private projects this will be the best way of doing things! It also just seems to me like a much more pythonic way of organising your settings. (You can also do this for urls, however there's not as much benefit there, given that urls probably won't change from machine to machine that often.)
-Keep your software in a DVCS of some sort. I find Git and Mercurial to be great.
-Use virtualenv to abstract away from your current environment's Python distribution. Start clean and download the packages you need.
-Create a requirements file listing the packages that you need for you program. Put it in the same format that the 'pip freeze' command outputs, so that installing a new environment is quick and easy.
-Set up a local configuration file (not managed by version control) and a base configuration file with all the settings that are immutable. Import the local config file within settings.py so as to avoid local setting conflicts.
-In production, set up an nginx frontend to serve static files, and route all the Django urls to an Apache backend (or Tornado, or whichever App server you may want to use).
-I haven't tried anything else, but WSGI is super intuitive and easy to use.
-If you need it, try using Django-South for versioning database schemas. Do take into account that it has a bit of a learning curve.
-You don't need to put your python files in /var/www, any directory will do.