"Too often when coming to a new language, it's hard to know what is old and busted and what the future is."
This is too true. As a Python/Django newcomer that has been difficult.
Any perspectives on buildout and where it fits into the toolbox of the future?
I started using it after watching Jacob Kaplan-Moss' Django Deployment Workshop ([1] -- highly recommended) and I like it, but I read about virtualenv and pip much more often.
One thing I don't really get about Python deployment tools in general is the idea of building eggs for everything, especially the parts of my own applications that I will never redistribute. This seems to be emphasized with all the tools.
I have never personally used buildout so I can't speak about it pro or con. It always seemed a bit heavy handed to me but that's an uninformed opinion.
I also think the idea of packaging to egg files is somewhat unneeded. When I use pip, I use cheeseshop for the more established frameworks I depend on (Django, south, etc...) but I also use pip to pull directly from various git repos.
When I package my own projects for use, I just drop a setup.py script in there and then install from github using pip. Looking at my requirements.txt for my main project, I have about 10 dependencies from cheeseshop and about 10 pulling from various git or bitbucket repos.
This is an interesting take on using virtualenv: "In the last couple months I have adopted the mindset that I will be mostly avoiding installing python packages into the global 'site-packages', as I've found that it pays to manage dependencies on more of a per-project basis, rather than globally installing, and hoping all goes well."
This appeals to me, but are there good reasons for avoiding installing packages to site-packages?
> This appeals to me, but are there good reasons for avoiding installing packages to site-packages?
There's a (pardon the language, but I don't know a better term) shitfight between OS packages and Python packages for control of the global site packages. Each will override the other.
My current setup, which works very well: use pip and virtualenv for everything, leave the global site packages alone.
Everything you need to run Ubuntu this way can be achieved by:
I find the main benefit of project-specific site-package directories is simply that it's easy to see what my project depends on and what it doesn't. If everything's mixed together in the global dir then it's not always obvious. This becomes important when you want to deploy your project somewhere other than your personal PC.
It's also very useful for managing different versions of packages - e.g. I can upgrade to the latest Django for one project without risking breaking all of the others.
Having said that, I install the stuff that I consider to be "core" globally - for me that's primarily numpy, but also a handful of others.
I don't get why I'd want to use virtualenv if my production servers allow global package installation. Ideally my Python deployment should be the only thing running on the server, right? In that case, having a virtualenv seems like overkill. And for local development, it's just a hassle to have to enter the virtualenv every time I want to run the code, when I could just have the needed packages installed globally.
http://guide.python-distribute.org/
Specifically this chart which shows the packaging roadmap:
http://guide.python-distribute.org/introduction.html#current...
Too often when coming to a new language, it's hard to know what is old and busted and what the future is.