I kinda of like this... but then again I'm kind of wary.
Isn't the standard library the place where packages go to die?
Isn't the reason pip is actually useful because has a nice health release cycle (http://www.pip-installer.org/en/latest/news.html) and isn't frozen into the standard-library ice age of the past?
Won't this make it even harder to build a compliant version of python that runs on mobile devices where you can't install packages (>_> iOS)?
I get that it's convenient, I'm not 100% convinced its a good idea.
Edit: Reading the PEP in detail its now clear that this is not bundling pip with python (see 'Including pip directly in the standard library'). This is bundling a pip installer as part of the standard library. Much better.
It looks like they're taking a new strategy with this move. Pip isn't actually going to be moved to the standard library to live; rather, a new "ensurepip" command that installs "pip" will be a part of Python proper.
However, a distributable version of pip will be included with each release so that ensurepip does not have to contact the PyPI servers during an install. And with each maintenance release of Python, this pre-included version will be updated to match the then-current pip release.
But, the bottom line is that pip still lives outside of the standard library. Python 3.4+ is just guaranteeing that you have a version installed.
Is it the "norm" that, when accepted into the standard library, packages "die" - or is it because this has happened to some packages, the popular perception is that the standard library is where packages go to die.
I'm just not understanding how it's OK that a standard acceptance of a Python package really means that it should go to the graveyard.
The standard library only gets new features when a minor or major Python release happens. That doesn't happen very often which slows down development of anything in the stdlib significantly to the point that packages in the stdlib effectively "die".
To some extent this is intentional because anything that requires more frequent changes is probably not stable enough for inclusion in the stdlib, in practice it does cause some unintentional issues for example rather lacking timezone support.
Vaguely related, this wonderful article by Paul Tagliamonte on how pip coexists with distro package managers (in this case APT): http://notes.pault.ag/debian-python/
Nice reminder that pip is a dev tool and should be used as such. It makes sense to be included in Python.
Glorious day! As a person who went to PyCon and experienced first-hand the state of Python packaging, this is excellent news. Good luck to the Python devs in the days to come.
This isn't very big news, in that virtualenv already provides pip in each new env, and you should not be using pip outside virtualenv -- unless it's to install virtualenv!
We're not all building web apps in python - virtualenv is not universally useful.
(Edit: Not that I don't like virtualenv when it's appropriate, but it really bugs me the wrong way when people just throw out generalizations like that)
I'm not a web developer, but I do data analysis with Python, and I use virtualenv heavily. For one thing, it increases reproducibility to have a record (pip freeze) of which versions of each package I used.
Having tried various ways of distributing desktop apps / non-web daemons, I've found virtualenvs useful there too (with the one fairly large caveat that installing wxWidgets in a virtualenv is a world of pain :( But then in those cases I use the OS-provided packages with the OS package manager, so I still never use virtualenv and pip independently)
Exactly! I am currently working on a machine learning task. Python is great for this, but most of the scripts that I am writing are run-once scripts. VirtualEnv would be overkill of this usecase.
I build web apps, but almost anything else I do, including a full Python based backend for $WORK is all done in a virtualenv. Virtualenv's are not just for web apps.
If you're doing data analysis, where most libraries are serious about backwards compatibility and you don't necessarily care whether your code still runs two years from now... virtualenvs are sometimes just not worth the bother. (Though I'd still recommend them.)
Libraries that are very build-finicky (like various image processing libs) I've had bad luck with virtualenv, or env pip. Usually end up building by hand.
fully agreed. the numpy, scipy, matplotlib, PyQt4 stack in particular is very indifferent to virtualenvs: you're going to need to install a whole ton of dependencies system-wide regardless.
i recommend it to everyone i work with, but it is significantly more useful when all your dependencies are pure python.
I agree with paulgb and others -- most of my past work with python could be called scientific programming in one way or another and pip has been enormously useful.
Didn't come with it, or didn't install it by default?
AFAIK the default virtualenv behaviour has been "create an environment and then automatically and immediately install pip from the internet", so for the last several years they've been practically tied together, even if they were distributed individually.
I was talking about virtualenv, not pyvenv - grandparent saying "virtualenv was released with python 3.3" made it sound like virtualenv was released with python 3.3...
I'm not really a pythonist so I'm not 100% aware of the consequences of this, but as someone who deploys Python-based software every now and then, this just seems to make sense to me.
As indicated in the Tagliamonte piece linked above, be careful deploying with pip. OS or distro tools such as apt-get are usually more appropriate for deployment. Distro maintainers have made commitments that PyPI uploaders have not made. Of course sometimes users need things that are not available from a distro repo, but in that case they're not so much "users" as "testers".
I really really despise what distro maintainers do with packages. Pinning some stuff way in the past, esp deps used by lots of projects, this then forces consumers of those deps to get patches against their code to work causing a schizm in the public version and the distro version of the library. A huge pointless wasted mess.
I almost always use pip and my own installed code (kinda homebrew like) to put dependencies onto a box. It takes more work but I can skip the bullshit.
Yeah I get that. If it's a frequently updated package that you use frequently and care about, or on an instance you're both developing and maintaining, it's totally worth it to do the system integration for yourself. I got the impression that GGP was deploying for other people. In that situation, not having a distro package maintainer for a package might mean the sysadmin becomes a de facto package maintainer, which sucks for the sysadmin and the users both.
For my own education: Python with a good standard packaging system and solid, standard async capabilities would be solidly going after the same areas Node.js has done so well in? If not, why not?
I doubt it. All Node.js libraries are built around the non-blocking model which is why it works so well. Python libraries will have to decide if they're a normal blocking library or an async one. Possibly maintain both versions of the API?
Personally I'm not convinced that the Node.js callback hell is a good way to get high IO performance. The Golang way of doing it with normal, blocking libraries and Goroutine's seems ideal as it doesn't burden the programmer as much.
Callback hell isn't, but promises do help (to a point)
In 0.12, node will ship with a version of v8 which has generators (hidden under the --harmony flag), which are shallow coroutines, which allow for the same code style as go-routines.
Good to hear. The situation with python packaging has seemed kind of chaotic for awhile. The setuptools/distribute merger will hopefully standardize things from here on out.
I like pip, but it's too bad that it can only install from source. It's quite a hassle sometimes to round up dependencies and build them all on windows (not to mention not everyone has a compiler installed on windows).
easy_install can install from binary installers or eggs. I'd like to see that added to pip.
i am using wheel (with pip) as a replacement for .exe installers to install some packages (numpy, scipy, py2exe, etc) on windows. i find it useful because you can automate the installation of a wheel archive using pip (with the typical .exe windows installers you need to click next several times...). the wheel command line utility also knows how to convert existing `.exe` installers to wheel archives.
That feeling of having to keep GBs of Visual Studio 2008 installed merely to be use the most recommended python XML library (lxml). Alternatively you can just add a 2MB egg to the repo.
What is involved in getting a wheel file to pypi so that `pip install lxml` happens out of the box in windows?
Not being able to install from binary installers is so annoying, it's a real shame it doesn't support it. Its the only reason I keep easy_install around.
As someone maintaining packages on pypi, I can tell you that demand for Python 3 support is growing and people are beginning to port packages for their own needs.
I expect we'll see python 3 overtake python 2 in new projects within 3 years. I realize that's still pretty far off, but these things take time. You have to give the PSF credit for great support of the 2.x series.
As a recent convert to Python (but only as an enthusiast... still doing C++ in my dayjob), I chose to go all in and start out with Python 3. I'm glad I did, but I do need to keep version 2 around.
Nice... although I don't get why pip is so much behind RubyGems and NPM in terms of package management. pip should merge with virtualenv and virtualenvwrapper as well.
Bundling the subpar pip actually solidifies the status quo and is in no way improving the situation. I think what made both Ruby and Node.js great is the flexible package management.
Hopefully it will also bring about some improvements to pip. It's a pretty great tool but with a couple major caveats. The first is that although it supports many forms of package specification, including VCS repositories, it does not report the package spec according to the way that it was installed, but rather according to the package name and version according to its setup.py. Say you install a package from a git commit that fixed a bug in the PyPI 1.0.0 package whose version is still reported as 1.0.0 in setup.py at the commit. Then you freeze the environment to requirements.txt to distribute. It's still reported as package==1.0.0 instead of the git spec, so the next person to install will pull down the broken version from PyPI instead. The other headache is that installing from a requirements file just installs dependencies in the order they're listed, so oftentimes you need to re-organize the output of `pip freeze` to make sure dependencies are installed in the right order, otherwise you can encounter things like unexpected package versions due to other packages making ambiguous dependency specs for dependencies of your own app.
I have to say, these are kind of obscure corner cases. Should be fixed, to be sure, but I don't think they are an accurate representation of pip to people who are not familiar.
What do you expect me to use as an alternative? easy_install?
You're right, maybe I've mis-categorized as them major caveats, but they're not so obscure because both myself and others have had to wrestle with these properties in production. I'm not saying use easy_install by any stretch (pip is great), they already have open tickets and I hope to see them fixed before being rolled into Python proper and do become defining problems within pip.
"If you are writing a package manager for a new non-js language, do whatever it takes to get that new language to load modules the way node does. The right way."
Python has had a good installer forever. The only difference is that it wasn't centrally mandated. If you need something to be centrally mandated in order to use it, I feel for you.
Isn't the standard library the place where packages go to die?
Isn't the reason pip is actually useful because has a nice health release cycle (http://www.pip-installer.org/en/latest/news.html) and isn't frozen into the standard-library ice age of the past?
Won't this make it even harder to build a compliant version of python that runs on mobile devices where you can't install packages (>_> iOS)?
I get that it's convenient, I'm not 100% convinced its a good idea.
Edit: Reading the PEP in detail its now clear that this is not bundling pip with python (see 'Including pip directly in the standard library'). This is bundling a pip installer as part of the standard library. Much better.