开发者

What are the benefits of pip and virtualenv?

开发者 https://www.devze.com 2023-01-31 03:25 出处:网络
So everyone is telling me to use pip and virtualenv but no-one is able to explain me how it is better than my current approach. The main reason

So everyone is telling me to use pip and virtualenv but no-one is able to explain me how it is better than my current approach. The main reason for people to use pip and virtualenv seems to be that everyone else is using it...

I'm sure there are very good reasons to use PIP and virtualenv but I haven't been able to find them with Google. I'm hoping that someone from the stackoverflow community will be able to explain them to me.

Here is how I currently organize my Django projects:

site/src/ : contains all python-only dependencies of my project

site/lib/ : contains symlinks to the python packages

site/[projectname]/ : contains all my project specific code

The entire site folder is check in my repository (yes, including all python-only dependencies such as django itself).

All non-python-only dependencies (PIL, psycopg2, ...) are documented in a README and installed at the system level (apt-get install ....)

For example, let's say I have a project name "projectfoo" that depends on django-1.2.3, pygeoip-0.1.3 and psycopg2 I will have:

/usr/lib/python2.5/site-packages/psycopg2

~/projects/foo/site : checkout of my repository
~/projects/foo/site/src/django-1.2.3
~/projects/foo/site/src/pygeoip-0.1.3
~/projects/foo/site/lib/django -> symlink to ../src/django-1.2.3/django
~/projects/foo/site/lib/pygeoip -> symlink to ../src/pygeoip-0.1.3/pygeoip
~/projects/foo/site/projectfoo/

Now how does this compare with PIP/virtualenv in practice?

Deploying the project with my current approach:

svn checkout https://myserver.com/svn/projectfoo/tags/1.0.0STABLE/site

Deploying with PIP/virtualenv:

wget https://myserver.com/svn/projectfoo/tags/1.0.0STABLE/projectfoo-requirements.txt
pip install -U -E projectfoo-venv -r projectfoo-requirements.txt

Working on a project with my current approach:

cd ~/projects/foo/site/projectfoo
export PYTHONPATH=.:..:../lib
./manage.py runserver 0:8000

Working on a project with PIP/virtualenv:

workon projectfoo
cd path/to/project
./manage.py runserver 0:8000

Dealing with non-python-only dependencies:

non-python-only dependencies would be handled the same way, there is no way I'm going to use the --no-site-packages option of virtualenv and install a compiler and all the build dependencies on my servers, I don't think anyone is actually doing it anyway.

Upgrading a python-only dependency with my current approach:

I checkout/unzip the new version in src, remove the old one from src, update the symlink in lib and commit. Now everyone else working on the project will get the update at their next svn up or git pull. One thing that is not nice is that the old folder in src will remains if it contains some pyc file in it, this can easily be avoided by removing all pyc before updating your local co开发者_如何学Cpy.

Upgrading a python-only dependency with PIP/virtualenv:

You commit a new version of the requirements file, hopefully everyone working on the project notice the new version when they update their local copy and they then run pip install -E projectfoo-venv -r requirements.txt.

Edit: I remove the use of -U with pip, this is not needed with pip 8.2

Edit: The only advantage on pip/virtualenv over my current approach seems to be when you work on different projects requiring differents version of python. But in my experience when you need different versions of python you also need different versions of others system libraries (PIL, psycopg2, ...) and virtualenv doesn't help with that (except if you're crazy enough to use the --no-site-package option, and even then it's incomplete). The only solution I can think of for that situation is using different virtual machines.

So what am I missing? Can someone point me to a use case where PIP or virtualenv would be better than what I'm doing?


virtualenv really shines when you have a number of projects, and don't want them to all share the same Python installation. For example, you could have two project with conflicting requirements.


"All non-python-only dependencies (PIL, psycopg2, ...) are documented in a README and installed at the system level (apt-get install ....)"

Then you can't have different dependencies for different projects, and not different versions of these dependencies for different projects.

One effect of that is that your local installs aren't accurately reflecting the production machines, so you may have problems in reproducing production bugs, for example.

And if you install system tools that need one version, you are forced to use that same version everywhere, which may break your projects.

Also, undeleting modules needs to be done on the system python level. Virtualenv means you can set up a python install to test things without polluting the system install.

I'd definitely recommend having a separate python for your projects, and using something that isolates even that Python from the projects, like virtualenv or zc.buildout.

PIP is only an easier way to install modules, that also helps you uninstall them.

"there is no way I'm going to use the --no-site-packages option of virtualenv and install a compiler and all the build dependencies on my servers, I don't think anyone is actually doing it anyway."

No, I use zc.buildout, but it amounts to much the same thing, but less work. ;)


I follow the method you have suggested without pip/virtualenv for my usual projects. This allows me to put the packages in specific directory.

+ext
  |
  |------python packages
+source
  |
  |------ project code and packages

And usually in the startup script I update the PYTHONPATH

export PYTHONPATH="";
export PYTHONPATH="${PYTHONPATH}:${workingdir}/path/to/ext";

This has the advantage of keeping the project and dependencies self-contained. I echo, your thoughts here.

How ever, I find the use of virtualenv, when

  1. I have to experiment with something new.
  2. Even better when I want to use two different versions of package and compare them.
  3. Another use is where, I keep different related packages that can be used across projects.

Ex: Documentation: Some key packages, I have installed includes sphinx, pygraphwiz, nterworkX and some more visualization packages. I use it across projects and also keep it out of system level installation to keep it unpolluted.

I would also like you to checkout : Yolk. I find it nice in combination of pip/virtualenv.

You can list packages

yolk -l
Jinja2          - 2.5.5        - active 
Markdown        - 2.0.3        - active 
Pycco           - 0.1.2        - active 
......

And check out pypi updates.

yolk --show-updates
Pycco 0.1.2 (0.2.0)
Sphinx 1.0.4 (1.0.5)
pip 0.8.1 (0.8.2)


Deploying with PIP/virtualenv:

According to you:

wget https://myserver.com/svn/projectfoo/tags/1.0.0STABLE/projectfoo-requirements.txt
pip install -U -E projectfoo-venv -r projectfoo-requirements.txt

What I do: I also "freeze" packages but I do it with pip and virtualenv and check in the entire project; including the Python packages. So my deployment is exactly like yours:

svn checkout https://myserver.com/svn/projectfoo/tags/1.0.0STABLE/site

Working on a project with PIP/virtualenv:

According to you:

workon projectfoo
cd path/to/project
./manage.py runserver 0:8000

What I do: Add a postactivate hook like this:

$ cat bin/postactivate
cd $VIRTUAL_ENV
./manage.py runserver 0:8000
$

And now, to change to the project:

workon projectfoo
0

精彩评论

暂无评论...
验证码 换一张
取 消