Page MenuHome

Using PyPI from Blender
Confirmed, NormalPublicDESIGN



Improve the workflow for installing Python modules from the Python package index PyPI.

This is a design task to evaluate options.



For Python developers to be able to install 3rd party modules into Blender.

While this is possible, it's quite involved:

This task is to evaluate possible solutions:

  • Use pip with Blender's built-in Python.
  • Only support pip for system-wide Python installations.

Package Compatibility

  • Windows Python MSVC compatibility. (EDIT: this is no longer an issue).

    In the past, Blender's Python was built with a different MSVC version, making binary modules incompatible.
  • File System Permissions

    The Blender's Python directory may not be writable.

    This could be solved using the --user argument.

    eg: on Linux pip install my_package --user installs to ~/.local/lib/python3.8/site-packages/bintrees

    Possible Solution: Set PYTHONUSERBASE to a directory under Blender's own configuration.

    This has the advantage that it will work with both a system wide & a bundled Python installation.


Event Timeline

I support bundling pip with Blender. It would have helped me in quite a few cases. I did not have any issues with binary compatibility after installing it myself yet. That does not mean that there are none of course.

For my vscode extension for Blender development, I had to use pip to install some packages in Blender. It works in most cases, but I remember that it did not work in every case. I would have to dig deeper to figure out the failure cases more precisely.

The code responsible for installing the packages is here:

This solution does not require any user interaction. Blender only has to be started from within vscode and everything else should happen automatically.

I would also appreciate to have pip in blender. I'm struggling with using numpy or scipy inside of blender. It would make it so much easier to use scientific data for visualising in blender.

At Airbus Defence & Space, we are using ensurepip (and then pip install) to install 3rd party packages inside Blender, for addon and pipeline development, based on buildin python from blender. Hope we can continue to perform it even on PC without system install of python

@mudu: numpy is already bundled with blender (but scipy not, but can be install with pip as described in stackexchange link.

It would be great if the improved workflow would make use of venv. This would allow a nice replacement for how modules were handled through the Scripts path in the preferences with the Blender specific directory structure.

If we consider improving the workflow of how Python modules are installed in Blender, I'd like to make the following proposal:

Instead of installing modules directly into Blender's bundled Python, let the user create, reference and use venvs in Blender's preferences. Blender would create the venv based on the bundled Python and therefore modules installed through pip in that venv would be compatible to Blender's bundled Python. The user could select one of the venvs to use, however Blender doesn't actually load the venv since it already has it's own interpreter running. Instead it checks if the referenced venv interpreter matches the bundled Python version and if it's ok, loads the modules. There could also be a section in the preferences that allow to install modules through pip into the venv within Blender.

The benefits of this approach:

  • The bundled Python remains in a clean state
  • If you want to install modules that have conflicting dependencies you can simply create separate venvs and switch between them
  • If the Python version of a new version Blender remains the same, you can re-use the venv. In case it doesn't match you can detect this and report it to the user, instead of having issues where modules aren't working properly.
  • Different venvs could be created for different needs
  • Unlike when using the Scripts path you don't have to manually copy modules from a manually created venv into a modules folder
  • Provides a single consistent workflow for working with Python modules

This could possibly replace the Scripts path in the preferences altogether, since custom startup files can already be created through application templates and add-ons would be installed in the add-ons section of the preferences. I know it's more work that simply adding pip, however it may be a cleaner solution in the long run.

BTW a better link for this discussion would be:

With ensurepip it's actually not really complicated.

Package Compatibility
Windows Python MSVC compatibility.
In the past, Blender's Python was built with a different MSVC version, making binary modules incompatible.
Is this still an issue?

Good news: No longer an issue, pip works and you can use it to install things !

Bad news: I think by default pip wants to install into the scripts folder, however if blender is installed into c:\program files\ writing there maybe problematic and require administrator rights, which is guaranteed to be a troublesome in most corporate and educational environments.

@Ray molenkamp (LazyDodo) You're right pip is already shipped with 2.81. So this task is basically solved?

Bad news: I think by default pip wants to install into the scripts folder, however if blender is installed into c:\program files\ writing there maybe problematic and require administrator rights, which is guaranteed to be a troublesome in most corporate and educational environments.

It looks like we can work-around this problem using PYTHONUSERBASE and pip's --user argument to install packages in a path we define (noted in task).

@Robert Guetzkow (rjg), I'm not sure if we need a venv if we can define the user-site directory.

While I haven't used venv much, and never dug into it's internals - a user-site directory looks to have most of the advantages noted in your post about venv's.

This comment was removed by Robert Guetzkow (rjg).

@Campbell Barton (campbellbarton) If setting the PYTHONUSERBASE allows to switch between multiple directories in order to separate the modules per directory from each other and from the system-wide installed modules, then it would be a good solution. It would also have less overhead than a venv.

Imagine how useful it would be if pip could be accessed through the API?

Here is a project which attempts to do that:

The ability to use venvs would actualy be really great.


There's not much blender can do to help. But a community tool can bring us there.


Setting PYTHONUSERBASE and using --user does not quite match:

  1. You would need to track which python/blender it has been created by (wheras it's included in venvs) This is important because the purpose of a venv is isolation for dependency resolution, and the python version is part of the dependencies. As would be the blender version in our case.
  2. pip show and pip list don't have a --user flag. Once installed, you can't inspect your packages with pip. This is just annoying but it may have repercutions on dependency resolution (does pip uses USER_BASE to resolve dependencies ? what happend when you switch USER_BASE)
  3. Installing in edit mode (install -e) kind of tries to works but the result is a mess (no dependecy installation, many error if a venv is active, ...) We could consider this one a bug though and let python fix it.
  4. This would prevent to use USER_BASE for its actual purpose: let the user add stuff.

What works though:

  • The "console_scripts" entry points correctly end up iside the USER_BASE.
  • Entry points of package from the USER_BASE are correctly seen by pkg_resources.iter_entry_points()

What we can do:

  1. Could be fixed by adding a .bat or .sh file pointing to the blender that created the USER_BASE.
  2. Feels like whining but it actually scares me. More tests should be done.
  3. After a couple hours I couldn't figure this one out :/
  4. There is no fix for that.

In summary: while similar in certain features, there are solid reasons for the raise of virtualenv, then venv, in addition to the USER_BASE feature.
It wouldn't feel right to try to user USER_BASE instead of venvs.


If we could create VENV with blender executable (not the bundled python), the venv would contain blender itself, which fixes 1). We could also use all the PyPA tools, which fixes 2), 3), and 4) wouldn't be a thing anymore.

Let's see what prevents us from doing so:

  • venv is not a PyPI package, it's a lib package. You can't install it even with pip :/

    => This is easily fixed: venv should be distributed with Blender's internal python (like pip is now) (and like ensurepip should still be, please ! ^^)
  • While creating a venv, the current executable is considered to have the python command-line arguments: it tries to run blender -Im ensurepip --upgrade --default-pip and it fails because blender command line does not behave like this.

    => This one is a hard one. We don't want to support python's command line arguments in blender's command line !

    I did manage to work around it, but there's nothing solid there :/ (If you create the venv with the flag "--without-pip" and finish it by recreating it with blender's bundled python, the resulting venv works like a charm.

    But then, running blender from the venv fails because it won't find it stuff. I tries the command line flags (--env-system-datafiles, etc...) with no luck. But using the env-vars (BLENDER_SYSTEM_DATAFILES, ...) did work -___-". So I added a 'this_blender' script in the venv/Scripts settings those env-vars before calling ./blender

    And now I have a blender inside a venv \o/ ( sys.executable == 'path/to/ENV_NAME.venv/Scripts/blender')

    Once activated, I can:
    • run blender by typing "this_blender".
    • run pip install -U XXX, pip list and pip show at will.
    • run pip install -e XXX and it works.
    • Enjoy console_scripts and entry points working, even with packages installed in edit-mode.
    • Addons installed using the gui end up in the same place as with no venv (I think, tbc...)
    • deactivate and activate another venv for another project without having to know which blender version I should use, nor deal with package version clashes \o/

My Conclusion

Using blender inside a VENV is really appealing from a pipeline/workflow/dependency-management point of view.
On project spanning 6 to 18 months, using venvs would help keeping up to date on blender versions.

But there's not much blender can do beside providing the venv package.
It can totally be a community effort. Something like pipenv or tox but specialized in blender...

Out of Topic

That being said, if there was an official way to use blender in a venv (not just a community thing), we would be able to elaborate an official way to publish/install addons to/from PyPI, which would give us free addon dependencies \o/.

Full disclosure: this is something I'm working on on my (non-existent) spare time. You can expect to see some tools for this soon.


This is my first post here, and probably my last on this year, so Yeah !
Happy New Year Everyone ! \o/

One issue with pip is updating of add-ons and their dependencies when the add-on is already loaded. Especially if the add-on dependency is a cpython-based module. My experience is that it is not possible to pip update a cpython dependency while it is loaded.

If an update mechanism could be provided for this that would be nice. One that would somehow uncheck add-on, somewhere write that the update needs to be made, then on next blender start do the update before re-enabling the add-on.

I think it would be great if the addon system of Blender could handle pip dependencies of addons in a pythonic way, from a requirements.txt file or file (would be great to support both). When an addon is loaded by the system, Blender could detect if one of these file exists and:

  • Check if dependencies are already installed
  • If not, notify the user that some dependencies needs to be installed, tell him where they will be installed and ask for confirmation
  • Call pip to install dependencies, and report errors with advices if some occur

From what I saw when digging into the source code, most of the addon system is implemented in python so I think such a system could be prototyped quickly.

The advantages of this system are:

  • Provide a pythonic, documented and supported way of having pip dependencies installed for an addon
  • Addon writers no longer have to call pip by themself with a subprocess
  • Easy to know the dependencies of an addon, they are centralised in one file
  • Blender users are now notified when an addon tries to install new things in their installation, they might refuse, and if the installation fails they know it, and why it failed
  • Don't need changes to Blender's bundled Python, only tooling
  • Compatible with a possible future integration of a virtual env workflow in Blender

One point that seems hard to solve to me: if an addon require a specific version of a pip package but another is already installed, what to do ? report an error ? ask the user if he want to upgrade/downgrade ?
import bpy
import sys

def ensure_site_packages(packages):
    """ `packages`: list of tuples (<import name>, <pip name>) """
    if not packages:

    import site
    import importlib


    modules_to_install = [module[1] for module in packages if not importlib.util.find_spec(module[0])]   

    if modules_to_install:
        import subprocess

        if < (2,91,0):
            python_binary =
            python_binary = sys.executable[python_binary, '-m', 'ensurepip'], check=True)[python_binary, '-m', 'pip', 'install', *modules_to_install, "--user"], check=True)

    ("PIL", "Pillow"),

Or may be to use modules folder:*.py#path-layout