Published: 13. 11. 2018   Category: Programming

Different solutions for Python virtual environments

Virtual environments for Python are very handy for several reasons when you need to:

Different versions of Python: pyenv

The source code of pyenv is available here: https://github.com/pyenv/pyenv

pyenv is a little bit different, it will allow you to change a used version of Python completely. I found it handy on few servers running Ubuntu 14.04, where I needed to upgrade salt-stack to a newer version and that version supports only Python version 2.7.15, but Ubuntu 14.04 latest version in official repositories is 2.7.6. The shell recipe in this section is about the installation of the python and with this new version available, you can install additional virtual environments with tools (pipenv, venv, virtualenv) described in next sections.

Pyenv is primarily for:

Here is the example of installation of specific version (2.7.15):

git clone https://github.com/pyenv/pyenv.git ~/.pyenv
echo 'export PYENV_ROOT="$HOME/.pyenv"' >> ~./.bashrc
echo 'export PATH="$PYENV_ROOT/bin:$PATH"' >> ~/.bashrc
echo 'eval "$(pyenv init -)"' >> ~/.bashrc
source ~/.bashrc
apt-get install make # be sure you have make installed
pyenv install 2.7.15
echo 2.7.15 > /root/.pyenv/version
source ~/.bashrc
python --version # check that version is enabled

Virtual environment for Python 3.x with venv

sudo pip3 install venv # install venv system-wide

# Create a new virtual environment.
# This will copy current Python on your system to that directory:

python3 -m venv directory

# Activate virtual environment:
source directory/bin/activate

# Your shell environment is now using that directory for all Python
# related operations. You may notice, that your PS1 variable contains the
# name of that directory in brackets.

# Few commands to check where you are:
pip --version # will show a new path for package installation
python3 -c 'import sys; print(sys.path)' # system paths are also updated

# To exit virtual environment use this command, which is actually a shell
# function sourced by the 'activate' script you have used before:
deactivate

# These commands will show default operating system values after deactivation:
pip --version
python3 -c 'import sys; print(sys.path)'

Virtual environment for Python 2.x/3.x with virtualenv

This is very similar to the previous section:

sudo pip3 install virtualenv # install virtualenv system-wide

# Create a new virtual environment.
# This will copy current Python on your system to that directory:

virtualenv directory # this script will execute the same as

# To select explicit interpreter use -p or --python=

virtualenv -p python2.7 directory 

python3 -m virualenv directory # use python2 or pytho3 to select
                                        # specified version

# Activate virtual environment:
source directory/bin/activate

# Your shell environment is now using that directory for all Python
# related operations. You may notice, that your PS1 variable contains the
# name of that directory.

# Few commands to check where you are:
pip --version # will show a new path for package installation
python -c 'import sys; print(sys.path)' # system paths are also updated

# To exit virtual environment use
deactivate

Best for packaging: pipenv

This is officially recommended solution for installing python dependencies by python.org. The pipenv session may look like this:

sudo pip3 install pipenv # install it system wide

mkdir project # create a directory for a new project
cd project
pipenv install

# Your project directory now contains following files
Pipfile Pipfile.lock

# Install packages:
pipenv install package1 package2 # install packages

# You can check section [packages] in Pipfile to see installed stuff:
less Pipfile
pipenv graph # will show a graph of dependencies

# You can execute commands in the virtual environment without activating:
pipenv run pip --version
pipenv run python3 -c 'import sys; print(sys.path)'
pipenv --venv # will show where is located virtual installation of the python

# Activate virtual environment
pipenv shell

# To exit virtual environment use simply
exit

Generate requirements.txt

To be able to span your virtual environment of the project everywhere, you need to save a state of installed modules. Command pip freeze will do it. The default file to save the list is requirements.txt. It will save also versions off all pip packages.

# Save current installed modules
pip freeze > requirements.txt

# Install packages from requirements.txt
pip install -r requirements.txt

# When pipenv is used
pipenv install -r requirements.txt

Keeping requirements up to date with pip-tools

Handy utilities dealing with packages are part of the package pip-tools. You need to create an input file, usually requirements.in and specify manually what packages are necessary. See example below with specification of versions of packages:

dnspython                      # do not care about version (will find latest)
elasticsearch>=5.0.0,<6.0.0    # use versions between
errbot==5.1.2                  # install exact version
salt-pepper>=0.5.0,!=0.5.3,!=0.5.4 # install version higher then, but ignore
Once you have requirements.in present in your project directory use:
pip install pip-tools       # install pip-tools

# Generate requirements.txt with all dependencies
pip-compile requirements.in

# Install packages specified in requirements.txt
pip-sync

How to build a Docker image?

The last option, I am currently using, is running python in a Docker container. In this case, you need to build a Docker image, use this image locally or deploy it somewhere. You can already have prepared requirements files in the same directory as Dockerfile is located. The image is created from a Dockerfile which may look like this:

# Base image from a public repository
FROM python:3.7.1

# Copy file from the local directory to the image
COPY requirements.txt /build/requirements.txt

# Just example of an environmental variable set inside the image
ENV AWS_REGION="us-east-1"

# Install dependencies from requirements.txt
RUN pip3 install --upgrade -r /build/requirements.txt

# Copy local files to the image
COPY your_script.py /opt/your_script.sh
COPY ./dep/*.py /opt/dep/

# Be sure that the script is executable
RUN chmod +x /opt/your_script.sh

# What will be executed
ENTRYPOINT ["/opt/your_script.sh", "-c", "parameter" ]

Stencil of the Dockerfile is ready, now build and deploy:

# Build from Dockerfile located in the work directory
docker build . -t image_name

# Build from specific Dockerfile
docker build -f ../other/Dockerfile -t image_name

# Check new image in your local repository
docker images | head

# Deploy image to some server
# pv is optional, it is for monitoring of the progress of data through a pipe
docker save image_name | bzip2 | pv | ssh user@server 'bunzip2 | docker load'

# Execute image, and name running container 'application'
docker run --name="application" image_name