Unity
Unity
About
News
Events
Docs
Contact Us
code
search
login
Unity
Unity
About
News
Events
Docs
Contact Us
dark_mode
light_mode
code login
search

Documentation

  • Requesting An Account
  • Get Started
    • Quick Start
    • Common Terms
    • HPC Resources
    • Theory of HPC
      • Overview of threads, cores, and sockets in Slurm for HPC workflows
    • Git Guide
  • Connecting to Unity
    • SSH
    • Unity OnDemand
    • Connecting to Desktop VS Code
  • Get Help
    • Frequently Asked Questions
    • How to Ask for Help
    • Troubleshooting
  • Cluster Specifications
    • Node List
    • Partition List
      • Gypsum
    • Storage
    • Node Features (Constraints)
      • NVLink and NVSwitch
    • GPU Summary List
  • Managing Files
    • Command Line Interface (CLI)
    • Disk Quotas
    • FileZilla
    • Globus
    • Scratch: HPC Workspace
    • Unity OnDemand File Browser
  • Submitting Jobs
    • Batch Jobs
      • Array Batch Jobs
      • Large Job Counts
      • Monitor a batch job
    • Helper Scripts
    • Interactive CLI Jobs
    • Unity OnDemand
    • Message Passing Interface (MPI)
    • Slurm cheat sheet
  • Software Management
    • Building Software from Scratch
    • Conda
    • Modules
      • Module Usage
    • Renv
    • Unity OnDemand
      • JupyterLab OnDemand
    • Venv
  • Tools & Software
    • ColabFold
    • R
      • R Parallelization
    • Unity GPUs
  • Datasets
    • AI and ML
      • AlpacaFarm
      • audioset
      • bigcode
      • biomed_clip
      • blip_2
      • blip_2
      • coco
      • Code Llama
      • DeepAccident
      • DeepSeek
      • DINO v2
      • epic-kitchens
      • florence
      • gemma
      • glm
      • gpt
      • gte-Qwen2
      • ibm-granite
      • Idefics2
      • Imagenet 1K
      • inaturalist
      • infly
      • instruct-blip
      • internLM
      • intfloat
      • LAION
      • lg
      • linq
      • llama
      • Llama2
      • llama3
      • llama4
      • Llava_OneVision
      • Lumina
      • mixtral
      • msmarco
      • natural-questions
      • objaverse
      • openai-whisper
      • phi
      • playgroundai
      • pythia
      • qwen
      • R1-1776
      • rag-sequence-nq
      • red-pajama-v2
      • s1-32B
      • satlas_pretrain
      • scalabilityai
      • sft
      • SlimPajama
      • t5
      • Tulu
      • V2X
      • video-MAE
      • videoMAE-v2
      • vit
      • wildchat
    • Bioinformatics
      • AlphaFold3 Databases
      • BFD/MGnify
      • Big Fantastic Database
      • checkm
      • ColabFoldDB
      • dfam
      • EggNOG
      • EggNOG
      • gmap
      • GMAP-GSNAP database (human genome)
      • GTDB
      • igenomes
      • Kraken2
      • MGnify
      • NCBI BLAST databases
      • NCBI RefSeq database
      • NCBI RefSeq database
      • Parameters of Evolutionary Scale Modeling (ESM) models
      • params
      • PDB70
      • PDB70 for ColabFold
      • PINDER
      • PLINDER
      • Protein Data Bank
      • Protein Data Bank database in mmCIF format
      • Protein Data Bank database in SEQRES records
      • Tara Oceans 18S amplicon
      • Tara Oceans MATOU gene catalog
      • Tara Oceans MGT transcriptomes
      • Uniclust30
      • UniProtKB
      • UniRef100
      • UniRef30
      • UniRef90
      • Updated databases for ColabFold
    • Using HuggingFace Datasets

Documentation

  • Requesting An Account
  • Get Started
    • Quick Start
    • Common Terms
    • HPC Resources
    • Theory of HPC
      • Overview of threads, cores, and sockets in Slurm for HPC workflows
    • Git Guide
  • Connecting to Unity
    • SSH
    • Unity OnDemand
    • Connecting to Desktop VS Code
  • Get Help
    • Frequently Asked Questions
    • How to Ask for Help
    • Troubleshooting
  • Cluster Specifications
    • Node List
    • Partition List
      • Gypsum
    • Storage
    • Node Features (Constraints)
      • NVLink and NVSwitch
    • GPU Summary List
  • Managing Files
    • Command Line Interface (CLI)
    • Disk Quotas
    • FileZilla
    • Globus
    • Scratch: HPC Workspace
    • Unity OnDemand File Browser
  • Submitting Jobs
    • Batch Jobs
      • Array Batch Jobs
      • Large Job Counts
      • Monitor a batch job
    • Helper Scripts
    • Interactive CLI Jobs
    • Unity OnDemand
    • Message Passing Interface (MPI)
    • Slurm cheat sheet
  • Software Management
    • Building Software from Scratch
    • Conda
    • Modules
      • Module Usage
    • Renv
    • Unity OnDemand
      • JupyterLab OnDemand
    • Venv
  • Tools & Software
    • ColabFold
    • R
      • R Parallelization
    • Unity GPUs
  • Datasets
    • AI and ML
      • AlpacaFarm
      • audioset
      • bigcode
      • biomed_clip
      • blip_2
      • blip_2
      • coco
      • Code Llama
      • DeepAccident
      • DeepSeek
      • DINO v2
      • epic-kitchens
      • florence
      • gemma
      • glm
      • gpt
      • gte-Qwen2
      • ibm-granite
      • Idefics2
      • Imagenet 1K
      • inaturalist
      • infly
      • instruct-blip
      • internLM
      • intfloat
      • LAION
      • lg
      • linq
      • llama
      • Llama2
      • llama3
      • llama4
      • Llava_OneVision
      • Lumina
      • mixtral
      • msmarco
      • natural-questions
      • objaverse
      • openai-whisper
      • phi
      • playgroundai
      • pythia
      • qwen
      • R1-1776
      • rag-sequence-nq
      • red-pajama-v2
      • s1-32B
      • satlas_pretrain
      • scalabilityai
      • sft
      • SlimPajama
      • t5
      • Tulu
      • V2X
      • video-MAE
      • videoMAE-v2
      • vit
      • wildchat
    • Bioinformatics
      • AlphaFold3 Databases
      • BFD/MGnify
      • Big Fantastic Database
      • checkm
      • ColabFoldDB
      • dfam
      • EggNOG
      • EggNOG
      • gmap
      • GMAP-GSNAP database (human genome)
      • GTDB
      • igenomes
      • Kraken2
      • MGnify
      • NCBI BLAST databases
      • NCBI RefSeq database
      • NCBI RefSeq database
      • Parameters of Evolutionary Scale Modeling (ESM) models
      • params
      • PDB70
      • PDB70 for ColabFold
      • PINDER
      • PLINDER
      • Protein Data Bank
      • Protein Data Bank database in mmCIF format
      • Protein Data Bank database in SEQRES records
      • Tara Oceans 18S amplicon
      • Tara Oceans MATOU gene catalog
      • Tara Oceans MGT transcriptomes
      • Uniclust30
      • UniProtKB
      • UniRef100
      • UniRef30
      • UniRef90
      • Updated databases for ColabFold
    • Using HuggingFace Datasets

On this page

  • Create a virtual environment with venv
    • Activate the environment
    • Install packages
    • Deactivate the virtual environment
    • Move your venv virtual environment
  • Submit a slurm job with venv
  • Other virtual environment software options
  1. Unity
  2. Documentation
  3. Software Management
  4. Venv

Virtual environments in Python

The venv module in Python creates a light virtual environment that allows users to isolate specific versions of the Python interpreter and software libraries for their projects. The venv module comes with the standard library of Python and does not require a separate installation. Venv is ideal for project-based workflows.

There are a few advantages to working with virtual environments:

  • They encapsulate project specific dependencies to prevent any updates to packages to break codes for different projects
  • They prevent dependency conflicts
  • They allow for reproducibility
warning
Venv module will attach the currently active version of the Python interpreter when you create it. If you need a specific Python version for your virtual environment, you should first activate it with module load python/version and then create your virtual environment with venv.

Create a virtual environment with venv

To create a virtual environment with venv, enter the following command in your command line:

python -m venv /PATH/TO/NEW/ENVIRONMENT

The virtual environment is created in the /PATH/TO/MY/NEW/ENVIRONMENT directory, so be sure to replace /PATH/TO/MY/NEW/ENVIRONMENT with directory path that you want your environment to be created in. This command also creates any intermediary directories if they do not exist already.

lightbulb
To check other options for venv, run python -m venv -h.

Activate the environment

To activate the new environment, use the source command:

source /PATH/TO/MY/ENVIRONMENT/bin/activate

Be sure to replace PATH/TO/MY/ENVIRONMENT with the actual path to the directory of your virtual environment. The terminal displays the name of your environment in parenthesis, which means your environment is safely activated.

Install packages

Within your virtual environment, you can use common installation tools such as pip to install packages into your virtual environment.

To download any packages that you need for your environment, use the pip command followed by the package(s) you want to install. The following example installs the NumPy and Pandas packages:

pip install numpy pandas

These packages are installed in the venv environment

lightbulb
pip is automatically included in the venv environment. You can disable this behavior by using the --without-pip option.
lightbulb
You can create a requirements.txt file from the venv environment for portability with the command pip freeze > requirements.txt.

Deactivate the virtual environment

To deactivate the virtual environment, use the deactivate command:

deactivate

Move your venv virtual environment

The venv environment itself is not portable, but it can be easily recreated in another location from the requirements.txt file. The following steps will show you how to create a new environment called MY_NEW_ENV and download the packages from the requirements.txt file.

  1. If you haven’t already, create a requirements.txt file using the following command:

    pip freeze > requirements.txt
    
  2. To load the correct python version, use the following command:

    module load python/3.11.7
    
  3. To create the venv environment, use the following command:

    python -m venv MY_NEW_ENV
    
  4. To install packages from the requirements.txt file, use the following command:

    pip install -r requirements.txt
    
  5. To activate the environment, use the following command:

    source MY_NEW_ENV/bin/activate
    
warning
The requirements.txt file does not log the python version used. It is up to the user to keep track of it and recreate the environment with the correct python interpreter.

Submit a slurm job with venv

To submit a sbatch job using a venv environment, you can source the environment at the top of the sbatch script.

The following example assumes there is a script called testscript.py that needs to be submitted. This example script prints the versions of the packages we have downloaded in the Move your venv virtual environment example environment called MY_NEW_ENV.

testscript.py

import numpy as np
import pandas as pd
import sys

print(f"Python version = {sys.version}")
print(f"Numpy version = {np.version.version}")
print(f"Pandas version = {pd.__version__}")

To submit this script using a venv environment, create the following script and save it as main.sh.

main.sh

#!/usr/bin/bash
#SBATCH --partition=cpu-preempt      # Partition (queue) name
#SBATCH --ntasks=1                   # Number of CPU cores
#SBATCH --nodes=1                    # Number of nodes
#SBATCH --mem=1gb                    # Job memory request
#SBATCH --time=00:01:00              # Time limit hrs:min:sec
#SBATCH --output=pyenv_test_%j.log   # Standard output and error log

# Load venv env
source /PATH/TO/MY/ENVIRONMENT/bin/activate

# Run script
python testscript.py

Submit the main.sh script to slurm using the command sbatch main.sh. The results are captured in the --output file.

Other virtual environment software options

The following are some additional software options for creating and/or managing virtual environments.

  1. Virtualenv is the original and most popular tool to create virtual environments in Python. Venv started as a subset of virtualenv. The biggest advantage is that it works with old python versions.
  2. Virtualenvwrapper is a set of extension tools for managing multiple virtualenvs.
  3. Pipreqs is a helpful tool for creating requirements.txt files. While pip freeze only logs packages installed with pip, pipreqs will do a more comprehensive job in creating your requirements file and only includes packages that are actively used in your project.
Last modified: Friday, March 14, 2025 at 2:20 PM. See the commit on GitLab.
University of Massachusetts Amherst University of Massachusetts Amherst University of Rhode Island University of Rhode Island University of Massachusetts Dartmouth University of Massachusetts Dartmouth University of Massachusetts Lowell University of Massachusetts Lowell University of Massachusetts Boston University of Massachusetts Boston Mount Holyoke College Mount Holyoke College Smith College Smith College
search
close