Unity
Unity
About
News
Events
Docs
Contact Us
code
search
login
Unity
Unity
About
News
Events
Docs
Contact Us
dark_mode
light_mode
code login
search

Documentation

  • Requesting An Account
  • Get Started
    • Quick Start
    • Common Terms
    • HPC Resources
    • Theory of HPC
      • Overview of threads, cores, and sockets in Slurm for HPC workflows
    • Git Guide
  • Connecting to Unity
    • SSH
    • Unity OnDemand
    • Connecting to Desktop VS Code
  • Get Help
    • Frequently Asked Questions
    • How to Ask for Help
    • Troubleshooting
  • Cluster Specifications
    • Node List
    • Partition List
      • Gypsum
    • Storage
    • Node Features (Constraints)
      • NVLink and NVSwitch
    • GPU Summary List
  • Managing Files
    • Command Line Interface (CLI)
    • Disk Quotas
    • FileZilla
    • Globus
    • Scratch: HPC Workspace
    • Unity OnDemand File Browser
  • Submitting Jobs
    • Batch Jobs
      • Array Batch Jobs
      • Large Job Counts
      • Monitor a batch job
    • Helper Scripts
    • Interactive CLI Jobs
    • Unity OnDemand
    • Message Passing Interface (MPI)
    • Slurm cheat sheet
  • Software Management
    • Building Software from Scratch
    • Conda
    • Modules
      • Module Usage
    • Renv
    • Unity OnDemand
      • JupyterLab OnDemand
    • Venv
  • Tools & Software
    • ColabFold
    • R
      • R Parallelization
    • Unity GPUs
  • Datasets
    • AI and ML
      • AlpacaFarm
      • audioset
      • bigcode
      • biomed_clip
      • blip_2
      • blip_2
      • coco
      • Code Llama
      • DeepAccident
      • DeepSeek
      • DINO v2
      • epic-kitchens
      • florence
      • gemma
      • glm
      • gpt
      • gte-Qwen2
      • ibm-granite
      • Idefics2
      • Imagenet 1K
      • inaturalist
      • infly
      • instruct-blip
      • internLM
      • intfloat
      • LAION
      • lg
      • linq
      • llama
      • Llama2
      • llama3
      • llama4
      • Llava_OneVision
      • Lumina
      • mixtral
      • msmarco
      • natural-questions
      • objaverse
      • openai-whisper
      • phi
      • playgroundai
      • pythia
      • qwen
      • R1-1776
      • rag-sequence-nq
      • red-pajama-v2
      • s1-32B
      • satlas_pretrain
      • scalabilityai
      • sft
      • SlimPajama
      • t5
      • Tulu
      • V2X
      • video-MAE
      • videoMAE-v2
      • vit
      • wildchat
    • Bioinformatics
      • AlphaFold3 Databases
      • BFD/MGnify
      • Big Fantastic Database
      • checkm
      • ColabFoldDB
      • dfam
      • EggNOG
      • EggNOG
      • gmap
      • GMAP-GSNAP database (human genome)
      • GTDB
      • igenomes
      • Kraken2
      • MGnify
      • NCBI BLAST databases
      • NCBI RefSeq database
      • NCBI RefSeq database
      • Parameters of Evolutionary Scale Modeling (ESM) models
      • params
      • PDB70
      • PDB70 for ColabFold
      • PINDER
      • PLINDER
      • Protein Data Bank
      • Protein Data Bank database in mmCIF format
      • Protein Data Bank database in SEQRES records
      • Tara Oceans 18S amplicon
      • Tara Oceans MATOU gene catalog
      • Tara Oceans MGT transcriptomes
      • Uniclust30
      • UniProtKB
      • UniRef100
      • UniRef30
      • UniRef90
      • Updated databases for ColabFold
    • Using HuggingFace Datasets

Documentation

  • Requesting An Account
  • Get Started
    • Quick Start
    • Common Terms
    • HPC Resources
    • Theory of HPC
      • Overview of threads, cores, and sockets in Slurm for HPC workflows
    • Git Guide
  • Connecting to Unity
    • SSH
    • Unity OnDemand
    • Connecting to Desktop VS Code
  • Get Help
    • Frequently Asked Questions
    • How to Ask for Help
    • Troubleshooting
  • Cluster Specifications
    • Node List
    • Partition List
      • Gypsum
    • Storage
    • Node Features (Constraints)
      • NVLink and NVSwitch
    • GPU Summary List
  • Managing Files
    • Command Line Interface (CLI)
    • Disk Quotas
    • FileZilla
    • Globus
    • Scratch: HPC Workspace
    • Unity OnDemand File Browser
  • Submitting Jobs
    • Batch Jobs
      • Array Batch Jobs
      • Large Job Counts
      • Monitor a batch job
    • Helper Scripts
    • Interactive CLI Jobs
    • Unity OnDemand
    • Message Passing Interface (MPI)
    • Slurm cheat sheet
  • Software Management
    • Building Software from Scratch
    • Conda
    • Modules
      • Module Usage
    • Renv
    • Unity OnDemand
      • JupyterLab OnDemand
    • Venv
  • Tools & Software
    • ColabFold
    • R
      • R Parallelization
    • Unity GPUs
  • Datasets
    • AI and ML
      • AlpacaFarm
      • audioset
      • bigcode
      • biomed_clip
      • blip_2
      • blip_2
      • coco
      • Code Llama
      • DeepAccident
      • DeepSeek
      • DINO v2
      • epic-kitchens
      • florence
      • gemma
      • glm
      • gpt
      • gte-Qwen2
      • ibm-granite
      • Idefics2
      • Imagenet 1K
      • inaturalist
      • infly
      • instruct-blip
      • internLM
      • intfloat
      • LAION
      • lg
      • linq
      • llama
      • Llama2
      • llama3
      • llama4
      • Llava_OneVision
      • Lumina
      • mixtral
      • msmarco
      • natural-questions
      • objaverse
      • openai-whisper
      • phi
      • playgroundai
      • pythia
      • qwen
      • R1-1776
      • rag-sequence-nq
      • red-pajama-v2
      • s1-32B
      • satlas_pretrain
      • scalabilityai
      • sft
      • SlimPajama
      • t5
      • Tulu
      • V2X
      • video-MAE
      • videoMAE-v2
      • vit
      • wildchat
    • Bioinformatics
      • AlphaFold3 Databases
      • BFD/MGnify
      • Big Fantastic Database
      • checkm
      • ColabFoldDB
      • dfam
      • EggNOG
      • EggNOG
      • gmap
      • GMAP-GSNAP database (human genome)
      • GTDB
      • igenomes
      • Kraken2
      • MGnify
      • NCBI BLAST databases
      • NCBI RefSeq database
      • NCBI RefSeq database
      • Parameters of Evolutionary Scale Modeling (ESM) models
      • params
      • PDB70
      • PDB70 for ColabFold
      • PINDER
      • PLINDER
      • Protein Data Bank
      • Protein Data Bank database in mmCIF format
      • Protein Data Bank database in SEQRES records
      • Tara Oceans 18S amplicon
      • Tara Oceans MATOU gene catalog
      • Tara Oceans MGT transcriptomes
      • Uniclust30
      • UniProtKB
      • UniRef100
      • UniRef30
      • UniRef90
      • Updated databases for ColabFold
    • Using HuggingFace Datasets

On this page

  • Create a workspace
    • Send an email reminder before workspace expiration
    • Create a shared workspace
  • Manage workspaces
    • List workspaces
    • Release a workspace
    • Extend a workspace
    • Share a workspace with another user
  1. Unity
  2. Documentation
  3. Managing Files
  4. Scratch: HPC Workspace

High performance scratch space: HPC Workspace

Unity provides a tool called HPC Workspace that allows you to create and manage high performance scratch space in a sustainable fashion. The following instructions describe how to create workspaces and manage them with a few key features.

Create a workspace

Use the ws_allocate command to create a workspace. There are several different options for creating a workspace. To view all options, run ws_allocate -h.

The following code sample shows how to create a workspace for a single user with a given number of days. The maximum duration for a workspace is 30 days.

stylus_note
The number of extensions available for scratch spaces has been increased from 3 to 5.
username@login2:~$ ws_allocate simple 14
Info: creating workspace.
/scratch/workspace/username-simple
remaining extensions  : 3
remaining time in days: 14

After running the ws_allocate command to create a workspace, the following code shows up to let you know that the directory that was created.

username@login2:~$ ls -ld /scratch/workspace/username-simple
drwx------ 2 username username 4096 Mar  2 17:48

Send an email reminder before workspace expiration

Use the -m option to specify an email address that you want to notify before the workspace expires. This option requires you to specify the number of days before expiration using the -r option.

For example:

username@login2:~$ ws_allocate -m username@umass.edu -r 1 workspacename 2
Info: creating workspace.
/scratch/workspace/username-workspacename
remaining extensions  : 3
remaining time in days: 2

Create a shared workspace

The HPC Workspace utility allows you to create shared workspaces, which are useful for collaborating on a project. To create a shared workspace, all members must be members of a common group, such as the members of a particular PI group.

The following code sample shows how to create a workspace that can be shared with a particular PI group.

username@login2:~$ ws_allocate -G pi_pi-username shared
Info: creating workspace.
/scratch/workspace/username-shared
remaining extensions  : 3
remaining time in days: 0

After creating a shared workspace, the following code shows up to let you know that the directory was created with group sharing permissions.

username@login2:~$ ls -ld /scratch/workspace/username-shared
drwxrws--- 2 username pi_pi-username 4096 Mar  3 19:08

Manage workspaces

HPC Workspace provides many useful features for managing your workspaces after they are created. The following instructions show you how to list your workspaces, share a workspace with another user, release a workspace, and extend a workspace.

List workspaces

Use the ws_list command to view a list of your workspaces along with key information about them.

For example:

username@login2:~$ ws_list -v
id: username-workspacename
     workspace directory  : /scratch/workspace/username-workspacename
     remaining time       : 1 days 23 hours
     creation time        : Fri Mar  3 16:56:51 2023
     expiration date      : Sun Mar  5 16:56:50 2023
     filesystem name      : workspace
     available extensions : 3
     acctcode             : username
     reminder             : Sat Mar  4 16:56:50 2023
     mailaddress          : username@umass.edu
id: username-simple
     workspace directory  : /scratch/workspace/username-simple
     remaining time       : 0 days 23 hours
     creation time        : Fri Mar  3 16:56:44 2023
     expiration date      : Sat Mar  4 16:56:44 2023
     filesystem name      : workspace
     available extensions : 3
     acctcode             : username
     reminder             : Sat Mar  4 16:56:44 2023
     mailaddress          : None

Release a workspace

Use the ws_release command to release a workspace once you are done with it. Releasing a workspace means that the ID can be reused, and the directory is no longer accessible.

For example:

username@login2:~$ ws_release workspacename
username@login2:~$ ws_release simple
username@login2:~$ ws_list
username@login2:~$
lightbulb
Please release workspaces
Releasing workspaces when they are no longer needed helps optimize scratch space for all Unity users.
warning
Releasing a workspace does not delete the data immediately. The deletion of the workspace can take place at any time once it’s released.

Extend a workspace

HPC Workspace allows a limited number of opportunities to extend a workspace. Use ws_list to see the available number of extensions.

The following code sample shows how to create a workspace and extend it for 30 days.

username@login2:~$ ws_extend simple 30
Info: extending workspace.
/scratch/workspace/username-simple
remaining extensions  : 2
remaining time in days: 30

Share a workspace with another user

HPC Workspace also allows you to share a workspace with another user whether or not they are in your PI group. Use the ws_share command to share a workspace with a user.

For example:

ws_share share simple username2_umass_edu
lightbulb
  • Replace share with unshare to remove the share.
  • Replace share with list to see which users the workspace is currently shared with.
Last modified: Thursday, May 8, 2025 at 1:22 PM. See the commit on GitLab.
University of Massachusetts Amherst University of Massachusetts Amherst University of Rhode Island University of Rhode Island University of Massachusetts Dartmouth University of Massachusetts Dartmouth University of Massachusetts Lowell University of Massachusetts Lowell University of Massachusetts Boston University of Massachusetts Boston Mount Holyoke College Mount Holyoke College Smith College Smith College
search
close