Unity
Unity
About
News
Events
Docs
Contact Us
code
search
login
Unity
Unity
About
News
Events
Docs
Contact Us
dark_mode
light_mode
code login
search

About

  • Unity Account Expiration and Deletion Policy
  • Unity Facilities Boilerplate
  • Unity Security Statement
  • Unity Terms of Service

About

Unity is a collaborative research computing platform with a variety of member institutions. If you have questions about Unity services and your specific institution, please see our point of contact list.

groups

People

Unity is supported by an interdisciplinary and multi-institutional team of systems administrators, research computing facilitators, and institutional representatives.

The systems administration team manages the hardware, networking, software, and services that form Unity. Our sysadmins are responsible for hundreds of compute nodes, petabytes of storage, services like our Open OnDemand deployment and Unity portal, and making it all work seamlessly together for a great user experience.

The facilitation team assists Unity users with deploying their workloads onto Unity and advising on workflow optimization, software tuning, and technical issues. In addition, the facilitation team provides HPC workshops and manages an active Slack for the Unity Community. For groups needing mid to long term project support, facilitator time is available on an hourly basis. For help, contact us.

dns

Compute resources

As of late 2024, Unity is an over 25,000 core cluster based on Ubuntu 24.04 LTS and Slurm with a heterogeneous network containing ethernet and IB linked compute nodes. Unity hosts a variety of compute node architectures. The majority of compute nodes are Intel or AMD x86-64 nodes, with a small number of ARM and Power9 nodes to supplement. Additionally, Unity contains ~1300 Nvidia GPUs, including over 100 A100, L40S, and V100 GPUs. Visit the node list for a complete description of Unity’s compute nodes.
save

Storage resources

Unity contains 1.5 PB high performance VAST, for home, work, and scratch directories, and 2 PB storage from the New England Storage Exchange (NESE), a regionally managed Ceph cluster located at the same data center as Unity, the Massachusetts Green High Performance Computing Center (MGHPCC). We have a variety of storage options that suit different performance and size needs.
sell

Purchasing

Research groups with compute or storage needs beyond the basic Unity access provided by their institution can contact their institution’s point of contact, our help desk at hpc@umass.edu, or, for institution-level inquiries, Tom Bernardin. Hardware purchases must be approved by the Unity team prior to purchase.
news

Grant information

Unity resources and purchases can be written into grants. We provide a Unity NSF boilerplate document to simplify the grant writing process. Contact hpc@umass.edu to discuss your grant requirements.
gavel

Terms of service

Unity’s Terms of Service provide the terms under which Unity may be accessed and used. All use of Unity must adhere to these guidelines to help ensure fair and equitable access. If you have any questions, reach out to hpc@umass.edu.
security

Security statement

Unity’s Security Statement provides information on the security of the Unity platform. For any further questions or concerns, reach out to hpc@umass.edu.
University of Massachusetts Amherst University of Massachusetts Amherst University of Rhode Island University of Rhode Island University of Massachusetts Dartmouth University of Massachusetts Dartmouth University of Massachusetts Lowell University of Massachusetts Lowell University of Massachusetts Boston University of Massachusetts Boston Mount Holyoke College Mount Holyoke College Smith College Smith College
search
close