FNAL - LQCD Documentation

New Users / Account Renewal

User Authentication

Kerberos and SSH Troubleshooting

Submitting jobs to the TORQUE Batch System

Project Allocations

Software Documentation Details

Hardware Details

Filesystem Details

Mass/Tape Storage Details

Transferring Files

Compilers

Using GPUs

FAQs

Contact Us

USQCD Clusters at Fermilab

Name
Description

pi0 cluster
pi0 & pi0g
pi0: 2-socket 8-Core (16 core total)
2.6GHz Intel E5-2650v2 "Ivy Bridge"
128GB memory per node (8GB/core)
QDR (40Gbit/sec) Infiniband
Total Nodes: 314
pi0g: 2-socket 8-Core (16 core total)
2.6GHz Intel E5-2650v2 "Ivy Bridge"
128GB memory per node (8GB/core)
QDR (40Gbit/sec) Infiniband
4 NVidia Tesla K40m GPUs per node
Total Nodes: 32
Cluster
Status

B/c cluster
Bc
4-socket 8-Core (32 core total)
2.8GHz AMD 6320 Opteron
64GB memory per node (2GB/core)
QDR (40Gbit/sec) Infiniband
Total Nodes: 224
Cluster
Status

D/s cluster
Ds & Dsg
Ds: 4-socket 8-Core (32 core total)
2GHz AMD 6128 Opteron
64GB memory per node (2GB/core)
QDR (40Gbit/sec) Infiniband
Total Nodes: 196
Dsg: 2-socket 4-Core (8 core total)
2.53GHz Intel Xeon E5630
48GB memory per node (6GB/core)
QDR (40Gbit/sec) Infiniband
2 NVidia Tesla M2050 GPUs per node
Total Nodes: 20
Cluster
Status