Tag Archives: hpc

HPC Resources

a summary of current resources on Yale’s HPC clusters (not incl. 100 Tb loan)

Compute

Grace
GROUP NODES CORES
gerstein1tb 1 32
gerstein 32 640
Farnam
GROUP NODES CORES
gerstein 34 384
gerstein_gpu 3 60

Storage

Grace
GROUP TB
gerstein 400
Farnam
GROUP TB
gerstein 366

Farnam partition

The absolute path is "/ysm-gpfs/pi/gerstein/". Please put most of your data here.

After creating the folder (don’t forget to set up chmod the way you like), then you can create a soft link your folder back to your home folder for quick access.

ln -s [your folder under pi] [the soft link under your home folder (aka ~/xx )]

Instructions on setting up ssh desktop clients for Farnam

1) It is possible to set up ssh such that you only have to authenticate once; subsequent ssh’s reuse that connection. Please see our detailed instructions here:

http://research.computing.yale.edu/support/hpc/user-guide/ssh-sample-configuration

2) We have detailed instructions on how to use Cyberduck and Winscp to connect using duo on our website. Please see:
http://research.computing.yale.edu/support/hpc/user-guide/transfer-files-or-cluster

I believe it is also possible to use filezilla after doing a similar configuration to avoid the creation of new connections, but don’t see the need, given that cyberduck and winscp are both usable.