(information for group members)
Our group’s project files are stored at the CCR computers under
Please do not store project files in your home directory. Instead, after you first login to one of the cluster login nodes, create a directory for yourself in /projects/academic/jochena/ as follows:
where mydir is your UB IT name or the name of a guest account. Then
cd $HOME; ln -s /projects/academic/jochena/mydir .
From there on you can easily change to the projects directory from your home directory.
Further, add to your .bashrc
in order to access a collection of scripts and other useful software.
There are various directories and off-line storage space available to store larger files for an extended period of time. Please don’t use the precious projects file space for storing large files that aren’t currently needed. Ask one of the more senior group members about file storage.
All jobs are submitted using the SLURM queuing system:
Click here for CCR information about SLURM
Our cluster partition is accessed via adding to your sbatch scripts the following lines:
#SBATCH --clusters=faculty #SBATCH --partition=jochena
Use the commands
sinfo -l --Node -p jochena -M faculty
snodes all faculty/jochena
to see cluster usage and hardware information. If you need nodes with fast network interconnects (Infiniband, IB), these are the cpn-f15 nodes. Currently, the cluster nodes in our partition are as follows (the memory is total per node in GB):
Node CPUs Mem(GB) ------------------------------ cpn-f15-03 16 128 cpn-f15-05 16 128 cpn-f15-06 16 128 cpn-f15-07 16 128 cpn-f15-08 16 128 cpn-f15-09 16 128 cpn-f15-10 16 128 cpn-f15-11 16 128 cpn-f15-13 16 128 cpn-f15-14 16 128 cpn-f15-15 16 128 cpn-f15-17 16 128 cpn-f15-18 16 128 cpn-f15-23 16 128 cpn-f15-24 16 128 cpn-f15-25 16 128 cpn-p26-19 12 128 cpn-p26-20 12 128 cpn-p26-21 12 128 cpn-p26-23 12 128 cpn-p26-24 12 128 cpn-p26-25 12 128 cpn-u28-23 24 187 cpn-u28-24 24 187 cpn-u28-25 24 187 cpn-u28-26 24 187 cpn-u28-27 24 187 cpn-v05-03 56 512 cpn-v05-04 56 512 cpn-v05-05 56 512 cpn-v05-06 56 512 cpn-v09-17 56 512 cpn-v09-18 56 512 cpn-v09-19 56 512 cpn-v09-20 56 512 cpn-v09-24 56 512 cpn-v09-25 56 512 cpn-v09-26 56 512 cpn-v09-27 56 512 cpn-v09-28 56 1000 cpn-v09-29 56 1000 cpn-v09-30 56 1000 cpn-v09-31 56 1000 cpn-v09-32 56 1000 cpn-v11-16 40 187 cpn-v11-17 40 187 cpn-v11-14-01 40 187 cpn-v11-14-02 40 187 cpn-v11-15-01 40 187 cpn-v11-15-02 40 187
Every so often a clean-up of files is needed in order to recover wasted disk space. Over time you will accumulate large files that are not needed anymore and take up space. Clean up your directories frequently. Use a command such as
find . -size +100M
to locate files larger than approximately 100 MB in the working directory and all subdirectories. A size specification of +1G will locate files 1 GB or larger. Start with that, or +10G, and see if you really need to keep those files, then continue with a smaller file size specification.
Do not delete input and output files. Output files should be compressed (gzip file) if they are very large, or delete parts of extremely large outputs that you won’t ever be using.
Use gzip to compress large files, delete unused TAPE’s from adf, etc. You can view compressed output files with zless. The command du -sh gives you a list of files and directories relative to PWD with how much space they need. This is an alternative way to find out which directories contain large files. Finally, there is a graphical tool named gdmap (in /software/bin) that lets you locate large files in your directories easily.
We have a large and growing number of useful scripts for everyday tasks. There is a project ’scripts’ in our Git repository on the cluster, and various group members have put useful software in directories under /projects/academic/jochena/shared/