(information for group members)
Our group’s project files are stored at the CCR computers under
/projects/academic/jochena/
Please do not store project files in your home directory. Instead, after you first login to one of the cluster login nodes, create a directory for yourself in /projects/academic/jochena/ as follows:
mkdir /projects/academic/jochena/mydir
where mydir is your UB IT name or the name of a guest account. Then
cd $HOME; ln -s /projects/academic/jochena/mydir projects
From there on you can easily change to the projects directory from your home directory via cd projects.
There are various directories and off-line storage space available to store larger files for an extended period of time. Please don’t use the precious projects file space for storing large files that aren’t currently needed. Ask one of the more senior group members about file storage.
All jobs are submitted using the SLURM queuing system:
Click here for CCR information about SLURM
Our cluster partition is accessed via adding to your sbatch scripts the following lines:
#SBATCH --clusters=faculty #SBATCH --partition=jochena
Use the commands
sinfo -l --Node -p jochena -M faculty
or
snodes all faculty/jochena
to see cluster usage and hardware information. Currently, the cluster nodes in our partition are as follows (the memory is total per node in GB):
Node CPUs Mem(GB) ------------------------------ cpn-f12-17 64 512 cpn-p28-20 64 512 cpn-p28-21 64 512 cpn-p28-22 64 512 cpn-p28-34 64 1000 cpn-p28-35 64 1000 cpn-v05-03 56 512 cpn-v05-04 56 512 cpn-v05-05 56 512 cpn-v05-06 56 512 cpn-v09-17 56 512 cpn-v09-18 56 512 cpn-v09-19 56 512 cpn-v09-20 56 512 cpn-v09-24 56 512 cpn-v09-25 56 512 cpn-v09-26 56 512 cpn-v09-27 56 512 cpn-v09-28 56 1000 cpn-v09-29 56 1000 cpn-v09-30 56 1000 cpn-v09-31 56 1000 cpn-v09-32 56 1000 cpn-v11-16 40 187 cpn-v11-17 40 187 cpn-v11-14-01 40 187 cpn-v11-14-02 40 187 cpn-v11-15-01 40 187 cpn-v11-15-02 40 187
Every so often a clean-up of files is needed in order to recover wasted disk space. Over time you will accumulate large files that are not needed anymore and take up space. Clean up your directories frequently. Use a command such as
find . -size +100M
to locate files larger than approximately 100 MB in the working directory and all subdirectories. A size specification of +1G will locate files 1 GB or larger. Start with that, or +10G, and see if you really need to keep those files, then continue with a smaller file size specification.
Do not delete input and output files. Output files should be compressed (gzip file) if they are very large, or (preferred) delete parts of extremely large outputs that you won’t ever be using and keep the remainder uncompressed.
Use gzip if you need to compress large files. You can view compressed output files with zless. Delete unnecessary scratch files, ADF rkf files, Gaussian checkpoint files, etc. Delete large cube files if they can be easily regenerated.
The command du -sh gives you a list of files and directories relative to PWD and tells you how much space they are using. This is an alternative way to find out which directories contain large files.
We have a large and growing number of useful scripts for everyday tasks. There is a project ’scripts’ in our Git repository on the cluster (see /projects/academic/jochena/git/), and various group members have put useful software in directories under /projects/academic/jochena/shared/
______________
© 2011– 2025 J. Autschbach.