[Home] [Group] [Research] [Publications] [Teaching] [Downloads] [Cluster-Info]

Computing cluster information

PIC PIC PIC

Projects directory

Our group’s project files are stored at the CCR computers under

/ifs/projects/jochena/shared/

Please do not store project files in your home directory. Instead, after you first login to one of the cluster login nodes, create a directory for yourself in /ifs/projects/jochena/shared/ as follows:

mkdir /ifs/projects/jochena/shared/mydir

where mydir is your UB IT name or the name of a guest account. Then

cd $HOME; ln -s /ifs/projects/jochena/shared/mydir .

From there on you can easily change to the projects directory from your home directory.

There are various directories and off-line storage space available to store larger files for an extended period of time. Please don’t use the precious projects file space for storing large files that aren’t currently needed. Ask one of the more senior group members about file storage.

Queues, hardware

All jobs are submitted using the SLURM queuing system:

Click here for CCR information about SLURM

Our cluster partition is accessed via adding to your sbatch scripts the following lines:

#SBATCH --partition=jochena  
#SBATCH --account=pi-jochena

Use sview to see cluster usage and hardware information. Currently, the cluster nodes in that partition are as follows (the memory is total per node):

f15n03  
f15n04  
f15n05  
f15n06  
f15n07  
f15n08  
f15n09  
f15n10  
f15n11  
f15n12  
f15n16  
f15n14  
f15n15  
f15n16  
f15n17  
f15n18  : 16-core 128 GB nehalem (year 2013 nodes for any proj.)  
 
f15n23  
f15n24  
f15n25  : 16-core 128 GB nehalem (year 2013 nodes for NSF proj.)  
 
d12n14  :  8-core  24 GB  
d12n15  :  8-core  24 GB  
d11n13  :  8-core  48 GB  
d11n19  : 12-core  48 GB  
d11n20  : 12-core  48 GB nehalem (year 2009-2012, for DOE proj.)

File clean-up

Every so often a clean-up of files is needed in order to recover wasted disk space. Over time you will accumulate large files that are not needed anymore and take up space. Clean up your directories frequently. Use a command such as

find . -size +10000k

to locate files larger than approximately 10 MB in the working directory and all subdirectories.

Do not delete input and output files. Output files should be compressed (gzip file).

Use gzip to compress large files, delete unused TAPE’s from adf, etc. You can view compressed output files with zless. The command du -sh gives you a list of files and directories relative to PWD with how much space they need. This is an alternative way to find out which directories contain large files. Finally, there is a graphical tool named gdmap (in /software/bin) that lets you locate large files in your directories easily.

Useful scripts

We have a large and growing number of useful scripts for everyday tasks. Check the group Wiki. There is a project ’scripts’ in our SVN repository, and various group members have put useful software in directories under /ifs/projects/jochena/shared/

[Home] [Group] [Research] [Publications] [Teaching] [Downloads] [Cluster-Info]