In order to properly manage the CPU-hours budget allocated to you on the C2PAP cluster, it will be necessary to have exactly one LRZ project. To avoid confusion, note that having an LRZ project does not imply that you have any C2PAP computing budget. But if you have the latter, you need an LRZ project to which the C2PAP admin can assign this budget. Therefore, it is not possible to use one LRZ project for multiple C2PAP projects.
In case you don't have such an LRZ project yet, please apply for a new LRZ project using this form https://www.lrz.de/wir/kennung/antrag_auf_ein_lrz-projekt.pdf:
- most likely you are Nutzerklasse 1
- indicate C2PAP under "Gewünschte Dienste - andere Dienste"
- assign a master user, usually one of the investigators
- you need a signature from the head of your institute, could be a director of a Max Planck institute, or the head of a chair at the university, a group leader is not sufficient. The head need not be part of your project but of course he/she should know you.
Send the form to the LRZ address indicated on the bottom of the form.
Note that these accounts are also eligible to use the LRZ "Linux cluster".
The login nodes are c2paplogin.lrz.de, c2papdata1.lrz.de, and c2papdata2.lrz.de. LRZ policy requires that each user log on only from a known IP. So please send account ids, for example dr92zda, together with the fixed IP, and your LRZ project id to the C2PAP administrator Aliaksei Krukau aliaksei.krukau _at_ lrz.de. He will then register the IP. Everytime your IP changes, you have to notify Aliaksei again, else an attempt to log on will fail silently:
$ ssh firstname.lastname@example.org ssh: connect to host c2paplogin.lrz.de port 22: Connection timed out
- For more information look at the SuperMuc  website at the LRZ
On the login and the two data nodes, outgoing connections are generally allowed. This means you can download a file via http, connect to a git or svn repository, scp files elsewhere etc. If you enable agent forwarding during log in, you can ssh to other remote machines (not at C2PAP) by using ssh keys on your local machine (say your laptop). On the compute nodes, outgoing connections are generally disallowed. If your job tries to pull data over the web, you need to ask the administrator for a special configuration.
Access to general TUM Network-attached storage (NAS)
Details about access to the TUM NAS in general are presented at Network_Storage. To make it work on the C2PAP login node, use
$ dbus-launch bash $ gvfs-mount $location