# Our resources

**CPU and memory**

| Machine                 | CPUs/node                                                      | Memory (GB)/node         | No. of nodes | Note                                                               |
| ----------------------- | -------------------------------------------------------------- | ------------------------ | ------------ | ------------------------------------------------------------------ |
| **Frontend**            |                                                                |                          |              |                                                                    |
| Lenovo System X 3550 M5 | 20 (Intel Xeon CPU E5-2640 v4 2.40GHz) with HT on (40 threads) | 32                       | 1            | `escience1.sc.chula.ac.th`                                         |
| Lenovo System X 3550 M5 | 16 (Intel Xeon CPU E5-2620 v4 2.10GHz)                         | 64                       | 1            | `escience2.sc.chula.ac.th`                                         |
| Lenovo SR630            | 32 (Intel Xeon Gold 5218 2.3GHz)                               | 8 x 32GB TruDDR4 2933MHz | 1            | <p>1x Tesla T4 GPU</p><p><code>escience3.sc.chula.ac.th</code></p> |
| **Worker: Slurm**       |                                                                |                          |              |                                                                    |
| Lenovo SR630            | 32 (Intel Xeon Gold 5218 2.3GHz)                               | 8 x 32GB TruDDR4 2933MHz | 7            | <p>1x Tesla T4 GPU/node</p><p>HPC, HTC</p>                         |
| Lenovo x3850 X6         | 80 (Intel Xeon E7-8870v4 2.1 MHz)                              | 512                      | 1            | HPC, HTC                                                           |
| Lenovo SR850            | 88 (Intel Xeon Gold 6152 2.10GHz)                              | 324                      | 1            | `escience4.sc.chula.ac.th`                                         |
| IBM BladeCenter HS22    | 16 (Intel Xeon CPU E5-2650 2.00GHz)                            | 32                       | 5            | HTC                                                                |
| IBM iDataPlex DX360M4   | 16                                                             | 128                      | 2            | gridMathematica                                                    |
| Lenovo SR635            | 16 (AMD EPYC 7313P 3.0 GHz)                                    | 256                      | 2            | <p>1 machine with Nvidia T4, <br>1 machine with Nvidia A2</p>      |
| DGXStation              |                                                                |                          | 1            |                                                                    |
| **Worker: Kubernetes**  |                                                                |                          |              |                                                                    |
| Dell PowerEdge R740     | 24 (Intel Xeon Pentium 8268 2.9 GHz)                           | 6 x 64GB DDR4 2933MHz    | 3            |                                                                    |
| Lenovo SR630            | 32 (Intel Xeon Gold 5218 2.3GHz)                               | 8 x 32GB TruDDR4 2933MHz | 2            | 1x Tesla T4 GPU/node                                               |
| **TOTAL**               | **676 CPUs**                                                   |                          |              |                                                                    |

**Storage**

Currently, a total capacity after RAID6+SPARE:

1. IBM Storwize 3700: 160 TB
2. Lenovo ThinkSystem DE2000H: 160 TB

{% hint style="info" %}
*Backup is the responsibility of the user and it should never be understood that RAID is a backup!*&#x20;
{% endhint %}

| **File system**        | Disk space limit | Note                              |
| ---------------------- | ---------------- | --------------------------------- |
| `$Home`                | 100 GB/user      |                                   |
| /work/project/quantum  | 20 TB            |                                   |
| /work/project/cms      | 50 TB            |                                   |
| /work/project/physics  | 20 TB            | For Physics CU staff and students |
| /work/project/escience | 30 TB            | For all users                     |

To use the group space, please see the [disk space](https://esciencecu-twiki.sc.chula.ac.th/introduction-to-our-cluster/disk-space) section.
