Computing Resources
The research group is equipped with Inspur NF5280M4 and NF5280M5 servers, which form the core of its computational infrastructure. The NF5280M4 is built on the Intel Xeon E5-2600 v3/v4 platform (six cores/twelve threads, up to 3.2 GHz) and provides 24 DDR4 memory slots (up to 1.5 TB) and six PCI-E 3.0 expansion slots. It supports 12–24 hot-swappable SAS/SATA drives (with optional SSD acceleration) and RAID 5 redundancy. With high-temperature tolerance (continuous operation at 45 °C) and a modular, tool-free maintenance design, this model is well suited for traditional numerical simulations (e.g., crustal deformation analysis) and mid-scale GNSS data processing tasks. The NF5280M5 features upgraded Intel Xeon Scalable processors (up to 28 cores/56 threads), enhanced memory bandwidth with DDR4-2666 MHz (up to 3 TB), and full-flash NVMe storage configurations delivering up to 15 million IOPS. It supports four dual-width GPU accelerator cards and integrates liquid cooling and microsecond-level I/O latency. These capabilities enable efficient computation for intelligent workloads such as deep-learning-based seismic wave inversion, multi-source satellite data fusion, and real-time positioning algorithm optimization. Its power-awareness technology and remote management features (IPMI 2.0/KVM) further reduce the operational complexity of large-scale clusters.
In addition, the research group operates a high-performance deep learning server, the Inspur NF5280M6, equipped with eight NVIDIA RTX 6000 Ada GPUs. Each GPU features 18,176 CUDA cores and 48 GB of memory, delivering up to 91.1 TFLOPS of single-precision performance and 1,458 TOPS of tensor computing capability, enabling efficient training and inference of large-scale deep learning models. The server is further configured with two Intel Xeon 4314 processors (16 cores at 2.4 GHz), 768 GB of DDR4 memory, and 10 TB of high-speed storage, providing strong support for large-volume data processing and storage. With leading-edge performance in computation, memory, and storage, this server provides a robust hardware foundation for the group's deep learning research.
Deep Learning Server (External)
Deep Learning Server (Internal)
时间:2025-11-17
浏览:16