1. Cisco UCS C480 ML M5 Server for Deep Learning Datasheet.
Reference: In the "Product overview" section, it states, "The server supports eight NVIDIA Tesla V100-SXM2-32GB GPUs interconnected with NVIDIA NVLink." This directly confirms support for the required GPU form factor.
Source: Cisco. (2021). Cisco UCS C480 ML M5 Server for Deep Learning Datasheet. Retrieved from the official Cisco website.
2. FlexPod Datacenter with NVIDIA AI Enterprise and VMware vSphere Design Guide (Cisco Validated Design - CVD).
Reference: The "Compute Hardware" section under "Solution Design" details the bill of materials, specifying the "Cisco UCS C480 ML M5" for AI workloads. The section "Cisco UCS C480 ML M5 Server for Deep Learning" further describes its architecture with "eight NVIDIA V100 32GB SXM2 GPUs with NVLink."
Source: Cisco & NetApp. (2022). FlexPod Datacenter with NVIDIA AI Enterprise and VMware vSphere Design Guide.
3. Cisco UCS C240 M5 Rack Server Datasheet.
Reference: The "GPUs" section lists supported graphics cards, including the NVIDIA Tesla V100, but specifies the "PCIe" form factor. There is no mention of SXM2 support.
Source: Cisco. (2022). Cisco UCS C240 M5 Rack Server Datasheet. Retrieved from the official Cisco website.