Overview
The demand for more computing power, efficiency and scalability is constantly accelerating in the HPC, Cloud, Web 2.0, Machine Learning, Data Analytics, and Storage markets. To address these demands Dell Technologies partners with NVIDIA Networking to provide complete end-to-end solutions supporting InfiniBand and Ethernet networking technologies.

Ethernet
NVIDIA Networking offers complete 10/25/40/50/100/200Gb/s Ethernet solutions for your customer’s data center, giving you the competitive advantage. These end-to-end solutions deliver industry-leading performance, scalability, reliability and value across a wide range of applications such as AI, Machine Learning, cloud, enterprise, financial, storage, Big Data, telco, and more.

InfiniBand
NVIDIA Networking continues its leadership in providing the highest performing interconnect solutions for Enterprise Data Centers, Web 2.0, Cloud Computing, High-Performance Computing, and embedded environments. Mellanox's line of InfiniBand products deliver the highest productivity, enabling compute clusters and converged data centers to operate at any scale while reducing operational costs and infrastructure complexity.

NVIDIA BlueField DPU
The NVIDIA® BlueField® data processing unit (DPU) ignites unprecedented innovation for modern data centers, delivering a broad range of advanced networking, storage, and security services for complex compute and AI workloads. By combining the industry-leading ConnectX® network adapter with an array of Arm cores, BlueField offers purpose-built hardware acceleration engines with full data center infrastructure on chip programmability.

Cables & Modules
Data centers, high scale storage systems and cloud computing require I/O services such as bandwidth, consolidation and unification, and flexibility. Mellanox's LinkX™ interconnect are a cost-effective solution for connecting high bandwidth fabrics, extending the benefits of Mellanox’s high-performance InfiniBand and 10/40/56/100 Gigabit Ethernet adapters throughout the network. In addition to meeting or exceeding IBTA and IEEE standards, Mellanox Certified cables are 100% tested on Mellanox equipment to ensure optimal signal integrity and the best end-to-end performance. With Mellanox interconnect you will experience trouble-free installation and long-term operation. Mellanox InfiniBand adapters provide advanced levels of data center IT performance, efficiency and scalability. In addition, Mellanox's InfiniBand adapters support traffic consolidation and provides hardware acceleration for server virtualization.
Our Solutions
- ConnectX-6 HDR100 Single Port Adapter Card Download
- ConnectX-5 EN Dual-port 10/25 Ethernet Adapter CardDownload
- vSAN Product BriefDownload
- 25G Ethernet with Dell EMC, Nutanix and MellanoxDownload
- ConnectX-5 Ex Dual Port 100GbE Network AdapterDownload
- ConnectX-4 Lx Dual Port 25 GbE KR Mezzanine CardDownload
- Single/Dual-Port Low Profile Adapters with Virtual Protocol Interconnect for Dell ServersDownload
- Dual-Port adapter with Virtual Protocol Interconnect for Dell PowerEdge C6100-series Rack ServersDownload
- M4001 - 16-port 40Gb/s and 56Gb/s InfiniBand Blade SwitchDownload
- M1601P - 10GbE PassThrough Module (KR-based)Download
- M1601P - 10GbE PassThrough Module (XAUI-based)Download
- SX6012 - 12-port Non-blocking Managed 56Gb/s InfiniBand/VPI SDN Switch System Download
- SX6025 - 36-port Non-blocking Unmanaged 56Gb/s InfiniBand SDN Switch SystemDownload
- SX6036 - 36-port Non-blocking Managed 56Gb/s InfiniBand/VPI SDN Switch SystemDownload
- Ethernet Switch Systems Brochure Download
- InfiniBand Switch Systems Brochure Download
- Ethernet Adapter Brochure Download
- 25Gb Ethernet Networking Download
- The Easiest Way to Deploy RDMA on Windows Storage Spaces Direct Download
- Dell Fluid Cache for SAN Test Preliminary Summary Download
- Boost Oracle RAC Performance with Dell Fluid Cache for SAN Download
- Improving Database Performance with Dell Fluid Cache for SAN Download
- RoCE (RDMA over Converged Ethernet) Download