ConnectX®-3 EN Single/Dual-Port 10/40/56GbE Adapters w/ PCI Express 3.0
Mellanox ConnectX-3 EN 10/40/56GbE Network Interface Cards (NIC) with PCI Express 3.0 deliver high-bandwidth and industry-leading Ethernet connectivity for performance-driven server and storage applications in Enterprise Data Centers, High-Performance Computing, and Embedded environments. Clustered databases, web infrastructure, and high frequency trading are just a few applications that will achieve significant throughput and latency improvements resulting in faster access, real-time response and more users per server. ConnectX-3 EN improves network performance by increasing available bandwidth while decreasing the associated transport load on the CPU especially in virtualized server environments.
The Open Compute Project (OCP) mission is to develop and specify the most cost-efficient, energy-efficient and scalable enterprise and Web 2.0 data centers. Mellanox ConnectX-3 EN 10GbE Open Compute Mezzanine adapter card delivers leading Ethernet connectivity for performance-driven server and storage applications in Web 2.0, Enterprise Data Centers and Cloud environments. The OCP Mezzanine adapter form factor is designed to mate into OCP servers adhering to revision 2.0 of the Intel server platform.
- 10/40/56Gb/s connectivity for servers and storage
- Industry-leading throughput and latency performance
- I/O consolidation
- Virtualization acceleration
- Software compatible with standard
- TCP/UDP/IP and iSCSI stacks
- Single or Dual 10/40/56GbE ports
- PCI Express 3.0 (up to 8GT/s)
- Low Latency RDMA over Ethernet
- Data Center Bridging support
- T11.3 FC-BB-5 FCoE
- TCP/IP stateless offload in hardware
- Traffic steering across multiple cores
- Hardware-based I/O virtualization
- Intelligent interrupt coalescence
- Advanced Quality of Service
- How to Get the Fastest Tick Capture and Market Data Analysis Available
- 10GbE Speed at Your Fingertips: Fast Data Center Connectivity is Now Cost Effective!
- Enhancing Cloud Providers to Deliver Revolutionary Throughput Performance by Using Mellanox InfiniBand
- The High Performance Approach for Building Hadoop Clusters