Press Releases - 2009

Back to Press Listing

Media Contact: media@mellanox.com

  

Mon, Nov 16, 2009

Mellanox ConnectX-2 40Gb/s InfiniBand Adapters with Application Communication Offloading Technology Available Through HP

ConnectX®-2 InfiniBand Adapters with New CORE-Direct™ (Collectives Offload Resource Engine) Capabilities Provide Clusters with Application Acceleration and Enhanced Scaling

SC09, PORTLAND, OR. – Nov. 16, 2009 – Mellanox® Technologies, Ltd. (NASDAQ: MLNX; TASE: MLNX), a leading supplier of end-to-end connectivity solutions for data center servers and storage systems, today announced its industry-leading ConnectX-2 40Gb/s InfiniBand adapter cards are available now for HP ProLiant BL, DL and SL series servers, as well as HP BladeSystem c-Class enclosures. Mellanox’s ConnectX-2 40Gb/s InfiniBand adapter cards with CORE-Direct enable significant improvements in application performance and scalability and increase the productivity of HP-based compute clusters.

Mellanox ConnectX-2 is the first adapter card in the market to process in hardware application communications frequently used by scientific simulation for data broadcast, global synchronization and data collection. By offloading these collectives communication, ConnectX-2 adapters help to reduce simulation completion time by accelerating the synchronization process and freeing up CPU cycles to work on the simulation, thus enabling greater scalability by eliminating system jitter and noise.

“As supercomputers increase in size from mere thousands to hundreds-of-thousands of processor cores, new performance and scalability challenges have emerged,” said John Monson, vice president of marketing at Mellanox Technologies. “The combination of Mellanox ConnectX-2 40Gb/s InfiniBand adapters with CORE-Direct application interprocessor communication offload and HP ProLiant servers provide end-customers with the highest performance, efficiency and scalability for their next-generation, high-processor node count supercomputer.”

“Customers with high-performance computing clusters seek solutions to improve system efficiency and resource utilization, while still reducing overhead,” said Steve Cumings, director of marketing, Scalable Computing and Infrastructure organization, HP. “The combination of HP Unified Cluster Portfolio and Mellanox’s ConnectX-2 InfiniBand adapters can significantly enhance customers’ compute cluster application performance and scalability, while simplifying and speeding deployment.”

The advanced feature set combined with the highest I/O performance of any standard interconnect makes ConnectX-2 the leading interconnect adapter solution for high-performance server and storage computing infrastructures. Together with Mellanox world leading InfiniBand switch systems, FabricIT™ comprehensive switch management software and cables, Mellanox and HP provide its customers with the richest, most advanced and highest performing end-to-end networking solutions for the world’s most compute-demanding applications.

Availability
ConnectX-2 adapter cards are available now through HP at www.hp.com.

About Mellanox
Mellanox Technologies is a leading supplier of end-to-end connectivity solutions for servers and storage that optimize data center performance. Mellanox products deliver market-leading bandwidth, performance, scalability, power conservation and cost-effectiveness while converging multiple legacy network technologies into one future-proof solution. For the best in performance and scalability, Mellanox is the choice for Fortune 500 data centers and the world’s most powerful supercomputers. Founded in 1999, Mellanox Technologies is headquartered in Sunnyvale, California and Yokneam, Israel. For more information, visit Mellanox at www.mellanox.com.

Mellanox, ConnectX, CORE-Direct, InfiniBlast, InfiniBridge, InfiniHost, InfiniRISC, InfiniScale, and InfiniPCI are registered trademarks of Mellanox Technologies, Ltd. Virtual Protocol Interconnect is a trademark of Mellanox Technologies, Ltd. All other trademarks are property of their respective owners.
###