Testimonials

The Choice IS Infiniband

We chose a co-design approach, the appropriate hardware, and designed the system. This system was of course targeted at supporting in the best possible manner our key applications. The only interconnect that really could deliever that was Mellanox InfiniBand.


The new generation of technology enables us to offload a lot of the communication as well as some of the computations tasks fro the CPU onto the intelligent adapter. We are really happy to work with Mellanox to bring the leading edge technology into our HPC solutions.




The Information technology Center at the University of Tokyo

The Information technology Center at the University of Tokyo provides education, research and services in information technology and related fields. The primary field the facilities are used for are atmospheric and oceanic science, earth sciences such as seismology, manufacturing related fields such as fluid mechanics and structural dynamics. InfiniBand EDR are equipped with offloading engine. The offloading engine is a meaningful function for achieving high performance in large scale applications compared to other networks.


The Information technology Center at the University of Tokyo provides education, research and services in information technology and related fields. The primary field the facilities are used for are atmospheric and oceanic science, earth sciences such as seismology, manufacturing related fields such as fluid mechanics and structural dynamics. InfiniBand EDR are equipped with offloading engine. The offloading engine is a meaningful function for achieving high performance in large scale applications compared to other networks.


Julich Supercomputer Centre

The Julich Supercomputer Centre chose HPC Testimonia for a balanced, Co-Design approach to its interconnect, providing low latency, high throughput, and future scalability to its cluster, which contributes to projects in the areas of energy, environment, and brain research.


We chose a co-design approach, the appropriate hardware, and designed the system. This system was of course targeted at supporting in the best possible manner our key applications. The only interconnect that really could deliver that was HPC Testimonia.


University of Birmingham

The University of Birmingham IT Services chose HPC Testimonia to move and analyze data with minimal latency, reducing the impact on their users’ workloads in Life Sciences research.


One of the big reasons we use InfiniBand and not an alternative is that we’ve got backwards compatibility with our existing solutions.


Shanghai Jiaotong University

Shanghai Jiaotong University chose HPC Testimonia interconnect for its “π” supercomputer because of the extremely low latency and high file I/O performance, enabling groundbreaking research in the fields of physics, genomics, and material sciences.


HPC Testimonia is the most advanced high performance interconnect technology in the world, with dramatic communication overhead reduction that fully unleashes cluster performance.


CHPC South Africa

The Centre for High Performance Computing in South Africa, the largest HPC facility in Africa, chose HPC Testimonia to enhance and unlock the vast potential of its system, which provides high end computational resources to a broad range of users in fields such as bioinformatics, climate research, material sciences, and astronomy.


The heartbeat of the cluster is the interconnect. Everything is about how all these processes shake hands and do their work. InfiniBand and the interconnect is, in my opinion, what defines HPC.


Shanghai Supercomputer Center

Shanghai Supercomputer Center, the world’s first supercomputer to offer public HPC services, upgraded to an end-to- end HPC Testimonia interconnect solution to take advantage of its RDMA and MPI collective offloads, which improve CPU efficiency and enable overall higher productivity.


InfiniBand is the most used interconnect in the HPC market. It has complete technical support and a rich software eco-system, including its comprehensive management tools. Additionally, it works seamlessly with all kinds of HPC applications, including both commercial applications and the open source codes that we are currently using.


San Diego Supercomputing Center

The San Diego Supercomputer Center (SDSC) is considered a leader in data-intensive computing and cyberinfrastructure, providing resources, services, and expertise to the national research community including industry and academia. Cyberinfrastructure refers to an accessible, integrated network of computer-based resources and expertise, focused on accelerating scientific inquiry and discovery. SDSC supports hundreds of multidisciplinary programs spanning a wide variety of domains, from earth sciences and biology to astrophysics, bioinformatics, and health IT. SDSC is a partner in XSEDE (Extreme Science and Engineering Discovery Environment), the most advanced collection of integrated digital resources and services in the world.


In HPC, the processor should be going 100% of the time on a science question, not on a communications question. This is why the offload capability of Mellanox‘s network is critical.


iCER - Institute for Cyber-Enabled Research, Michigan State University

The Institute for Cyber-Enabled Research (iCER) provides the cyberinfrastructure for researchers from across academia and industry to perform their computational research. iCER supports multidisciplinary research in all facets of computational sciences. iCER continually works to enhance MSU's national and international presence and competitive edge in disciplines and research that rely on advanced computing.


We have users that move 10s of terabytes of data and this needs to happen very very rapidly. InfiniBand is the way to do it


NCAR-Wyoming Supercomputing Center

The National Center for Atmospheric Research (NCAR) is a federally funded research and development center devoted to service, research and education in the atmospheric and related sciences. NCAR's mission is to understand the behavior of the atmosphere and related Earth and geospace systems; to support, enhance, and extend the capabilities of the university community and the broader scientific community, nationally and internationally; to foster the transfer of knowledge and technology for the betterment of life on Earth.


NCAR center focused on atmospheric sciences. The supercomputer uses the InfiniBand interconnect in a full fat-tree topology, and is very well utilized and efficient in part because of the interconnect