APPLICATIONS
Mellanox Doubles InfiniBand Performance W/ 20, 60Gbps Technology
- Written by: Writer
- Category: APPLICATIONS
Mellanox Technologies will demonstrate new double data date (DDR) 20Gbps and 60Gbps InfiniBand technology at the Supercomputing 2004 conference in Pittsburgh. InfiniBand DDR technology provides twice the bandwidth delivered by current InfiniBand products and achieves up to eight times the performance of competing interconnects. The technology demonstration is welcome news for data centers and supercomputer architects looking to deploy bandwidth-hungry applications on 64-bit platforms. The demonstration also highlights the significant return on investment of deploying InfiniBand, as the new 20Gbps and 60Gbps solutions are backwards compatible with existing software and hardware. "InfiniBand has clearly enabled a Moore's law effect for interconnects," said Michael Kagan, vice president of architecture at Mellanox. "While competing technologies are struggling to introduce 10Gbps technologies, InfiniBand is doubling the speed of its existing architecture without requiring new software. This positions InfiniBand as the highest performance and best technology choice for 64-bit and dual core platforms in business and technical computing." The technology demonstration consists of a complete 8-node Dual Processor Intel Xeon EM64T cluster with InfiniBand PCI Express Adapters and an InfiniBand Switch with both 20Gbps and 60Gbps ports. The adapters use the Mellanox InfiniHost III Ex PCI Express device, and the switch is based on the 24-Port InfiniScale III device. The demonstration runs a Computational Fluid Dynamics application from Fluent Inc and the latest Linux IBGold software package from Mellanox. 20Gbps technology is only one in a series of speed upgrades in the InfiniBand roadmap. In September, the InfiniBand Trade Association (IBTA) released a final version of the Annex, which specifies DDR and QDR (Quad Data Rate) modes of operation. These modes define increased signaling rates over existing 1X, 4X and 12X InfiniBand links, effectively doubling or quadrupling bandwidth using the same infrastructure. For example, DDR operation over 4X and 12X InfiniBand links provides 20Gbps and 60Gbps respectively, as will be demonstrated in the Mellanox booth at SC2004. Mellanox will begin sampling Switch and HCA silicon in Q4, 2004. DDR Switches and PCI Express HCA Adapter cards are scheduled for availability in the first half of 2005.