QLogic’s 9xx0 family of InfiniBand based 10 and 20Gbps Multi-protocol Fabric Directors form the cornerstone of QLogic’s high-performance “one wire” cluster computing interconnect solutions.
- Eliminate the need for separate server connections for storage, interprocessor communications, and LAN/WAN networks
- Improve application performance, lower latency, lower CPU load, and use less physical space and power than competing solutions
- Scale network I/O and servers independently
- Virtually pool and share I/O between servers
- Simplify network cabling and reduce power and cooling requirements
- Dramatically improve the total cost of cluster and grid computing
The 9xx0 series platform is an extremely flexible slot-independent chassis-based system. Chassis configurations come in 24, 12, 8, 4, and 2 slot versions ranging in size from 14U to 1U, and can be populated with a variety of InfiniBand SDR (10 Gbps) and DDR (20 Gbps) switching and Virtual I/O Controller (VIC) modules. VIC modules enable hosts on InfiniBand fabrics to transparently access either Fibre Channel or Ethernet networks or both. All option modules are interchangeable in any slot in any chassis.
QLogic 9xx0 Fabric Director system configurations can scale from as low as 12 ports to as high as 432 ports in a single chassis. Multi-chassis fabrics can be constructed to support thousands of nodes. The 9xx0 platform is today’s most scalable and highest port density InfiniBand based solution in the industry.