Argent Court Data Centre is a purpose built facility to provide the University with a modern and efficient facility to securely house server, storage and network equipment. The facility is designed to provide expansion space for up to 10 years to cope with the anticipated growth and expansion of both the University and IT Services.
Pictures of the data centre both in construction and completed can be found here.
The facility is separated into 3 distinct areas:
Hall 1 (phase 1)
Hall 1 contains 40 x 19" 46U 1200mm deep server cabinets. Each server cabinet has 4 x 32amp power feeds, 2 of these feed a PDU within the server cabinet. Each PDU has multiple 10amp and 16amp IEC sockets.
The 40 cabinets are split into 2 pods of 20 cabinets and each pod can accommodate 150Kw of power load. This works out to 7.5Kw per cabinet, although should more power be required in certain racks, this can be accommodated.
Each PDU is monitored to ensure overloading cant occur. They are also fed from different power supplies to ensure resilience and minimal downtime for maintenance.
Each cabinet is provided with a minimum of 48 x CAT6 RJ45 outlets allowing for up to 10GB/s transmission speeds. These CAT6 outlets are connected back to a central network provision within the facilty. Both OM3 and OS2 fibre connections are also provided to each cabinet. These provide optical links out to the other data centres and other locations both on and off campus.
Cooling is provided to both pods via In Row Cooling (IRC) units. This has been designed to use a "hot aisle" contained system providing on air to the front of the servers and other equipment at 24°C. The IRCs have been designed in a N+1 configuration to allow for faults and maintenance on the cooling system to be carried out with little affect to the running of the data centre.
A false floor has been installed with networking and water services routed within the floor void. Power is routed at high level above the cabinets. Flood detection is installed within the floor void.
Both a VESDA (Very Early Smoke detection Aspirating) system and fire gas suppression system has been installed to minimise any damage to installed equipment from fire.
Hall 2 (phase 2)
Hall 2 is currently empty but has the capacity to house another pod of 20 cabinets to the same specification and capacity as the pod in hall 1.
A false floor has been installed in hall 2 the same as hall 1, but no network or water services have been installed.
The area is adjacent to hall 1 and is separated with a wall. This wall could either be removed to create one large hall, or if the circumstances dictate, kept as a separate area to hall 1, with a separate external entrance installed if required, making it a stand alone area.
As and when this hall is required, additional external cooling plant, UPS and generator services will be required to support it. Although this equipment is not yet installed, the footprints for them are.
Hall 3 (HPC Area)
Hall 3 is also currently empty and as hall 2, could house another pod of 20 cabinets. The area could also be used as a stand alone High Performance Computing area should the need arise, with up to 150amps of power available once additional cooling plant, UPSs and generators are installed. As with hall 2, footprints for this equipment has been allocated.
There are no services installed in this room including false floor.
This hall does have its own separate entrance to halls 1 and 2.
UPS and Generator Provision
UPS and generator provision has been provided as follows to hall 1 only:
A UPS has been installed to provide continuous power to all critical loads in the event of a power outage. The UPS also "smooths" the power coming into the data centre to compensate for any spikes on the incoming power supply which could cause damage to the IT equipment installed within.
The UPS is designed to provide continuous power to critical loads for up to 30 minutes in the event of the incoming or generator supply failing. The UPS consists of 3 separate units running in parallel in a N+1 set up. This allows for one UPS to be off line at any time to allow for faults and maintenance to take place without reducing cover on the critical load.
A 850Kva standby generator set has been installed to provide power to the whole facility in the event of loss of the utility power supply. In conjunction with the UPS, continuous power will be provided to the data centre without loss of service in the event of a power outage.
The generator has its own fuel tank giving a run time on full load of around 72 hours.
Should you require any more information on the data centre or would like a tour of the facility, please contact the service owner for data centres and infrastructure, Steve Silver at steve dot silver at warwick dot ac dot uk