Designed specifically to meet the high-capacity demands of cloud computing customers, the 6,400 square meter (68,889 square foot) capacity datacenter was opened in October last year. Demand is so high that the first of its two floors will soon be entirely sold out and the other is about to be kitted out for ready for the next wave of customers.
In a conventional datacenter, it's typical to find an assortment of general-purpose servers each dedicated to specific functions or applications, running at around 4-5 percent average utilization. A fully loaded 42U rack (about two meters or six feet in height) will consume up to around 4-5Kw of electricity on average. Add at least the same power overhead used in transforming, cooling and other datacenter operations to arrive at a typical power usage effectiveness (PUE) of 2.0 or more. So much for the traditional datacenter.
In cloud computing, cost efficiency demands a completely different set of metrics around power consumption and cooling. Instead of lying idle or underused much of the time, servers are designed to run at 60 percent or higher average utilization, with virtualization and IT automation working to distribute workloads to maximize effective resource usage. Server designs share power supplies so that they can run at better than 90 percent efficiency. Racks are densely packed with these custom-built servers all running at optimum utilization.
AM3 demonstrates the kind of datacenter metrics that this much denser profile of cloud computing demands (in the picture below, you can pick out features such as the cold aisle containment cabinets and color-coded overhead cable racks):
- 50U racks to increase the volume of air circulating between racks for more effective cooling
- Up to 29Kw power consumption per rack using cold aisle containment — a cabinet system that encloses cool air in the aisles between racks, increasing the amount of heat that can be safely dispersed as the air is drawn through the racks to the room beyond.
- Up to 33Kw per rack using an optional cold water cooling system
- Eliminating chillers by drawing cool water from a natural underground aquifer (excess warm water is recycled to heat neighbouring University of Amsterdam buildings)
- Built to achieve a PUE rating of less than 1.2 (the award-winning facility hasn't yet been operating long enough to provide the actual PUE, but a figure not far from 1.1 is expected).
The similarity is deliberate. Equinix aims to attract the leading cloud players to AM3 (and in some cases has already done so). One feature that it emphasizes is its cross-connection capabilities, with a commitment to install connections in 2 days or less and an SLA-backed uptime for connections of 99.95 percent. Copper, fiber and carrier cables run in separate, color-coded overhead racks to speed installation and fault-finding.
With its emphasis on high-speed, reliable connections between major cloud players, Equinix wants to draw parallels with the Internet Exchange points of old, which quickly became hubs where serious Internet players wanted to have a presence so they could have the fastest connections. It's no accident that AM3 (and the adjacent site for AM4, which will be ready to take new customers once AM3 is full) is located in the Amsterdam Science Park, where the Amsterdam Internet Exchange first got started two decades ago.
Equinix sees the rise of cloud computing as a new opportunity for its global network of datacenters to become a network of exchange points where all the cloud players want to have a presence. It can then pitch its colocation facilities as the ideal place for enterprises to site their private infrastructure for more rapid, reliable connections to those cloud providers. Equinix is not alone in that ambition, of course, but its existing relationships with Amazon, Facebook, Salesforce.com and many others give it a market advantage that it is doing its utmost to leverage.
Disclosure: I was a speaker at an event hosted at the Science Park by Dimension Data, who paid my travel costs and a speaker fee as part of a consulting engagement.
Photo credit: all photos courtesy of Equinix.