Hyperscale computing is reinventing the data center

Blogs and Articles

Hyperscale computing companies are driving demand for data center capacity and availability. Learn the innovative solutions that benefit IT organizations.

6 December 20197 mins
Hyperscale Computing Is Reinventing the Data Center - Server room

Iron Mountain is a major provider of data centre and co-location services, with 3.5 million square feet across 15 locations in five countries. Its facilities can support thousands of services and millions of virtual machines with flexible scalability and world-class security. Iron Mountain also achieved a 100% renewable energy utilization rate for its data centre operations worldwide in 2019.

The innovations developed by hyperscale processors and the data centre providers that support them will ensure that data centres of all sizes will become more efficient, cleaner and better able to meet the capacity and performance demands of customers.

Cloud computing has done more than just change the way organizations compute. It's also revolutionizing the way data centres are built.

Web giants like Google, Facebook, Yahoo and eBay pioneered the concept of "hyperscale" computing — a form of processing that's defined by a massive scale optimized to manage and store data for thousands of business customers or millions of consumers. These facilities demand entirely new approaches to space planning, power provisioning, cooling and expandability, among other factors, according to a new report by Data Center Frontier.

The Scale of Hyperscale

To get an idea of the scope of these mega data centres, consider that traditional wholesale computing providers used to provision between 10,000 and 12,000 square feet of floor space for their commercial customers. In contrast, today's hyperscale facilities typically run between 30,000 and 60,000 square feet, with some ranging as high as 140,000 square feet, often to support just a single company.

These data centres may also be daisy-chained together in campus configurations of up to 500,000 square feet, or eight times the size of a football field. Power requirements may run as high as 350 megawatts, which is enough to provide electricity to all the residents of Honolulu.

Such computing concentration would have been unthinkable just a few years ago, but a variety of cloud-related phenomena are driving demand for new data centre designs, including IT infrastructure and software delivered as services, social networks and streaming media. Considerable innovation spawned from this in both the way data centres are built and how their services are provisioned.

Hyperscale data centres make up just 10% of all data centres tracked by 451 Research, but their share of investment in infrastructure and servers is much larger. Capital expenditures by hyperscale providers grew 43% in 2018 to almost $120 billion, according to Synergy Research.

One of the many new wrinkles these operators have introduced is the need to plan for constant and often significant growth. Specialized data centre real estate investment trusts have emerged to accommodate long-term road maps that see demand doubling or tripling in just a few years. Many data centres are designed to expand through modular data halls that can be added as the tenant requires more capacity.

Innovating Around Challenges

Hyperscalers' voracious power demands (data centres consumed about 3% of all U.S. electricity in 2017, according to Forbes) have also spawned innovative approaches to sourcing. Some facilities now operate with 100% renewable energy, and providers employ teams that specialize in navigating the complexities of energy markets.

The industry has also created innovative approaches to resiliency through such vehicles as availability zones, which are clusters of data centres within a region that enable customers to run applications in several different locations to avoid single points of failure.

Another area of innovation is in cooling. Heat is the enemy of computing efficiency and reliability, so providers have developed ways to minimize air-conditioning costs by locating data centres in cold climates to take advantage of fresh air cooling, adopting membrane-based evaporative cooling and creating in-server chilling units. Microsoft has even experimented with locating servers underwater for cooling, placing them near seaside customers.

Data Center Evolution

Data centre siting is also undergoing a rapid evolution. Early facilities were intended to serve a broad geographic distribution, but as more companies have moved to the cloud, providers have responded by locating their server farms closer to the point of use in order to minimize latency. The U.S. still houses 40% of the world's data centres, but strong growth overseas will quickly narrow that gap.