AI Datacenters: The Rapid Evolution Of The Datacenter Industry

By Justin Heyes. April 15th 2024

How will AI drive future growth of the Data Center Market?

In a rapidly evolving tech driven world, where businesses are digitising their services in order to meet the demands of their online customer base, the datacenter industry is at the forefront of innovation, constantly evolving to provide the backbone for an ever increasing need for connectivity, processing and storage. However this constant drive for greater efficiency doesn’t come without its challenges, especially in solving the problems we’re facing today.

For context, as of January 2024, there were 5.35 billion internet users worldwide, which amounts to 66.2% of the global population (Statista, 2024), whereas comparatively in 2005 the world was celebrating the milestone of reaching one billion people being online. Looking back further and tracking patterns, we can see a rapid transformation, with the use of technology and the number of users taking leaps and bounds across noticeably smaller increments of time with each progression. 

Understanding how the datacenter industry has driven or responded to these progressions and why, can give us an insight into where the industry will trend to over the coming years, as we march towards a fully connected world.

The Mainframe Era

Initially the origin for datacenters were a far cry from what we now recognise today. In the 1950s and 60s, the introduction of powerful mainframe computers enabled companies to conduct large-scale data processing and storage in a way that had only been theorised previously. They were stored onsite in “computer rooms,” meticulously designed to control temperature and humidity, to prevent the sensitive equipment from overheating or failing.

Despite their bulky size and temperamental nature, these mainframes served as the nerve centre for business critical operations such as payroll systems, customer and transaction databases, and enterprise resource planning.

By the 1970s and 80s, Moore’s Law was in full effect and these “computer rooms” now filled with sleeker, more efficient mainframes, were commonplace in businesses. Banks were able to manage their vast databases, tracking millions of daily transactions in real time. Airlines had reservation systems, ensuring seamless ticket bookings and customer service, and insurance companies harnessed the processing power of mainframes to manage complex actuarial computations and customer records, to name but a few examples of how computer processing had begun to shape how the world would operate, as computers became not just commonplace in offices, but homes too.

Industry Standard Server

In the 1990s, everything changed again. The world witnessed the microprocessor boom, the birth of the Internet and the development of client-server computing models. Old mainframe rooms filled up with microprocessor computers acting as servers, laying the foundation for the first in-house datacenters. Slowly, this infrastructure became standardised, and the modular racks we know today were born. Large computers were replaced by personal desktops. Connectivity became fast, some reading this may remember the absence of strange noises while trying to connect to the internet as dial up was replaced, and this in turn drove the development of the datacenters, which catered to the growing numbers of computer users. Enterprises started to build their server rooms and facilities, encompassing thousands of servers. In addition, several businesses also emerged in the domain of operating data centres for public clients.

Virtualisation

Datacenters finally took centre stage with the advent of the new millennium. With the internet now in its maturity, IT investment skyrocketed and new facilities shot up around the world as everyone looked to cash in on the dotcom boom. However when the investment finally died down, the datacenter industry was struck a massive blow. Operating datacenters on that much hardware simply wasn’t viable without a growing economy to support it. Hardware utilisation, power, cooling, and cost-efficiency were the order of the day, allowing organisations to reduce their data centre footprint and lower both CapEx and OpEx. 

This focus created a quieter revolution in the industry, virtualisation. By introducing virtual software that mimics the functions of physical hardware to run multiple virtual machines simultaneously on a single physical machine, the impact was dramatic, ultimately reducing datacenter power, space and cooling requirements by around 80%. With the 2008 financial crisis, the drive to reduce IT spending, outsource requirements and harness the potential of economies of scale strengthened its grip. The colocation market saw runaway success, which continues to this day.

The Current Era

Today’s datacenters are unsurprisingly grappling with escalating data volumes and this has seen a rise in hyperscale projects. Commonly datacenters receive escalating demands for more storage capacity from their clients and so datacenters are built with two key focuses, scalability and redundancy. Redundancy ensures constant uptime for a world that now has a growing reliance on cloud computing. Scalability ensures that clients are able to access more processing power during surges, and seamlessly add or remove servers, adjust bandwidth, or modify their service packages to ensure that their IT infrastructure can evolve alongside their business.

AI Datacenters: A New Dawn

Looking to the future then; there are factors in two key advances in the industry that have the potential to revolutionise the industry once again. The first is edge datacenters; these smaller facilities bring processing workloads and data storage closer to end users, and aid to reduce latency, while saving bandwidth. The second is the introduction of AI Racks; these racks are considerably more powerful than their predecessors, capable of processing data in ways not seen before and performing extremely complex tasks. However hosting these racks is a monumental task, they require an incredible amount of energy and cooling to remain operational.

One commonly proposed solution then is to combine these technologies, and thereby shift the datacenter landscape further. Utilising edge datacenters as the first point of contact for businesses, these servers can handle smaller tasks while relaying more complex processes to hyperscale datacenters housing AI racks capable of managing the workloads. This natural evolution would see a rise in large central datacenter parks positioned away from populations, with access to ample resources, using a chain of edge datacenters as a nervous system that covers the country.

SHARE

Like, Share and Subscribe to help us grow

AREA NEWSLETTER