Artificial Intelligence, Integration Key as Data Moves to the Edge
By Martin Olsen, Global Vice President, Edge Strategy, Vertiv
Today’s networks are becoming more complex and distributed in order to support real-time computing and decision-making that are essential to smart cities, autonomous vehicles, the augmented and virtual reality demands of the metaverse, and much more. These hybrid architectures include core data centers and public and private clouds, but they increasingly are focused on edge delivery options that push storage and computing closer to users to reduce latency.
Organizations are building new data centers specifically to support this new reality. Globally, about 2.9 gigawatts of new data center construction is in development compared with 1.6 gigawatts in 2020, according to a Cushman & Wakefield report. Much of that is happening at the edge.
- Edge sites will grow by 226% between 2019 and 2025, according to an STL Partners report.
- Today, edge deployments account for about 5% of workload, but over the next five years that will increase to 30%, with another 30% in enterprise data centers and 40% in the public cloud, according to a VMWare report.
- By 2025, 85% of infrastructure strategies will integrate on-premises, colocation, cloud and edge delivery options, compared with just 20% in 2020, according to a recent Gartner report.
For technology leaders, the key issue is how to manage the increasing complexity of hybrid network architectures. Solutions include micro data centers, Artificial Intelligence (AI) and integration.
Sharpening your edge with micro data centers
When planning for edge computing, it’s essential to ensure powerful computing capacity with a small footprint. Micro data centers include all the essential components you find in a typical data center in a compact deployment. They typically support critical loads of no more than 100-150 kilowatts (kW). Along with servers, other components within the rack usually include:
- Uninterruptible power supply (UPS)
- Rack power distribution unit (rPDU)
- Rack cooling unit and environmental controls with integrated heat rejection
- Remote monitoring sensors and software
- IT management systems
Off-the-shelf solutions lack the specificity needed for most sites, so it is important to select a provider that can customize the micro data center to your needs.
Optimizing performance with AI
AI and Machine Learning are essential to optimizing performance in networks that combine edge computing with enterprise data centers, public and private clouds, and colocation facilities. Full-time manual management is not feasible given this complexity.
Fortunately, AI has become easier to use. Simplified programming tools allow data scientists to target computing resources to meet a challenge without consulting programming or hardware experts. AI is a viable option even for smaller companies, given that AI hardware and cloud options are readily available from established vendors. Together with a simplified toolchain and wider education focus on data science, AI is within reach of more companies than ever.
Of course, it still takes time and resources to make sure you are getting it right — collecting the right data, building the right models, and training the platform to make the right decisions. As the adoption of AI and other applications requiring high-performance edge computing increases, it will be important to plan the infrastructure for those rising computing densities and the accompanying increases in heat. This will accelerate the adoption of liquid cooling. In addition, the lower barrier to entry means it is more important than ever to select the right vendors, platforms, and systems to keep these critical systems running optimally and enable anticipated growth.
Integration: Key to speed and flexible capacity management
Physical infrastructure is a key consideration in developing an edge computing strategy, including power, cooling, and enclosures. This infrastructure provides the foundation on which applications can run and supports a wide range of use cases.
Integrated systems that allow for modular capacity additions have been an important solution for data centers, and we have seen many innovations over the past 10 years. The next step for data centers involves working with providers to improve the integration of larger systems, such as multi-megawatt components of the power infrastructure, toward the goal of seamless interoperability.
Integration is already well established as a way to reduce construction and deployment costs while delivering flexible capacity management. When applied to larger systems, integration delivers the speed needed to reduce latency, while managing large data flows. Rack-based power solutions are early accelerators of integration momentum.
It is also important to remember deployment is just the beginning, and life cycle services to ensure reliable, efficient operation of computing equipment and infrastructure is critical to success at the edge.
Finding your Edge
These solutions offer data center leaders the flexibility to create hybrid models that meet the needs of our post-pandemic society in which remote/hybrid work models, video streaming, telehealth and other data-intensive applications are here to stay.
About the author
Martin Olsen is the vice president of global Edge and integrated solutions at Vertiv.
DISCLAIMER: Guest posts are submitted content. The views expressed in this blog are that of the author, and don’t necessarily reflect the views of Edge Industry Review (EdgeIR.com).
Armo raises $30M for open-source Kubernetes security platform
Relay Robotics raises $10M in Series A; hires new CEO
Article Topics
design | edge data center | energy | micro data center | power management | Vertiv
Comments