Livin on the Edge, Part 2: The Brains and Nervous System of Edge Computing

This is part two of a five part series on edge computing with a focus on computing and memory.
Edge computing will require a shift in processing and memory resources to decentralized locations. Many of the qualities that have fueled cloud computing adoption will facilitate this new method of delivering computing resources, enabling new applications. There are key differences to edge computing versus cloud, and investors need to be aware as this market grows over the next decade to rival cloud computing.
While communications infrastructure will require the largest architectural changes, MPUs and memory players will need to position themselves to capitalize on the burgeoning incremental edge market. As with enterprise and hyperscale data centers, digital semiconductors will serve as the central processing components of servers in this new form factor. This presents unique challenges and priorities that will need to be addressed.
In the mid-2000s cloud showed promise but there was still plenty of pushback. Those in the technology and investment communities questioned the feasibility and value of a centralized/ shared model versus an on-prem and private computing model. We see edge computing following a similar, potentially exponential adoption curve.
CPUs will continue to be the brains of all servers, with accelerators attached to them; edge cloud shouldn’t change that. However, CPUs at the edge will need to prioritize power efficiency, cooling, and ruggedness as they will be generally installed in less controlled environments. With more focus on I/O, server CPU will largely remain driven by the same factors as within cloud – perf/watt/$.
Memory will indeed be impacted by the redistribution of compute resources to edge computing. Broadly speaking, edge workload requirements will demand DRAM in different flavors, quantities, and locations compared to centralized compute resources.
The first decade of cloud computing’s ramp saw virtually the entire MPU TAM go to CPU players. This will not repeat in the edge computing market as accelerated computing architectures are entering the initial buildout phase with a proven model.
Part 1: Evolving Tomorrow’s Internet focuses on communications services and cloud/internet.
Part 2: The Brains and Nervous System of Edge Computing focuses on computing and memory.
Part 3: Storage & Networking focuses on storage and networking.
Part 4: Bringing Analytics & Business Logic to Edge Compute focuses on software and security. .
Part 5: Enabling & Empowering Locally focuses on end-devices & services enabled by edge computing.
If you’re already a member of our Research portal, log in.
Log InIf not, reach out to us directly for more information.