Livin on the Edge, Part 2: The Brains and Nervous System of Edge Computing

This is part two of a five part series on edge computing with a focus on computing and memory.


Edge computing will require a shift in processing and memory resources to decentralized locations. Many of the qualities that have fueled cloud computing adoption will facilitate this new method of delivering computing resources, enabling new applications. There are key differences to edge computing versus cloud, and investors need to be aware as this market grows over the next decade to rival cloud computing. 

While communications infrastructure will require the largest architectural changes, MPUs and memory players will need to position themselves to capitalize on the burgeoning incremental edge market. As with enterprise and hyperscale data centers, digital semiconductors will serve as the central processing components of servers in this new form factor. This presents unique challenges and priorities that will need to be addressed.

We Expect Edge Cloud Computing to Follow a Similar Adoption Curve As Cloud

In the mid-2000s cloud showed promise but there was still plenty of pushback. Those in the technology and investment communities questioned the feasibility and value of a centralized/ shared model versus an on-prem and private computing model. We see edge computing following a similar, potentially exponential adoption curve.

x86 Continues to Dominate, But Competitors Have the Potential to Edge Their Way In

CPUs will continue to be the brains of all servers, with accelerators attached to them; edge cloud shouldn’t change that. However, CPUs at the edge will need to prioritize power efficiency, cooling, and ruggedness as they will be generally installed in less controlled environments.  With more focus on I/O, server CPU will largely remain driven by the same factors as within cloud – perf/watt/$.

Fragmented Memory Resource Allocation

Memory will indeed be impacted by the redistribution of compute resources to edge computing. Broadly speaking, edge workload requirements will demand DRAM in different flavors, quantities, and locations compared to centralized compute resources.

Accelerators – Taking a Larger Share of the TAM

The first decade of cloud computing’s ramp saw virtually the entire MPU TAM go to CPU players. This will not repeat in the edge computing market as accelerated computing architectures are entering the initial buildout phase with a proven model.


Part 1: Evolving Tomorrow’s Internet focuses on communications services and cloud/internet.

Part 2: The Brains and Nervous System of Edge Computing focuses on computing and memory.

Part 3: Storage & Networking focuses on storage and networking.

Part 4: Bringing Analytics & Business Logic to Edge Compute focuses on software and security. .

Part 5: Enabling & Empowering Locally focuses on end-devices & services enabled by edge computing.

Get the Full Report

If you’re already a member of our Research portal, log in.

Log In

If not, reach out to us directly for more information.

More Like This

Concept around the premise that robotics is a key tool for combating climate change, fostering sustainable industry, and supporting ESG. The image is of two assembly robots making a digital rendering of a tree, effectively growing and nurturing the tree. Neon blue lighting against a black background with a city-scape visible in the background.
Ahead of the Curve®

Deus Ex Machina Part IV, Robotics as Tools in Climate Fight

Read More
Concept for reservoir analysis used to evaluate CO2 storage. Related to sustainability, clean energy, and big data. The image is of a light bulb growing in soil with data circuits circulating around the lightbulb as if reflected by yellow light of the bulb.

Carbon Storage Evaluation with Core Lab

Read More

50 Years of Innovative Ideas: Cowen’s TMT Conference

Read More