Lucy Kerner, Director, Security Global Strategy and Evangelism, Red Hat.
News & Events

Managing risks and performance at the Edge

Despite being a relatively new technology term, edge computing has already established itself. It has not taken long for businesses to understand the benefits of locating their compute services right where applications are running.

Speed is the obvious one. Software works faster and so can do more when it does not have to reach back to a datacentre located hundreds, even thousands, of miles away.

But not everyone is an advocate. For years, businesses have been taught that IT security is about centralising operations. Anything at the edge feels intuitively risky. More devices mean a larger attack surface; and though each may be mini environments, none are isolated.

At some point they will need to send data and information back to their datacentres, and vice-versa. That means nodes and connections, and that creates openings to exploit.

So how do businesses reconcile these risks of edge computing with the irrefutable benefits? The answer is to approach security as part of a holistic edge strategy, and not in opposition to it.

For years, businesses have been taught that IT security is about centralising operations

In other words, bake security into your architecture from the start, and the edge is merely an extension of your environment, as secure and resilient as the centre. Security that enables, rather than compromises.

Two things matter—the systems you run at the edge; and the network that connects them together, and to your core systems. Consistency is key to both. Standard security protocols and processes make everything easier to manage, and so safer.

Anything at the edge feels intuitively risky

But the best edge devices tend to be built with a very specific task in mind, and so often come from multiple vendors. Deploying them is eclecticism by design, the opposite of standardisation.

Step forward the hybrid cloud, to serve as the common platform on which to build your edge stack as an extension of your core infrastructure. It is here where security standards are set; OS security, ID and access controls, vulnerability management and data encryption, to name a few. And all consistent with the container and Kubernetes toolkits employed to maintain innovation at the edge.

More devices mean a larger attack surface

Next comes securing the network. Increasingly businesses are using third-party SD-WAN technology to manage their expanding networks. They should offload their network security to these Managed Security Service Provider, MSSP experts, who are building increasingly sophisticated SASE, Secure Access Service Edge solutions.

SASE tackles edge security at scale by integrating SD-WAN and security into a cloud service, in partnership with various security vendors. It debunks the misconception that edge computing means relinquishing central control, instead allowing security teams to look across their entire network from a single pane of glass.

From this central console every aspect of security policy, threat prevention and attack remediation can be defined, monitored and executed. Consistency is coupled with automation to enhance a posture further. It transforms edge security from a dislocated, perimeter activity to a core tenet of a strategy.

SASE tackles edge security at scale by integrating SD-WAN and security into a cloud service

Businesses should see these two aspects—the secure hybrid cloud and the secure network—as parallel priorities.


Two things matter, the systems you run at the edge, and the network that connects them together, and to your core systems.

Lucy Kerner, Director, Security Global Strategy and Evangelism, Red Hat.
Lucy Kerner, Director, Security Global Strategy and Evangelism, Red Hat.