Edge computing moves towards full autonomy
- 14 April, 2022 10:30
Edge computing is rapidly shedding its reputation as a fringe concept, and both adopters and vendors are focusing their sights on the technology's next goal: fully autonomous deployment and operation.
The edge deployment experience is drawing closer to the simplicity of unboxing a new mobile phone, said Teresa Tung, cloud first chief technologist at IT advisory and consulting firm Accenture.
"We're seeing automated technology that simplifies handling the edge’s unique complexity for application, network, and security deployments," Tung said.
The ability to create and manage containerised applications enables seamless development and deployment in the cloud, with the edge simply becoming a specialised location with more stringent resource constraints, Tung said.
"Self-organising and self-healing wireless mesh communications protocols, such as Zigbee, Z-Wave, ISA100.11a, or WirelessHART can create networks where devices can be deployed ad hoc and self-configure," Tung added.
The decentralisation of IT environments to encompass edge systems comes with specific challenges, said Matteo Gallina, principal consultant with global technology research and advisory firm ISG.
"Management of devices and services has to be done outside the traditional management sphere, including managing physically inaccessible devices, a high variance of solutions and operating systems, different security requirements, and more," he said. "The larger and more disperse the systems get, the more significant the role automation plays to ensure effectiveness and reliability."
Automation technology innovation led by open source communities
The trend toward automating edge deployments is not unlike the journey into AI, where innovations are led by open source groups, infrastructure manufacturers, and cloud service providers, Tung said. She noted that open source communities — such as LF Edge — are leading innovations and building critical standards definitions in areas such as communication, security, and resource management.
"Infrastructure providers are creating solutions that allow compute to be run anywhere and embedded in anything," Tung said. "It includes new hardware capabilities that are ultra-low power, ultra-fast, connected anywhere, and ultra-secure and private.
"5G opens new opportunities for network equipment providers and telecom operators to innovate with both private and public networks with embedded edge compute capabilities."
At the same time, cloud provider innovations are making it easier to extend centralised cloud DevOps and management practices to the edge.
"Just like [the] central cloud makes it easy for any developer to access services, we are now seeing the same thing happening for technologies like 5G, robotics, digital twin, and Internet of Things (IoT)," Tung said.
Software-defined integration of multiple network services has emerged as the most important technology approach to automating edge deployments, said Ron Howell, managing network architect at Capgemini Americas.
Network security, equipped with zero trust deployment methods incorporating SASE edge features, can significantly enhance automation, and simplify what it takes to deploy and monitor an edge compute solution. Additionally, when deployed, full stack observability tools and methods that incorporate AIOps will help to proactively keep data and edge compute resources available and reliable.
AI applied to the network edge is now widely viewed as the leading way forward in network edge availability," Howell said. "AIOps, when used in the form of full-stack observability is one key enhancement."
A variety of options are already available to help organisations looking to move toward edge autonomy.
"These begin with physical and functional asset onboarding and management, and include automated software and security updates, and automated device testing," Gallina explained.
If a device works with some form of ML or AI functionality, AIOps will be needed, both at the device level to keep the local ML model up-to-date — and ensure that correct decisions are made in any situation — as well as within any backbone ML/AI that might be located on premises or in centralised edge systems.
Physical and digital experiences come together at the edge
Tung uses the term "phygital" to describe the result when digital practices are applied to physical experiences, such as in the case of autonomous management of edge data centres.
"We see creating highly personalised and adaptive phygital experiences as the ultimate goal," she noted. "In a phygital world, anyone can imagine an experience, build it and scale it."
In an edge computing environment that integrates digital processes and physical devices, hands-on network management is significantly reduced or eliminated to the point where network failure and downtime is automatically detected and resolved, and configurations are applied consistently across the infrastructure, making scaling simpler and faster.
Automatic data quality control is another potential benefit.
"This involves a combination of sensor data, edge analytics, or natural language processing (NLP) to control the system and to deliver data on-site," Gallina said. Yet another way an autonomous edge environment can benefit enterprises is with “zero touch” remote hardware provisioning remotely at scale, with the OS and system software downloaded automatically from the cloud.
Gallina noted that a growing number of edge devices are now packaged with dedicated operating systems and various other types of support tools.
"Off-the-shelf edge applications and marketplaces are starting to become available, as well as an increasing number of open-source projects," he said.
Providers are working on solutions to seamlessly manage edge assets of almost any type and with any underlying technology. Edge-oriented, open-source software projects, for example, such as those hosted by the Linux Foundation, can further drive scaled adoption, Gallina said.
AI-optimised hardware is an up-and-coming edge computing technology, Gallina said, with many products offering interoperability and resilience.
"Solutions and services for edge data collection—quality control, management, and analytics—are likely to expand enormously in the next few years: just as cloud native applications have done," he added.
AI on edge automation leaders include IBM, ClearBlade, Verizon, hyperscalers
Numerous technologies are already available for enterprises considering edge automation, including offerings from hyperscaler developers and other specialised providers. One example is KubeEdge, which offers Kubernetes, an open source system for automating the deployment, scaling, and management of containerised applications.
Gallina notes that in 2021 ISG ranked system integrators Atos, Capgemini, Cognizant, Harman, IBM, and Siemens as global leaders in AI on edge technology. Among the leading edge computing vendors are the hyperscalers (AWS, Azure, Google), as well as edge platform providers ClearBlade and IBM. In the telco market, Verizon stands out.
Edge-specific features deliver autonomy and reliability
Vendors are building both digital and physical availability features into their offerings in an effort to make edge technology more autonomous and reliable. Providers generally use two methods to provide autonomy and reliability: internal sensors and redundant hardware components, Gallina said.
Built-in sensors, for example, can use on-location monitoring to control the environment, detect and report anomalies, and may be combined with fail-over components for the required level of redundancy.
Tung lists several other approaches:
- Physical tamper-resistant features designed to protect devices from unauthorised access.
- Secure identifiers built into chipsets allowing the devices to be easily and reliably authenticated.
- Self-configuring network protocols, based on ad hoc and mesh networks, to ensure connectivity whenever possible.
- Partitioned boot configurations so that updates can be applied without the risk of bricking devices if the installation goes wrong.
- Hardware watchdog capabilities to ensure that devices will automatically restart if they become unresponsive.
- Boot time integrity checking from a secure root of trust, protecting devices against malicious hardware installation.
- Trusted compute and secure execution environments to ensure approved compute runs on protected and private data.
- Firewalls with anomaly detection that pick up unusual behaviours, indicative of emerging faults or unauthorised access.
Self-optimisation and AI
Networks require an almost endless number of configuration settings and fine tuning in order to function efficiently.
"Wi-Fi networks need to be adjusted for signal strength, firewalls need to be constantly updated with support for new threat vectors, and edge routers need constantly changing configurations to enforce service level agreements (SLAs)," said Patrick MeLampy, a fellow at Juniper Networks. "Nearly all of this can be automated, saving human labour, and human mistakes."
Self-optimisation and AI are needed to operate at the edge and determine how to handle change, Tung said. What, for instance, should happen if the network goes down, power goes out, or a camera is misaligned? And what should happen when the problem is fixed?
"The edge will not scale if these situations require manual interventions every time," she warns. Issue resolution can be addressed by simply implementing a rule to detect conditions and prioritise application deployment accordingly.
The edge is not a single technology, but a collection of technologies working together to support an entirely new topology that can effortlessly connect data, AI, and actions, Tung said "The biggest innovations are yet to come," she added.
Meanwhile, the pendulum is swinging toward more numerous but smaller network edge centres located closer to customer needs, complimented by larger cloud services that can handle additional workloads that are less time sensitive, less mission-critical, and less latency-sensitive, Howell said.
He noted that the one factor that remains immutable is that information must be highly available at all times, "this first rule of data centres has not changed — high quality services that are always available."