Edge Computing in IoT

Spread the love

Edge computing is a distributed information technology (IT) architecture in which client data is processed at the periphery of the network, as close to the originating source as possible. The move toward edge computing is driven by mobile computing, the decreasing cost of computer components and the sheer number of networked devices in the internet of things(IoT). Depending on the implementation, time-sensitive data in an edge computing architecture may be processed at the point of origin by an intelligent device or  sent to an intermediary server located in close geographical proximity to the client.  Data that is less time sensitive is sent to the cloud for historical analysis, big data analytics and long-term storage.

Edge computing allows data produced by internet of things (IoT) devices to be processed closer to where it is created instead of sending it across long routes to data centers or clouds.

Doing this computing closer to the edge of the network lets organizations analyze important data in near real-time – a need of organizations across many industries, including manufacturing, healthcare, telecommunications and finance.

“In most scenarios, the presumption that everything will be in the cloud with a strong and stable fat pipe between the cloud and the edge device – that’s just not realistic,” says Helder Antunes, senior director of corporate strategic innovation at Cisco.

This on-device approach helps reduce latency for critical applications, lower dependence on the cloud, and better manage the massive deluge of data being generated by the IoT. An example of this trend is the recently announced Nest Cam IQ indoor security camera, which uses on-device vision processing to watch for motion, distinguish family members, and send alerts only if someone is not recognized or doesn’t fit pre-defined parameters. By performing computer vision tasks within the camera, Nest reduces the amount of bandwidth, cloud processing, and cloud storage used versus the alternative of sending raw streams of video over the network. In addition, on-device processing improves the speed of alerts while reducing chances of annoying, recurrent false alarms.

The ability to do advanced on-device processing and analytics is referred to as “edge computing.” Think of the “edge” as the universe of internet-connected devices and gateways sitting on the field — the counterpart to the “cloud.” Edge computing provides new possibilities in IoT applications, particularly for those relying on machine learning for tasks such as object detection, face recognition, language processing, and obstacle avoidance.

The rise of edge computing is an iteration of a well-known technology cycle that begins with centralized processing and then evolves into more distributed architectures. The internet itself started with a limited number of connected mainframes in government facilities and universities — it didn’t reach mass scale and affordability until “dumb” terminals that interfaced with mainframes were replaced by more capable PCs, which were able to render the graphics-rich pages of an emerging world wide web. Likewise, the mobile revolution largely accelerated when smartphones substituted feature phones at the edge of the cellular network. Edge computing will have a similar effect on the IoT, fueling strong ecosystem growth as end devices become more powerful and capable of running sophisticated applications.

Edge computing delivers tangible value in both consumer and industrial IoT use cases. It can help reduce connectivity costs by sending only the information that matters instead of raw streams of sensor data, which is particularly valuable on devices that connect via LTE/cellular such as smart meters or asset trackers. Also, when dealing with a massive amount of data produced by sensors in an industrial facility or a mining operation for instance, having the ability to analyze and filter the data before sending it can lead to huge savings in network and computing resources.

Security and privacy can also be improved with edge computing by keeping sensitive data within the device. For example, new retail advertising systems and digital signage are designed to deliver targeted ads and information based on key parameters set on field devices, such as demographic information. Edge computing in these solutions helps protect user privacy by anonymizing, analyzing, and keeping the data at the source rather than sending identifiable information to the cloud.

Processing at the edge also reduces latency and makes connected applications more responsive and robust. Avoiding device-to-cloud data round trips is critical for applications using computer vision or machine learning — for instance, an enterprise identity verification system or a drone tracking and filming its owner or an object. On-device machine learning can enhance natural language interfaces as well, allowing smart speakers to react more quickly by interpreting voice instructions locally, run basic commands such as turning lights on/off, or adjust thermostat settings even if internet connectivity fails. Moreover, edge computing brings “future proofing” to these systems by allowing over-the-air updates for the device software and the list of local commands it can run.

The proliferation of machine learning for IoT applications is a powerful driver for increased edge compute capabilities. Devices not only need to run complex deep learning networks quickly, they need to do so while consuming very little power since many IoT devices run on battery. This is prompting adoption of heterogeneous compute architectures — integrating diverse engines such as CPUs, GPUs and DSPs — in IoT devices so that different workloads are assigned to the most efficient compute engine, thus improving performance and power efficiency. In fact, DSPs have shown a 25X improvement in energy efficiency and an 8X improvement in performance versus running the same workloads on a CPU.

With edge computing, the opportunity for system architects is to learn how to harness the benefits of the available distributed computing power from end to end — tapping into the capabilities of field devices, gateways, and cloud altogether. Edge devices are being created with increasingly sophisticated compute capabilities. Couple that with not-so-far-off advanced connectivity technologies such as 5G, which will deliver faster, more robust, and massive connectivity, and it becomes obvious that we are about to witness the emergence of a new breed of smart devices and applications. It’s truly a fascinating time to watch and participate in this space.

The Edge Computing Consortium identifies the following:

  • Predictive maintenance
    • Reducing costs
    • Security assurance
    • Product-to-service extension (new revenue streams)
  • Energy Efficiency Management
    • Lower energy consumption
    • Lower maintenance costs
    • Higher reliability
  • Smart manufacturing
    • Increased customer demands mean product service life is dramatically reduced
      • Customization of production modes
      • Small-quantity and multi-batch modes are beginning to replace high-volume manufacturing
    • Flexible device replacement
      • Flexible adjustments to production plan
      • Rapid deployment of new processes and models

Transmitting massive amounts of raw data over a network puts tremendous load on network resources. In some cases, it is much more efficient to process data near its source and send only the data that has value over the network to a remote data center. Instead of continually broadcasting data about the oil level in a car’s engine, for example, an automotive sensor might simply send summary data to a remote server on a periodic basis. Or a smart thermostat might only transmit data if the temperature rises or falls outside acceptable limits. Or an intelligent Wi-Fi security camera aimed at an elevator door might use edge analytics and only transmit data when a certain percentage of pixels significantly change between two consecutive images, indicating motion.

Edge computing can also benefit remote office/branch office (ROBO) environments and organizations that have a geographically dispersed user base. In such a scenario, intermediary micro data centers or high-performance servers can be installed at remote locations to replicate cloud services locally, improving performance and the ability for a device to act upon perishable data in fractions of a second. Depending upon the vendor and technical implementation, the intermediary may be referred to by one of several names including edge gateway, base station, hub, cloudlet or aggregator.

 A major benefit of edge computing is that it improves time to action and reduces response time down to milliseconds, while also conserving network resources. The concept of edge computing is not expected to replace cloud computing, however. Despite its ability to reduce latency and network bottlenecks, edge computing can pose significant security, licensing and configuration challenges.

Security challenges: Edge computing distributed architecture increases the number of attack vectors. The more intelligence an edge client has, the more vulnerable it becomes to malware infections and security exploits.

Licensing challenges: Smart clients can have hidden licensing costs. While the base version of an edge client might initially have a low ticket price, additional functionalities may be licensed separately and drive the price up.

Configuration challenges: Unless device management is centralized and robust, administrators may inadvertently create security holes by failing to change the default password on each edge device or neglecting to update firmware in a consistent manner, causing configuration drift.

The name “edge” in edge computing is derived from network diagrams; typically, the edge in a network diagram signifies the point at which traffic enters or exits the network. The edge is also the point at which the underlying protocol for transporting data may change. For example, a smart sensor might use a low-latency protocol like MQTT to transmit data to a message broker located on the network edge, and the broker would use the hypertext transfer protocol (HTTP) to transmit valuable data from the sensor to a remote server over the Internet.

The OpenFog consortium uses the term fog computing to describe edge computing. The word “fog” is meant to convey the idea that the advantages of cloud computing should be brought closer to the data source. (In meteorology, fog is simply a cloud that is close to the ground.) Consortium members include Cisco, ARM, Microsoft, Dell, Intel and Princeton University.

  Edge Computing Software :

As part of GE’s own digital transformation journey, it created Predix—an applications and services platform developed for industrial companies by an industrial company, spanning cloud and edge devices to improve productivity, deliver higher uptime, and drive down costs.

GE Digital’s Predix Machine software unlocks new value from machine-based insights that drive increased operational agility while reducing risk and preserving machine investments. It is designed to connect, run, and manage applications in close proximity to the source of the data—the physical industrial machines and assets.

Predix Machine allows industrial companies to track, manage, and communicate with all network edge devices anytime, anywhere. In concert with Predix Cloud, Predix Machine allows for advanced monitoring and diagnostics, machine performance optimization, proactive maintenance, and operational intelligence. This gives industrials the flexibility to manage and process machine data wherever it makes the most sense for optimal operation—at the edge, in the cloud or a combination of the two.

Predix Machine can process and route data to Predix Cloud with reliable and secure cloud connectivity, enabling cloud applications to process, analyze, and act on data generated from connected devices without having to manage any infrastructure. Predix Machine meets unique security, privacy, and data governance regulations and policies for companies around the world.

Edge computing is a “mesh network of micro data centers that process or store critical data locally and push all received data to a central data center or cloud storage repository, in a footprint of less than 100 square feet,” according to research firm IDC.

It is typically referred to in IoT use cases, where edge devices would collect data – sometimes massive amounts of it – and send it all to a data center or cloud for processing. Edge computing triages the data locally so some of it is processed locally, reducing the backhaul traffic to the central repository.

Typically, this is done by the IoT devices transferring the data to a local device that includes compute, storage and network connectivity in a small form factor. Data is processed at the edge, and all or a portion of it is sent to the central processing or storage repository in a corporate data center, co-location facility or IaaS cloud.

Edge computing security

There are two sides of the edge computing security coin. Some argue that security is theoretically better in an edge computing environment because data is not traveling over a network, and it’s staying closer to where it was created. The less data in a corporate data center or cloud environment, the less data there is to be vulnerable if one of those environments is comprised.

The flip side of that is some believe edge computing is inherently less secure because the edge devices themselves can be more vulnerable. In designing any edge or fog computing deployment, therefore, security must be a paramount. Data encryption, access control and use of virtual private network tunneling are important elements in protecting edge computing systems.

Edge computing terms and definitions

Like most technology areas, edge computing has its own lexicon. Here are brief definitions of some of the more commonly used terms

  • Edge devices: These can be any device that produces data. These could be sensors, industrial machines or other devices that produce or collect data.
  • Edge: What the edge is depends on the use case. In a telecommunications field, perhaps the edge is a cell phone or maybe it’s a cell tower. In an automotive scenario, the edge of the network could be a car. In manufacturing, it could be a machine on a shop floor; in enterprise IT, the edge could be a laptop.
  • Edge gateway: A gateway is the buffer between where edge computing processing is done and the broader fog network. The gateway is the window into the larger environment beyond the edge of the network.
  • Fat client: Software that can do some data processing in edge devices. This is opposed to a thin client, which would merely transfer data.
  • Edge computing equipment: Edge computing uses a range of existing and new equipment. Many devices, sensors and machines can be outfitted to work in an edge computing environment by simply making them Internet-accessible. Cisco and other hardware vendors have a line of ruggedized network equipment that has hardened exteriors meant to be used in field environments. A range of compute servers, converged systems and even storage-based hardware systems like Amazon Web Service’s Snowball can be used in edge computing deployments.
  • Mobile edge computing: This refers to the buildout of edge computing systems in telecommunications systems, particularly 5G scenarios.

Three motivating factors for using Edge Computing :

We have pinpointed three main motivating factors for using Edge Computing:

  1. Preserve privacy

Data captured by IoT devices can contain sensitive or private information, e.g., GPS data, streams from cameras, or microphones. While an application might want to use this information to run complex analytics in the Cloud, it is important that, whenever data leaves the premises where it is generated, the privacy of sensitive content is preserved. With Edge Computing, an application can make sure that sensitive data is pre-processed on-site, and only data that is privacy compliant is sent to the Cloud for further analysis, after having passed through a first layer of anonymizing aggregation.

  1. Reduce latency

The power and flexibility of Cloud computing has enabled many scenarios that were impossible before. Think about how the accuracy of image or voice recognition algorithms has improved in recent years. However, this accuracy has a price: the time needed to get an image or a piece of audio recognized is significantly affected by the non-negligible yet unavoidable network delays due to data being shipped to the Cloud and results computed and sent back to the edge. When low-latency results are needed, Edge Computing applications can implement machine-learning algorithms that run directly on IoT devices, and only interact with the Cloud off the critical path, for example, to continuously train machine learning models using captured data.

  1. Be robust to connectivity issues

Designing applications to run part of the computation directly on the Edge not only reduces latency, but potentially ensures that applications are not disrupted in case of limited or intermittent network connectivity. This can be very useful when applications are deployed on remote locations where network coverage is poor or even to reduce costs coming from expensive connectivity technologies like cellular technologies.

Figure 1: An example of Edge Computing architecture

MeenaG Staff

Internet of Things Enthusiast

Leave a Reply