EDGE COMPUTING
Cloud computing involves accessing hosted services like data storage, servers, databases,
networking, and software through the internet. These services are managed on physical servers
maintained by a cloud provider. It enables users to utilize computing resources, particularly
storage and processing power, on demand without needing to oversee the underlying
infrastructure.
Limitations of Cloud Computing:
Latency: In the traditional cloud computing model applications send data to the data Centre and
obtain a response, which increases the system latency. For e.g. High speed autonomous driving
vehicles require milliseconds of response time.
Bandwidth: Transmitting large amount of data generated by edge devices to the cloud in real
time manner will cause great pressure on bandwidth.
Availability: As more and more Internet services are deployed on the cloud, the availability of
the services has become an integral part of daily life. Therefore, it is a big challenge for cloud
service providers to keep the 24*7 promise.
Energy: With the increasing amount of computation and transmission energy consumption will
become a bottleneck restricting the development of cloud computing centers.
Edge Computing
Edge computing refers to a distributed computing paradigm where data processing happens
closer to its source, like sensors or IoT devices, rather than relying solely on a centralized cloud
server, enabling faster response times and reduced latency by minimizing data transmission
distances.
Edge computing architecture moves computing resources from clouds and data centers as close
as possible to the originating source. The main goal of edge computing is to reduce latency
requirements while processing data and saving network costs.
At its simplest, edge computing brings computing resources, data storage, and enterprise
applications closer to where the people actually consume the information. This reduces latency,
enhances efficiency, and improves data security by minimizing the need for long-distance data
transmission. Unlike traditional cloud computing, edge architecture decentralizes processing
tasks. It is particularly beneficial in scenarios requiring real-time data analysis.
Edge computing is driven by several key factors that make it essential for modern
digital infrastructure:
1. Low Latency & Real-Time Processing
Applications like autonomous vehicles, industrial automation, gaming, and augmented
reality (AR) require real-time decision-making. Edge computing reduces the delay by
processing data closer to the source instead of sending it to distant cloud data centers.
2. Bandwidth Optimization
With the explosion of IoT devices and high-definition video streaming, transmitting all
data to the cloud is costly and inefficient. Edge computing filters and processes only the
most relevant data locally, reducing bandwidth usage.
3. Reliability & Resilience
Edge computing ensures uninterrupted operations even in cases of poor or intermittent
connectivity. This is crucial in remote locations, factories, and mission-critical
applications where cloud access might not always be available.
4. Data Privacy & Security
Processing sensitive data locally reduces exposure to cyber threats and minimizes
regulatory compliance risks, as it avoids transmitting personal or confidential information
over networks.
5. Scalability & Cost Efficiency
Instead of continuously upgrading central cloud servers, organizations can distribute
processing power across edge devices, making scaling more efficient and cost-effective.
6. Growth of IoT & AI Workloads
The rise of smart devices, industrial IoT, and AI applications requires edge computing for
faster insights and automated actions without relying on centralized data centers.
7. 5G Deployment
5G networks complement edge computing by enabling ultra-fast data transfer, making
real-time processing at the edge more viable for applications like smart cities, connected
healthcare, and remote monitoring.
Key Scenarios for Edge Computing
Edge computing is revolutionizing industries by enabling real-time data processing closer to the
source. Here are some key scenarios where it plays a critical role:
1. Smart Manufacturing & Industrial IoT (IIoT)
Predictive Maintenance: Sensors analyze equipment performance in real time to prevent
failures.
Quality Control: AI-driven vision systems detect defects on production lines.
Process Automation: Edge computing optimizes factory operations for efficiency and
cost savings.
2. Autonomous Vehicles & Transportation
Real-Time Navigation & Safety: Edge processing enables instant decision-making in
self-driving cars.
3. Smart Cities & Infrastructure
Traffic Management: AI at the edge optimizes signal timing based on real-time
congestion data.
Public Safety & Surveillance: Edge-powered cameras detect security threats instantly.
Environmental Monitoring: IoT sensors analyze air quality and noise pollution for
urban planning.
4. Healthcare & Remote Patient Monitoring
Wearable Devices & Telemedicine: Edge AI enables real-time health tracking and
emergency alerts.
Smart Hospitals: Edge computing enhances medical imaging and patient data processing
for faster diagnosis.
5. Retail & Customer Experience
Smart Checkout & Inventory Management: Edge devices enable cashier-less shopping
and automated restocking.
Personalized Shopping Experiences: AI-driven edge analytics tailor promotions to
customers in real time.
6. Energy & Utilities
Smart Grid Optimization: Edge analytics balance energy supply and demand
efficiently.
Oil & Gas Monitoring: Edge sensors detect leaks, optimizing maintenance in remote
areas.
7. Agriculture & Precision Farming
Smart Irrigation: Edge sensors optimize water usage based on real-time soil and
weather data.
Livestock Monitoring: Wearable sensors track animal health and behavior.
8. Gaming & Augmented Reality (AR) / Virtual Reality (VR)
Cloud Gaming: Reduces lag by processing data on edge servers near players.
Immersive AR/VR Experiences: Enables seamless, real-time interaction in gaming and
training applications.
EDGE ARCHITECTURE
Edge computing architecture is typically structured into three layers. The cloud layer is
responsible for extensive data processing, storage, and management. The edge layer
processes data closer to the source, enabling near real-time responses and reducing
latency. Lastly, the device layer consists of sensors and devices that collect data and
handle basic processing tasks.
Edge architecture consists of multiple components, each essential for efficiently managing and
processing data at a local level.
Edge Devices refer to IoT devices, sensors, and other hardware that generate data at the
network’s edge.
Edge Nodes serve as local processing units, performing initial data processing and analysis
before transmitting relevant information to central servers.
Edge Servers are strategically positioned near edge devices to locally store and process data,
minimizing the need for long-distance data transmission to centralized data centers.
Edge Gateways act as intermediaries, regulating data traffic between edge devices and edge
servers to optimize data flow and processing.
Communication Networks provide the necessary connectivity between edge devices, nodes,
and servers, enabling smooth data transfer and system integration.
CLOUD LAYER :
While edge computing was developed to tackle network congestion and latency issues often
associated with cloud computing, the cloud still plays a crucial role within the overall edge
computing architecture. Rather than replacing each other, cloud and edge computing work
together as complementary technologies.When necessary, edge servers transfer data to the cloud
for more complex computations. Additionally, critical data is also sent to the cloud for storage
and in-depth analysis, highlighting the seamless integration between the cloud and edge layers.
EDGE LAYER :
The edge layer primarily consists of edge servers, which are more numerous and widely
distributed compared to those in the cloud layer. This broad deployment allows the edge layer to
process data closer to its source, effectively reducing latency issues commonly associated with
cloud computing.This layer plays a central role in handling data from the device layer before
transmitting it to the cloud for further processing and analysis. In cases where the edge layer
cannot fully process certain data, it is sent to the cloud to ensure comprehensive analysis and
data integrity.
DEVICE LAYER :
Among the three layers, the device layer consists of the largest number of devices. These range
in size from small personal devices like mobile phones and computers to large-scale systems
such as buses and industrial machinery. All these devices play a key role in the device layer by
using sensors to collect and capture data that help products fulfill their intended functions. For
example, hospital equipment monitors patients' vital signs, while autonomous vehicles gather
data on surrounding traffic. While the cloud and edge layers have greater computing power,
devices in the device layer can still handle basic data analysis, processing, and storage tasks that
require minimal computing resources. Additionally, they process data closest to the source,
enabling near real-time responses.
Each layer of the architecture's structure manages data flow and computing demands in a unique
way.
Device Layer (Edge Devices)
The end user and IoT devices that generate data are included in this tier. The sensors, cameras,
and embedded processors on these devices allow for simple data collection and analysis.
Examples include wearable technology, smart cameras, industrial robots, IoT sensors, and
driverless cars.
Data generation, basic processing, and transmission to edge nodes.
Edge Layer (Edge Nodes & Gateways)
This layer acts as a bridge between cloud infrastructure and edge devices, processing, filtering,
and aggregating data in real time before forwarding pertinent information to the cloud.
Edge nodes are local computing units that process and analyze data at the edge of the network;
they are frequently embedded systems or small-scale servers.
Edge Gateways: These manage protocol translation, data routing, and security between edge
servers and edge devices.
Examples of hardware include ruggedized IoT gateways, industrial PCs, and microcontrollers
(such as Raspberry Pi and NVIDIA Jetson).
Edge Servers (Mini Data Centers)
Nearer to the data source, edge servers offer high-performance processing power. They minimize
the need for frequent cloud communication by processing and storing important data locally.
Workload distribution, caching, real-time analytics, and AI inference are important features.
Examples of hardware include dedicated AI accelerators (Google Coral, Intel Movidius), edge-
optimized GPUs (NVIDIA Edge AI), and mini data centers.
Cloud/Data Center Layer
Even while edge computing decreases the need for centralized cloud servers, the cloud is still
useful for complex computing jobs, long-term storage, and large-scale analytics.
Training deep learning models, data backup and regulatory compliance are important functions.
Examples of hardware include distributed storage systems, high-performance GPUs, and
hyperscale cloud data centers.
Cloud/Data Center Layer Long-term storage, deep analytics, AI model training,
large-scale processing.
Edge Server Layer Local storage, AI inference, real-time analytics, caching,
workload distribution
Edge Nodes & Gateway Initial processing, filtering, aggregation, protocol
Layer translation, security management.
Device Layer IoT sensors, cameras, autonomous vehicles,industrial
robots, wearables, edge AI chips.
Key Factors for Edge Computing
Computing Performance: Utilizes AI accelerators, GPUs, and CPUs designed for fast,
low-latency processing.
Network Connectivity: Supports multiple communication protocols such as Wi-Fi, 5G,
Ethernet, and LPWAN to ensure reliable data transmission.
Power Efficiency: Features energy-efficient processors and advanced cooling systems to
optimize performance in edge environments.
Security Measures: Incorporates hardware-based encryption, Trusted Platform Modules
(TPM), and secure boot mechanisms to protect data and devices.