Cloud computing has two meanings to it. The most popular applies to run workloads remotely over the internet in a private provider’s data center, also known as the “public cloud” model. The second definition of cloud computing explains how it works: a virtualized pool of resources, from raw processing power to application features, accessible on demand. Cloud computing depends on sharing resources to achieve coherence and cost-effectiveness. “Pay as you go” model is typically used by cloud providers, as it can lead to unexpected operating costs if administrators are not familiar with cloud pricing models. The development of high-capacity networks, low-cost computers, and storage devices, and the widespread adoption of virtualization of hardware, service-oriented architecture, and autonomic and utility computing have made cloud computing expand.
All these can be called cloud computing roots. The Cloud Service Provider (CSP) will screen, store and collect firewall data, intrusion, or/or counteractive action frameworks, and stream information within the network. Although service-oriented architecture advocates “everything as a service,” cloud-based providers offer their “services” according to different models. Cloud architecture, the system architecture of the software systems used in cloud computing delivery, usually includes several cloud components that communicate with each other through a loose coupling mechanism, such as a messaging queue. Elastic provision suggests intellect as applied to mechanisms such as these and others in the use of a tight or loose coupling.
Everything changed when cloud computing came into the image. Beginning with AWS and introducing their Elastic Compute Cloud (EC2), Amazon came and began experimenting with cloud computing technologies in 1999. Instead, around 2005, companies realized that it made more economic sense to purchase pooled computing services from the cloud as it needed less man-hours and effort and, in effect, had more power and was cheaper. Cloud computing as a tool has been around for quite a long time now and it has evolved dramatically in the way companies used it. It began as a simple hosting service and grew into IaaS, PaaS, and SaaS. Looking at the latest situation, cloud computing of broad-scale is now open to everyone.
A. EDGE COMPUTING
Edge computing is a networking concept that seeks to get computation as close to the data source as possible to reduce latency and bandwidth usage. In a simpler way, edge computing means running fewer processes in the cloud and transferring those processes to local locations, such as a user’s computer, an IoT system, or an edge server. Bringing computation to the edge of the network minimizes the amount of long-distance contact between a client and server that may occur. Cloud computing operates on big data while edge computing operates on “instant data” which is created by sensors or users in real-time. Speed and Latency, Stability, Cost savings, Greater reliability, and Scalability are some of the major benefits of Edge computing.
B. FOG COMPUTING
Fog computing or fog networking, also termed as fogging, is an architecture that uses edge devices to perform a large amount of local and internet-backbone computing, storage, and communication. Fog computing can be perceived both as large cloud systems and big data structures, it makes reference to the growing difficulties in accessing information objectively. This results lack in the quality of the obtained content. The effect of fog computing in cloud computing and big data system can vary a lot accordingly.
The architecture of fog and edge computing addresses its properties, discrepancies, and hardware requirements. Such two paradigms do not have widely accepted meanings, although there are several scattered examples. Such confusion possibly lies in the fact that both architectures have the same goal: to get the computation closer to the data source. The low latency performance required a shift to the classic cloud architecture, where data are sent far to be processed geographically, thus raising the network delay
Data is processed and generated in the edge computing architecture inside the same computer or in the external device at the end of the same network. Whereas data is processed outside the network in the Fog computing model but at a geographical position very similar to the origin of the data.
Edge / Fog Hardware-It involves three types of devices in edge computing: devices producing raw data, devices receiving computed data, devices providing computational power, or other services. Instead, the Fog network paradigm only includes servers, as all data are generated and used in the levels below:
1) Edge Devices: These devices can be categorized into three main classes: constrained devices, single-board computers, and mobiles. The constrained devices category includes an IoT device used for different objectives, such as demotics, surveillance, automotive, and many more. Class y of restricted devices covers any IoT system used for different purposes, such as demotics, surveillance, automotive, and many more. Such tools do not support technologies for virtualization. Because of constraints such as production costs and physical constraints such as height, weight, available power, and energy. Single-board computers, provide complete support for the virtualization of hardware as well as applications. Mobile devices, although they have an architecture that offers complete virtualization support, do not yet provide any implementation that supports it.
2) Edge/fog servers: The systems that can be used as fog and edge can be divided into three major groups: general-purpose servers, modern solutions specifically designed for Edge / Fog specifications, and other solutions designed for different applications. Current Edge / Fog servers, known as cloudlets, are miniaturized versions of cloud servers, primarily hierarchical. CPU based system with one or more GPU coprocessors
|SL NO:||FOG COMPUTING AND CLOUD COMPUTING|
|1||Awareness of location||no||yes|
|2||Security Issues||Not defined||defined|
|4||Server nodes||Very few||Large|
- FOG COMPUTING: Cloud computing is vulnerable to many security threats, mainly because of its computing framework and centralized data storage. Its security has become a crucial issue that restricts its development. Since fog computing inherits many features of cloud so does it inherits risks also? This cannot be perceived as entirely secure.
There are many threats in fog computing that may be exploited by attackers :
- FORGERY – the network attackers imitate their identities by generating fake information. This may harm the system due to the falsified data packet.
- SPAM – unwanted data including redundant information, fake data are created and used by attackers which may lead to misleading contacts.
- EAVESDROPPING – attackers get control over transmission channels to read or listen to contents without the user’s consent.
- DENIAL OF SERVICE – a network attacker sends fake data towards fog nodes and floods them with a large number of fake requests to make them unavailable for genuine users. These intrusions consume network resources like battery, bandwidth, thus degrading performance.
- JAMMING – network attackers flood a large amount of data packets to jam or consume the transmission channels in order to restrict legitimate users.
- IMPERSONATION – a network attacker behaves like a genuine server and provides fake or malicious service to users by creating an impression of being genuine fog nodes or servers.
- EDGE COMPUTING: Similarly in fog computing problems, edge computing also has security problems such as Eavesdropping, Denial of service(discussed in fog computing), and data tampering attack, which means the hacker or attacker can tamper the data sent during communication or the data kept in the storage.
- The weak protection credential renders the device vulnerable to unauthorized users.
- Unsafe interaction amongst clients.
- Restoration and backup of data during device shutdown.
PRIVACY IN FOG COMPUTING
Privacy is a serious issue in fog computing as users data is collected, processed, transmitted, and shared over fog nodes, the user’s privacy includes :
- Identity Privacy – a user identity includes basic attributes such as name, address, mobile number, visa id, etc. and these identities are vulnerable to get exposed to getting authentications of fog nodes by providing all this information.
- Data Privacy – Different types of vital information of users may get exposed while communicating on fog nodes to an untrusted party E.g.: address, preferences, etc…
- Usage Privacy – It means the template used by the customer for fog services. e.g. users sleeping time, time when the user is not at home can be revealed by smart meter reading which violates the users’ privacy.
- Location Privacy – Most of the applications use users’ current and saved locations. Users have to sacrifice location privacy at times to enjoy the best services.
It’s difficult to manage everything with the cloud . Latency mobility, geographic, network bandwidth, reliability, security, and privacy issues are at stake. Nor can everything run at edge of smart endpoints due to energy, space, capacity, environmental, reliability, modularity, and security challenges.
Fog computing addresses these gaps by bridging the cloud-to-thing continuum. It distributes computing, communication, control, storage, and decision making closer to where data originates, enabling dramatically faster processing times and lowering network costs. To reduce costs monitoring services are used in the cloud platform, learn more about it here.