Edge Computing Platform - What is Edge Computing? - CDNetworks

Edge Computing Platform

최적화된 빠른 속도의 사용자 경험 제공

에지 컴퓨팅은 매우 낮은 지연과 고대역폭/고성능 컴퓨팅을 보장합니다.

씨디네트웍스 Edge Computing Platform (ECP)은 고객이 컨테이너 기반 애플리케이션을 손쉽게 배포 및 확장하여 증가하는 비즈니스 요구 사항을 충족할 수 있게 합니다. 이를 통해 대기 시간이 매우 짧고 고 대역폭/성능의 컴퓨팅을 구현할 수 있습니다. ECP는 고성능 컴퓨팅, 스토리지 및 네트워크 리소스를 최종 사용자에게 최대한 가깝게 제공합니다. 이렇게하면 데이터 전송 비용이 절감되고 대기 시간이 줄어들며 지역성이 증가됩니다. ECP는 Kubernetes 및 Docker에 구축된 컨테이너 오케스트레이션 시스템으로, 고객은 컨테이너 기반 애플리케이션을 한 번 구현하여 어디서나 배포할 수 있습니다.

ECP 무료 티어 프로그램

가입하시고 $500의 크레딧을 받으세요

ECP 리소스
블로그에서 자세히 보기
블로그에서 자세히 보기

ECP 무료 티어 프로그램

가입하시고 $500의 크레딧을 받으세요

ECP 리소스
블로그에서 자세히 보기

서비스 특장점

1500+ PoP

글로벌 네트워크 거점

씨디네트웍스는 컨테이너 기반 애플리케이션을 빠르게 확대할 수 있는 탁월한 확장성 제공

50+ Tbps

광대역폭

트래픽 폭주 시에도 높은 성능과 가용성을 보장하는 집계 대역폭

< 50ms

매우 짧은 지연 시간

엣지와 앤드 포인트 사이의 고속 애플리케이션 프로세싱 및 커뮤니케이션

초저지연의 보장을 위한 PoP 범위 분산

TCP 프로토콜과 호환

자동 배포, 자동 망 복구, 규모 자동 조정, 애플리케이션 모니터링 및 리포팅

종합적 기술 지원

Edge Computing Platform 솔루션

ECP는 에지에서 컨테이너 인스턴스와 Kubernetes(K8s) 컨테이너 관리를 위한 컴퓨팅, 네트워크 밍 스토리지 리소스를 제공하는 서비스로의 인프라(IaaS)입니다.

컴퓨팅

CPU
메모리

네트워크

퍼블릭 IPv4 및 IPv6 네트워크 인터페이스
정적 IP
로드 밸런싱

스토리지

고성능 로컬 SSD 영구 스토리지

기능

애플리케이션 자동 배포

개발자는 Pod를 지정할 때 각 컨테이너가 필요로 하는 리소스를 임의로 지정할 수 있습니다. 쿠버네티스는 어느 노드에 Pod를 배치할지를 요청 및 미리 정의된 스케줄 정책 및 선호도에 근거하여 자동으로 결정하는 스케줄러를 실행합니다. 수동으로 애플리케이션 계획을 수립할 필요가 없습니다.

자동 망 복구

쿠버네티스 스케줄러는 고장을 일으킨 컨테이너를 재기동하고, 노드가 죽으면 컨테이너를 교체 및 재스케줄링하고, 어떠한 상태 점검에도 반응하지 않는 컨테이너를 제거합니다.

자동 롤링 업데이트

배포 컨트롤러는 개발자의 애플리케이션 롤아웃과 롤백 업무를 쉽게 만들어 줍니다.

수평 포드 자동 크기 조정 (Horizontal Pod Autoscaling)

CPU 및 메모리 등의 리소스 사용량에 따라 자동으로 애플리케이션의 규모를 확대/축소합니다.

What is Edge Computing?

Edge computing is a network philosophy that aims to bring computing power, memory and storage as close to the end users as possible. The “edge” refers to the edge of the network, the location where the network’s servers can deliver computing functionalities to customers most expediently.

Instead of relying on a server at a centralized location like a data center, edge computing moves processing physically closer to the end user. The computation is done locally, like on a user’s computer, an IoT device or an edge server.

Edge computing minimizes the amount of long-distance communication that has to happen between a client and a centralized cloud or server.  This results in less delay, or latency, faster response times and bandwidth usage.

How Edge Computing Works

Edge computing works by allowing data from the local devices to be analyzed at the edge of the network they are in, before being sent to centralized cloud or edge cloud ecosystem. A network of data centers, servers, routers and network switches distributed across the globe processes and stores data locally and each can replicate its data to other locations. These individual locations are called Points of Presence (PoP). Edge PoPs are physically closer to the device, unlike cloud servers which could be far away.

Traditionally, organizations ran multiple applications on physical servers. There was no easy way to allocate resources to all applications to ensure they all performed equally well. Then came virtual machines (VM) which allowed applications to be isolated for better utilization of a server’s resource on the same hardware infrastructure.

Containers are similar to VMs, except that they can share the operating system (OS) among the applications. This makes containers portable across clouds and OS distributions. Developers can bundle and run applications effectively and in an agile manner, with no downtime.

In fact, the open-source platform Kubernetes helps developers automate much of the management of container applications. For example, it allows you to distribute network traffic in case one container is receiving high traffic, automate rollouts and rollbacks, restart containers that fail, check on their health and more.

Developers can deploy applications on the edge by building pods – small units of computing that group together one or more containers with shared storage and network resources. Kubernetes or K8s as they are called, can be deployed on every edge PoP to allow developers to build these pods on the edge themselves.

Consider a cloud gaming company with users from across the world accessing graphics-intensive content to their devices from a centralized cloud. The game has to respond to users’ keystrokes and mouse action and the data must travel to and from the cloud in milliseconds or less. This continual interactivity requires immense computing power to be stored, fetched and processed by the company’s servers. Additionally, modern cloud-gaming requires 5G networks because of the stable ultra-low latency it promises.

The greater the distance to the servers, the more the data has to travel and the higher the chances of latency and jitter. This could lead to delays and a poor gaming experience for users.

By moving the computing closer to the edge and the users, data travels the minimum possible distance and players have a latency-free experience. This makes the actual user devices like a console or personal computer irrelevant. Running the data workloads at the edge thereby making it possible to render graphically intensive video and creating a better gaming experience overall., and also helps the company do away with the costs of running a centralized infrastructure.

Why is Edge Computing Important for Privacy & Security?

Edge computing does come with some security concerns. Since the edge nodes are closer to the end users, edge computing often deals with large volumes of highly sensitive data. If this data leaks, there can be serious concerns about privacy violations.

As more IoT and connected devices join the edge network, the potential attack surface also expands. The devices and users in the edge computing environment could also be moving. This makes it difficult to design security rules to thwart attacks.

One approach to ensure security with edge computing is to minimize the processing done on the devices themselves. The data can be collected from the device, packaged and routed to an edge node for processing. This may not always be possible though, such as when sensors on self-driving cars or building-automation systems need to process data and make decisions in real-time.

Encryption of data at rest and in transit can help address some of the security concerns with edge computing. This way, even if the data from the devices is leaked, they will not be able to decipher any personal information.

The edge devices can also differ in their requirements for power, electricity and network connectivity. This raises concerns about their availability and what happens when one of the nodes go down. Edge computing addresses this using Global Server Load Balancing (GSLB), a technology which distributes traffic among the several different edge nodes. So when one node is overwhelmed and about to go down, others can take over and continue to fulfil user requests.

How Does Edge Computing Differ From Cloud Computing?

Cloud computing is a technology that allows for the delivery of storage, applications and processing power on an on-demand service basis over the internet. In the early days of computing, businesses had to set up data centers, hardware and other computing infrastructure to run their applications. This meant upfront costs, managing complexity and spending manpower on maintaining the infrastructure, all of which multiplied with scale.

Cloud computing essentially lets businesses “rent” access to data storage and applications from cloud service providers. The providers will be responsible for owning and managing the centralized applications in their data centers while businesses only pay based on their usage of these resources. Edge computing is different in that the applications and computation is moved closer to users.

Stateless VS Stateful

Another crucial difference between cloud computing and edge computing lies in how they handle stateful and stateless applications.

Stateful applications are those that store information on previous transactions. Online banking or email are examples, where new transactions are performed in context to what has happened before. Since these applications need to store more data about their state, they are better suited to be stored on the conventional cloud.

Stateless applications are those that don’t store any information in reference to past transactions. For example, entering a query in a search engine is a stateless transaction. If the search is interrupted or closed, you will start a new one from scratch. The applications which run on the edge are often stateless as they need to be moved around and require less storage and computation.

 

Bandwidth requirements

Cloud computing and edge computing also differ in the bandwidth requirements of the applications they handle. Bandwidth refers to the amount of data that can travel between the user and the servers across the internet. The higher the bandwidth, the greater the impact on the performance of the applications and the resulting costs.  

Since the distance that the data has to travel to a centralized cloud is much more, applications require less bandwidth. When you have applications that require high bandwidth for their performance, edge computing is the way to go.

While edge computing and cloud computing may differ in many aspects, utilizing one does not preclude the use of the other. For example, to address the latency issues in a public cloud model, you can move processing for mission-critical applications closer to the source of the data.

Latency

One of the main differences between cloud computing and edge computing pertains to latency. Cloud computing can introduce latency because of the distance between users and the cloud. The edge infrastructure moves computing power closer to end users to minimize the distance that data has to travel, while still retaining the centralized nature of cloud computing.  Thus edge computing is better for latency-sensitive applications while cloud computing works for those for which latency is not a major concern.

Benefits of Edge Computing

Edge computing helps businesses provide seamless user experiences to their users. Developers can use edge computing platforms to specify the right resources needed for applications and deploy and scale them up as necessary. Here are four ways in which edge computing benefits businesses.

1.   It helps save costs by optimizing bandwidth and cloud resources

As more offices get equipped with smart cameras, printers and other IoT devices, the bandwidth and cloud resources required also increase, driving up costs. Statista predicts that by 2025 there will be over 75 billion Internet of Things devices installed worldwide. To support all those devices, significant amounts of computation will have to be moved to the edge.

2.   It improves performance by reducing latency

When web applications run processes that communicate with an external server, users will encounter delays. The duration of these delays can vary based on the available bandwidth and server location. But if more processes are moved to the network edge through edge computing, such delays can be minimized or avoided altogether. In some cases, the deployment of low latency applications can also be automated based on the resources available.

3.   It helps businesses offer new functionalities

Since edge computing moves computation closer to the source, it allows businesses to deliver new functionalities. Think of a business deployment of a graphics-heavy web page that has elements of augmented reality. Or an autonomous vehicles manufacturer exploring applications that need to process a lot of artificial intelligence algorithms and machine learning capabilities. Relying on sending the data to a centralized source far away is not practical, while edge computing will allow the business to run these processes in real-time.

4.   It ensures high availability for applications

Businesses that provide services over the internet need to ensure continuous availability for their applications. Disruptions can affect customer experience and satisfaction, such as in the case of e-commerce stores. In more critical scenarios such as a refinery gas leakage detection system, the impact of disruptions could be the difference between life or death. Edge computing ensures any disruptions that arise are localized to specific nodes as opposed to the entire network and ensures that applications are always available and running.

High availability is more about to when 1 PoP down GSLB will bring traffic to the other node, so service won’t stop

Edge Computing Use Cases

Unified Streaming, an Amsterdam based streaming services technology provider, was looking for a way to deal with the rising costs of large scale content distribution. They were seeing a rise in CDN cache and cloud storage costs with more video formats, protocols and encryption schemes.

Using CDNetworks Edge Computing Platform, they were able to generate different alternative formats and encodings for streaming in real-time. The result was a 50 percent reduction in cloud egress and CDN caching footprint. Rufael Mekuria, Head of Research and Standardization at Unified Streaming says of the CDNetworks experience, “We were impressed with the ease of use of the ECP and the consistent performance results.”

Edge Computing Platform Free Tier Program

Get 500$ credit, begin to optimize your applications and deliver rapid-speed user experiences.
Get Started

Our Global Network

No matter what industry you’re in, we provide you with a tailored CDN solution to ensure efficient web performance for your global audience.
 
0

Global Pops

0

Bandwidth

0

Global Offices

Trusted by global leading companies across key industries

CDNetworks is a leading Global Content Delivery Network. We provide innovative and custom-tailored solutions to businesses across key industries.