Solving the Edge Puzzle

A breakdown of the expected and unexpected building blocks required to build optimized distributed systems.

Introduction: The Edge Puzzle

The Complexities of Replicating the Cloud Developer Experience at the Edge

How to Approach Application Selection, Deployment, and Management for the Edge

How to Approach Application Selection, Deployment, and Management for the Edge

During the cloud computing era, many application creators turned to complementary CDN technology to boost performance, security, and scalability. Today, they are turning to the edge for the next logical iteration. Placing parts of an application at the edge has obvious performance benefits but it also adds complexity to the simplicity of the cloud by adding an additional and discrete delivery layer.

With the demand for faster user experiences being driven by emerging and evolving use cases, application creators are increasingly looking to offload more services to the edge. At the same time, application operations teams are looking to simplify their delivery stacks. Bringing more of the application delivery cycle into a single cohesive edge delivery solution can achieve both of these goals concurrently.

While cloud providers have the flexibility to support a diverse range of workloads, developers working in the cloud are limited to a single provider’s network, or are responsible for managing workload orchestration across multiple providers. CDNs, meanwhile, may have expansive global networks of infrastructure, but they are typically unable to support general purpose workloads beyond basic content delivery.

Content Delivery Networks (CDNs) are often thought of as the first evolution of edge computing. However, content delivery encompasses only a small subset of all edge workloads. As the diversity of edge workloads has expanded beyond content delivery, existing solutions fall short in terms of what they’re able to support. Many CDNs were built around open source technologies, such as Varnish Cache and ModSecurity. Typically, they have customized the code base so much over the years that developers using them are locked into “black box”, proprietary solutions that don’t offer the flexibility and control necessary to fit the unique requirements of each application.

Furthermore, growth in adoption of container technology and serverless functions has completely changed the game, leaving many legacy CDNs unequipped to support modern applications. With Kubernetes becoming the preferred container orchestration platform, edge solutions built on Kubernetes are significantly better positioned to support the needs of modern developers.

Emerging and Evolving Edge Use Cases:

The Challenge: The Complexities in Moving Diverse Workloads to the Edge

Now, let’s take a deeper dive into some of the complexities involved in moving more diverse workloads to the edge, including selection, deployment and ongoing management.

Web Application Firewalls (WAFs) & Bot Management Tooling

DevOps teams are increasingly choosing to deploy WAFs and bot mitigation tools across a distributed architecture, with the goal of detecting and mitigating threats faster. Managing a WAF or bot mitigation deployment across a multi-cloud/edge network is no simple feat, however.

While many best-in-class WAF and bot management technologies have emerged – providers such as Wallarm, Snapt (WAF), ThreatX, Signal Sciences, Radware Bot Manager, and PerimeterX – most legacy CDNs still don’t give developers the option of deploying third-party solutions. Fastly, for example, recently acquired Signal Sciences, recognizing the need for more advanced WAF technology beyond their own proprietary solution.

We often speak with developers who are frustrated with the “black box”, built-in solutions of legacy CDNs, and demand more choice and flexibility.

Image Optimization Beyond the simple caching of images, developers, especially in the e-Commerce sector, are increasingly seeking out image optimization solutions, such as Optidash, that optimize and transform images on-the-fly.

Image optimization benefits include:

Just as with security solutions, most legacy CDNs don’t support third-party software that specializes in point solutions. What’s more, if you’re operating a multi-cloud/edge environment, you will have to install and manage these types of image optimization tooling across the entire network.

Modern Testing & Experience Optimization

Marketers, product managers, developers and others need the ability to effectively test and optimize applications across the client side, server-side, single page application (SPA), mobile, redirects, and so on. Conventional A/B testing solutions use JavaScript tags to manipulate content on applications, which reduces site performance with flicker and increased latency.

Modern tools like SiteSpect, however, rethink this model by sitting in the flow of HTTP traffic. This allows them to support multiple user experience optimization techniques, including client-side, serverside, redirects, and SPA optimization.

Legacy CDNs can’t support this new architectural model and therefore require extra hops in the HTTP delivery chain, ironically negating many of the performance benefits they are aiming to solve. Now, let’s take a deeper dive into some of the complexities involved in moving more diverse workloads to the edge, including selection, deployment and ongoing management.

Quickand Flexible Deployment options

Greatest Performance Benefits

Additional Options

Load Balancing Solutions

While most hyperscalers and edge providers offer load balancing, these solutions are often restricted to their own environments. Therefore, if you migrate your application to a different cloud or data center, the hyperscaler or edge provider’s proprietary load balancer won’t be able to follow.

In the instance of a traditional load balancer that is being deployed to the cloud, you need to use a virtual appliance. If you then decide to use a load balancer in a second cloud, that virtual appliance will need to be re-configured again… and so on for every cloud or data center it operates in. There is no communication between these two appliances. In this instance, you are operating two (or more) separate clouds that your teams will need to manage separately.

Organizations that use multi-cloud/edge networks are then faced with having to separately configure, monitor and manage delivery and security for each distinct environment. Similarly, for any application that changes hosting location, adjustments must be made on an individual basis. This not only increases complexity, but takes up valuable resources and limits much of the flexibility that is supposedly a key benefit of a multi-cloud/edge model.

Containers: Challenges of Orchestration

In a small environment with only a handful of systems, managing and automating orchestration is fairly straightforward, but when an enterprise has thousands of individual systems that interact with each other on some level, orchestration automation is both powerful and essential.

Containers are lightweight by definition with a low footprint, making them perfect candidates for running on edge devices. The main reason machine learning models leverage containers is because legacy devices can still interact with cloud services like AI/ML to achieve fast computation in-place.

Containers can be deployed to the device of your choosing and can be built using the architecture of your choice so long as it can run the container runtime. Updating containers in-place is simple, particularly when orchestration solutions like Kubernetes are used.

Consider SaaS providers who traditionally offered on-premise or single point of presence installations. As customers increasingly demand distributed deployment models, SaaS providers are faced with the build vs. buy dilemma.

The management of these complex clusters of devices, services, and networks can get highly complicated very quickly.

Serverless Computing for Edge Computing

Serverless computing, also called function as a service (FaaS), enables the execution of event-driven logic without the burden of managing the underlying infrastructure. The name ‘serverless’ is characterized by the freedom that it gives developers to focus on building their applications without having to think about provisioning, managing, and scaling servers.

The concept of serverless was originally designed for cloud environments, eliminating the ‘always-on’ model to save on resource consumption, among other benefits. In recent years, advances in edge computing technology have led more developers to migrate serverless workloads to the edge. The benefits of serverless at the edge, when compared to alternatives like containers and VMs, include lighter resource consumption, improved cost efficiencies, code portability, and speed of deployment.

However, not all workloads are suitable for serverless models and it’s important to understand the requirements of a given workload when determining the most appropriate deployment model. Considerations such as code dependencies, cold starts and their effect on performance, security, and resource requirements are critical when designing edge architectures.

The Solution: Partner with an Edge as a Service Provider

The right EaaS provider, for instance, won’t lock developers into specific software choices for security or image optimization tooling, allowing for “best of breed” selection. EaaS can also support distributed deployment of more advanced workloads, making it easier to integrate testing and experience optimization solutions like SiteSpect into your edge stack.

An Edge as a Service (EaaS) provider can help overcome many of the complexities involved in application deployment and management at the edge.

One of the most important areas that Edge as a Service can make more straightforward is load balancing across multi-cloud/edge networks. EaaS providers may offer automated load balancer options, which automatically migrates traffic and workloads so that developers don’t have to do this on an individual basis.

Similarly, Edge as a Service providers can help streamline containerization and serverless deployments by (i) containerizing applications and accelerating the developer path to the edge and (ii) offering flexible language support that allows developers to simply ship code and offload the responsibilities of deployment, management, and scaling of the underlying infrastructure to the edge compute platform.

The Complexities of Building and Operating Edge Networks and Infrastructure

Conclusion: Overcoming the Complexities

A scalable, and secure solution for deploying and managing containerized applications on public, private or multi-cloud infrastructure.

Designed as a PaaS (Platform-as-a-Service) to run high-performing applications at the lowest cost by proactively monitoring compute resource utilization and automatically rightsizing infrastructure.

Purpose built to enable businesses to optimize cloud-agnostic application delivery in dense containerized environments.

Webscale CloudFlow combines three core patented technologies: Kubernetes Edge Interface (KEI) for simplified container management, Adaptive Edge Engine (AEE) for AI-enabled workload placement, and an available Composable Edge Cloud (CEC) for scalable compute capacity across the cloud vendors of your choice.