Akamai Enables Microservices Deployment at the Edge

13Oct - by aiuniverse - 0 - In Microservices

Source: devops.com

Akamai today announced it is adding the ability to deploy microservices on its edge computing platform to enable developers to run latency-sensitive applications faster.

David Theobald, principal product manager at Akamai, said developers can now code dynamic content assembly at the edge to create microservices on the Akamai Intelligent Edge platform that include external requests and the ability to manipulate response bodies. That approach provides developers with the primitives required to run edge computing applications on the equivalent of a serverless computing framework, he said.

In addition, Akamai is adding EdgeWorker reporting and debugging tools for JavaScript applications deployed on the edge of its content delivery platform (CDN) and updated its application programming interfaces (APIs) to make it easier and faster to deploy “cloudlet” applications.

Finally, Akamai also has updated its Akamai Image and Video Manager software-as-a-service (SaaS) offering by adding a video status optimization API through which IT teams can track the status of videos as they are being optimized by the Akamai CDN.

While CDNs have been widely employed for decades, they are now evolving into platforms for deploying edge computing applications that are developed and deployed using best DevOps practices. Rather than having to replicate the IT infrastructure already put in place by a CDN provider, many organizations are opting to essentially consume IT infrastructure as a service as they deploy edge computing applications alongside existing web applications.

Competition is already fierce among CDN providers seeking to leverage points of presence around the globe to enable IT teams to deploy applications closer to the point where data is generated and consumed. Many of these edge computing applications are at the core of digital business transformation initiatives that require data to be processed in near real-time. As such, processing that data in a local data center creates too much latency.

Of course, cloud service providers have also extended the range of services they provide to include CDN services. It’s not clear to what degree IT organizations will prefer to leverage those cloud services versus the CDN capabilities many of them already rely on to deploy web applications.

Regardless of the approach, the number of IT teams building, deploying and maintaining their own edge computing infrastructure is likely to be reduced sharply in the wake of the COVID-19 pandemic. IT teams are trying to reduce the potential risk to their IT staff by limiting travel as much as possible, which makes putting IT personnel on a plane to install infrastructure an option of last resort.

Developers, in the meantime, are building more applications faster than ever, thanks mainly to DevOps processes that enable them to work from home. As the deployment backlog for edge computing applications builds, the need to rely more on external services becomes that much more pressing. It may be a while before most DevOps teams routinely view CDNs as just another target platform for deploying applications, but in many cases, that day is already here.

Facebook Comments