How LinkedIn is using Azure Front Door for edge build-out
LinkedIn recently described how it uses the edge to support the popular professional networking web site. The engineering team wrote a blog post to talk about edge deployments and its migration away from an in-house solution to Azure Edge.
LinkedIn is one of many large web properties that is owned by a hyperscaler cloud provider — Microsoft, in this case — but doesn’t actually use the in-house cloud. YouTube is another example of this — it has yet to be migrated to GCP.
LinkedIn actually had its own quite extensive in-house CDN, with nineteen PoPs (points of presence) in ten countries. This is a common practice for major content owners as it defrays transport costs and improves performance. The nodes can cache content and perform tasks such as SSL offloading. But as LinkedIn noted, Microsoft’s CDN – Azure Front Door – gives access to 300 PoPs around the globe, which is a big step up in terms of reach.
Source: LinkedIn
As well as moving to the Azure CDN, LinkedIn made some network changes such as using Microsoft’s backbone for transporting end-user traffic to LinkedIn’s data centers once it had reached the nearest Azure Front Door (AFD) node. A key draw for Linkedin was being able to piggy- back off AzureFrontDoor’s expansion into Africa, a key target market for LinkedIn, and it has seen median round-trip times at least halved. Microsoft has opened twenty new locations on the continent in places like Kenya, Egypt and Angola.
LinkedIn’s migration wasn’t entirely trouble-free. Some of the company’s third-party API clients were incompatible with AFD because of variations on how the content-length header was used. The most impacted partners were a handful of cloud services, and to solve the problem LinkedIn used DNS to direct their traffic to LinkedIn’s legacy PoPs.
Once the migration was done, LinkedIn was able to garner considerable cost savings by decommissioning its edge PoPs and instead investing in building new origin data center PoPs that terminate AFD traffic within LinkedIn data centers. Another benefit was the reduction in the amount of time its engineers spent maintaining certs and logs.
Article Topics
Azure | CDN | content delivery | latency | LinkedIn | Microsoft
Comments