E-Commerce Holiday Showdown

Distributed Cache Accelerates Time to Largest Contentful Paint
Logo - Fortune 100 Retailer Logo
A pair of hands with a credit card in one hand and typing on a laptop.


One of the world's largest e-commerce companies has a catalog too extensive for leading CDN providers to adequately service. Seeking a long-tail cache solution for website acceleration, the retailer deployed HarperDB as a secondary cache layer to deliver pages evicted from their CDN.


Preparing for the upcoming 2023 holiday season, a Fortune 100 retailer embarked on a mission to have its product pages outrank industry titan Amazon for coveted top spots on Google Search. However, there was a catch—their pages had to load faster than Amazon's, which wielded the unparalleled might of AWS's global infrastructure. 

A formidable challenge lay ahead, considering the retailer had already pushed their primary Content Delivery Network (CDN) solution to its limits. Undeterred, the retailer's SEO team sought more advanced solutions to improve page load times. Due to the retailer's extensive product catalog, over 40% of page lookups resulted in a CDN cache miss. These misses hampered search engine rankings and revenue potential.

This case study dives into how this major retailer implemented a HarperDB cache to secure the winning position for their product pages. 

Highlighted Performance Metrics


In August 2023, Akamai approached HarperDB to help accelerate page load times for one of their largest retail customers. The metric to focus on was the average time to LCP (Largest Contentful Paint), which required low-latency delivery of hero image metadata that allowed those images to preload, speeding up LCP. Akamai CDN was already delivering these image hints for approximately 58% of product page loads. However, the remaining 42% of page loads were not accelerated due to the limitations of Akamai’s shared CDN infrastructure.

Delivering image hint metadata for the outstanding 42% of page requests required holding tens of millions of keys in a long-tail cache so that even the most infrequently visited pages would perform well when search engine crawlers came upon them.  

HarperDB was the ideal technology to meet this need, given its built-in application engine, in-memory caching capabilities, and ease of distribution. Ultimately, the retail team utilized HarperDB as a secondary layer behind Akamai's Ion CDN, which was already delivering a lightning-fast response for most requests. 

Diagram showing how the HarperDB cache layer is positioned in the retailer's infrastructure.
HDB Cache Diagram


In September 2023, the solution was deployed to eight Akamai Connected Cloud locations across North America. Within days, the HarperDB cache layer held over 85 million keys in memory while delivering a P50 lookup time of 0.36 ms (millisecond) and a P95 of 1.3 ms. Most importantly, the retailer saw a 50ms average improvement in time to LCP, representing a notable leap for their already impressive infrastructure.

Further, the retailer saw a 30% decrease in origin requests, with over 550 million page loads accelerated by HarperDB, helping to increase revenue while reducing origin server load even during the busiest time of year. 

What’s Next

Impressed by the rapid time to value and high performance delivered by HarperDB, the retailer is exploring broader implementations of HarperDB to help streamline their systems, reduce costs, and improve performance.  

Image collage of person talking on the phone
Image collage of person talking on the phone

Request a Call

Consult a solutions architect for tailored recommendations and insights.
Request received! We'll reach out shortly. In the meantime, check out our Dev Center for helpful development resources.
Go to Dev Center
Uh-oh! It seems your submission failed to submit. Please try again.