Edge Caching

Edge caching is a mechanism content delivery networks (CDNs) use to cache Internet content in different locations around the world. Examples include website data, cloud storage, and streaming media. By storing copies of files in multiple "edge" locations, a CDN can deliver content to users more quickly than a single server can.

CDNs may have tens or hundreds of global data centers. Each data center contains edge servers that intelligently serve data to nearby users. In most cases, edge servers "pull" data from an origin server when users request content for the first time. Once an edge server pulls an image, video, or another object, it caches the file — typically for a few days or weeks. The CDN then serves subsequent requests from the edge server rather than the origin.

Tiered Caching

A CDN may automatically propagate newly pulled content to all servers or wait for each server to request the data. Automatic propagation reduces trips to the origin server but may result in unnecessary duplication of rarely-accessed files. Waiting for local requests is more efficient but increases trips to the origin server. Modern CDNs use tiered caching to balance the two methods. The first time a user accesses a file, the CDN caches it on the local edge server and several "primary data centers." It reduces unnecessary propagation of cached files and limits trips to the origin server.

CDNs provide customizable cache settings, such as how frequently to check for updated files or when to let cached files expire. Most have a "purge" feature, which allows webmasters to remove old content from all edge servers at once. Purging files is useful when updating static assets, such as CSS documents and image files.

Edge Caching Benefits

Edge caching reduces latency, providing faster and more consistent delivery of Internet content to users around the world. For example, a user in Sydney, Australia, may experience a two-second delay when accessing a server in Houston, Texas. If the data is cached in Australia, the delay may be less than one-tenth of a second.

While the primary purpose of edge caching is to improve content delivery speed, it also provides two other significant benefits: bandwidth reduction and redundancy.

Edge caching reduces Internet bandwidth by shortening the distance data needs to travel to each user. Local edge servers reduce Internet congestion and limit traffic bottlenecks. Replicating data across multiple global data centers also provides redundancy. While CDNs typically pull data from an origin server, individual data centers can serve as backups if the origin server fails or becomes inaccessible.

Updated April 23, 2022 by Per C.

quizTest Your Knowledge

Which organization assigns blocks of IP addresses to ISPs, businesses, and institutions?

A
NSF
0%
B
ICANN
0%
C
GNOME
0%
D
SCA
0%
Correct! Incorrect!     View the ICANN definition.
More Quizzes →

The Tech Terms Computer Dictionary

The definition of Edge Caching on this page is an original definition written by the TechTerms.com team. If you would like to reference this page or cite this definition, please use the green citation links above.

The goal of TechTerms.com is to explain computer terminology in a way that is easy to understand. We strive for simplicity and accuracy with every definition we publish. If you have feedback about this definition or would like to suggest a new technical term, please contact us.

Sign up for the free TechTerms Newsletter

How often would you like to receive an email?

You can unsubscribe or change your frequency setting at any time using the links available in each email.

Questions? Please contact us.