Least frequently used: Difference between revisions
Guy Harris (talk | contribs) Get rid of capitalization where it doesn't belong. |
Adding short description: "Algorithm for caching data" |
||
Line 1: | Line 1: | ||
{{Short description|Algorithm for caching data}} |
|||
{{Use dmy dates|date=November 2022}} |
{{Use dmy dates|date=November 2022}} |
||
'''Least Frequently Used''' ('''LFU''') is a type of [[cache algorithm]] used to manage [[Cache (computing)|memory]] within a computer. The standard characteristics of this method involve the system keeping track of the number of times a [[Page (computer memory)|block]] is [[Reference (computer science)|referenced]] in memory. When the cache is full and requires more room the system will purge the item with the lowest reference frequency. |
'''Least Frequently Used''' ('''LFU''') is a type of [[cache algorithm]] used to manage [[Cache (computing)|memory]] within a computer. The standard characteristics of this method involve the system keeping track of the number of times a [[Page (computer memory)|block]] is [[Reference (computer science)|referenced]] in memory. When the cache is full and requires more room the system will purge the item with the lowest reference frequency. |
Revision as of 10:05, 30 July 2023
Least Frequently Used (LFU) is a type of cache algorithm used to manage memory within a computer. The standard characteristics of this method involve the system keeping track of the number of times a block is referenced in memory. When the cache is full and requires more room the system will purge the item with the lowest reference frequency.
LFU is sometimes combined with a Least Recently Used algorithm and called LRFU.[1]
Implementation
The simplest method to employ an LFU algorithm is to assign a counter to every block that is loaded into the cache. Each time a reference is made to that block the counter is increased by one. When the cache reaches capacity and has a new block waiting to be inserted the system will search for the block with the lowest counter and remove it from the cache, in case of a tie (i.e., two or more keys with the same frequency), the Least Recently Used key would be invalidated.[2]
- Ideal LFU: there is a counter for each item in the catalogue
- Practical LFU: there is a counter for the items stored in cache. The counter is forgotten if the item is evicted.
Problems
While the LFU method may seem like an intuitive approach to memory management it is not without faults. Consider an item in memory which is referenced repeatedly for a short period of time and is not accessed again for an extended period of time. Due to how rapidly it was just accessed its counter has increased drastically even though it will not be used again for a decent amount of time. This leaves other blocks which may actually be used more frequently susceptible to purging simply because they were accessed through a different method.[3]
Moreover, new items that just entered the cache are subject to being removed very soon again, because they start with a low counter, even though they might be used very frequently after that. Due to major issues like these, an explicit LFU system is fairly uncommon; instead, there are hybrids that utilize LFU concepts.[4]
See also
References
- ^ Donghee Lee; Jongmoo Choi; Jong-Hun Kim; Noh, S.H.; Sang Lyul Min; Yookun Cho; Chong Sang Kim. LRFU: a spectrum of policies that subsumes the least recently used and least frequently used policies. IEEE Transactions on Computers
- ^ Silvano Maffeis. Cache Management Algorithms for Flexible Filesystems. ACM SIGMETRICS Performance Evaluation Review, Vol. 21, No. 3
- ^ William Stallings. Operating Systems: Internals and Design Principles 7th Edition. 2012
- ^ B.T. Zivkoz and A.J. Smith. Disk Caching in Large Database and Timeshared Systems. IEEE MASCOTS, 1997
External links
- An O(1) algorithm for implementing the LFU cache eviction scheme, 16 August 2010, by Ketan Shah, Anirban Mitra and Dhruv Matani