What Is The Full Form Of LRU?
The full form of LRU is Least Recently Used.
LRU replaces the line in the cache that had stayed in it the longest without any reference to it. It follows the idea that the recently used blocks will get referenced more likely. LRU gives lesser page faults as compared to other algorithms and is thus the most used among all.
An LRU cache organizes the items based on the order of use so that you can identify the ones that haven’t been used the longest. A commendable observation for an optimal algorithm is that the pages used heavily in the last instructions will most probably come to play in the next few. Conversely, the pages that stayed unused for long will probably stay unused in the next few instructions.
Characteristics Of LRU
- The frequency of use of pages in the previous instructions defines their probable use in the next instructions. It forms the basis of LRU algorithms.
- The frequently used pages are more likely to be used again in the next few instructions and vice versa.
- When a program requests a page that is not present in the RAM, it shows page fault. If the page frame is full, then we remove the pages not used for the longest.
- One can implement LRU feasibly in a 2-way set associative mapping. It includes a USE bit in each line.
LRU Cache Data Structure
One can use these two data structures for implementing an LRU Cache:
Queue – This data structure is implemented using a doubly-linked list. The max size of any queue is equal to the total number of available frames (size of the cache). The recently used pages stay near the front end, while the least recent ones are at the rear end.
Hash – It contains the address of the corresponding queue node in the form of value and the page number as a key.
Benefits Of LRU
- LRU does not face Belady’s Anomaly, unlike FIFO. One can replace the least recently used page.
- The total number of page faults that LRU gives is much lesser than any algorithm other than optimal. And since one can’t implement optimal in real life, LRU becomes the most frequently used algorithm.
- This algorithm is also very efficient and fast.
- LRU updates the cache in no time every time one accesses an item.
- It catches the store items in the order of most used (recently) to least used (recently).
- LRU is always open for full analysis.
- LRU makes it easy to choose the faulted pages that haven’t been used for long.
Limitations Of LRU
- The overhead is more since one has to keep track of the pages that they referenced.
- LRU requires the implementation of an additional data structure.
- It requires high hardware assistance.
- The implementation process is difficult and may get lengthy.
- LRU is space heavy. A cache that tracks nn items would require a linked list of the length nn along with a hash map that holds nn items.