Data in primary memory can be accessed faster than secondary memory but still, access times of primary memory are generally in a few microseconds, whereas the CPU is capable of performing operations in nanoseconds. Due to the time lag between accessing data and acting on data performance of the system decreases as the CPU is not utilized properly, it may remain idle for some time. In order to minimize this time gap new segment of memory is Introduced known as Cache Memory.
It is based on principle of locality of reference, which refers to the observation that program tries to access a relatively small portion of their address space at any given time, and repeatedly tries to access some portion of the memory. For ex: In fees department of your college, transactions are accessed frequently to check on the dues.
Key Features of Cache Memory
- Speed: Faster than the main memory (RAM), which helps the CPU retrieve data more quickly.
- Proximity: Located very close to the CPU, often on the CPU chip itself, reducing data access time.
- Function: Temporarily holds data and instructions that the CPU is likely to use again soon, minimizing the need to access the slower main memory.
Role of Cache Memory
The role of cache memory is explained below,
- Cache memory plays a crucial role in computer systems.
- It provide faster access.
- It acts buffer between CPU and main memory(RAM).
- Primary role of it is to reduce average time taken to access data, thereby improving overall system performance.
Benefits of Cache Memory
Various benefits of the cache memory are,
- Faster access: Faster than main memory. It resides closer to CPU , typically on same chip or in close proximity. Cache stores subset of data and instruction.
- Reducing memory latency: Memory access latency refers to time taken for processes to retrieve data from memory. Caches are designed to exploit principle of locality.
- Lowering bus traffic: Accessing data from main memory involves transferring it over system bus. Bus is shared resource and excessive traffic can lead to congestion and slower data transfers. By utilizing cache memory , processor can reduce frequency of accessing main memory resulting in less bus traffic and improves system efficiency.
- Increasing effective CPU utilization: Cache memory allows CPU to operate at a higher effective speed. CPU can spend more time executing instruction rather than waiting for memory access. This leads to better utilization of CPU’s processing capabilities and higher overall system performance.
- Enhancing system scalability: Cache memory helps improve system scalability by reducing impact of memory latency on overall system performance.
Working of Cache Memory
In order to understand the working of cache we must understand few points:
- Cache memory is faster, they can be accessed very fast
- Cache memory is smaller, a large amount of data cannot be stored
Whenever CPU needs any data it searches for corresponding data in the cache (fast process) if data is found, it processes the data according to instructions, however, if data is not found in the cache CPU search for that data in primary memory(slower process) and loads it into the cache. This ensures frequently accessed data are always found in the cache and hence minimizes the time required to access the data.