What is cache memory?
Cache memory is a small-sized volatile computer memory that provides high-speed data access to a processor and stores frequently used computer programs, applications and data. Cache memory also called CPU memory, which is placed between random access memory (RAM) and a computer microprocessor. It can be accessedquicker by microprocessor than regular RAM.
In the 1980s, the idea took hold that a small amount of more expensive, faster SRAM could be used to improve the performance of the less expensive, slower main memory. Initially, the memory cache was separate from the system processor and not always included in the chipset. Early PCs typically had from 16 KB to 128 KB of cache memory.
It is designed to speed up the transfer of data and instructions. The data and instructions are retrieved from RAM when the CPU uses them for the first time. A copy of that data or instructions is stored in a cache. The next time the CPU needs that data or instructions, it first looks in a cache. If the required data is found there, it is retrieved from cache memory instead of main memory. It speeds up the working of CPU.
The purpose of cache memory is to store program instructions and data that are used repeatedly in the operation of programs or information that the CPU is likely to need next. The computer processor can access this information quickly from the cache rather than having to get it from the computer's main memory. Fast access to these instructions increases the overall speed of the program.
A computer can have several different levels of cache memory. The level numbers refer to distance from CPU where Level 1 is the closest. All levels of cache memory are faster than RAM. The cache closest to CPU is always faster but generally costs more and stores less data than other levels of cache.
The cache memory works according to various algorithms, which decide what information it has to store. These algorithms work out the probability to decide which data would be most frequently needed. This probability is worked out on the basis of past observations.
In addition to hardware-based cache, cache memory also can be a disk cache, where a reserved portion on a disk stores and provides access to frequently accessed data from the disk.
Cache memory generally tends to operate in a number of different configurations: direct mapping, fully associative mapping and set associative mapping.
Direct mapping features blocks of memory mapped to specific locations within the cache, while fully associative mapping lets any cache location be used to map a block, rather than requiring the location to be pre-set. Set associative mapping acts as a halfway-house between the two, in that every block is mapped to a smaller subset of locations within the cache.
Types of Cache
Primary Cache (L1) - A primary cache is always located on the processor chip. This cache is small and its access time is comparable to that of processor registers.
Secondary Cache (L2) - Secondary cache is placed between the primary cache and the rest of the memory. It is referred to as the level 2 (L2) cache. Often, the Level 2 cache is also housed on the processor chip.
Main Memory (L3) - The L3 cache is larger in size but also slower in speed than L1 and L2, its size is between 1MB to 8MB.In Multicore processors, each core may have separate L1 and L2, but all core share a common L3 cache. L3 cache double speed than the RAM.
Graphics processing chips often have a separate cache memory to the CPU, which ensures that the GPU can still speedily complete complex rendering operations without relying on the relatively high-latency system RAM.