- Tech know how online


A cache is a very fast memory with relatively small storage capacity, which is arranged as a buffer between the central processing unit( CPU) and the working memory. The purpose of this memory is to speed up access to frequently used program parts and data. Storing and reading is done fully automatically by monitoring the access frequency of the individual memory areas and overwriting the least frequently used areas first. The cache is managed by a special program or directly by the operating system.

Static RAMs( SRAM) are usually usedin cache memories because their access times are shorter than those of Dynamic RAMs( DRAM).

Hierarchies of cache memories

Cache memories are divided into hierarchies and designated with levels depending on the position they occupy in the data stream. For example, a level-0 cache is a small memory, of a few bytes, which decouples the data stream in the central processing unit. A level 1 cache( L1) or first level cache is also an internal cache, as is the level 2 c ache( L2). The Level 3 cache( L3) a separate memory on the motherboard.

Embedding the caches

Embedding the caches

In addition to memory caches, there are also caches for floppy, CD and DVD drives. These caches do not use SRAMs but conventional memory techniques in which the data is temporarily stored.

The mode of operation of cache memories

According to their internal mode of operation, cache memories are divided into asynchronous caches (A cache), synchronous caches (S cache), burst caches (B cache) and pipeline burst caches( PB cache). The asynchronous caches operate asynchronously to the CPU clock and are organized in memory banks like the DRAMs. The synchronous caches operate as SRAMs and are clocked by the CPU clock signal. They are faster than the A caches and have access times of less than 20 ns. After transferring the start address, the B caches can generate the following addresses themselves, which eliminates overhead and increases the data rate. PBSRAMs (Pipelined Burst SRAM) are used as PB caches, which are characterized by a particularly short read cycle.

Cache algorithms for memory management

Cache algorithms for memory management

The internal organization for newly read and read-out memory blocks is determined by cache algorithms. This involves how older data blocks are replaced by newer data blocks. There are several displacement strategies, such as Least Frequently Used( LFU), Least Recently Used( LRU), Most Recently Used( MRU), Not Recently Used( NRU) and Adaptive Replacement Cache( ARC).

An important parameter of cache memories is the so-called hit rate, which represents the frequency of data found in the cache. Cache memories can be part of the microprocessor architecture and have a memory capacity of 8 KB or 16 KB, for example, as in the Pentium processor.

Englisch: cache
Updated at: 12.01.2017
#Words: 444
Links: memory, capacity (C), buffer, central processing unit (CPU), central processing unit (CPU)
Translations: DE

All rights reserved DATACOM Buchverlag GmbH © 2022