Introduction to Cache Memory
Cache memory is a random access memory (RAM) that a computer microprocessor can access more quickly than it can access regular RAM. As the microprocessor processes data, it looks first in the cache memory and if it finds the data from a previous reading of data, it does not have to do the more time-consuming reading of data from larger memory. Cache memory is sometimes described in levels of closeness and accessibility to the microprocessor. An L1 cache is on the same chip as the microprocessor. For an example, the PowerPC 601 processor has a 32 kilobyte level-1 cache built into its chip. L2 is usually a separate static RAM (SRAM) chip. The main RAM is usually a dynamic RAM (DRAM) chip. In addition to cache memory, one can think of RAM itself as a cache of memory for hard disk storage since all of RAM's contents come from the hard disk initially when you turn your computer on and load the operating and later as you start new applications and access new data. RAM can also contain a special area called a disk cache that contains the data most recently read in from the hard disk. Characteristics of Cache Memory
Cache memory is a component that improves performance by transparently storing data such that future requests for that data can be served faster. Cache memory is usually built inside a CPU or board chip facilitate the frequently used commands. Other than that, most frequently used code or data in the memory is also kept in cache but it is invisible to software. When the program accesses this code or data, it comes from high speed cache rather than from slower main memory. The data that is stored within a cache might be values that have been computed earlier or duplicates of original values that are stored elsewhere.
Cacheonix is a caching API which helps to handle increasing volume of critical data when developing Java applications. Caching is the ways for improving performance, concurrence and scalability in...
Please join StudyMode to read the full document