In the age of the internet, people are used to being able to surf the web instantaneously. But as some of you may know, that cannot always be the case. As the speed of the central processing unit (CPU) starts to increase, the gap between the CPU speed and main memory begins to widen and performance comes to a slow. In order to combat this issue, cache memory
, also known as CPU memory or CPU cache, was created. Cache memory stores frequently used data and allows the CPU to access that data from the main memory quicker. It is the fastest memory, but cache memory has a lower capacity than other types of memory.
Most modern server Central Processing Units
have three independent caches. The instruction cache that speeds up executable instruction fetch; a data cache that speeds up data fetch and store; and a translation lookaside buffer (TLB) that is used to speed up virtual-to-physical address translation for executable instructions and data. The TLB is not directly related to the CPU caches, it is part of the memory management unit (MMU).
Computer cache memory
is divided into three levels. Level 1 (L1) cache, or primary cache, is the smallest and is the first one to be searched by the CPU. If the instructions are not found in L1, Level 2 (L2) is searched. L2 cache, or secondary cache, has more space than L1 cache. Level 3 (L3) cache, or main memory, is larger and slower than L1 and L2 but is still double the speed of RAM.
If the cache has the information that the CPU needs already loaded onto it, it is called a cache hit. If there is a failure in reading or writing the data in the cache, it is called a cache miss; the CPU will then access the main memory, and this takes longer. There are three types of cache misses: instruction read misses, data read misses, and data write misses.
In addition to cache memory, there are other ways to increase the memory of a computer or system. For example, increasing RAM and/or ROM
, utilizing OTP memory, or even adding external memory to the system will help.