Date: Around 1999
A question regarding Cache memory.
Can you please explain to be how data structures are used in Cache memory?
i.e. how they are stored, and how they are processed.
The kind of data structures im interested in are Stacks, Queues, and Lists.
Cache memory doesn't know anything about data structures at all, so the
not very useful answer to your question is that data structures are treated
cache memory the same as everything else.
So, perhaps some slightly different but related questions might be more
1. What is cache memory and what is its purpose?
2. How does cache memory work?
3. What does this mean for different types of data structures?
1. First of all cache memory is very fast memory that the processor can
more quickly than main memory or RAM. Currently processors can operate at
much greater than affordable, reliable memory can supply the necessary data.
memory, is much like other memory, except it can operate much faster, and
Cache memory attempts to bridge the gap between fast, expensive memory that
be made in limited quantities, and the large amounts of RAM needed for
applications. By giving the processor a small amount of fast memory to
used, and then
having that memory read in and write to main memory in "spare" time, the
operate at full speed much of the time.
2. The details of how cache memory works vary depending on the different
and processors, so I won't describe the exact details. In general, though,
works by attempting to predict which memory the processor is going to need
next, and loading
that memory before the processor needs it, and saving the results after the
done with it.
Whenever the byte at a given memory address is needed to be read, the
processor attempts to
get the data from the cache memory. If the cache doesn't have that data,
the processor is
halted while it is loaded from main memory into the cache. At that time
memory around the
required data is also loaded into the cache.
When data is loaded from main memory to the cache, it will have to replace
is already in the cache. So, when this happens, the cache determines if the
memory that is
going to be replaced has changed. If it has, it first saves the changes to
and then loads the new data.
The cache system doesn't worry about data structures at all, but rather
whether a given
address in main memory is in the cache or not. In fact, if you are familiar
memory where the hard drive is used to make it appear like a computer has
more RAM than
it really does, the cache memory is similar.
3. So, cache memory doesn't know anything about data structures, and
instead only knows
whether a given memory address is present in the cache or not. For most
programs this doesn't
matter at all. Programmers don't have to do anything and everything works
just fine, and the
actual details of how the cache work don't matter too much.
However, if a given process is absolutely time critical, knowing a little
about the cache can
help you improve the performance of the program. Most of the time, however,
this is wasted effort
since modern operating systems are multi-tasking, and any time the OS takes
over the cache is
pretty much blown away.
But, in general, to optimize for the cache:
1. Keep loops small, don't make them take up large amounts of memory for
the code, this is so
all the code for the loop will fit into the cache. Generally the amount
that will fit into
the cache is small 2-4K.
2. Try not to call functions that are outside the current area of code.
3. Try to access memory in the same areas. For example, classes where each
fields are accessed
together are good. Arrays that are small and accessed sequentially are
This implies some types of data structures aren't great for caches:
1. Large arrays that are iterated from start to finish are bad. If you go
through the entire
array several times, and the array doesn't fit, this is not good. It
would be better to
process portions of the array completely before moving to the next
2. Linked lists where one piece of data in each node is accessed for each
record is not good.
It would be better to completely process each node, and be completely
done with a node
before moving on.
3. Stacks are generally fairly good as they access the most recently loaded
4. Queues can be really bad if they are a first-in-first-out type of queue.
This is because
many cache systems, when they have to discard memory, discard the oldest
memory, and this
is the memory the queue needs to access, since the first-in node is the
But, as I said, it is generally good to know how cache works and be aware of
it. It is really
bad to spend extra time programming for the cache unless you
for the cache is the best way to get your results. Personally, I have been
critical code for seven years, and have found that a better algorithm will
much more than tweaking for the cache.
I hope this helps.
Click here to return to the Computer Science Archives
Update: June 2012