A memory cache is a small but fast memory where data


To speed up memory access, caching is typically used. A memory cache is a small but fast memory where data recently accessed is kept in anticipation of future references.

When an access is made, if the data is in the cache, then it is returned quickly. This is called a cache hit. otherwise main memory is accessed and the access is said to be a cache miss.

For the purposes of this problem, assume that the latency of the main memory is ten times the latency of the cache (i.e.. if an access to the cache takes one unit of time, then access to main memory would take 5 units of time).

Now, consider two possible optimizations for a memory system. The first will cut the latency of the main memory by 50%, whereas the second would cut the latency of the cache by 20%. [Use Amdahl law]

a) If the cache hit rate is 95% what speedup is achieved under each one of the two optimizations under consideration (separately)

b) Under what condition on the cache hit rate would you select each one of the two optimization under consideration (separately)

c) What speedup is achieved if both optimizations are adopted. Your answer should be a function of the hit rate, which you should take as a variable h.

Request for Solution File

Ask an Expert for Answer!!
Computer Engineering: A memory cache is a small but fast memory where data
Reference No:- TGS02929304

Expected delivery within 24 Hours