Miss Rate, Hit Time, Miss Penalty, Miss Rate, Hazards, Bandwidth
Pipelined Caches
increase bandwidth
A larger Page size can allow larger caches with faster ___
Hit times
Larger page sizes increase or decreases process startup time
Increases
Multibanked Caches
Increases bandwidth
Nonblocking caches
Increases bandwidth
(T/F) Memory can be saved by making pages bigger
T
Larger Page size reduces ____ misses
TLB Misses
compiler prefetching
reduce miss penalty or miss rate
hardware prefetching
reduce miss penalty or miss rate
Do smaller pages waste storage or save storage?
save
Which Type(s) of miss does larger cache size decrease?
Capacity
Direct Mapped and Set Associative cache are susceptible to which misses?
Compulsory, Capacity, Conflict
Which Type(s) of miss does higher associativity decrease?
Conflict misses
(T/F)Transferring smaller pages to or from secondary storage is more efficient than transferring Larger pages
F
compiler optimizations
Reduce miss rate
Multilevel Caches
Reduces Miss Penalty
Higher associativity
Reduces Miss Rate
Larger Cache Size
Reduces Miss Rate. Longer hit time. Higher cost and power
Larger Block Size
Reduces Miss Rate. Reduces compulsory misses. Increase miss penalty
Avoiding Address Translation when indexing the cache
Reduces hit time
Small and simple first-level caches
Reduces hit time. Decrease power Consumption
Way Prediction
Reduces hit time. Decrease power consumption
Critical word first
Reduces miss penalty
Giving reads priority over writes
Reduces miss penalty
Merging write buffers
Reduces miss penalty