For current information about me, see my LinkedIn profile: Jim Bennett.
I was a graduate student at Stanford University from 1991 to 1999, when I received my Ph.D. in Computer Science. I was a member of the Stanford Architecture and Arithmetic Group, headed by Michael Flynn.
Faculty advisors: Michael Flynn, Mendel Rosenblum, and Monica Lam.
Latency Tolerant Architectures
Processor cycle times are currently much faster than memory cycle times, and the trend has been for this gap to increase over time. The problem of increasing memory latency, relative to processor speed, has been dealt with by adding high speed cache memory. However, depending on the miss rate, memory latency can still have a significant performance impact. Since the trend of increasing memory latency is expected to continue, the performance impact will become even more significant with time.
Researchers have proposed a variety of techniques for dealing with memory latency, many of which have been implemented. These techniques fall into the categories of dynamic scheduling, hardware prefetching, software prefetching, or supporting multiple contexts. Various combinations of techniques for latency tolerance are possible as well. I would like to investigate the performance of these techniques in the context of modern uniprocessor design.
A more detailed description (my thesis proposal) is available.
Performance Factors for Superscalar Processors. Technical report CSL-TR-95-661, Stanford University, Computer Systems Laboratory, February 1995.
Reducing Cache Miss Rates Using Prediction Caches. Technical report CSL-TR-96-707, Stanford University, Computer Systems Laboratory, October 1996.
Prediction Caches for Superscalar Processors, Micro-30 proceedings, December 1997. Some listeners requested additional information on the on the performance of prediction caches, so here are some tables of IPC numbers for prediction caches, on a variety of machine models and memory configurations: Pred. Cache Performance.
The owner of this page can be contacted at:email@example.com