Ring prediction for non-uniform cache architectures

Sayaka Akioka, Feihui Li, Mahmut Kandemir, Padma Raghavan, Mary Jane Irwin

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Scopus citations

Abstract

Increasing wire delays and memory capacities motivate new ways of designing L2 and L3 caches. NUCA (Non-Uniform Cache Architecture) has received considerable attention in the last few years. While most of the prior NUCA-based efforts have focused on data placement, data replacement, and migration related issues, this paper studies the problem of data search. Specifically, it proposes and experimentally evaluates several data search schemes for NUCA L2 caches that exhibit different performance-power trade-offs. These schemes are based on predicting the next ring (set of banks) to be accessed in a NUCA L2, and checking the banks in that ring first. In this work, we present the details of these prediction schemes, and compare them to two alternate approaches: searching all rings in parallel, and searching rings sequentially, starting with the one that is closest to the CPU.

Original languageEnglish (US)
Title of host publication16th International Conference on Parallel Architecture and Compilation Techniques, PACT 2007
Pages401
Number of pages1
DOIs
StatePublished - 2007
Event16th International Conference on Parallel Architecture and Compilation Techniques, PACT 2007 - Brasov, Romania
Duration: Sep 15 2007Sep 19 2007

Publication series

NameParallel Architectures and Compilation Techniques - Conference Proceedings, PACT
ISSN (Print)1089-795X

Conference

Conference16th International Conference on Parallel Architecture and Compilation Techniques, PACT 2007
Country/TerritoryRomania
CityBrasov
Period9/15/079/19/07

All Science Journal Classification (ASJC) codes

  • Software
  • Theoretical Computer Science
  • Hardware and Architecture

Fingerprint

Dive into the research topics of 'Ring prediction for non-uniform cache architectures'. Together they form a unique fingerprint.

Cite this