a look at computer
play

A Look at Computer Architecture Methodologies Mario Badr and - PowerPoint PPT Presentation

A Look at Computer Architecture Methodologies Mario Badr and Natalie Enright Jerger Why evaluation methodologies? 1. Is computer architecture an art or a science? Experimental Data Reproducibility 2. How have evaluated metrics changed


  1. A Look at Computer Architecture Methodologies Mario Badr and Natalie Enright Jerger

  2. Why evaluation methodologies? 1. Is computer architecture an art or a science? • Experimental Data • Reproducibility 2. How have evaluated metrics changed over the years? 2

  3. Scope of the Survey • 44 ISCA Proceedings • 1973-2017 • Too many papers (over 1600) • Select papers from each proceeding across topics • Bias selection to impactful papers • 4-7 papers per proceeding • 222 papers total 3

  4. Paper Topics Axis #1 Description Single Core A conventional general purpose processor with one core Multiple Core More than one conventional processor Specialized Architecture An unconventional processor (e.g., accelerator, GPU) Axis #2 Description or Examples Microarchitecture e.g., branch prediction, simultaneous multithreading Memory e.g., cache replacement, phase change memory, cache coherence, memory consistency Networks e.g., bus, crossbar, network-on-chip, network interface Organization The overall design of multiple components Coordination The management of multiple components to achieve a goal 4

  5. Surveyed Papers Along Both Axes Memory Microarchitecture Networks Organization Coordination Specialized Architecture Single Core Multiple Cores 0 20 40 60 80 100 120 140 Paper Count 5

  6. Types of Evaluations • None • Qualitative • Theoretical • Quantitative • Experimental data 6

  7. We Focus on Quantitative Evaluations • None • Analytical Model • Qualitative • Prototype • Theoretical • Simulation • Architectural • Circuit-level • Quantitative • Other • Experimental data 7

  8. The 1970s – 27 papers • Quantitative Evaluations: 40% Memory Microarchitecture Networks Organization 8 • Evaluated Metrics 7 6 • Performance Paper Count 5 • Proxies for area 4 3 • Analytical Models 2 1 • e.g., assume ideal parallelism 0 • e.g., performance projections Analytical Model Architectural Other Simulation 8

  9. The 1980s – 46 papers • Quantitative Evaluations: 60% Memory Microarchitecture Networks Organization Coordination • Reduced costs of memory and 18 16 CPU 14 • Single core processors Paper Count 12 10 • Prototyping 8 6 4 • Trace-driven simulation 2 0 Analytical Model Architectural Prototyping Simulation 9

  10. The 1990s – 47 papers • Quantitative Evaluations: 85% Memory Microarchitecture Networks Organization 35 • Introduction of many simulators 30 • SimpleScalar 25 Paper Count 20 15 • Introduction of CACTI 10 • Catches on in the next decade 5 0 Analytical Model Architectural Prototyping • Power/energy is considered Simulation 10

  11. A Brief Interlude: Evaluated Metrics 1973 – 1995 1996 – 2017 100% 100% Percentage of Papers 90% 90% 80% 80% 70% 70% 60% 60% 50% 50% 40% 40% 30% 30% 20% 20% 10% 10% 0% 0% Performance Power Energy Area Performance Power Energy Area 11

  12. The 2000s – 50 papers • Quantitative Evaluations: 98% Memory Microarchitecture Networks Organization Coordination • Models for power, energy, thermal 50 • Wattch, HotSpot, Orion, McPAT 45 • CACTI gains popularity 40 35 Paper Count 30 • More simulator options 25 20 • Pin, Simics 15 10 5 • Tools to reduce simulation time 0 • SimPoint, PinPoint, SMARTS Analytical Model Architectural Prototyping Simulation 12

  13. The 2010s – 52 papers • Models and prototypes used Memory Microarchitecture more Networks Organization Coordination 50 45 • More tools 40 35 • Raised levels of abstraction Paper Count 30 • Design space exploration 25 20 15 10 5 0 Analytical Model Architectural Prototyping Simulation 13

  14. Summarizing Tool Use – 1973 - 2017 50 45 40 35 Paper Count 30 25 20 15 10 5 0 1970s 1980s 1990s 2000s 2010s 1970s 1980s 1990s 2000s 2010s 1980s 1990s 2000s 2010s Analytical Model Architectural Simulation Prototyping 14

  15. Computer Architecture: Art or Science? • Strong push to quantitative evaluations • Designs are evaluated with more metrics • Many tools developed to generate data • Reproducibility? 15

  16. The Increasingly Complex “Methodology” • Methodology section prominent in mid-to-late 90s • Methodologies grow very complex • More tools are used • Page real estate • Less used for methodology • More used for experimental data • Methodologies do not provide enough information 16

  17. Conclusion: Towards a Scientific Method Architects Tools Developers • Better methodology section • Caution against limitations • Relevant experimental data • Output ‘artifacts’ that • Can be redistributed • Can be re-used as inputs • Release your evaluation • Can be analyzed • Docker • GitHub • Other technologies 17

  18. Our Data is Open Source https://github.com/mariobadr/survey-wp3 License: Apache 2.0 Mario Badr and Natalie Enright Jerger 18

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend