it strategy board
play

IT STRATEGY BOARD September 27, 2016 AGENDA > Call to Order - PowerPoint PPT Presentation

IT STRATEGY BOARD September 27, 2016 AGENDA > Call to Order Welcome and introductions Proposed IT Strategy Board 2016-2017 agenda > Research High Performance Computing Strategy > Major Projects Update UW Medicine EPIC


  1. IT STRATEGY BOARD September 27, 2016

  2. AGENDA > Call to Order — Welcome and introductions — Proposed IT Strategy Board 2016-2017 agenda > Research High Performance Computing Strategy > Major Projects Update — UW Medicine EPIC Migration — HR/Payroll Modernization — Integrated Service Center Update > IT Project Portfolio Executive Review > Wrap up 2

  3. Proposed Strategy Board Agenda 2016 - 2017 Month IT Strategy Board Topic  September 27, 2016 Proposed Strategy Board Meeting Agenda  Research High Performance Computing Strategy  Major Projects Update  EPIC Migration  HR/Payroll Modernization  Integrated Service Center Update  Administrative Systems Modernization Strategy February 6, 2017  HR/P Modernization Deep Dive  Finance Business Transformation  Enterprise Data Warehouse  Research Data Science Strategy  eScience and UW-IT Partnership  Sean Mooney, UW Med, Data Analysis  HR/P Modernization Update May 8, 2017  Student Analytics 3

  4. Major Projects Update 4

  5. Research Strategy Kelli Trosvig Vice President, UW-IT and Chief Information Officer Martin Savage Chair, Hyak Governance Board Professor, Physics 5

  6. High Performance Computing and Data Ecosystem for the UW Community Martin Savage – Chair, Hyak Governance Board Educating the next generation of leaders UWIT’s Hyak and Lolo 6

  7. Seattle – A Computing and Data Capital > In choosing Seattle and Washington State, our UW community deserves to have competitive Simulation/Scientific Computing/Data research capabilities and educational opportunities and, ideally, considerably more > A viable and significant growth strategy is required for UWIT’s research and education High Performance Computing and Data Ecosystem (HPCDE) 7

  8. World is Entering the Exascale Era Wired , June 21, 2016 > 125 Petaflops > 10.6 Million compute cores > 1.3 PB ram > x5 any US machine 8

  9. US is Designing an Exascale Ecosystem 9

  10. Scientific Computing and Data Addressing present and future challenges 10

  11. Exascale Will Have Broad Impacts THE INNOVATORS Lecture S eries SUPERCOMPUTING, THE CANCER MOONSHOT AND BEYOND Designing the next generation of supercomputers for biomedical research G uest Lecture Dr. Dimitri Kusnezov Chief scientist U.S. Department of Energy National Nuclear Security Administration How can the next generation of supercomputers unlock biomedical mysteries that will shape the future practice of medicine? Scientists behind the National Strategic Computing Initiative, a federal strategy for investing in high-performance computing, are explor - ing this question. September 20, 2016 Guest lecturer Dr. Dimitri Kusnezov serves as chief 4:00 p.m. scientist in the U.S. Department of Energy National Allen Institute Nuclear Security Administration and advises the secre - 615 Westlake Ave. North tary of energy. His lecture explores how exascale computing—computing systems capable of a billion Seattle, WA billion calculations per second—can open new doors Hosted eception to follow in biomedical research. He discusses how supercom - puting could support federal research, technology, and Register Today: policy initiatives, including cancer research and the innovators.wsu.edu quest to revolutionize understanding of the human brain. R eservations required by S eptember 15 11

  12. UW is Deploying a Petascale Ecosystem > 2008 Physics/Astro Athenna cluster: ~10 users, a few Tflops > Phase 1: Factor of x20 growth in compute capability between 2010 and 2016 (185 Tflops) — kept up with #500 position — mainly CPU, but with an IBM GPU testbed > NSF MRI : Intel Xeon-Phi components to be added, ~$800K > 2016: Delivers >7 Million Core-hrs per month to UW community > Phase 2: > 10 Pflops in 2020, with accompanying data and communication infrastructure 12

  13. UW HPCDE: Hyak + Lolo 2016 Hyak Support: > Stephen Fralich > Pramad Gupta > Chance Reschke 33% 100% FTE 50% 2017 … + 50% > UW HPC expertise imparted to faculty and students alike to take them from laptops or iPads to modern computing tools used to solve large-scale complex problems across disciplines > Prepares researchers to compete nationally in teams for DOE+NSF Leadership Class computing awards – INCITE, ALCC, Exascale Compute Projects 13

  14. Hyak Governance Board http://www.int.washington.edu/users/mjs5/HYAK/governance/ 14

  15. HPC Club – the Student Organization http://students.washington.edu/hpcc/ 15

  16. The Hyak HPCDE and its Role > SoS – The Speed of Science – No proposal writing and hoping for resources – Immediate access to resources to try out new ideas, algorithms, etc. > P4E – Preparing for Exascale – Trying out new algorithms and codes with Exascale type architectures > BDP – Big Data Pipelines – Rapid access to large data sets and fast distribution – Local experiments, sensors, real- time processing 16

  17. The Origins of Our Present HPCDE > Conversation with Researchers in 2009 > IT and Data Management Expertise > Data Management Infrastructure > Storage Infrastructure > Computing Power > Communication and Collaboration > Computational Leaders Meeting – 2007/08 — Mary Lidstrom/OR > Kaplan Letter – 2007 > Chance Reschke is hired to deploy Hyak > Rob Fatland is hired to facilitate research use of cloud computing (2015) 17

  18. What was Accomplished A success story for Hyak and Lolo > 2009 – 30 full racks > 2010 – Hyak creates consolidation, 10 racks → 1 Hyak rack +1 Hyak rack > 2013 Biochem funds Hyak expansion > Hardware footprint dropped from 30 racks to 12 racks > Dozens of research results published 18

  19. What was Accomplished UWB STEM Hyak % Utilization UWB� STEM� Hyak� %� Utilization In 2014 UW Bothell STEM faculty demand for access to on-site HPC became critical. The 300.00% In� 2014� UW� Bothell� STEM� faculty� demand� for� access� to� on-site� HPC� became� critical .� � The� initial� response� was� initial response was to build a new data center costing $20M. Because the STEM to� build� a� new� data� center� costing� as� much� as� $20M. Because� the� STEM�dean� had� the� opton� of� investing� in� Hyak� instead,� the� $20M� went� to� start� a� new� ME� program,� the STEM� program� spent� $50k� of� central� funds,� dean had the option of investing in Hyak instead, the $20M started a new ME and� world� class� research� results� were� being� published� within� a� year. program, the UWB invested $50k in Hyak, and world class research results were being published within one year. 250.00% Hyak� is� an� ELASTIC platform Hyak is an ELASTIC platform In periods� of� low� use,� others� borrow� from� UWB� STEM - in periods of low use, others borrow from UWB STEM In� periods� of� high� use,� others� return� the� favor - in periods of high use, others return the favor 200.00% UWB� STEM�has averaged�80%� utilization� over� their� two� years� using� Hyak UWB STEM has averaged 80% 150.00% utilization over two years 100% 100.00% 50.00% 0.00% 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 Month 19

  20. What was Accomplished Data from early 2015 20

  21. What Has Not Yet Been Accomplished 25 > Large colo deployments outside Hyak see low utilization, and fast growth, threatening a repeat of the past. 20 > Bringing these groups into the Hyak fold is essential. Some already in hand. 15 > Others remain challenging. Provisioned Hyak� Equiv. 10 5 0 IHME HHMI Biostat Statistics Chemistry 21

  22. Hyak Use Campuses 2 Colleges 6 Departments 89 Users 939 Core Hrs 236,617,278 22

  23. HGB Evaluates and Decides and RECOMMENDS to IT Strategy Board > HGB evaluated the Hyak HPCDE — Unanimously voiced it to be a great success and vital to UW Research — Unanimously voted to embark on Next-Gen Hyak - Phase-2 — Unanimously voted not to terminate UW’s HPCDE 23

  24. Hyak System Support is Mission Critical … and is one -deep and overburdened, putting the service at great risk… Cores per FTE at US HPC Centers 6,000 Hyak 4,696 4,000 2,800 2,000 2,100 867 Extra Large Medium Small 24

  25. Budget Breakdown – Pun Intended 25

  26. Summary > 2009- 2016: Successfully deployed UW’s HPCDE to address UW needs identified 2006-2008 – Hyak HPCDE is embedded into UW-IT infrastructure, >7M core-hrs/month – Saves UW lots of money through economy of scales, power, people > e.g., 20 racks at 20% usage 4 racks at 100% usage – Brings in grant money and diversifies UW research capabilities – Attracts and retains scientists at all career stages > Current central financial contribution is small and dropping – Cost is carried by PIs (grants), students (STF), Colleges, Departments and UW-IT 26

  27. Summary > 2016-2020: Next-Generation Hyak being deployed – UW-IT is adding more staff > Mitigate system support risk – Additional staff needed > Support, educate, trouble-shoot, optimize > HPCDE is important to education and research, and needs a sustainable funding stream to enable UW to remain competitive > Discussion 27

  28. QUESTIONS 28

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend