RES updates, resources and Access European HPC ecosystem Sergi - - PowerPoint PPT Presentation

res updates resources and access european hpc ecosystem
SMART_READER_LITE
LIVE PREVIEW

RES updates, resources and Access European HPC ecosystem Sergi - - PowerPoint PPT Presentation

RES updates, resources and Access European HPC ecosystem Sergi Girona RES Coordinator RES: HPC Services for Spain The RES was created in 2006. It is coordinated by the Barcelona Supercomputing Center (BSC-CNS). It forms part of the


slide-1
SLIDE 1

RES updates, resources and Access European HPC ecosystem

Sergi Girona RES Coordinator

slide-2
SLIDE 2

RES: HPC Services for Spain

  • The RES was created in 2006.
  • It is coordinated by the Barcelona Supercomputing

Center (BSC-CNS).

  • It forms part of the Spanish “Map of Unique Scientific

and Technical Infrastructures” (ICTS).

slide-3
SLIDE 3

RES: HPC Services for Spain

RES is made up of 12 institutions and 13 supercomputers.

1 10 100 1000 UAM Cibeles SCAYLE Caléndula UZ CaesarAugusta IAC La Palma UMA Picaso CénitS Lusitania & SandyBridge UPM Magerit UC Altamira UV Tirant BSC MinoTauro CSUC Pirineus & Canigo CESGA FinisTerrae BSC MareNostrum TFlop/s (logarithmic scale)

slide-4
SLIDE 4

RES: HPC Services for Spain

  • Objective: coordinate and manage high performance computing services to promote

the progress of excellent science and innovation in Spain.

  • It offers HPC services for non-profit, open R&D purposes.
  • Since 2006, it has granted more than 1,000 Million CPU hours to 2,473 research

activities.

Research areas

Astronomy, space and earth sciences Mathematics, physics and engineering Chemistry and materials sciences Life and health sciences 23% 19% 28% 30% AECT BCV FI QCM

Hours granted per area

slide-5
SLIDE 5

BSC (MareNostrum 4) 165888 cores, 11400 Tflops Main processors: Intel(R) Xeon(R) Platinum 8160 Memory: 390 TB Disk: 14 PB UPM (Magerit II) 3920 cores, 103 Tflops Main processors : IBM Power7 3.3 GHz Memory: 7840 GB Disk: 1728 TB UMA (Picasso) 4016 cores, 84Tflops Main processors: Intel SandyBridge-EP E5-2670 Memory: 22400 GB Disk: 720 TB UV (Tirant 3) 5376 cores, 111,8 Tflops Main processors: Intel SandyBridge-EP E5-2670 Memory: 10752 GB Disk: 14 + 10 TB CSUC (Pirineus) 2784 cores, 283,66 Tflops Main processors: Intel(R) Xeon(R) Platinum 8160 Memory: 12000 GB Disk: 200 TB CSUC (Canigo) 384 cores, 33,2 Tflops Main processors: Intel(R) Xeon(R) Platinum 8160 Memory: 9000 GB Disk: 200 TB

RES supercomputers

slide-6
SLIDE 6

CénitS (Lusitania 2) 800 cores, 33,2 Tflops Main processors Intel Xeon E5-2660v3, 2.6GHz Memory: 10 GB Disk: 328 TB CénitS (SandyBridge) 2688 cores, 56 Tflops Main processors Intel Sandybridge Xeon Memory: 5376 GB Disk: 328 TB BSC (MinoTauro) 624 cores, 251 Tflops Main processor: 39x 2 Intel Xeon E5-2630 v3 Memory: 20 TB Disk: 14PB (shared with MN4) CESGA (FinisTerrae 2) 7712 cores, 328,3Tflops Main processor: Intel Xeon E5-2680v3 Memory: 40 TB Disk: 960 TB UC (Altamira 2+) 5120 cores, 105 Tflops Main processor: Intel SandyBridge Memory: 15,4 TB Disk: 2PB UZ (Caesaraugusta) 2014 cores, 80.5 Tflops Main processor: Intel E5-2680v3, 2.5GHz Memory: 5400 GB RAM memory Disk: 219TB

RES supercomputers

slide-7
SLIDE 7

SCAYLE (Caléndula) 2432 cores, 50,6 Tflops Main processor: Intel SandyBridge Xeon Memory: 4864 GB Disk: 600 TB UAM (Cibeles) 368 cores, 14,1 Tflops Main processor: Intel Xeon E5-2630 v3, 2.40GHz Memory: 896 GB Disk: 80 TB UAM (SandyBridge) – coming soon – 2688 cores, 56Tflops Main processor: Intel SandyBridge Xeon, 2.60GHz Memory: 5376 GB Disk: 80 TB IAC (LaPalma) 4032 cores, 83,85 Tflops Main processor: Intel SandyBrigde Memory: 8064 GB Disk: 60 TB

RES supercomputers

slide-8
SLIDE 8

Resources granted: CPU hours

50.000 100.000 150.000 200.000 250.000 300.000 350.000 400.000 2006 2007 2008 2009 2010 2011 2012 2013 2014 2015 2016 2017 Hours x 1000 Requested hours Awarded hours (A+B)

140mh available

slide-9
SLIDE 9

How to apply?

  • RES resources are open to researchers and spin-offs:
  • Computing resources: CPU hours and storage
  • Technical support: application analysis, porting of applications, search for the

best algorithm… to improve performance and ensure the most effective use of HPC resources.

  • Free of cost at the point of usage
  • Three open competitive calls per year.

Period Deadline for applications Starting date P1 January 1st March P2 May 1st July P3 September 1st November Next deadline: January 2019

slide-10
SLIDE 10

How to apply?

  • Researchers present a proposal which includes research project description,

technical requirements and research group experience.

  • Accepted proposals have access to RES supercomputers for 4 months.
  • Granted time can be: hours with priority (hours A) or without priority (hours B)

RES intranet: https://www.bsc.es/res-intranet

slide-11
SLIDE 11

Proposal evaluation

Submit

Formal evaluation Technical evaluation

Scientific experts panel Access committee Final report of accepted activities

slide-12
SLIDE 12

Activity length

  • Accepted proposals have access to RES supercomputers for 4 months. If

your activity needs more time to be properly developed, you can ask for a continuation activity: New activity (4 months) Report dissemination information Continuation activity (4 months) Report dissemination information Continuation activities

  • The application form is simplified
  • Are preferably allocated to the same machine
  • In the evaluation, one reviewer is kept from the

previous activity and the second reviewer changes

slide-13
SLIDE 13

RES Users’ Committee

  • CURES aims to provide advice and feedback to RES coordinators:
  • Promotes optimal use of high performance computing facilities
  • Shares information about users’ experiences
  • Voices user concerns
  • You can contact CURES through RES intranet:
slide-14
SLIDE 14

Ø I don't know how to write a strong application Ø I'm not sure if I can apply Ø Lack of HPC expertise in my research group Ø Too much paperwork Ø Not enough resources for my project

Which is/could be the main impediment to apply to RES resources?

slide-15
SLIDE 15

Tips to write a strong proposal

  • Project description section: highlight the importance of your project, not
  • nly from the scientific point of view but also from the return to society.

Ø Why do you think that your project should deserve the resources requested?

  • Read carefully all the protocols, guides and FAQs in:

https://www.res.es/en/access-to-res

slide-16
SLIDE 16

Tips to write a strong proposal

  • Activity description section: specify clearly why do you need

supercomputing resources. Write as accurate as possible the flowchart of the simulations. Indicate that you have your own human resources in your group to run and process the output of all the simulations you propose.

Ø Why do you need to carry out the simulations in the selected machine? Ø Is the amount of computing resources requested adjusted to your needs and properly justified?

  • Doubts about software/HPC resources: ask support team!

Ø Are your jobs adequate for parallel computing? support@bsc.es

slide-17
SLIDE 17

Ø I don't know how to write a strong application Ø I'm not sure if I can apply Ø Lack of HPC expertise in my research group Ø Too much paperwork Ø Not enough resources for my project

Which is/could be the main impediment to apply to RES resources?

slide-18
SLIDE 18

Who can apply?

  • RES resources are aimed at open R+D+I activities:
  • Researchers from academia and public R&D institutions
  • Spin-offs during their first 3 years from its creation
  • Collaboration projects between private companies and research groups

from academia or public institutions

  • Open to international applicants, but we recommend the collaboration

with researchers from Spanish institutions

slide-19
SLIDE 19

Ø I don't know how to write a strong application Ø I'm not sure if I can apply Ø Lack of HPC expertise in my research group Ø Too much paperwork Ø Not enough resources for my project

Which is/could be the main impediment to apply to RES resources?

slide-20
SLIDE 20

RES events: technical training

These workshops are organized by the RES nodes and aim to provide the knowledge and skills needed to use and manage the supercomputing facilities.

  • Check the agenda in RES website:

https://www.res.es/en/events?event_type=technical_training

  • PATC courses in BSC (PRACE Advanced Training Center):

https://www.bsc.es/education/training/patc-courses

slide-21
SLIDE 21

RES events: networking opportunities

Scientific seminars

The RES promotes scientific seminars which address supercomputing technology applications in specific scientific areas. These events are mainly

  • rganized by RES users and are open to the entire research community.

In 2017:

ü 5 scientific seminars ü More than 300 attendees Agenda 2018: www.res.es/en/events

Next Generation Sequencing and Supercomputing: life as a couple CBMSO-UAM (Madrid) Sep

27

slide-22
SLIDE 22

RES events: networking opportunities

The agenda includes:

  • Information about RES and the European HPC ecosystem
  • Plenary session: Research Open Data
  • Parallel scientific sessions
  • Poster session
  • Networking opportunities
  • Evening social event

RES Users’ Meeting: 20 September 2018 - Valencia

www.res.es/users-conference-2018

slide-23
SLIDE 23

ü Mobility grants for researchers using HPC resources ü Short stays to visit scientific hosts (3 weeks – 3 months) ü Funds for travel and living allowance ü Access to European HPC facilities

http://www.hpc-europa.eu/ Next deadline: 20 September Funded by the EC: 2017 - 2021

slide-24
SLIDE 24

Ø I don't know how to write a strong application Ø I'm not sure if I can apply Ø Lack of HPC expertise in my research group Ø Too much paperwork Ø Not enough resources for my project

Which is/could be the main impediment to apply to RES resources?

slide-25
SLIDE 25

RES forms

  • New activity application form: 10 pages on average
  • Continuation activity application form: simplified
  • Dissemination form: 3 pages on average
  • Brief description of results (1-2 paragraphs)
  • Publications
  • Figures / pictures
  • Optional: patents, PhD students…
  • Intermediate reports: 1-2 sentences (“Everything is ok”)
  • Resubmission of non-accepted activities: one click

In the RES we try to keep the administrative procedures short and simple for researchers:

slide-26
SLIDE 26

Ø I don't know how to write a strong application Ø I'm not sure if I can apply Ø Lack of HPC expertise in my research group Ø Too much paperwork Ø Not enough resources for my project

Which is/could be the main impediment to apply to RES resources?

slide-27
SLIDE 27

PRACE HPC Access

  • Call for Proposals for Project Access:
  • 12, 24 or 36-month projects
  • Minimum request: 30 million core hours
  • Call for Proposals for PRACE Preparatory Access:
  • From 2 to 12 month projects

http://www.prace-ri.eu/

Next deadline: 30 October

slide-28
SLIDE 28

EuroHPC

slide-29
SLIDE 29

Distributed Supercomputing Infrastructure

24 members, including 5 Hosting Members

(Switzerland, France, Germany, Italy and Spain) 524 scientific projects

enabled 70 PFlops/s of peak performance

  • n 7 world-class systems

MareNostrum Marconi Piz Daint Curie JUQUEEN Hazel Hen SuperMUC

>10.000 people trained by 6 PRACE Advanced Training Centers and

  • thers events

prace-ri.eu/hpc_acces

Access

slide-30
SLIDE 30

Top500 (June 2018), European ranking

Europe Rank Name Computer Site Country Cores & Accel. Rmax [PFlop/s] Rpeak [PFlop/s] 1 6 Piz Daint Cray XC50, Xeon E5-2690v3 12C 2.6GHz, Aries interconnect , NVIDIA Tesla P100 CSCS Switzerland 361760 297920 19,59 25,33 2 13 HPC4 Proliant DL380 Gen10, Xeon Platinum 8160 24C 2.1GHz, Mellanox InfiniBand EDR, NVIDIA Tesla P100 ENI Italy 253600 177520 12,21 18,62 3 14 Tera-1000-2 Bull Sequana X1000, Intel Xeon Phi 7250 68C 1.4GHz, Bull BXI 1.2 CEA France 561408 11,97 23,40 4 18 Marconi Intel Xeon Phi CINECA Cluster, Lenovo SD530/S720AP, Intel Xeon Phi 7250 68C 1.4GHz/Platinum 8160, Intel Omni-Path CINECA Italy 312936 8,41 16,21 5 20 Cray XC40, Xeon E5-2695v4 18C 2.1GHz, Aries interconnect UKMET United Kingdom 241920 7,04 8,13 6 22 MareNostrum Lenovo SD530, Xeon Platinum 8160 24C 2.1GHz, Intel Omni-Path BSC Spain 153216 6,47 10,30 7 23 JUWELS Module 1 Bull Sequana X1000, Xeon Platinum 8168 24C 2.7GHz, Mellanox EDR InfiniBand/ParTec ParaStation ClusterSuite Juelich Germany 114480 6,18 9,89 8 27 Hazel Hen Cray XC40, Xeon E5-2680v3 12C 2.5GHz, Aries interconnect HLRS Germany 185088 5,64 7,40 9 28 COBRA Intel Compute Module HNS2600BP, Xeon Gold 6148 20C 2.4GHz, Intel Omni-Path Max-Planck- Gesellschaft MPI/IPP Germany 127520 5,61 9,79 10 30 Pangea SGI ICE X, Xeon Xeon E5-2670/ E5- 2680v3 12C 2.5GHz, Infiniband FDR Total France 220800 5,28 6,71

slide-31
SLIDE 31

Top10 industrial machines (June 2018)

Rank Name Computer Site Country Cores & Acc. Rmax [PFlop/s] Rpeak [PFlop/s]

13 HPC4 Proliant DL380 Gen10, Xeon Platinum 8160 24C 2.1GHz, Mellanox InfiniBand EDR, NVIDIA Tesla P100 Eni S.p.A. Italy 253600 177520 12,21 18,62 30 Pangea SGI ICE X, Xeon Xeon E5-2670/ E5-2680v3 12C 2.5GHz, Infiniband FDR Total Exploration Production France 220800 5,28 6,71 35 Abel Cray XC30, Xeon E5-2698v3 16C 2.3GHz, Aries interconnect Petroleum Geo- Services United States 145920 4,04 5,37 45 NVIDIA DGX-1/Relion 2904GT, Xeon E5-2698v4 20C 2.2GHz/ E5- 2650v4, Infiniband EDR, NVIDIA Tesla P100/Quadro GP100 Facebook United States 60512 55552 3,31 4,90 46 DGX Saturn V NVIDIA DGX-1, Xeon E5-2698v4 20C 2.2GHz, Infiniband EDR, NVIDIA Tesla P100 NVIDIA Corporation United States 60512 55552 3,31 4,90 49 HPC2 iDataPlex DX360M4, Intel Xeon E5-2680v2 10C 2.8GHz, Infiniband FDR, NVIDIA K20x Eni S.p.A. Italy 72000 42000 3,19 4,61 65 HPC3 Lenovo NeXtScale nx360M5, Xeon E5-2697v4 18C 2.3GHz, Infiniband EDR, NVIDIA Tesla K80 Energy Company (A) Italy 66000 39000 2,59 3,80 78 Makman-3 PowerEdge R440/C6320p, Intel Xeon Phi 7250/Xeon Gold 6130 16C 2.1GHz, Intel Omni-Path Saudi Aramco Saudi Arabia 53300 2,32 3,58 79 Inspur TS10000, Xeon Gold 6130 16C 2.1GHz, NVIDIA Tesla V100, 25G Ethernet Internet Service P China 55104 52480 2,29 4,89 80 Makman-2 Dell PowerEdge R630, Xeon E5-2680v3 12C 2.5GHz, Infiniband QDR Saudi Aramco Saudi Arabia 76032 2,25 3,04

slide-32
SLIDE 32

Signatory European Countries

Status of Signatory Countries

slide-33
SLIDE 33

EuroHPC mission and objectives

  • To develop, deploy, extend and maintain in the Union an

integrated world-class supercomputing and data infrastructure and to develop and support a highly competitive and innovative High-Performance Computing ecosystem

  • Provide European scientists, industry and the public sector with

the latest HPC and data infrastructure and support the development of its technologies and applications across a wide range of fields

  • Support an ambitious research and innovation agenda to develop

and maintain in the Union world-class High Performance Computing ecosystem, exascale and beyond, covering all scientific and industrial value chain segments, including low- power processor and middleware technologies, algorithms and code design, applications and systems, services and engineering, interconnections, knowhow and skills, for the next generation supercomputing era

HPC Ecosystem Infrastructure & Operations R&I, Applications & Skills

slide-34
SLIDE 34

Infrastructure and Operation

  • Two exascale systems
  • Possibly one on 2022 and a second one in 2023
  • At least one with European competitive technologies
  • Expected budget on TCO: 500 m€ each
  • Two pre-exascale systems
  • In operation in January 2021
  • Expected budget on TCO: 240m€ each
  • At least 2 petascale systems
  • In operation in January 2021
  • Budget: TBD
  • Next steps
  • Identify the hosting sites
  • User requirements to prepare RFP
slide-35
SLIDE 35

EuroHPC Roadmap

Apps

Software

Software

Integration 2018 2019 2020 2021 2022 2023 2024 2025 Gen 2 Gen 1

Exascale technologies integration

App & Software EsD

Intg. Co-Design

2 Pre-Exa 2 Exascale (1 EU)

HPC and Accelerator Chips & Systems

R&I

Procurement

sw stack

Applications

2 Peta

Eu Euro roHPC JU

slide-36
SLIDE 36

Follow us in Twitter: @RES_HPC Subscribe to our newsletter applications@res.es dissemination@res.es Visit our website: www.res.es

Contact us!

slide-37
SLIDE 37

THANK YOU!