grid deployment status and plan at kek
play

GRID Deployment Status and Plan at KEK ISGC2007 Takashi Sasaki - PowerPoint PPT Presentation

GRID Deployment Status and Plan at KEK ISGC2007 Takashi Sasaki KEK Computing Research Center Who we are? KEK (High Energy Accelerator Research Organization) Inter University Research Institute Corporation since 2004 National


  1. GRID Deployment Status and Plan at KEK ISGC2007 Takashi Sasaki KEK Computing Research Center

  2. Who we are? � KEK (High Energy Accelerator Research Organization) � Inter University Research Institute Corporation since 2004 � National Institute previously � Still 100% funded by the government � Mission � Carry out research related accelerator science � Particle Physics, Material Science, Life Science and so on � Supporting collaborating universities � Universities have a freedom for the choice and they can be independent of us Takashi.Sasaki@kek.jp ISGC2007 27/MAR/2007 2

  3. High Energy Physics in Japan � Major High Energy Activities in Japan � Terminated in the JFY2005 � K2K Experiment at Kamiokande and KEK CDF at FNAL � Active Experiments ATLAS at LHC � Belle Experiment at KEK � KamLAND at Kamioka � CDF at FermiLab/USA KamLand � Under construction Tokyo Metropolitan U. Kamioka RIKEN � J-PARC SuperKamiokande Tohoku U. � T2K Experiment at Tokai and Kamioka Niigata U. � ATLAS and ALICE at LHC � Future Plan Osaka U. Kanazawa U. J-PARC Shinshu U. JAEA Tokai � SuperB Factory Okayama U. Kyoto U. Nagoya U. � International Linear Collider (ILC) Kobe U. KEK Hiroshima U � HEPnet-J: Nara Women U. Tsukuba U. Osaka city U. ICRR/Tpkyo High Energy Physics Network in Japan Waseda U. U. Tokyo � KEK provides the network facility, NEPnet-J, Tokyo Inst. of Tech. KEK BELLE on the SINET/SuperSINET (NII) 。 SuperSINET nodes SINET nodes Takashi.Sasaki@kek.jp ISGC2007 27/MAR/2007 3

  4. High Energy Accelerator Research Organization High Energy Accelerator Research Organization High Energy Accelerator Research Organization J- -PARC PARC J Tokai Tsukuba Tsukuba B- -Factory Factory B LC- LC -Test Facility Test Facility Narita Airport Photon- -Factory Factory Photon Takashi.Sasaki@kek.jp ISGC2007 27/MAR/2007 4

  5. KEKB e + e - Collider KEKB e + e - Collider SCC RF(HER) 8 GeV e - 3.5 GeV e + ARES Ares RF Belle Experiment (LER) cavity 13 countries, 57 institutes, ~400 collaborators B 0 � J/ ψ K S e + source B 0 � J/ ψ K S Observation of CPV Observation of CPV in the B meson system in the B meson system Takashi.Sasaki@kek.jp ISGC2007 27/MAR/2007 5

  6. Strategy on GRID deployment � Highest priority is support on Belle and ILC � LCG deployment � Help Universities to start over � R&D � Preparing for ILC � Japan has strong will to host ILC � NAREGI � Japanese flagship project on developing a GRID middleware � May substitute LCG/gLite in the future � GRID interoperability is the key Takashi.Sasaki@kek.jp ISGC2007 27/MAR/2007 6

  7. LCG at KEK � Computer Systems related � Central Information System since Feb. 20. 2006 � Central Computer System ■ GRID System ■ Mail System, etc. � B Factory Computer System since Mar. 23. 2006 � CPU : 45,662 SI2K rate ■ Disk : 1 PB ■ Tape : 3.5 PB � 1 st Phase � KEK Grid CA � approved by APGRID PMA and has been in operation since Jan. 2006. � The third official Grid CA in Japan � NAREGI CA software was modified to use at KEK. � KEK employee and their collaborators are eligible for this service. � LCG and SRB for production usage are available on the GIRD System � not for public usage, but for supporting projects � WN: 36 nodes x 2 =72 CPU � Storage: Disk (2TB) + HPSS(~200TB) � Supported VO: Belle, APDG, Atlas_J � in operation since May 2006 � 2 nd Phase � since Jan. 2007, LCG is going to be deployed in the Belle production. Takashi.Sasaki@kek.jp ISGC2007 27/MAR/2007 7

  8. LCG Sites at KEK LCG Sites at KEK JP- -KEK KEK- -CRC CRC- -01 01 JP- -KEK KEK- -CRC CRC- -02 02 JP-KEK-CRC-01 JP JP-KEK-CRC-02 JP � since early 2006. � since early 2006. � since Nov. 2005. � since Nov. 2005. � Site Role: � Site Role: � Site Role: � Site Role: � Production � Production � Experimental � Experimental � Resource and Component: � Resource and Component: � Resource and Component: � Resource and Component: � SL or SLC w/ gLite � SL or SLC w/ gLite- -3.0 later 3.0 later � � SL SL- -3.0.5 w/ gLite 3.0.5 w/ gLite- -3.0 later 3.0 later � CPU: 48, Storage: ~1TB (w/o HPSS) � CPU: 48, Storage: ~1TB (w/o HPSS) � CPU: 14, Storage: ~1.5TB � CPU: 14, Storage: ~1.5TB � Full components � FTS, FTA, RB, MON, BDII, LFC, CE, SE � Full components � FTS, FTA, RB, MON, BDII, LFC, CE, SE � Supported � Supported VOs VOs: : � Supported � Supported VOs VOs: : � belle, � belle, apdg apdg, g4med, , g4med, ppj ppj, , dteam dteam, ops and ail , ops and ail � belle, � belle, apdg apdg, g4med, , g4med, atlasj atlasj, , ppj ppj, , ilc ilc, , dteam dteam, ops and , ops and ail ail We depends on services by APROC We depends on services by APROC We depends on services by APROC members in ASGC, Taiwan. members in ASGC, Taiwan. members in ASGC, Taiwan. 8 8 Takashi.Sasaki@kek.jp ISGC2007 27/MAR/2007 8

  9. Belle GRID � LCG services are provided by the Central Computer system � Only storage system is used in B Factory computer system � Almost the same size of LHC tier-1 centers � CPU’s are served in the future � JP-KEK-CRC-02 is the production site � Belle data is exported by SRB-DSI � Very useful to export existing data � SRB-SRM is desirable still � iRODs? Takashi.Sasaki@kek.jp ISGC2007 27/MAR/2007 9

  10. LCG at KEK (2) � Data Trans in Belle Grid Kracow AS(Taiwan) Melbourne CPUs LCG/gLite LCG/gLite WS SRB Nagoya MCAT SRB-DSI SRB File servers server CRC-01 CRC-02 SRB files Local files Disk/HSM Grid files HPSS B Factory-system Grid-system Takashi.Sasaki@kek.jp ISGC2007 27/MAR/2007 10

  11. B Factory Computer System B Factory Computer System ‐ New B Factory Computer System since March 23. 2006 New B Factory Computer System since March 23. 2006 ‐ ‐ History of B Factory Computer System History of B Factory Computer System ‐ 11 11 Moore’s Law: 1.5y=x2.0 4y=x~6.3 5y=x~10 Takashi.Sasaki@kek.jp ISGC2007 27/MAR/2007 11

  12. Total Number of Jobs at KEK in 2006 Total Number of Jobs at KEK in 2006 JP- -KEK KEK- -CRC CRC- -01 01 JP- -KEK KEK- -CRC CRC- -02 02 JP-KEK-CRC-01 JP JP-KEK-CRC-02 JP 1,400 1,400 700 700 1,000 1,000 200 400 200 400 BELLE BELLE BELLE BELLE BELLE BELLE 12 12 Takashi.Sasaki@kek.jp ISGC2007 27/MAR/2007 12

  13. Total CPU Time at KEK in 2006 Total CPU Time at KEK in 2006 (Normalized by 1kSI2K) (Normalized by 1kSI2K) JP- -KEK KEK- -CRC CRC- -01 01 JP- -KEK KEK- -CRC CRC- -02 02 JP-KEK-CRC-01 JP JP-KEK-CRC-02 JP 12,000 12,000 4,000 4,000 10,000 10,000 3,000 3,000 4,000 1,000 [hrs kSI2K] 4,000 1,000 [hrs kSI2K] BELLE BELLE BELLE BELLE BELLE BELLE 13 13 Takashi.Sasaki@kek.jp ISGC2007 27/MAR/2007 13

  14. JLCG � Sponsored by the s-Science promotion program of National Institute of Informatics � LCG federation among Japanese Universities and KEK � Nagoya: ATLAS, Belle � University of Tohoku :ILC, KamLand � University of Tsukuba :ATLAS, ILC � Kobe University: ATLAS, ILC � Hiroshima Institute of Tech.: ATLAS, Information science � KEK: Belle, ILC, ATLAS… � KEK behaves as GOC Takashi.Sasaki@kek.jp ISGC2007 27/MAR/2007 14

  15. NAREGI at KEK � NAREGI � National Research GRID Initiative � What we expect? � Better quality � Better functionality � Job submissions with NAREGI is almost 3times faster than gLite3 now � Better support � We may substitute LCG/gLite with NAREGI if NAREGI works as advertised � GRID interoperability is the key � We are working together with NAREGI for this issue � Support on operation and application adoption is necessary � LCG is very excellent in support on deployment and operations Takashi.Sasaki@kek.jp ISGC2007 27/MAR/2007 15

  16. NAREGI at KEK (2) � NAREGI-beta test bed � 6nodes+3nodes(DG) � Installation nrg00 Portal node: � Naregi-beta 1.0 : Nov. 2006 � Information, PSE, WFT, GVS � Naregi-DG : Feb 2007 Super Scheduler:SS nrg01 � Test by Application � P152 (Heavy ion) experiment DG MCS Inforamtion Server:IS � Data analysis nrg02 (Metadata) � data processing, then stored the nrg09 results to SRB through SRB-DSI gate Grid VM Sever:GVM way nrg04 DG RMS GridVM engine node-1 � Belle event simulation (Resource Manag.) � We found nrg10 nrg05 GridVM engine node-2 � several inconvenient differences Gfarm FS between gLite and NAREGI nrg11 � ask NAREGI group to improve User Manag. Serv.: VOMS nrg03 Takashi.Sasaki@kek.jp ISGC2007 27/MAR/2007 16

  17. GRID at KEK � GRID environment at KEK 2007.2.9 gLite/EGEE gLite/EGEE CPUs SRB Naregi-kek MCAT SRB-DSI SRB server gLite/ gLite/ CRC-01 CRC-02 SRB files Local files Grid files HPSS Takashi.Sasaki@kek.jp ISGC2007 27/MAR/2007 17

  18. Plan � More production use of LCG � Belle � Collaboration with ASGC, National Central Univeristy(Taiwan), U. of Melborne, Krakow and so on � ILC GRID � Collaboration with CC-IN2P3 � Start with CALICE � NAREGI-LCG(gLite) interoperability � More tests locally and discussion with NAREGI � Seeking a possible scenario to have a tier-1 center in Japan in the future (2012?) � Good practice for hosting the tier-0 center of ILC Takashi.Sasaki@kek.jp ISGC2007 27/MAR/2007 18

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend