presentation at the sti cell be workshop
play

Presentation at the STI Cell BE Workshop "Geoscience and - PowerPoint PPT Presentation

Presentation at the STI Cell BE Workshop "Geoscience and Aerospace Applications on a Potential Heterogenous Cell BE Cluster at UMBC" Prof.s M. Halem and Ye Yesha At Georgia Tech Univ. June 18-19, 2007 halem@umbc.edu Overview


  1. Presentation at the STI Cell BE Workshop "Geoscience and Aerospace Applications on a Potential Heterogenous Cell BE Cluster at UMBC" Prof.’s M. Halem and Ye Yesha At Georgia Tech Univ. June 18-19, 2007 halem@umbc.edu

  2. Overview  UMBC Background  iCLASS Configuration  Cell Research Application areas: -Aerospace -Geoscience -Other  Summary 6/18/07 Page 2

  3. The University of Maryland Baltimore County (UMBC)  One of the three research campuses in the University of Maryland System  UMBC is a tier 1 doctoral research extensive university -- Carnegie Foundation  Has 500 full time and 335 part time faculty  9.5K undergraduate and 2.5K grad. students  Located in suburban Baltimore County, between Baltimore and Washington DC.  Focus on science, engineering, information technology and public policy with ~$85M in external research funding  NY Times, May 25,2006; Science Jnl March 25 “UMBC Science Program Remaking Science Education”  UMBC’s Meyerhoff Scholarship Program produces approx. 50 undergraduates/year; 90% matriculate to advanced degrees in science. 6/18/07 Page 3

  4. CSEE @ UMBC Computer Science & Electrical Engineering  UMBC’s largest department with 45 faculty, ~1000 undergrad, ~200 grad students Degree programs (graduate and undergraduate)  Computer Science, Computer Engineering, Electrical Engineering  UMBC is #1 in BS CS production and #18 in PhD production (see 2004 NSF data) among research universities Many institutes, centers and labs  Institute for Language and Information Technology, Center for Information Security and Assurance, Center for Photonics, Lab For Advanced Information Technology, interdisciplinary Computational Lab for Autonomous Systems and Services (iCLASS), VLSI Lab, … Breadth and focus in research areas  ~ $5M/year in sponsored research from Government and industry  Pervasive computing, AI, security, information retrieval, Computational Science, graphics, databases, VLSI, … 6/18/07 Page 4

  5. UMBC/CSEE iCLASS Computing Configuration  The CSEE department provides an ‘interdisciplinary Computational Laboratory for Autonomic Systems and Services’ (iCLASS) within the UMBC School of Engineering that supports high productivity computational research across campus  Mission of iCLASS is to prepare UMBC for large scale team science research in partnership with universities, industry and government by enabling an optical based network cyberinfrastructure for prototyping of large compute and data intensive computational systems in the areas of aerospace, geosciences, defense and medical imaging  Focus of research applications has been drawn from environmental and life science modeling, web-based service oriented science and engineering, pervasive computing, communicating devices, visualization and data preservation  The Computational Lab currently consists of an IBM JS20/JS21 cluster “ bluegrit” and an Intel based IBM cluster of PCs “matisse” driving a Hyperwall for visualization. In addition, ‘bluegrit’ is part of the University System of Maryland Grid and a member of SURAgrid. 6/18/07 Page 5

  6. iCLASS Compute Servers Bluegrit Head Node  IBM X Series 346  Red Hat Linux Enterprise 4.0  Intel 2.8 GHz Intel Xenon with Hyper Threading  256 MB DDR SDRAM Compute Nodes  IBM Blade Center with 34 JS20 Blades and 12 JS21 Blades  Red Hat Linux Enterprise 4.0  2 x 2.20 GHz PowerPC 970 Processors per JS20 Blade and 4x2.5 GHz Power PC 970 Processors  Total number of processors is 116  512 MB DDR SDRAM per JS20 Blade and 2GB per JS21  40 GB hard disk per JS20 Blade and 0 hard disk per JS21 ---------------------------------------------------------------------------------  Proposing to acquire a few Cell blades to upgrade ‘bluegrit’ to a heterogeneous multicore processing system for exploratory research. 6/18/07 Page 6

  7. iCLASS Compute Servers Bluegrit Storage  IBM Storage System DS4300  Red Hat Linux Enterprise 4.0  21 x 250 GB hard disks  5.25 TB total disk storage Internal connectivity  4 Gbps connection between Blades External Connectivity  1 Gbps burst fiber optic through Quest  10 Gbps fiber optic with Dynamic Dragon Circuit Service from UMBC to UMCP/UMIACS, NASA/GSFC, NSA/LTS, NGIT/ McLean Va., NLR 6/18/07 Page 7

  8. iCLASS Visualization Tiled Display  6 x ThinkVision L200P  Each with 1600 x 1200 resolution  Driven by ATI All In Wonder X600  A total resolution of ~ 11.5 Megapixels Matisse  7 x IBM Think Centre Desktop Computers  Red Hat Linux Enterprise 4.0  Intel 3.3 GHz Intel Xenon with Hyper Threading  1 GB MB DDR SDRAM  80 GB hard disk  Optiputer S/W ------------------------------------------------------------------------  Planning to acquire a small cluster of PS3s ( ~ 4-6) to drive an upgraded Tile display and explore other uses. 6/18/07 Page 8

  9. Exploratory Cell Research Applications  Aerospace: Temperature & Ozone Radiance Gridding  Geosciences: Global Circulation & Hurricane Predictions  Ray Tracing Animations  Hierarchical Image Segmentation for Medical Scanning ---------- Other Academic Research Areas ---------------  Digital Media  Chemical Modeling  Bioinformatics  Sensor Web Simulations for Dynamic Congestion Pricing 6/18/07 Page 9

  10. Aerospace: Cell Sounding Data Processing M. Halem & Ye.Yesha  NASA, NOAA and DOD have very large satellite data holdings (PBs) from many temperature and moisture sounding instruments flown over several decades. Current instruments have thousands of spectral channels.  Atmospheric radiance data products are archived at different spatial and spectral resolutions and temporal frequencies, and in different formats Figure shows of one month of gridded AIRS data for a single channel  Data transformations such as gridding, statistical sampling, subsetting, convolving, etc. are needed to produce long climate data records to study global warming. These transformations lend themselves well to the data parallel model which are ideal for Cell BE Processor. 6/18/07 Page 10

  11. Field of Views for AIRS (2382 spectral ch.) /AMSU (15 Spectral ch.) 2520 km (6 min. )/ granule 135 scan lines /granule 240 granules/ day 6/18/07 Page 11

  12. Aerospace: Cell Sounding Data Processing (cont.)  Web service tools developed at UMBC provide complex gridding services on- demand for atmospheric radiance data sets utilizing the IBM JS20/21 cluster. However, reprocessing will consume the entire current system.  Service algorithms are compute and data intensive to implement on serial processors for gridding different instruments:  On demand work flows for spatial/temporal/spectral subsetting of gridded fields  Concatenation of radiances from high resolution spectral instruments using FFT convolutions to match spectral radiances of lower resolution instruments  Generating gridded AIRS and MODIS radiance data sets on demand from satellite observations (L1) for user specified spatial-temporal regions with selected statistical aggregations  Overlaying multi instrument gridded animations of over long time periods. Motivation for Cell Processing:  Services can be implemented as embarrassingly parallel operations. E.g., Each SPUs can project a separate granule in parallel with all other granules. Convolutions can be performed with 1-D FFT’s in single precision to give performance factors of 10X to 15X over leading superscalar and vector CUs, (Williams et. al., CF’06,May 2006, Ischia,Italy). 6/18/07 Page 12

  13.  AIRS ch. 2333(blues/purple), MODIS ch. 32,  MODIS cld free June-Sept. composite true color Earth Page 13

  14. Ozone Cell Processing Application  The Ozone Monitoring Instrument (OMI) launched on the NASA Aura spacecraft on July 15, 2004.  OMI is a nadir-viewing UV-VIS spectrometer, with a 2600 km wide swath, and a 13x24 km2 footprint, guaranteeing daily global coverage.  OMI measures the back-scattered solar radiance in the wavelength range of 270 to 500 nm.  The OMI Ozone Profile Algorithm is based on the optimal estimation technique [C.D. Rodgers, 2000)] that has become standard in the field.  It takes advantage of the hyperspectral capabilities of the OMI instrument to improve the vertical resolution of the ozone profile below 20 km compared to those from the SBUV instruments that have flown on NASA and NOAA satellites since 1970. 6/18/07 Page 14

  15. OMI Cell Processing (Cont.)  OMI processing uses new approaches to calculate the required radiances and Jacobians efficiently and to correct for rotational Raman scattering  Planning to produce global daily ozone profiles at 36 x 48 km, an unprecedented spatial resolution.  The current implementation of the algorithm takes approximately 10 minutes per pixel on off the shelf commodity Intel processors.There are 1.3 million pixels per day which would take about 8000 processors to keep up with real time processing of every pixel.  Profiles are computed largely independently, so 100 Cell blades should be well suited to parallelize the code using independent SPUs. It should be possible to use the SPU SIMD instructions to compute multiple profiles simultaneously within each SPU at 5X over Intel processors.  Averaging over 5X5 pixel regions would enable feasibility testing on the proposed modest sized Cell blade cluster. 6/18/07 Page 15

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend