Presented at CERN - Geneva, Switzerland March 27, 2009 Dean Nelson - - PowerPoint PPT Presentation

presented at cern geneva switzerland
SMART_READER_LITE
LIVE PREVIEW

Presented at CERN - Geneva, Switzerland March 27, 2009 Dean Nelson - - PowerPoint PPT Presentation

Presented at CERN - Geneva, Switzerland March 27, 2009 Dean Nelson Sr. Director, Global Lab & Datacenter Design Services (GDS) 1 Unique Challenge 2 Demand and Capacity Are Colliding... ...And Data Centers Are Right in The MIDDLE! 800


slide-1
SLIDE 1 1

Presented at CERN - Geneva, Switzerland

March 27, 2009 Dean Nelson

  • Sr. Director, Global Lab & Datacenter Design Services (GDS)
slide-2
SLIDE 2 2

Unique Challenge

slide-3
SLIDE 3 3
  • Demand
  • Users
  • Services
  • Access
  • Power
  • Costs
  • Space
  • Heat

2003 2005 Next Generation

40

(430W m2)

120

(1300W m2)

800

(8,600W m2)

Watts per Square Foot/Meter

Demand and Capacity Are Colliding...

...And Data Centers Are Right in The MIDDLE!

slide-4
SLIDE 4 4

Moore’s Law in Action

Performance ~150k tpm ~300k tpm 64 Threads in 1997

E10K

128 Threads, 16 Cores

5140

Size 1&½ Footprint Full Rack Power

(Systems at peak utilization)

9620 Watts 30 kWatts Weight 1,800 lbs. ~2200 lbs.

720W each

30x 42

2x per 5140

zv 768 Cores, 28.5kW

6048

slide-5
SLIDE 5 5

Industry Average is Between 4-6kw/cabinet; >20kw Skyscrapers will be Integrated; Must Deal with mixed load environment

Reality: Heterogeneous Data Centers

slide-6
SLIDE 6 6

Why is this topic important?

slide-7
SLIDE 7 7

Unprecedented Activity

  • Sun Datacenter Briefings over 17 months (07/07-2/09)

> >675 – an average of ~8 per week > California = >4,000 people representing >400 customer companies

have engaged in briefings and toured Santa Clara, CA, India and UK datacenters in 15 months

> Colorado = Almost 1,000 people in less than two months > Challenges: Power, cooling, space, connectivity and utility costs > Interest: Investment Protection, Future-Proofing, Efficiency

  • Investments

> 21 of these companies spending $19B in datacenter projects in the

US alone

> Does not include Microsoft, Google, Facebook or DRT

slide-8
SLIDE 8 8

A different perspective

Server

440 Watt Server 3,942 kWh/year 5.3 Tonnes CO2

Auto Travel

Toyota Camry 15,000 m/year (24,000 km/year) 5.3 Tonnes CO2

Air Travel

Commercial Airliner Vancouver-Toronto (7 trips) 5.2 Tonnes CO2

A single server is responsible for about the same amount of CO2 as a typical automobile driven for a year

Usually

  • n

24x7

slide-9
SLIDE 9 9

A different perspective

A single server is responsible for about the same amount of CO2 as a typical automobile driven for a year

Usually

  • n

24x7

BUT

Moore's Law Mandates Efficiency Gains Automotive Equivalent Efficiency, 10 year period

163 MPG!

=

slide-10
SLIDE 10 10

Changing Priorities & Drivers

  • At the end of a dock instead of the end of a street
  • $15B investment ($1.2B solar project)
  • First Carbon Neutral, Waste Free, Car-Free City

Investment in solar innovation will change the industry

slide-11
SLIDE 11 11

Floating Data Centers

  • Tier1-Tier3 ECO datacenters at US and international ports
  • Capacity: 4000 racks and over 350 SunMDs
  • 75MW of power, free cooling from ocean water
  • Six months time to market, up to 40% less than traditional build
  • At the end of a dock instead of the end of a street
slide-12
SLIDE 12 12

Strategy

slide-13
SLIDE 13 13

A New Age

Information Age

All Things Connected Data Storm Building

Industrial Age

Global Production Global Consumption

Participation Age

Unprecedented Contribution Unprecedented Consumption

slide-14
SLIDE 14 14

Top 20 Social Networks

Source: http://en.wikipedia.org/wiki/List_of_social_networking_websites (as of 11/10/2008)

1.3 Billion Users and growing

slide-15
SLIDE 15 15

1 10 100 1k 10k 100k 1m .1 1990 1995 2000 2010 2005

The Shift

Internet Infrastructure High Performance Computing Software as Services Global Consumer Services Core Enterprise Apps Moore’s Law

UNDER SERVED OVER SERVED

IT = Weapon IT = Cost

slide-16
SLIDE 16 16

Innovate

slide-17
SLIDE 17 17

A moment of silence...

  • Raised Floors Are Dead

> No longer required > Go against physics > Increasingly cumbersome > Expensive

  • Next Generation

equipment requires a new way of thinking...

slide-18
SLIDE 18 18

Pod Architecture

Modular Data Center Building Blocks Container and/or Brick & Mortar

slide-19
SLIDE 19 19

Modular Pod Components

Physical Design

  • Influenced by cooling, brick and mortar

and/or container

Cooling – Closely Coupled

  • In-row or overhead, not-aisle containment,

and passive

Power Distribution

  • Overhead or under-floor busway

Cabling

  • Localize switching in each pod
slide-20
SLIDE 20 20

Cooling in Sun Modular Datacenter

  • Integrated cooling modules
  • Circular airflow, 5 cooling zones per module
  • Variable-speed fans on a per-fan basis
  • Handles densities up to 25 kW/rack
slide-21
SLIDE 21 21

Sun Pod Architecture

slide-22
SLIDE 22 22

Sun Pod Architecture

slide-23
SLIDE 23 23

Closeup: Power Distribution

Modular overhead, hot-pluggable busway with conductors to handle multiple voltages and phases

  • Requires no floor

space or cooling

> Transformers moved
  • utside the datacenter
  • Snap-in cans with

short whips

> Non-disruptive > Reduced copper

consumption

> No in-place abandonment > Significant time reduction – from months to minutes
slide-24
SLIDE 24 24

Closeup: Power Distribution

Modular overhead, hot-pluggable busway with conductors to handle multiple voltages and phases

  • Supports multiple

Tier levels

> Use multiple

busways

  • Scalable by

removing jumpers

slide-25
SLIDE 25 25

Act

slide-26
SLIDE 26 26

Design Services – Holistic Solution

Global Lab and Datacenter Design Services Bridging the gap Between Facilities and IT/Eng

GDS Facilities IT/Eng

  • Central design competency center
  • Understanding IT/Eng requirements, speak facilities’ language
  • Design services provide holistic solution
  • Aligned Business Organizations

> http://www.sun.com/aboutsun/environment/docs/aligning_business_organizations.pdf

slide-27
SLIDE 27 27

History: Sun’s Internal Challenge

  • Facilities is Sun’s second largest expense

> Real estate, utility, tax, and support costs

  • 20+ years of organic growth

> New products, reorgs, acquisitions > Lack of design standards and control of implementations for global

technical infrastructure

> Duplication and inefficiencies

  • Multi-billion dollar IT/R&D technical infrastructure portfolio

> 860k ft2 (80k m2) of Eng and IT space globally (reduced from 1.4M ft2 - 138k m2) > 1,068 individual rooms (reduced from 1,685) > IT space = 17% of the portfolio (143k ft2 / 13k m2 – 275 rooms) > Engineering/Services = 83% of the portfolio (718k ft2 / 67k m2 – 793 rooms)

slide-28
SLIDE 28 28

Corporate Drive to Reduce Costs

  • Spotlight: Santa Clara, CA - 2007

> Shed 1.8M ft2 (167k m2) of real estate > Compress 202k ft2 (18.8k m2) of datacenters space into < 80k ft2 (7,400

m2) of new Datacenter space in Santa Clara

> Project included every major business unit in Sun > 12 month project duration (complete by 06/30/2007)

  • Approach

> Phase I: Move 84k ft2 (7,800 m2) of datacenters into existing space > Phase II: Compress and build < 80k ft2 (7,400 m2) of next generation

datacenter space in 12 months

slide-29
SLIDE 29 29

ROI Sweet Spot = >4 to 1 Replacement Ratio

Spotlight: Hardware Replacement

  • 88% space compression
  • 61% utility reduction
  • $9M cost avoidance
  • 2.2MW to 500Kw
  • 450% compute increase
  • 550 racks to 65
  • 3,227 tons CO2 reduced
  • 312 cars off the road
  • Minimal downtime
  • Completed in 3 months
  • 2:1 Server, 3:1 Storage
  • 2,915 devices replaced

Reality:

slide-30
SLIDE 30 30

Bring Out Your Dead...

slide-31
SLIDE 31 31 China, India, UK, Czech Republic, Norway

Global Consolidation

  • $250M investment
  • 41% global datacenter

space compression

> 1.44M ft2 to 858k ft2

  • Scalable/Future Proof

> 9MW to 21MW (CA) > 7MW to 10MW (CO)

  • Largest Liebert/APC

installs

  • 15 Buildings to 2
  • 152 Datacenters to 14
  • $1.2M Utility Rebates

$250k Innovation Award

  • Enabled company pace
  • Reduced opex 30% (CA)
slide-32
SLIDE 32 32

China, India, UK, Czech Republic, Norway

Colorado DC Consolidation

  • 66% space compression
  • 496k ft2 to 126k ft2
  • Scalable/Future Proof

> 7MW to 10MW

  • First & Largest Liebert XD

dynamic cooling installs

  • Largest, most complex & costly

consolidation in Sun's history

  • 66% Datacenter compression
> 496k ft2 to 126k ft2
  • Scalable/Future Proof
> 7MW to 10MW
  • First & Largest Liebert XD

dynamic cooling install

  • Water treatment saves 600k

gallons/year, eliminates chemicals

  • Waterside economizer, free

cooling > 1/3 of year.

  • Compressed 165k ft2 raised floor

to <700 ft2 ($4M Cost Avoidance)

  • Flywheel UPS, eliminates

batteries.

  • Chillers 32% more efficient at avg

load than ASHRAE std

  • 2 ACE Awards
  • Removed 1M kWh per month
  • Removed 5% of global carbon
slide-33
SLIDE 33 33

Share

slide-34
SLIDE 34 34 Typical Data Center
  • 573 kW less support

power compared to industry PUE target (2)*

  • 36% More efficient than

the industry PUE target and almost 50% better that industry PUE average (2.5)*

  • $400,000 Annual opex

savings compared to typical data center ($0.08/kWh)

SCA11-1500 Software Datacenter PUE Load % of Total Load IT Load 798 78.02% Chiller Plant 126 12.28% RC/CRAC Loads 39 3.84% UPS/Transformer Loss 39 3.86% Lighting 20 2.00% Total Load 1023 Total Support Loads 225 PUE 1.28 78% Target Datacenter PUE Load % of Total Load* IT Load 798 50.00% Chiller Plant 399 25.00% RC/CRAC Loads 192 12.00% UPS/Transformer Loss 160 10.00% Lighting 48 3.00% Total Load 1596 Total Support Loads 798 PUE 2.00 50% kW DciE kW DciE SCA11-1500 Power Use IT Load Lighting UPS/Transformer Loss RC/CRAC Loads Chiller Plant * Industry average & target from uptime institute: http://www.datacenterknowledge.com/archives/2008/Jan/22/case_study_ups_green_data_center.html

Power Usage Effectiveness (PUE)

SCA11-1500 Data Center Efficiency Benchmark

IT Load Lighting UPS/Transformer Loss RC/CRAC Loads Chiller Plant
slide-35
SLIDE 35 35

Best Practices = Competitive Weapon

  • Align Facilities, IT & Engineering

>Partnering nets significant short term & long term savings

http://www.sun.com/aboutsun/environment/docs/aligning_business_organizations.pdf
  • Hardware Replacement

>Apply new hardware solutions and extend the life of your DC

http://www.sun.com/aboutsun/environment/docs/creating_energy_efficient_dchw_consolidation.pdf
  • Simplify Datacenter design with the POD concept

> Power: Modular, Scalable, Smart http://www.sun.com/aboutsun/environment/docs/powering_energy_efficientdc.pdf > Cooling: Adaptable, Scalable, Smart http://www.sun.com/aboutsun/environment/docs/cooling_energy_effiicientdc.pdf > Cabling: Distributed vs Centralized http://www.sun.com/aboutsun/environment/docs/connecting_energy_efficientdc.pdf > Measurement: Power to control http://www.sun.com/aboutsun/environment/docs/accurately_measure_dcpower.pdf

  • Data Center Tour Videos

> California: http://www.sun.com/aboutsun/environment/media/datacenter_tour.xml > Colorado: http://www.sun.com/featured-articles/2009-0126/feature/index.jsp

slide-36
SLIDE 36 36

Sun Blueprints

  • First Chapter - Modularity

Released June 10, 2008

  • Second Chapter - Electrical

Released March 10, 2009

  • Total of nine chapters to be

released over the next 12 months

  • Download:

http://sun.com/blueprints

slide-37
SLIDE 37 37

Participate & Contribute

slide-38
SLIDE 38 38

Internet Archive - WayBackMachine

  • Hosted @ Sun Santa Clara

> Captured over 12 years (2PB compressed) > Entire internet in a box > 4.5PB of x4500 storage (thumper)

slide-39
SLIDE 39 39

Hosted Chill Off

  • Load: Up to 320, Sun V20z 1U servers

> Selected legacy servers to create hostile environment (currently EOL) > Max load 10kW per rack

v

APC InRow RC

InRow Architecture with Thermal Containment Water based

Liebert XDV+

Overhead, Refrigerant based

Rittal LCP+

Contained Rack Water based

IBM/Vette RDHX

Passive Heat Exchanger Door Water based

Spraycool

Chip Level Cooling Fluorinert based
slide-40
SLIDE 40 40

TREND

Chill-Off Results

more efficient less efficient
slide-41
SLIDE 41 41

FILL PERFORMANCE CHART

Chill-Off 2

more efficient less efficient
slide-42
SLIDE 42 42 Clustered Systems

TEST PARTICIPANTS TESTING SUPPORT SPONSORS REVIEWERS

slide-43
SLIDE 43 43

Chill Off 2 Close Up...

Ultra-efficient no-fan, high-density servers that plug into the pod infrastructure

Breaking our own

  • warranty. We are

experimenting with this type of configuration in the chill off now...

slide-44
SLIDE 44 44
  • 798 members, 481 Companies, 39 Countries,

59 Industries

  • http://datacenterpulse.org

Data Center End User Community

* Membership stats as of 9:05pm 03/18/2009.

An exclusive group of global datacenter

  • wners, operators and users

influencing the datacenter industry through the end user lens.

slide-45
SLIDE 45 45

Open, Global, Focused

  • Formed Sept/2008
  • First Global Summit Held in CA, February 2008

> Un-conference, topics defined and driven by members > In-person & On-line > Selected Topics: Top 10, Metrics, Certification, Cloud, Industry

Alignment, Fanless Servers, Power

  • Access

> Website: http://www.datacenterpulse.org > : http://www.youtube.com/user/datacenterpulse

  • Join the Group through

> Owner/Operators http://www.linkedin.com/groups?gid=841187 > Industry http://www.linkedin.com/groups?gid=1315947

slide-46
SLIDE 46 46

The Top 10 (Feb/2009)

1) Align Industry Organizations 2) Data Center Certification Standard 3) Standard Data Center Stack 4) Update or Dump Tier Levels 5) More Products with Modularity 6) Simple Top Level Efficiency Metric 7) End to End IT/Facilities Measurement 8) Standard Conductive Cooling Interface 9) 480V/277V Power Supplies 10) Independent Data Center Repository

slide-47
SLIDE 47 47

Draft: Standard Data Center Stack

  • Industry Alignment
  • add all
slide-48
SLIDE 48 48

Dean Nelson

  • Sr. Director of Global Lab & Datacenter Design Services (GDS)

dean.nelson@sun.com http://blogs.sun.com/geekism

Thank You