Analyzing Architectures Introduction to Software Architecture Jay - - PowerPoint PPT Presentation

analyzing architectures
SMART_READER_LITE
LIVE PREVIEW

Analyzing Architectures Introduction to Software Architecture Jay - - PowerPoint PPT Presentation

Analyzing Architectures Introduction to Software Architecture Jay Urbain, PhD Credits: Software Architecture in Practice, 2 nd . Ed., Len Bass, Paul Clements, Rick Kazman 1 ATAM Architecture Tradeoff Analysis Method A thorough and


slide-1
SLIDE 1

1

Analyzing Architectures

Introduction to Software Architecture Jay Urbain, PhD

Credits: Software Architecture in Practice, 2nd. Ed., Len Bass, Paul Clements, Rick Kazman

slide-2
SLIDE 2

2

ATAM

  • Architecture Tradeoff Analysis Method
  • A thorough and comprehensive way to evaluate a

software architecture

  • This is hard:

– Large systems are very complicated. – Evaluation needs to compare business goals and technical decisions. – Careful management of an evaluation process is needed to consider multiple stakeholders in a review format. Focus on RISK!!!

slide-3
SLIDE 3

3

Participants in the ATAM

  • Three basic groups/perspectives:

– The evaluation team ~ 3-5 people – Project decision makers ~ Project manager, customer, architect – Architecture stakeholders ~ varies

slide-4
SLIDE 4

4

Outputs

  • Concise presentation of the architecture
  • Clearly defined/refined business goals
  • Quality requirements via a collection of scenarios
  • Mapping of architectural decisions to quality

requirements

– Tactics and patterns used to satisfy qualities

  • A set of risks and non-risks
  • A set of risk themes
slide-5
SLIDE 5

5

Phases

  • Phase 0: partnership and preparation
  • Phase 1 and 2: evaluation

– Phase 1 has 6 steps – Phase 2 has 3 steps with all stakeholders

  • Phase 3: follow-up via written report
slide-6
SLIDE 6

6

Phase 0: Partnership and Preparation

Participants:

  • Evaluation team
  • Project decision makers
  • Architecture stakeholders

Preparation:

  • Team leadership and key project decision makers meet informally to

work out the details of the exercise.

  • Project representatives brief evaluators about the project so the team

can be supplemented by people who possess the necessary expertise.

  • Work out logistics
  • Identify stakeholders
  • Define what's expected for Phase I
slide-7
SLIDE 7

7

Phase 1: Evaluation

Activity:

  • Evaluation

Participants:

  • Evaluation team and key project decision makers

Duration:

  • Varies, ~1 day for large projects, followed by hiatus (~1

to 2 weeks)

slide-8
SLIDE 8

8

Phase 1

1. Present the ATAM process 2. Present the business drivers 3. Present the architecture 4. Identify architectural approaches 5. Generate quality attribute tree (table) 6. Analyze the architectural approaches

slide-9
SLIDE 9

9

Present the ATAM process

  • Evaluation leader presents the ATAM process to project

representatives. – Explain process – Answer questions – Set context and expectations – Modifies as necessary

slide-10
SLIDE 10

10

Present the business drivers

  • Everyone involved needs to understand the context of

the system, and the primary business drivers motivating development.

  • Should describe:

– System's most important functions – Relevant technical, managerial, economic, political constraints – Business goals as they relate to project – Major stakeholders – Architectural drivers

slide-11
SLIDE 11

11

Present the architecture

  • Presented by lead architect
  • "Appropriate" level of detail
  • Cover architectural constraints:

– OS, HW, middle-ware, language, framework, etc.

  • Architectural approaches:

– Tactics (and patterns) used to meet architectural requirements

slide-12
SLIDE 12

12

Architecture Presentation

  • Description: elevator pitch
  • System context
  • UC’s: list, trace through 1 or 2 most important UC's
  • Key qualities (business drivers) that affect the structure of the system
  • Key quality attribute scenarios with quantitative response
  • Tactics
  • High-level design
  • Layers, common application framework
  • Key design patterns for each layer
  • Key class diagrams
  • Sequence diagrams (one for each use case)
  • Skeletal system: thread through system, focus on risk and validating

qualities

slide-13
SLIDE 13

13

Identify architectural approaches

  • Analyze architecture by analyzing approaches
  • Common application architectures and patterns
  • Known ways in which each pattern/approach affects

particular quality attributes.

  • Example:

– Layered pattern tends to bring portability, maintainability, and distributability to a system, possibly at the expense of performance. – Data centric architecture at the cost of distributability. – Consequences of service layer???

  • Evaluation team should catalog patterns and approaches.
slide-14
SLIDE 14

14

Generate Quality Attribute Utility Tree/Table

  • Architecture’s suitability to deliver quality attributes to

the system.

  • High-performance system may be totally wrong for a

system in which performance is not nearly as important as security.

  • Articulate quality attribute goals Identify, prioritize, and refine

the system's most important quality attribute goals which are expressed as scenarios.

slide-15
SLIDE 15

15

QA Utility Tree/table

Quality Attribute Attribute Refinement Scenarios

(Importance, Difficulty)

Performance Throughput

At peak load, the system is able to complete 150 normalized transactions per second. (M,M)

Usability Proficiency training

A user in a particular context asks for help, and the system provides help for that context. (H,L)

Availability

The system supports 7/24 Web-based account access by patients. (L,L)

slide-16
SLIDE 16

16

Analyze Architectural Approaches

  • Rank scenarios one at a time
  • Probe architectural approaches used to carry out the

scenario.

  • Document relevant architectural decisions and identify

risk, non-risk, sensitivity points, and tradeoffs.

– Example: number of simultaneous database clients will affect the number of transactions that can be processed per second.

slide-17
SLIDE 17

17

Phase 2

  • 1. Brainstorm and prioritize scenarios
  • 2. Analyze the architectural approaches
  • 3. Present results
slide-18
SLIDE 18

18

Brainstormed Scenarios

Number Scenario

1 Previously public data is made private, and access is adjusted accordingly. 6 Decide to support German. 9 A user requests a new field for asynchronous queries. 21 Introduce a new workflow process for patient check-in and check-out. 29 A fire in the data center forces the information hub to be moved to a new location. 33 George quites

slide-19
SLIDE 19

19

Analyze Architectural Approaches

  • Eval team guides architect in the process of carrying out

the highest ranked scenarios identified during brainstorming.

  • Architect provides explanation of how scenarios are

addressed.

slide-20
SLIDE 20

20

Present Results

  • Summarize and present ATAM results

– Architectural approaches documented – Set of scenarios and their prioritization from the brainstorming – Utility Table or similar – Risks discovered – Non-risks documented – Sensitivity points and tradeoff points found

slide-21
SLIDE 21

21

ATAM Summary

  • Not an evaluation of requirements
  • Not a code evaluation
  • Does not include actual system testing
  • Not precise, but identifies possible risk areas within the

architecture

  • Actual practice: amazement that so many risks can be

found in such a short time.

slide-22
SLIDE 22

22

The CBAM

  • The biggest tradeoffs in large, complex systems usually

have to do with economics.

  • How should an organization invest its resources to

maximize gain and minimize risk?

  • Economics includes cost to build, but also the benefits

that an architecture delivers.

slide-23
SLIDE 23

23

Decision-Making Context

  • Begins with the ATAM results
  • Add in costs and benefits associated with the

architectural decisions

  • The stakeholders decide if they should:

– Use redundant hardware; failover; load balance – Save cost on this project, invest in another

  • Provides a framework for making decisions
  • Helps clarify ROI – the ratio of benefit to cost
slide-24
SLIDE 24

24

Utility

  • Definition: the benefit gained by system stakeholders
  • Variation: set of ATAM scenarios by varying the value of

the responses, which gives a utility-response curve

slide-25
SLIDE 25

25

Utility Response Curves

  • Start with collection of scenarios
  • For each scenario, see how they differ in their quality

attribute responses.

  • Assign utility based on importance of these values.

– Best-case (0.1 sec RT is instantaneous to a person, so .03 doesn’t matter) = 100 – Worst-case (minimum requirement) = 0 – Current (relative to Best & Worst) = x% – Desired (relative to Best & Worst) = y%

  • Generate curves for all scenarios across architectural

strategies.

slide-26
SLIDE 26

26

Utility Response Curves

  • Determine quality attribute levels for best-case and

worst-case situations.

  • Best case - no value added above best case
  • Worst case - min acceptable threshod the system must

perform

  • Must determine current and desired levels
slide-27
SLIDE 27

27

Architectural Strategies

  • Determine architectural strategies to move from current

quality attribute response to desired or best-case level.

  • Calculate utility of expected value by interpolating from

four values already ploted

  • Affect of architectural strategy, cost
slide-28
SLIDE 28

28

Utility Response Curves

  • Determine benefit and normalize
  • Calculate overall utility of an architectural strategy

across scenarios from utility response curves by summing utility from each utility curve.

  • Bi=Sumj(bi,J * wj) // wj normalized
  • bi,J = Uexpected - Ucurrent
  • ROI: Ri = Bi/Ci
slide-29
SLIDE 29

29

Architectural Strategies

  • How do you move from the current quality attribute

response level to the desired or best-case level?

  • What would be a strategy for

– increasing system response time? – Increasing capacity?

  • Fifth data point: derive expected value of the new

response; utility is interpolation of original 4 values

  • Watch out for side effects
slide-30
SLIDE 30

30

Exercise for lab: ATAM Introduction

  • Easy architecture review!

– Evaluation team: 4 class teams (1 customer, rest stakeholders) XX is PM and architect – Documents from a real system (architectural document) – Outputs: next slide – Process:

  • Skim through the documentation, looking for

information that provides the expected Outputs

slide-31
SLIDE 31

31

Exercise: Outputs

  • A concise presentation of the architecture
  • Defined business goals
  • Quality requirements via a collection of scenarios
  • Mapping of architectural decisions to quality

requirements

  • A set of risks and non-risks
slide-32
SLIDE 32

Design Review

Design Review: 20 min., 10 min. Q/A

  • Description: elevator pitch
  • System context
  • UC’s: list
  • Key qualities (business drivers) that affect the structure of the

system

  • Key quality attribute scenarios with quantitative response
  • High-level design
  • Layers, common application framework
  • Key design patterns for each layer
  • Key class diagrams
  • Sequence diagrams (one for each use case)
  • Skeletal system: thread through system, focus on risk and validating

qualities

32