analyzing architectures
play

Analyzing Architectures Introduction to Software Architecture Jay - PowerPoint PPT Presentation

Analyzing Architectures Introduction to Software Architecture Jay Urbain, PhD Credits: Software Architecture in Practice, 2 nd . Ed., Len Bass, Paul Clements, Rick Kazman 1 ATAM Architecture Tradeoff Analysis Method A thorough and


  1. Analyzing Architectures Introduction to Software Architecture Jay Urbain, PhD Credits: Software Architecture in Practice, 2 nd . Ed., Len Bass, Paul Clements, Rick Kazman 1

  2. ATAM • Architecture Tradeoff Analysis Method • A thorough and comprehensive way to evaluate a software architecture • This is hard: – Large systems are very complicated. – Evaluation needs to compare business goals and technical decisions. – Careful management of an evaluation process is needed to consider multiple stakeholders in a review format. Focus on RISK!!! 2

  3. Participants in the ATAM • Three basic groups/perspectives: – The evaluation team ~ 3-5 people – Project decision makers ~ Project manager, customer, architect – Architecture stakeholders ~ varies 3

  4. Outputs • Concise presentation of the architecture • Clearly defined/refined business goals • Quality requirements via a collection of scenarios • Mapping of architectural decisions to quality requirements – Tactics and patterns used to satisfy qualities • A set of risks and non-risks • A set of risk themes 4

  5. Phases • Phase 0: partnership and preparation • Phase 1 and 2: evaluation – Phase 1 has 6 steps – Phase 2 has 3 steps with all stakeholders • Phase 3: follow-up via written report 5

  6. Phase 0: Partnership and Preparation Participants: • Evaluation team • Project decision makers • Architecture stakeholders Preparation: • Team leadership and key project decision makers meet informally to work out the details of the exercise. • Project representatives brief evaluators about the project so the team can be supplemented by people who possess the necessary expertise. • Work out logistics • Identify stakeholders • Define what's expected for Phase I 6

  7. Phase 1: Evaluation Activity: • Evaluation Participants: • Evaluation team and key project decision makers Duration: • Varies, ~1 day for large projects, followed by hiatus (~1 to 2 weeks) 7

  8. Phase 1 1. Present the ATAM process 2. Present the business drivers 3. Present the architecture 4. Identify architectural approaches 5. Generate quality attribute tree (table) 6. Analyze the architectural approaches 8

  9. Present the ATAM process • Evaluation leader presents the ATAM process to project representatives. – Explain process – Answer questions – Set context and expectations – Modifies as necessary 9

  10. Present the business drivers • Everyone involved needs to understand the context of the system, and the primary business drivers motivating development. • Should describe: – System's most important functions – Relevant technical, managerial, economic, political constraints – Business goals as they relate to project – Major stakeholders – Architectural drivers 10

  11. Present the architecture • Presented by lead architect • "Appropriate" level of detail • Cover architectural constraints : – OS, HW, middle-ware, language, framework, etc. • Architectural approaches: – Tactics (and patterns) used to meet architectural requirements 11

  12. Architecture Presentation • Description: elevator pitch • System context • UC’s: list, trace through 1 or 2 most important UC's • Key qualities (business drivers) that affect the structure of the system • Key quality attribute scenarios with quantitative response • Tactics • High-level design • Layers, common application framework • Key design patterns for each layer • Key class diagrams • Sequence diagrams (one for each use case) • Skeletal system: thread through system, focus on risk and validating qualities 12

  13. Identify architectural approaches • Analyze architecture by analyzing approaches • Common application architectures and patterns • Known ways in which each pattern/approach affects particular quality attributes. • Example: – Layered pattern tends to bring portability, maintainability, and distributability to a system, possibly at the expense of performance. – Data centric architecture at the cost of distributability. – Consequences of service layer??? • Evaluation team should catalog patterns and approaches. 13

  14. Generate Quality Attribute Utility Tree/Table • Architecture ’ s suitability to deliver quality attributes to the system. • High-performance system may be totally wrong for a system in which performance is not nearly as important as security. • Articulate quality attribute goals Identify, prioritize, and refine the system's most important quality attribute goals which are expressed as scenarios. 14

  15. QA Utility Tree/table Quality Attribute Attribute Scenarios Refinement (Importance, Difficulty) At peak load, the system Performance Throughput is able to complete 150 normalized transactions per second. (M,M) A user in a particular Usability Proficiency context asks for help, and training the system provides help for that context. (H,L) The system supports 7/24 Availability Web-based account access by patients. (L,L) 15

  16. Analyze Architectural Approaches • Rank scenarios one at a time • Probe architectural approaches used to carry out the scenario. • Document relevant architectural decisions and identify risk, non-risk, sensitivity points, and tradeoffs. – Example: number of simultaneous database clients will affect the number of transactions that can be processed per second. 16

  17. Phase 2 1. Brainstorm and prioritize scenarios 2. Analyze the architectural approaches 3. Present results 17

  18. Brainstormed Scenarios Number Scenario 1 Previously public data is made private, and access is adjusted accordingly. 6 Decide to support German. 9 A user requests a new field for asynchronous queries. 21 Introduce a new workflow process for patient check-in and check-out. 29 A fire in the data center forces the information hub to be moved to a new location. 33 George quites 18

  19. Analyze Architectural Approaches • Eval team guides architect in the process of carrying out the highest ranked scenarios identified during brainstorming. • Architect provides explanation of how scenarios are addressed. 19

  20. Present Results • Summarize and present ATAM results – Architectural approaches documented – Set of scenarios and their prioritization from the brainstorming – Utility Table or similar – Risks discovered – Non-risks documented – Sensitivity points and tradeoff points found 20

  21. ATAM Summary • Not an evaluation of requirements • Not a code evaluation • Does not include actual system testing • Not precise, but identifies possible risk areas within the architecture • Actual practice: amazement that so many risks can be found in such a short time. 21

  22. The CBAM • The biggest tradeoffs in large, complex systems usually have to do with economics. • How should an organization invest its resources to maximize gain and minimize risk? • Economics includes cost to build , but also the benefits that an architecture delivers . 22

  23. Decision-Making Context • Begins with the ATAM results • Add in costs and benefits associated with the architectural decisions • The stakeholders decide if they should: – Use redundant hardware; failover; load balance – Save cost on this project, invest in another • Provides a framework for making decisions • Helps clarify ROI – the ratio of benefit to cost 23

  24. Utility • Definition: the benefit gained by system stakeholders • Variation: set of ATAM scenarios by varying the value of the responses, which gives a utility-response curve 24

  25. Utility Response Curves • Start with collection of scenarios • For each scenario, see how they differ in their quality attribute responses. • Assign utility based on importance of these values. – Best-case (0.1 sec RT is instantaneous to a person, so .03 doesn ’ t matter) = 100 – Worst-case (minimum requirement) = 0 – Current (relative to Best & Worst) = x% – Desired (relative to Best & Worst) = y% • Generate curves for all scenarios across architectural strategies. 25

  26. Utility Response Curves • Determine quality attribute levels for best-case and worst-case situations. • Best case - no value added above best case • Worst case - min acceptable threshod the system must perform • Must determine current and desired levels 26

  27. Architectural Strategies • Determine architectural strategies to move from current quality attribute response to desired or best-case level. • Calculate utility of expected value by interpolating from four values already ploted • Affect of architectural strategy, cost 27

  28. Utility Response Curves • Determine benefit and normalize • Calculate overall utility of an architectural strategy across scenarios from utility response curves by summing utility from each utility curve. • B i =Sum j (b i,J * w j ) // w j normalized • b i,J = U expected - U current • ROI: R i = B i /C i 28

  29. Architectural Strategies • How do you move from the current quality attribute response level to the desired or best-case level? • What would be a strategy for – increasing system response time? – Increasing capacity? • Fifth data point: derive expected value of the new response; utility is interpolation of original 4 values • Watch out for side effects 29

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend