Introduction to Multi-Objective Investment Decision-Making New York - - PowerPoint PPT Presentation

introduction to multi objective investment decision making
SMART_READER_LITE
LIVE PREVIEW

Introduction to Multi-Objective Investment Decision-Making New York - - PowerPoint PPT Presentation

TPM Workshop Introduction to Multi-Objective Investment Decision-Making New York City, New York December 3-4, 2019 Welcome and Summary of TPM Activities to Date Debra Nelson, NYMTC Introduction and Overview Agenda Day 1 AM 10:00


slide-1
SLIDE 1

TPM Workshop

Introduction to Multi-Objective Investment Decision-Making

New York City, New York

December 3-4, 2019

slide-2
SLIDE 2

Debra Nelson, NYMTC

Welcome and Summary of TPM Activities to Date

slide-3
SLIDE 3

Introduction and Overview

slide-4
SLIDE 4

10:00 Welcome, TPM Summary 10:20 FHWA Introduction, Overview 10:35 Overview TPM Toolbox & Guidebook, MODA 10:55 Discussion 11:20 Practice Example: DVRPC 11:45 Lunch

Agenda – Day 1 AM

slide-5
SLIDE 5

1:00 Steps to Implementing MODA 2:00 Introduce Exercises 2:10 Break 2:25 Exercise 1: Determining Goals, Objectives, Measures 3:55 Discussion, Report Out 4:25 Day 1 Wrap Up

Agenda – Day 1 PM

slide-6
SLIDE 6

9:00 Recap and Discussion 9:10 Exercise 2: Weighing Objectives and Measures 10:10 Discussion, Report Out 10:30 Break 10:35 Software Tour

Agenda – Day 2 AM

slide-7
SLIDE 7

11:05 Exercise 3: Investment Prioritization Using MODAT 12:05 Discussion, Report Out 12:25 Workshop Wrap Up and Feedback

Agenda – Day 2 AM (continued)

slide-8
SLIDE 8

8

Why TPM?

slide-9
SLIDE 9

9

TPM Elements

slide-10
SLIDE 10

10

Rulemakings

TPM-Related Rules Rule Effective Date Regulatory Chapter Safety Performance Measures (PM1) April 14, 2016 23 CFR 490 (Subpart A & B) Highway Safety Improvement Program (HSIP) April 14, 2016

23 CFR 924

Statewide and Non-Metropolitan Planning; Metropolitan Planning June 27, 2016 23 CFR 450 Highway Asset Management Plans for NHS October 2, 2017 23 CFR 515 & 667 Pavement and Bridge Condition Measures (PM2) May 20, 2017 23 CFR 490 (Subpart A, C & D) Performance of the NHS, Freight, and CMAQ Measures (PM3)* May 20, 2017 23 CFR 490 (Sub. A, E, F, G, H)

* Except for the GHG measure (the percent change in tailpipe CO2 emissions on the NHS compared to the 2017 level)

slide-11
SLIDE 11

11

Required Plans

Multimodal Plans State/MPO Long Range Transportation Plans State/MPO Transportation Improvement Programs Safety Strategic Highway Safety Plan (SHSP) Highway Safety Improvement Program (HSIP) Infrastructure Condition Transportation Asset Management Plan (TAMP) Congestion/ Air Quality CMAQ Performance Plan Freight State Freight Plan Transit Transit Safety Plan Transit Asset Management Plan

slide-12
SLIDE 12
  • USDOT
  • Performance Measure Rules include:

§ Establish measures; identify data sources; define metrics § Report to Congress § Stewardship and oversight

  • States and MPOs
  • Establish targets
  • Support national goals in the planning process and consider

measures and targets in long range plans and programs

  • Report progress to USDOT (States)

12

TPM Roles and Responsibilities

slide-13
SLIDE 13
  • Identify available and needed data
  • Coordinate with other agencies
  • Establish coordinated targets
  • Collect and submit required data
  • Report progress

State DOT and MPO Roles

13

slide-14
SLIDE 14
  • FHWA is committed to your success!
  • Headquarters provides guidance and develops policies and

tools

  • Divisions are responsible for program delivery
  • The Resource Center provides technical assistance and

training

14

FHWA Role

slide-15
SLIDE 15

Overview of TPM Toolbox, Capability Maturity Model, Guidebook and MODA

slide-16
SLIDE 16

16

TPM Toolbox Elements

slide-17
SLIDE 17

17

TPM Framework

slide-18
SLIDE 18

18

TPM Framework Component 4

Performance-Based Programming

slide-19
SLIDE 19
  • Assess current state of your agency with respect to TPM
  • Identify logical set of improvements
  • Show benefit of moving to higher maturity levels
  • Can assess each TPM component separately

19

TPM Capability Maturity Model (CMM)

slide-20
SLIDE 20

CMM Level Definition

  • 1. Initial

Programming decisions not links to goals or planning documents, lack

  • transparency. Based on formulas, history, not based on analysis of performance.
  • 2. Developing

Developing performance-based programming that reflects agency goals, priorities, funding constraints, risk factors, and relative needs across performance areas.

  • 3. Defined

Method and processes defined and incorporating into long-range plans.

4.Functioning

Established and documented methodology and process within and across performance areas to maximize achievement of multiple goals. Clear linkage between investments and expected performance outputs and outcomes.

  • 5. Sustained

Performance-based programming applied for multiple cycles. Feedback loop between monitoring and programming. Process is periodically refined.

20

Assessment for Performance-Based Programming

slide-21
SLIDE 21
  • Allocation based on

formulas, historical trends

  • Programming at region

level, validated by DOT HQ

  • Performance based

principles, but focused on reporting

Legacy Practice

  • Links planning and

programming

  • Considers cross asset

impacts

  • Flexible criteria
  • Aligns with target-setting

Emerging Practice

21

Emerging Practice in Programming Projects

slide-22
SLIDE 22

22

Challenges in Programming Across Areas

  • Many investment needs
  • Tightly constrained

resources

  • Multiple with competing
  • bjectives that are

difficult to compare

  • Mobility
  • Safety
  • Preservation
  • Environment
slide-23
SLIDE 23
  • Structured approach for choosing between a set of

alternatives considering multiple objectives

  • Approach for supporting TPM Framework Area 4.2 –

Programming Across Program Areas

  • Requires quantifying one’s objectives and the value
  • btained with respect to each objective by each

alternative under consideration

  • Also referred to using Multi-Criteria Decision-Making

(MCDM) or other acronyms

23

Multi-Objective Decision Analysis (MODA)

slide-24
SLIDE 24
  • Better alignment of resource allocation decisions with

agency goals and objectives

  • Clear definition of goals and objectives
  • Performance-based
  • Data-driven
  • More efficient use of

scarce resources

  • Repeatability
  • Transparency

24

Benefits of Using MODA to Prioritize

slide-25
SLIDE 25

Scope

  • What types of investments will be prioritized?
  • How many funding periods are considered?
  • How will the results be used?

Methodology

  • Generating candidate investments
  • Relating investment objectives to performance measures
  • Scaling performance measures to approximate the benefit of

performing work

  • Weighing competing objectives

Data

  • Need predicted performance for each measure representing the

full range of objectives

25

Considerations in Implementing MODA

slide-26
SLIDE 26

NCHRP Report 921

(2019)

  • Updated the NCHRP Report

806 spreadsheet tool and created a new web tool

  • Case studies illustrating

MODA applications

26

Relevant Research

NCHRP Report 806

(2015)

  • Cross-asset resource

allocation approaches for asset management

  • Spreadsheet tool and

guidance

slide-27
SLIDE 27

Discussion

slide-28
SLIDE 28

Practice Example

Delaware Valley Regional Planning Commission

slide-29
SLIDE 29

Lunch (75 min)

41

slide-30
SLIDE 30

Steps in Implementing MODA for Investment Prioritization

slide-31
SLIDE 31
  • 1. Establish Roles

& Responsibilities

  • 2. Clarify

Purpose

  • 3. Develop

Methodology

  • 4. Document

Process

43

Programming Across Performance Areas

Clarify purpose of cross performance area prioritization Develop a methodology that reflects agency priorities and external stakeholder interests Document the process Identify and assign internal roles and responsibilities

slide-32
SLIDE 32

44

  • 1. Establish Roles

& Responsibilities

  • 2. Clarify

Purpose

  • 3. Develop

Methodology

  • 4. Document

Process

Step 1

Identify and Assign Internal Roles and Responsibilities

  • Project curator – who will

facilitate project submissions?

  • Criteria selection team – who

will develop criteria by which projects will be evaluated for inclusion in the STIP or TIP?

  • Data reporters – who is

responsible for reporting data that will be used to assess projects?

  • Analysts—who will evaluate the

potential projects based on the criteria? Determination of what methodology will be used?

  • Decision maker—who will

finalize and approve the selection of projects?

  • Liaison – who will communicate

progress to agency and gather feedback from those not involved in process development?

slide-33
SLIDE 33

45

  • 1. Establish Roles

& Responsibilities

  • 2. Clarify

Purpose

  • 3. Develop

Methodology

  • 4. Document

Process

Step 2

Clarify purpose of cross performance area prioritization

High Level Goals

  • Transparency and Accountability
  • Maximize Efficiency (Funds, ROI)
  • Maximize Effectiveness (System

Performance)

  • Standardize Performance Criteria
  • Improve Repeatability
  • Mitigate Risks
slide-34
SLIDE 34

What will be prioritized?

  • Which modes?
  • Which types of investments?
  • What is the timeframe for analysis?

What should process yield?

  • Prioritized list of investments?
  • Recommended resource allocation?

How will results be used?

  • Initial list of investments to fund?
  • Approach for narrowing the candidate pool of investments?
  • Support setting targets?

46

Detailing the Scope and Purpose

slide-35
SLIDE 35

47

  • 1. Establish Roles

& Responsibilities

  • 2. Clarify

Purpose

  • 3. Develop

Methodology

  • 4. Document

Process

3. 3.1 Define goals and object ctives 3. 3.2 Select ct performance ce measures and evaluation cr criteria 3. 3.3 Assess data and analytica cal ca capabilities 3. 3.4 Prototype the approach ch 3. 3.5 Set weights on goals and object ctives 3. 3.6 Ap Apply the model

Step 3

Develop the Methodology

slide-36
SLIDE 36

48

What it takes

  • Review the agency’s overall goals and
  • bjectives that support those goals
  • Determine which of these are relevant

given the scope of the process

  • Attempt to resolve any gaps or overlaps

between objectives

Step 3.1

Define Goals and Objectives

3. 3.1 De Define goals and

  • b
  • bjectives
slide-37
SLIDE 37

49

Example Goals

  • Mobility
  • Preservation
  • Safety
  • Security
  • Resilience
  • Environment
  • Community
  • Economic Development
  • Accessibility
  • Environmental Justice

Step 3.1

Define Goals and Objectives

3. 3.1 De Define goals and

  • b
  • bjectives
slide-38
SLIDE 38

50

What it takes

  • Identify performance measures that

will be used to quantify each goal and objective

  • Determine how to scale each

measure

  • Typically measures are scaled such that the

maximum value for a goal is expressed on a scale from 0% (lowest value) to 100% highest value

  • Determine how to combine measures

where multiple measures contribute to a goal

50

3. 3.2 Se Select pe performanc nce mea measures es and d ev evaluati tion criter eria

Step 3.2

Select Performance Measures and Evaluation Criteria

slide-39
SLIDE 39

51

Considerations in Scaling Measures

  • Measures should be structured such

that the measure is proportional to

  • verall value yielded by the investment
  • May need to multiply measure some

proxy value if the measure does not scale appropriately

  • Can utilize parameters from

benefit/cost analyses to help facilitate scaling

51

3. 3.2 Se Select pe performanc nce mea measures es and d ev evaluati tion criter eria

Step 3.2

Select Performance Measures and Evaluation Criteria

slide-40
SLIDE 40

Measure 1: Job Accessibility

  • Used a software tool to determine increased

number of jobs accessible within a 45 minute commute from disadvantaged communities

  • Divided result by maximum value to put on a 0-1 scale

Measure 2: Economic Development Impact on Low Income Communities

  • Completed checklist of factors related to this topic to

determine a score on a scale from 0-5

  • Multiplied result by estimated development area
  • Divided result by maximum value to put on a 0-1 scale

Decided to weigh Measure 1 vs. 2 equally and combine to obtain an overall value

52

Example

MDOT Accessibility Measures

slide-41
SLIDE 41

Measure 1: Emissions Reduction

  • Calculated annual reduction in fuel consumption
  • Multiplied result by environmental cost of a gallon of fuel

determined using Cal B/C benefit-cost model

Measure 2: Health Activity Benefit

  • Estimated number of new cyclists and pedestrians
  • Multiplied number by annual health benefit in dollars per

new cyclist/pedestrian determined in a separate NCHRP study

  • Summed Measure 1 and 2 to obtain total benefit
  • Divided total benefits by a maximum value to obtain

results on a 0-1 scale

53

Example

Caltrans Air Quality and Health Measures

slide-42
SLIDE 42

54

What it takes

  • Obtain data - several approaches:
  • Direct measurement
  • Predictive models
  • Representative default
  • Subjective judgment
  • To overcome data challenges:
  • Revisit scope and measures
  • Collect more data

54

3. 3.3 Asse Assess ss data and an anal alytical al ca capabilities

Step 3.3

Assess Data and Analytical Capabilities

slide-43
SLIDE 43

What they did

  • Series of workshops to select performance measures and

assess data and analytical capabilities

How they did it

  • Included relevant stakeholders and discussion over each

potential measure to determine feasibility of gathering data and performing analysis

Why they did it

  • Time-effective way to reduce number of iterations

55

Example

MDOT Assessing Data Capabilities

slide-44
SLIDE 44

56

Example

DVPRC Assessing Data Capabilities

What they did

  • Implemented simpler performance measures initially and

will build data capabilities over time

How they did it

  • Selected measures for which data was readily available

Why they did it

  • Saw challenges of collecting data and the need to select

performance measures with data requirements that could be met by small MPOs with limited capabilities

slide-45
SLIDE 45

57

What it takes

  • Collect data for a sample set of projects
  • Calculate a score or utility for each

project

  • Prioritize the sample set
  • Review and assess the results

57

3. 3.4 Pr Prototy type e th the e ap approac ach

Step 3.4

Prototype the Approach

slide-46
SLIDE 46

58

Source: Adapted from VDOT Smart Scale Project Scores spreadsheet Basic project information Performance Measure scores Project benefit score Score divided by cost Rank Example

VDOT using a Spreadsheet to Prototype

slide-47
SLIDE 47

59

What it takes

  • Typically need to establish weights on

each goal or objective to determine how to compare benefit of achieving

  • ne versus the other
  • Also may need to establish weights on

individual measures if combined to determine the score of a given

  • bjective

59

3. 3.5 Se Set w weights o

  • n

go goals and d

  • b
  • bjectives

Step 3.5

Set Weights on Goals and Objectives

slide-48
SLIDE 48

60

Common Approaches

  • Delphi Approach
  • Analytical Hierarchy Process
  • Swing Weighting

How to simplifying or sidestep this

  • Develop monetized measures – these

can be summed without additional weights if structured appropriately

  • Use Data Envelopment Analysis to

select efficient investments using scores for each objective

60

3. 3.5 Se Set w weights o

  • n

go goals and d

  • b
  • bjectives

Step 3.5

Set Weights on Goals and Objectives

slide-49
SLIDE 49

61

Delphi Approach

  • Assemble expert panel
  • Review goals and measures, illustrating calculations with

example projects

  • Conduct first round of voting: each participant specifies

weights on each goal

  • Review initial results through group discussing, showing

results for example projects

  • Conduct successive rounds of voting until consensus is

reached

  • Use consensus or average results as initial weights
slide-50
SLIDE 50

62

Analytical Hierarchy Process

  • Described in NCHRP Report 806
  • Popularized in commercial software products
  • Requires evaluating relative importance between each

goal/objective/measure pair

  • Practical when number of pairs is small
  • 4 goals = 6 pairs
  • 7 goals = 21 pairs
slide-51
SLIDE 51

63

Swing Weighting

  • Determine the best and worst score for each objective
  • bserved for a set of candidate investments
  • Create a set of fictional investments
  • One worst-case with the lowest score observed for each objective
  • One for each objective where the score for that objective is the

highest observed and the scores = worst case for the others

  • Rank order the fictional alternatives
  • Rate the fictional alternatives from 0 (worst) to 100 (best)
  • Normalize the ratings so they sum to 1 (divide by the sum)
  • The weight for each objective is the normalized rating of the

corresponding fictional investment

slide-52
SLIDE 52

64

What it takes

  • Identify candidate investments
  • Calculate measures for each candidate
  • Prioritize using weights
  • Consider other factors
  • Optional: perform additional tradeoff

analysis

64

3. 3.6 Ap Apply the mod

  • del

Step 3.6

Apply the Model

slide-53
SLIDE 53

65

  • 1. Establish Roles

& Responsibilities

  • 2. Clarify

Purpose

  • 3. Develop

Methodology

  • 4. Document

the Process Step 4

Document the Process

  • How process was established

and conducted

  • Project selection criteria
  • Formulas for project

evaluation and justification

  • Why goal areas were

prioritized

  • Impacts on performance from

tradeoff analyses

  • What alternatives were not

chosen and why

  • Roles and responsibilities
  • Project eligibility
  • Project submission process
  • Timeline for submission,

evaluation, and publication of final results

  • Input received from

stakeholders

  • Risk factors
  • Output targets that can be

used to track anticipated effects

slide-54
SLIDE 54

Introduce Small Group Exercises

slide-55
SLIDE 55
  • Exercise 1: Determine how agency’s objectives relate

to specific performance measures for one of the investment objectives

  • Exercise 2: Set weights for all of the investment
  • bjectives
  • Exercise 3: Use MODAT to walk through prioritization

process

67

Exercise Objectives

slide-56
SLIDE 56
  • Determine roles
  • Data owner – will capture your information on the

worksheet in part 1 of exercise and run the software when we get to part 3 of exercise

  • Facilitator – will lead the discussion and help the group

come to a consensus

  • Reporter – will report out results to the group
  • Participants – will participate in the exercise – you will be

working as a team

  • Assumptions going into exercise
  • Dataset preloaded ahead of session
  • There are predetermined performance areas
  • There is defined approach for scaling

68

Exercise Logistics

slide-57
SLIDE 57
  • Within assigned groups, review and discuss:
  • 1. Background
  • Complete (in 2. Exercise Tasks)
  • Task 1. Identify Performance Measures that

contribute to your objective

  • Task 2. Set Weights on Performance Measures

either for all measures together or by sub-objective

  • Task 3. Discuss within your groups and capture how

you weighted the measures for your objective

  • Task 3. Report out the the group on results

69

Workshop Exercise: Measure Weights

slide-58
SLIDE 58

Break (15 min)

70

slide-59
SLIDE 59

Exercise 1: Determining Goals, Objectives and Measures

slide-60
SLIDE 60
  • Within assigned groups, review and discuss:
  • 1. Background
  • Complete (in 2. Exercise Tasks)
  • Task 1. Identify Performance Measures that

contribute to your objective

  • Task 2. Set Weights on Performance Measures

either for all measures together or by sub-objective

  • Task 3. Discuss within your groups and capture how

you weighted the measures for your objective

  • Task 3. Report out the the group on results

72

Workshop Exercise: Measure Weights

slide-61
SLIDE 61

Discussion, Report Out

slide-62
SLIDE 62

Day 1 Wrap Up

slide-63
SLIDE 63

Day 2 Recap and Discussion

slide-64
SLIDE 64

Exercise 2: Weighing Objectives and Measures

slide-65
SLIDE 65
  • Review and discuss:
  • 1. Background
  • Complete (in 2. Exercise Tasks)
  • Task 1. Review Exercise 1 Results
  • Task 2. Make Pairwise Comparisons
  • Task 3. Discuss

77

Workshop Exercise: Objective Weights

slide-66
SLIDE 66

Discussion, Report Out

slide-67
SLIDE 67

Break (5 min)

79

slide-68
SLIDE 68

Software Tour

slide-69
SLIDE 69

Exercise 3: Investment Prioritization Using MODAT

slide-70
SLIDE 70
  • Review and discuss:
  • 1. Background
  • Complete (in 2. Exercise Tasks)
  • Task 1. Enter Measure Weights in MODAT
  • Task 2. Enter Objective Weights in MODAT
  • Task 3. Review modeling assumptions
  • Task 4. Prioritize
  • Task 5. Perform a Sensitivity Analysis
  • Task 6. Discuss your results within your group
  • Task 7. Report out to the group highlights of exercise,

discussion, results

82

Workshop Exercise: Using MODAT

slide-71
SLIDE 71

Discussion, Report Out

slide-72
SLIDE 72

Workshop Wrap Up

slide-73
SLIDE 73
  • Overview of multi-objective decision analysis (MODA)

as a performance-based approach for programming across investment areas

  • Steps in implementing MODA
  • Practice examples
  • Use of MODAT
  • Exercises in weighing measures and stepping through

the prioritization process using MODAT

Summary

85

slide-74
SLIDE 74

Session Feedback

What did you learn in this workshop? What more would you like to learn? What will be most challenging for your agency to incorporate? What will be easiest for your agency to incorporate?

Before you leave today, please capture the following

  • n the Post-Its

at the back of the room What did you learn? What more would you like to learn? What will be most challenging? What will be easiest? How can we improve future sessions?