Introduction to Multi-Objective Investment Decision-Making 2019 - - PowerPoint PPT Presentation

introduction to multi objective investment decision making
SMART_READER_LITE
LIVE PREVIEW

Introduction to Multi-Objective Investment Decision-Making 2019 - - PowerPoint PPT Presentation

TPM Workshop Introduction to Multi-Objective Investment Decision-Making 2019 Conference on Performance and Data in Transportation Decision Making Atlanta, Georgia September 18, 2019 Workshop Overview Introduction and Overview Practice


slide-1
SLIDE 1

TPM Workshop

Introduction to Multi-Objective Investment Decision-Making

2019 Conference on Performance and Data in Transportation Decision Making Atlanta, Georgia

September 18, 2019

slide-2
SLIDE 2
  • Introduction and Overview
  • Practice Example: Maryland DOT
  • Tradeoff Decision-Making, MODA, and MODAT
  • Small Group Exercise: Tradeoff Analysis Using MODAT
  • Break
  • Small Group Exercise (continued)
  • Workshop Wrap Up

2

Workshop Overview

slide-3
SLIDE 3

Introduction and Overview

slide-4
SLIDE 4

4

Why TPM?

slide-5
SLIDE 5

5

TPM Elements

slide-6
SLIDE 6

6

Rulemakings

TPM-Related Rules Rule Effective Date Regulatory Chapter Safety Performance Measures (PM1) April 14, 2016 23 CFR 490 (Subpart A & B) Highway Safety Improvement Program (HSIP) April 14, 2016

23 CFR 924

Statewide and Non-Metropolitan Planning; Metropolitan Planning June 27, 2016 23 CFR 450 Highway Asset Management Plans for NHS October 2, 2017 23 CFR 515 & 667 Pavement and Bridge Condition Measures (PM2) May 20, 2017 23 CFR 490 (Subpart A, C & D) Performance of the NHS, Freight, and CMAQ Measures (PM3)* May 20, 2017 23 CFR 490 (Sub. A, E, F, G, H)

* Except for the GHG measure (the percent change in tailpipe CO2 emissions on the NHS compared to the 2017 level)

slide-7
SLIDE 7

7

Required Plans

Multimodal Plans State/MPO Long Range Transportation Plans State/MPO Transportation Improvement Programs Safety Strategic Highway Safety Plan (SHSP) Highway Safety Improvement Program (HSIP) Infrastructure Condition Transportation Asset Management Plan (TAMP) Congestion/ Air Quality CMAQ Performance Plan Freight State Freight Plan Transit Transit Safety Plan Transit Asset Management Plan

slide-8
SLIDE 8
  • USDOT
  • Performance Measure Rules include:

§ Establish measures; identify data sources; define metrics § Report to Congress § Stewardship and oversight

  • States and MPOs
  • Establish targets
  • Support national goals in the planning process and consider

measures and targets in long range plans and programs

  • Report progress to USDOT (States)

8

TPM Roles and Responsibilities

slide-9
SLIDE 9
  • Identify available and needed data
  • Coordinate with other agencies
  • Establish coordinated targets
  • Collect and submit required data
  • Report progress

State DOT and MPO Roles

9

slide-10
SLIDE 10
  • FHWA is committed to your success!
  • Headquarters provides guidance and develops policies and

tools

  • Divisions are responsible for program delivery
  • The Resource Center provides technical assistance and

training

10

FHWA Roles

slide-11
SLIDE 11

Overview of TPM Toolbox, Capability Maturity Model, Guidebook and MODA

slide-12
SLIDE 12

12

Toolbox Elements

slide-13
SLIDE 13

13

TPM Framework

slide-14
SLIDE 14

14

Component 4

Performance-Based Programming

  • 1. Strategic Direction
  • 2. Target Setting
  • 3. Performance-Based Planning
  • 4. Performance-Based

Programming

  • 5. Monitoring & Assessment
  • 6. Reporting & Communication
  • A. Organization & Culture
  • B. External Collaboration &

Coordination

  • C. Data Management
  • D. Data Usability & Analysis

14

slide-15
SLIDE 15

Purpose

  • Assess current state of your agency
  • Identify logical set of improvements
  • Show benefit of moving to higher maturity levels

TPM CMM

  • Assesses maturity on 1-5 scale
  • For each TPM Component

15

CMM: Capability Maturity Model

slide-16
SLIDE 16

CMM Level Definition

  • 1. Initial

Programming decisions not links to goals or planning documents, lack

  • transparency. Based on formulas, history, not based on analysis of performance.
  • 2. Developing

Developing performance-based programming that reflects agency goals, priorities, funding constraints, risk factors, and relative needs across performance areas.

  • 3. Defined

Method and processes defined and incorporating into long-range plans.

4.Functioning

Established and documented methodology and process within and across performance areas to maximize achievement of multiple goals. Clear linkage between investments and expected performance outputs and outcomes.

  • 5. Sustained

Performance-based programming applied for multiple cycles. Feedback loop between monitoring and programming. Process is periodically refined.

16

CMM: Performance-Based Programming

slide-17
SLIDE 17
  • Allocation based on

formulas, historical trends

  • Programming at region

level, validated by DOT HQ

  • Performance based

principles, but focused on reporting

Legacy Practice

  • Links planning and

programming

  • Considers cross asset

impacts

  • Flexible criteria
  • Aligns with target-setting

Emerging Practice

17

Emerging Practice in Programming Projects

slide-18
SLIDE 18

MODA: Multi-Objective Decision Analysis

  • Structured approach for choosing between a set of

alternatives considering multiple objectives

  • Requires quantifying one’s objectives and the value obtained

with respect to each objective by each alternative under consideration

  • Also referred to using Multi-Criteria Decision-Making (MCDM)
  • r other acronyms

18

MODA Defined

slide-19
SLIDE 19
  • MODA helps agencies prioritize projects considering
  • bjectives such as:
  • Mobility
  • Safety
  • System Preservation
  • Environment and Sustainability
  • Potential approach for programming across

performance areas (TPM Toolbox)

19

MODA Applications

slide-20
SLIDE 20

Scope

  • What types of investments will be prioritized?
  • How many funding periods are considered?
  • How will the results be used?

Methodology

  • Generating candidate investments
  • Scaling performance measures to approximate the benefit of

performing work

  • Weighing competing objectives

Data

  • Need predicted performance for each measure representing the

full range of objectives

20

Challenges in Applying MODA

slide-21
SLIDE 21

NCHRP Report 921

(2019)

  • Updated the NCHRP Report

806 spreadsheet tool and created a new web tool

  • Case studies illustrating

MODA applications

21

Relevant Research

NCHRP Report 806

(2015)

  • Cross-asset resource

allocation approaches for asset management

  • Spreadsheet tool and

guidance

slide-22
SLIDE 22

22

Using MODA to support TPM and PBPP

slide-23
SLIDE 23
  • 1. Establish Roles

& Responsibilities

  • 2. Clarify

Purpose

  • 3. Develop

Methodology

  • 4. Document

Process

23

Programming Across Performance Areas

Clarify purpose of cross performance area prioritization Develop a methodology that reflects agency priorities and external stakeholder interests Document the process Identify and assign internal roles and responsibilities

slide-24
SLIDE 24
  • 1. Establish Roles

& Responsibilities

  • 2. Clarify

Purpose

  • 3. Develop

Methodology

  • 4. Document

Process

24

Programming Across Performance Areas

3. 3.1 Define goals and object ctives 3. 3.2 Select ct performance ce measures and evaluation cr criteria 3. 3.3 Assess data and analytica cal ca capabilities 3. 3.4 Prototype the approach ch 3. 3.5 Set weights on goals and object ctives 3. 3.6 Ap Apply the model

slide-25
SLIDE 25

Practice Example

Maryland Department of Transportation

slide-26
SLIDE 26
  • Implementing state legislation for prioritizing major

expansion projects over $5 million for inclusion in Consolidated Transportation Plan (CTP)

  • Evaluating projects across 9 goals and 23 measures

established in legislation

  • Conducted series of workshops to determine evaluation

criteria for each measure based on available data and resources

  • Wherever possible utilized quantitative methods
  • Qualitative evaluation criteria used in some cases
  • Implemented resulting scoring approach in Citygate’s

iOpenDecision

Maryland DOT MODA Practice Example

26

slide-27
SLIDE 27
  • An annual call is made for major capacity investments
  • State, County and Municipal governments can propose

projects

  • Each county assigns its local priority to the nominated projects
  • MDOT reviews project applications and performs

supplemental analysis on each project

  • MDOT then scores the projects and generates a prioritized list

that informs CTP development

Maryland DOT Annual Process

27

slide-28
SLIDE 28

Goals and Weights

Delphi method used to establish weights on each goal

  • Stakeholders vote on

weights for each goal

  • Discuss difference of
  • pinion
  • Ultimately reach

consensus

28

slide-29
SLIDE 29

MDOT Measures and Weights (1/2)

69% 47% 11% 53% 31% 26% 64% 27% 27% 25% 20%

Safety & Security System Preservation Congestion Environment

Reduce fatalities and injuries Implement Complete Streets Increase lifespan Increase functionality Increase resilience Job accessibility Improve travel time reliability Support connections between modes Reduce harmful emissions Impact on State resources Advance State environmental goals

29

slide-30
SLIDE 30

49% 41% 53% 14% 100% 25% 28% 47% 64% 26% 32% 22%

Community Vitality Economic Equitable Access ROI Local

Increase bike, ped, transit Increase transportation alternatives Support local priorities Job accessibility (disadvantaged pop.) Economic development (low-income) Travel time savings divided by cost Leverage other funding sources Enhance community access Further State revitalization plan Job accessibility Enhance access to intermodal locations Economic development

MDOT Measures and Weights (2/2)

30

slide-31
SLIDE 31

31

Example Goal 3

Reducing Congestion and Improving Commute Time

Measure 1

  • The expected change in cumulative job accessibility within an

approximately 60-minute commute for highway projects or transit projects

Calculation

  • Predicted increase in number
  • f jobs from Multi-Modal

Accessibility (MMA) Tool

slide-32
SLIDE 32

32

Measure 2

  • The degree to which the project has a positive impact on travel

time reliability and congestion

Calculation

  • Highway - annual travel time savings from Statewide Travel

Demand Model

  • Transit - Annual travel time savings based on daily new transit
  • passengers. Assumes 5.4 minutes per trip in travel time

savings for every new transit passenger

Example Goal 3

Reducing Congestion and Improving Commute Time

slide-33
SLIDE 33

33

Measure 3

  • The degree to which the project supports connections

between different modes of transportation and promotes multiple transportation choices

Calculation

  • Qualitative checklist with

points assigned based on the connections the project facilitates

  • Points are multiplied by the

project cost to scale their impact

Rating Description Points Value Which of the following are included in the proposed project?

  • 1. Promotes Multiple Transportation Choices
  • 1A. Bus system improvements

1

  • 1B. Rail system improvements

1

  • 1C. Construction of bicycle facilities

1

  • 1D. Construction of pedestrian facilities

1

  • 2. Improve Connections Between Modes
  • 2A. Port Facilities

Supports Direct Connections to Port Facilities Supports Indirect Connections to Port Facilities 1 0.5

  • 2B. Freight Facilities

Supports Direct Connections to Freight Facilities Supports Indirect Connections to Freight Facilities 1 0.5

  • 2C. Airport Facilities

Supports Direct Connections to Airport Facilities Supports Indirect Connections to Airport Facilities 1 0.5

  • 2D. Transit Facilities

Supports Direct Connections to Transit Facilities Supports Indirect Connections to Transit Facilities 1 0.5 Total (sum of points) 0-8

Example Goal 3

Reducing Congestion and Improving Commute Time

slide-34
SLIDE 34
  • MDOT was able to implement a process consistent with

legislative requirements

  • The resulting process provides input into the determination of

which major investments to fund

  • Overall the process helps address the concerns of the MD

legislature that a more data-driven approach was needed for prioritizing major capital investments

34

Outcomes

slide-35
SLIDE 35

Lessons Learned

  • Selecting performance criteria is a large part of

setting up cross-asset resource allocation approach

  • Number of measures can impact practicality and sustainability of

a cross-asset resource allocation approach

  • There are clear benefits to selecting quantitative measures over

qualitative measures

  • Dollar benefits facilitate comparisons among measures
  • Important to test the process during developing using an example

set of projects

  • Using the Delphi process to set weights on goals was effective –
  • ther approaches (e.g. pairwise comparison) would have been

cumbersome given the large number of goals involved

35

slide-36
SLIDE 36

Tradeoff Decision-Making, MODA and MODAT

slide-37
SLIDE 37
  • 1. Establish Roles

& Responsibilities

  • 2. Clarify

Purpose

  • 3. Develop

Methodology

  • 4. Document

Process

37

Programming Across Performance Areas

Clarify purpose of cross performance area prioritization Develop a methodology that reflects agency priorities and external stakeholder interests Document the process Identify and assign internal roles and responsibilities

slide-38
SLIDE 38

38

  • 1. Establish Roles

& Responsibilities

  • 2. Clarify

Purpose

  • 3. Develop

Methodology

  • 4. Document

Process

Step 1

Identify and Assign Internal Roles and Responsibilities

  • Project curator – who will

facilitate project submissions?

  • Criteria selection team – who

will develop criteria by which projects will be evaluated for inclusion in the STIP or TIP?

  • Data reporters – who is

responsible for reporting data that will be used to assess projects?

  • Analysts—who will evaluate the

potential projects based on the criteria? Determination of what methodology will be used?

  • Decision maker—who will

finalize and approve the selection of projects?

  • Liaison – who will communicate

progress to agency and gather feedback from those not involved in process development?

slide-39
SLIDE 39

39

  • 1. Establish Roles

& Responsibilities

  • 2. Clarify

Purpose

  • 3. Develop

Methodology

  • 4. Document

Process

Step 2

Clarify purpose of cross performance area prioritization

High Level Goals

  • Transparency and Accountability
  • Maximize Efficiency (Funds, ROI)
  • Maximize Effectiveness (System

Performance)

  • Standardize Performance Criteria
  • Improve Repeatability
  • Mitigate Risks
slide-40
SLIDE 40

What will be prioritized?

  • Which modes?
  • Which types of investments?
  • What is the timeframe for analysis?

What should process yield?

  • Prioritized list of investments?
  • Recommended resource allocation?

How will results be used?

  • Initial list of investments to fund?
  • Approach for narrowing the candidate pool of investments?
  • Support setting targets?

40

Detailing the Scope and Purpose

slide-41
SLIDE 41

41

  • 1. Establish Roles

& Responsibilities

  • 2. Clarify

Purpose

  • 3. Develop

Methodology

  • 4. Document

Process

3. 3.1 Define goals and object ctives 3. 3.2 Select ct performance ce measures and evaluation cr criteria 3. 3.3 Assess data and analytica cal ca capabilities 3. 3.4 Prototype the approach ch 3. 3.5 Set weights on goals and object ctives 3. 3.6 Ap Apply the model

Step 3

Develop the Methodology

slide-42
SLIDE 42

42

What it takes

  • Review the agency’s overall goals and
  • bjectives that support those goals
  • Determine which of these are relevant

given the scope of the process

  • Attempt to resolve any gaps or overlaps

between objectives

Step 3.1

Define Goals and Objectives

3. 3.1 De Define goals and

  • b
  • bjectives
slide-43
SLIDE 43

43

Example Goals

  • Mobility
  • Preservation
  • Safety
  • Security
  • Resilience
  • Environment
  • Community
  • Economic Development
  • Accessibility
  • Environmental Justice

Step 3.1

Define Goals and Objectives

3. 3.1 De Define goals and

  • b
  • bjectives
slide-44
SLIDE 44

44

What it takes

  • Identify performance measures that

will be used to quantify each goal and objective

  • Determine how to scale each

measure

  • Typically measures are scaled such that the

maximum value for a goal is expressed on a scale from 0% (lowest value) to 100% highest value

  • Determine how to combine measures

where multiple measures contribute to a goal

44

3. 3.2 Se Select pe performanc nce mea measures es and d ev evaluati tion criter eria

Step 3.2

Select Performance Measures and Evaluation Criteria

slide-45
SLIDE 45

45

Considerations in Scaling Measures

  • Measures should be structured such

that the measure is proportional to

  • verall value yielded by the investment
  • May need to multiply measure some

proxy value if the measure does not scale appropriately

  • Can utilize parameters from

benefit/cost analyses to help facilitate scaling

45

3. 3.2 Se Select pe performanc nce mea measures es and d ev evaluati tion criter eria

Step 3.2

Select Performance Measures and Evaluation Criteria

slide-46
SLIDE 46

Measure 1: Job Accessibility

  • Used a software tool to determine increased

number of jobs accessible within a 45 minute commute from disadvantaged communities

  • Divided result by maximum value to put on a 0-1 scale

Measure 2: Economic Development Impact on Low Income Communities

  • Completed checklist of factors related to this topic to

determine a score on a scale from 0-5

  • Multiplied result by estimated development area
  • Divided result by maximum value to put on a 0-1 scale

Decided to weigh Measure 1 vs. 2 equally and combine to obtain an overall value

46

Example

MDOT Accessibility Measures

slide-47
SLIDE 47

Measure 1: Emissions Reduction

  • Calculated annual reduction in fuel consumption
  • Multiplied result by environmental cost of a gallon of fuel

determined using Cal B/C benefit-cost model

Measure 2: Health Activity Benefit

  • Estimated number of new cyclists and pedestrians
  • Multiplied number by annual health benefit in dollars per

new cyclist/pedestrian determined in a separate NCHRP study

  • Summed Measure 1 and 2 to obtain total benefit
  • Divided total benefits by a maximum value to obtain

results on a 0-1 scale

47

Example

Caltrans Air Quality and Health Measures

slide-48
SLIDE 48

48

What it takes

  • Obtain data - several approaches:
  • Direct measurement
  • Predictive models
  • Representative default
  • Subjective judgment
  • To overcome data challenges:
  • Revisit scope and measures
  • Collect more data

48

3. 3.3 Asse Assess ss data and an anal alytical al ca capabilities

Step 3.3

Assess Data and Analytical Capabilities

slide-49
SLIDE 49

What they did

  • Series of workshops to select performance measures and

assess data and analytical capabilities

How they did it

  • Included relevant stakeholders and discussion over each

potential measure to determine feasibility of gathering data and performing analysis

Why they did it

  • Time-effective way to reduce number of iterations

49

Example

MDOT Assessing Data Capabilities

slide-50
SLIDE 50

50

Example

DVPRC Assessing Data Capabilities

What they did

  • Implemented simpler performance measures initially and

will build data capabilities over time

How they did it

  • Selected measures for which data was readily available

Why they did it

  • Saw challenges of collecting data and the need to select

performance measures with data requirements that could be met by small MPOs with limited capabilities

slide-51
SLIDE 51

51

What it takes

  • Collect data for a sample set of projects
  • Calculate a score or utility for each

project

  • Prioritize the sample set
  • Review and assess the results

51

3. 3.4 Pr Prototy type e th the e ap approac ach

Step 3.4

Prototype the Approach

slide-52
SLIDE 52

52

Source: Adapted from VDOT Smart Scale Project Scores spreadsheet Basic project information Performance Measure scores Project benefit score Score divided by cost Rank Example

VDOT using a Spreadsheet to Prototype

slide-53
SLIDE 53

53

What it takes

  • Typically need to establish weights on

each goal or objective to determine how to compare benefit of achieving

  • ne versus the other
  • Also may need to establish weights on

individual measures if combined to determine the score of a given

  • bjective

53

3. 3.5 Se Set w weights o

  • n

go goals and d

  • b
  • bjectives

Step 3.5

Set Weights on Goals and Objectives

slide-54
SLIDE 54

54

Approaches

  • Delphi Approach
  • Analytical Hierarchy Process

How to avoid this step

  • Develop monetized measures
  • Use Data Envelopment Analysis to

select efficient investments using goal- level scores

54

3. 3.5 Se Set w weights o

  • n

go goals and d

  • b
  • bjectives

Step 3.5

Set Weights on Goals and Objectives

slide-55
SLIDE 55

55

Delphi Approach

  • Assemble expert panel
  • Review goals and measures, illustrating calculations with example

projects

  • Conduct first round of voting: each participant specifies weights on

each goal

  • Review initial results through group discussing, showing results for

example projects

  • Conduct successive rounds of voting until consensus is reached
  • Use consensus or average results as initial weights
slide-56
SLIDE 56

56

Analytical Hierarchy Process

  • Described in NCHRP Report 806
  • Popularized in commercial software products
  • Requires evaluating relative importance between each

goal/objective/measure pair

  • Practical when number of pairs is small
  • 4 goals = 6 pairs
  • 7 goals = 21 pairs
slide-57
SLIDE 57

57

What it takes

  • Identify candidate investments
  • Calculate measures for each candidate
  • Prioritize using weights
  • Consider other factors
  • Optional: perform additional tradeoff

analysis

57

3. 3.6 Ap Apply the mod

  • del

Step 3.6

Apply the Model

slide-58
SLIDE 58

58

  • 1. Establish Roles

& Responsibilities

  • 2. Clarify

Purpose

  • 3. Develop

Methodology

  • 4. Document

the Process Step 4

Document the Process

  • How process was established

and conducted

  • Project selection criteria
  • Formulas for project

evaluation and justification

  • Why goal areas were

prioritized

  • Impacts on performance from

tradeoff analyses

  • What alternatives were not

chosen and why

  • Roles and responsibilities
  • Project eligibility
  • Project submission process
  • Timeline for submission,

evaluation, and publication of final results

  • Input received from

stakeholders

  • Risk factors
  • Output targets that can be

used to track anticipated effects

slide-59
SLIDE 59

Software Tour

slide-60
SLIDE 60

Small Group Exercise Introduction

Tradeoff Analysis Using MODAT

slide-61
SLIDE 61
  • Determine roles
  • Data owner (done)
  • Facilitator
  • Reporter
  • Participants
  • Data owners already accessed software tool and

preloaded dataset ahead of session

  • There are predetermined performance areas

61

Exercise Logistics

slide-62
SLIDE 62
  • Review background on handout
  • Complete exercise tasks
  • Determine measure weights
  • Review modeling assumptions
  • Prioritize
  • Perform a sensitivity analysis
  • Discuss
  • Report out

62

Workshop Exercise

slide-63
SLIDE 63

Break (10 min)

63

slide-64
SLIDE 64
  • Share highlights of exercise, discussion,

results

  • 5 min per group

64

Report Out

slide-65
SLIDE 65

Workshop Wrap Up

slide-66
SLIDE 66
  • Overview of multi-objective decision analysis (MODA)

and tradeoff decision-making to support TPM, PBPP

  • TPM Toolbox and MODAT
  • Practice examples of MODA and tradeoff decision-

making

  • Tool developed in NCHRP 08-103
  • Tradeoff decision-making practice

Summary

66

slide-67
SLIDE 67

Session Feedback

What did you learn in this workshop? What more would you like to learn? What will be most challenging for your agency to incorporate? What will be easiest for your agency to incorporate?

Before you leave today, please capture the following

  • n the Post-Its

at the back of the room What did you learn? What more would you like to learn? What will be most challenging? What will be easiest? How can we improve future sessions?