Assessing System Efficiency in Providing Public Benefits
Supportive Parents Information Network (SPIN) Caring Council
February 9, 2012
Efficiency in Providing Public Benefits Supportive Parents - - PowerPoint PPT Presentation
Assessing System Efficiency in Providing Public Benefits Supportive Parents Information Network (SPIN) Caring Council February 9, 2012 Our Goals Pr Proposing oposing a pr a process ocess no no a pr a product oduct To De Develop
February 9, 2012
To De Develop elop an an aut authe hent ntic ic pa partn tner ership ship amo among ng all all key ey play player ers: s: HHS HHSA A Mana Manage gemen ment, t, Wor
ers, s, Appli licants/ ts/Recipien ipients ts While we are critical – we don’t wish to be adversarial
To successfully address and remove the barriers to the efficient and compassionate delivery of public benefits
The here e is n is no sys
temic, objec bjecti tive mea e means o ns of monitoring of the system’s effectiveness
It seems that all “outputs” are assumed to have a positive
Data that are reported on the system’s functioning only provide a partial picture – an overly positive picture
The he f fun unda damen mental tal app pproa
h is flaw is flawed ed
“Public assistance eligibility determination processes and
System” Kim Forrester System designed to serve the tool – rather than the tool designed to serve the system
The County consistently refers to its enrollment as “participation rate” and points to the significant rise in cases. While enrollment has increased there is no evidence that the participation rate has increased.
The County consistently reports its Monthly Compliance Rate at above 90% but fails to mention that this number is based on less than half the monthly cases. San Diego continues to have an unusually high pending rate in comparison to other large counties and the state
Estimates of Participation v Actual Participation
25,000 50,000 75,000 100,000 125,000 150,000 175,000 200,000 225,000 250,000
J a n
F e b
M a r
A p r
M a y
J u n
J u l
A u g
S e p
O c t
N
D e c
J a n
F e b
M a r
A p r
M a y
J u n
J u l
A u g
S e p
O c t
N
D e c
J a n
1 F e b
1 M a r
1 A p r
1 M a y
1 J u n
1 J u l
1 A u g
1 S e p
1 O c t
1 N
1 D e c
1
Months Individual Enrollment
Participation 3% change 9% change Linear (Participation) Linear (3% change) Linear (9% change)
Using Unemployment as indicator of demand – shows County is keeping up with demand – not increasing its participation rate
Pending Rate Across Largest Counties
5 10 15 2 0 2 5 3 0 3 5 4 0 4 5 5 0 5 5 6 0 6 5Jan-09 Feb-09 Mar-09 Apr-09 May-09 Jun-09 Jul-09 Aug-09 Sep-09 Oct-09 Nov-09 Dec-09 Jan-10 Feb-10 Mar-10 Apr-10 May-10 Jun-10 Jul-10 Aug-10 Sep-10 Oct-10 Nov-10 Dec-10 Jan-11 Feb-11 Mar-11 Apr-11 May-11 Jun-11 Jul-11 Aug-11 Sep-11 Oct-11 Nov-11 Dec-11
Months
% Pending End of Month
San Diego State Los Angeles Orange Riverside San Bernadino
The County has consistently had a pending rate of over 50% since August
This definition raises the questions:
Our hope is that we can create a process where the three major players [HHSA Mgt, Workers, Recipients] work together to define the ideal and create the real
Outputs Outcomes Impact
Feedback
The Evaluation must be FORMATIVE & SUMMATIVE: It must examine the process (formative) and the outcomes (summative). Quality feedback mechanisms need to be developed and institutionalized: The efficiency of any system is related to the quality of the feedback it gets and its ability to incorporate that feedback. Must expand whose feedback is accepted & incorporated – Management, Workers, Clients.
The ACCESS Call Center Task-Based Operation
Is it achieving its intention? [Requires Management Perspective] How well is it being implemented? [Requires Worker Perspective] What is the experience of the recipient? [Requires Client Perspective]
Valida alidate tes s some some of
r co conc ncer erns: ns: “There is no evidence of a joint
techanical planto support the current service delivery model, operational needs of ACCESS or the FRC‟s or requirement of a future „online‟ service delivery model.”
Supp Suppor
ts some of ou
r assessmen ssment t & & rec ecommen
dation tions: s:
Evaluation begins with operationalizing Outcomes
Operationalized outcomes provide measureable indicators of how well the system is functioning Examples of Outcome Measures:
What are the conditions in the FRCs - how well are they functioning?
In particiular:
What is effect of Early Fraud Detection on client experience & what is its value?
submit to Project 100%
losing children
[See State Auditor‟s report on Anti-Fraud activities]
Evaluation of internal processes must include workers & ability of clients to provide feedback Formative Evaluation needs to examine both the efficiency of the process and its impact on the recipient
Examples of things that make the work harder:
processes on those outcomes
that includes regular reporting to the SSAB
The value of data is in its interpretation
Numbers are only indicators of a process or an outcome – they are not the process or the outcome. No indicator is a perfect reflection of what it is measuring. The best interpretation comes when all perspectives come to consensus. The way outcomes are operationalized and measured will affect worker behavior If an ACCESS work is assessed on length of call, s/he will focus on ending calls in the prescribed time. If an ACCESS worker is assessed on the percent of successful resolutions, s/he will focus on finding resolutions to a caller‟s issues