Stronger threads: The critical requirement for enabling the - - PDF document

stronger threads the critical requirement for enabling
SMART_READER_LITE
LIVE PREVIEW

Stronger threads: The critical requirement for enabling the - - PDF document

IT 2 EC 2020 IT 2 EC Extended Abstract Template Presentation/Panel Stronger threads: The critical requirement for enabling the Competency- Based Workforce Brian Moon 1 1 Chief Technology Officer, Perigean Technologies, Fredericksburg, VA, USA


slide-1
SLIDE 1

IT2EC 2020 IT2EC Extended Abstract Template Presentation/Panel

Stronger threads: The critical requirement for enabling the Competency- Based Workforce

Brian Moon1

1Chief Technology Officer, Perigean Technologies, Fredericksburg, VA, USA

Abstract — Essential advancements toward enabling the competency-based wokforce (CBW) have been made in the past few years. As interest grows and new endeavours are pursued, however, it is becoming increasingly clear that many of the key advancements have been made without sufficient connectivity to others. To realize the full potential, more focus must be placed on stitching together the essential elements that can enable the CBW. This presentation

  • ffers insights from the point of view of a practitioner who has been working in the seams to enable competency-based

workforces in military and commercial contexts, and suggestions regarding the requirements of the threads necessary to stitch the elements more tightly together.

1 The Essential Elements

  • f

a Competency-Based Workforce

The competency-based workforce (CBW) movement is been well underway, as industries have come to realize that industrial age models of workforce development are no longer satisfactory to survive in information-rich, cognitively complex domains. Previous approaches that supported hiring by qualifications, one-size-fits-all training, and periodic performance review, are being replaced by techniques and tools that enable more exact definition of performance requirements, more adaptive and higher fidelity learning experiences, and deeper assessment on performance across all venues of performance. The proliferation of tools and techniques is serving to gain coverage over the essential elements of the CBW, which are:

  • Explicit delineation of all tasks and the associated

competencies required for efficient and effective performance of a job (“Delineation”)

  • Experiences that enable the development of the

competencies (“Development”)

  • Accurate

and efficient assessment

  • f

the achievement of the competencies (“Assessment”). These three elements comprise the core capabilities that organizations must develop to enable a CBW. The simple diagram in figure 1 demonstrates their interconnectedness to each other and other organizational elements. Each element present challenges to any organization, not the least of which are the time, costs, and ironically competencies associated with committing to doing them

  • properly. Delineation takes time and the skills to conduct

processes such as CTA are often not resident. Development, beyond the basic introductions to a job, is too often left to the performer to manage during operating

  • conditions. And assessment, if conducted at all, typically

relies on elementary testing and/or fleeting observations.

  • Fig. 1. The essential elements that enable CBW.

Thus, significant effort and resources have been expended in the past few decades to enable efficiencies and more refined capabilities in each of these areas. The pace

  • f advancement in each has increased significantly along

with broader technology advances. Techniques such as cognitive task analysis (CTA, [1]) have allowed for the detailed descriptions of the cognitive elements of performance expected at increasing levels of proficiency. Modeling and simulation technologies have created ever- more-close replications of working conditions that enable safe and efficient practicing of critical skills and tracking all manner of data. Access to such data has, in turn, spawned the development of new, more scrutinous methods for evaluation how workers are performing that are applicable even during operational conditions. Yet, as the dotted red lines in figure 1 suggest, the interconnectedness of the elements remains the critical challenge to the entire endeavour of CBW. As each of the elements has advances, insignificant effort and resources have been committed to stitching them together in ways that enable a cohesive and efficient CBW.

slide-2
SLIDE 2

IT2EC 2020 IT2EC Extended Abstract Template Presentation/Panel

2 Challenges in stitching and ‘thread’ requirements

For two decades, the author has been involved in efforts in both military and commercial domains to address the individual elements and, more recently, the stitching between them. These have included: delineation projects in domains as varied as nuclear engineering, military deception, piloting, consumer goods and pet food production [2]; development projects for defeating improvised explosive devices [3]; and assessment projects in domains including cyber security [4]. Each effort has brought into high relief the challenges noted above. But a few in particular have revealed the brittleness of the stitching between the essential elements. The following offers insights into the brittleness of the stitching and suggest what will be necessary for the threads to more securely tie together the elements. For the sake of privacy, the following examples do not identify the domain – the reader is assured that they are drawn some of the domains noted above. 2.1 Delineation to Development and Assessment Between position descriptions, essential task lists, performance review criteria, most organizations operate under the belief that their enabling functions and roles are described to a level of specificity to get the job done. A closer inspection of organizations’ documentation, however, has often revealed that descriptions of the cognitive requirement for performance are missing. In one such example, significant resources had been invested to provide a very high fidelity of development experiences – from simulators to instructor-coached equipment

  • peration. But guided by a description of the domain that
  • nly defined performance to the mission phase level, the

instructional strategies were limited to low-pressure mission execution and assessing performance by use of gross-measure rubrics. Approaches such as mastery models [5] offer the potential for performance to be delineated down to the cognitive requirements of performance expected at emergent levels of proficiency – i.e., novice to competent to expert. Such microscopic views of performance make

  • bvious the sorts of development activities that will be

necessary to enable the achievement of expertise, particularly when blending strategies [6]. And they specify the sorts of behavioural indicators that enable gathering the right sorts of data for deep assessment from any development activity. Rapidizing the process must be a goal for implementing such approaches [7]. 2.2 Development informing Delineation and Assessment A CBW is most necessary in domains that are undergoing significant change, are highly complex, and/or are

  • emergent. Such tumultuous conditions aggravate the

challenge of closing the loop between development and delineation and assessment. As new knowledge is developed and new skills and aptitudes become necessary, a reshaping of the nature of the work is required, and the traditional “waterfall” approach to delineation becomes

  • problematic. This challenge was especially notable in one

project where new adversarial tactics were being developed almost daily. By the time that competencies had been fully developed and implemented in a safe learning environment – i.e., a simulation – the operational environment had changed significantly. Lessons learned approaches that focus on rapid refinement to the delineations will certainly be a necessary component thread. But ultimately, tools will be needed to support the just-in-time incorporation of the lessons

  • learned. Changes to official documentation (e.g., task lists)

and developing valid and reliable assessments have historically required significant process work across teams

  • f stakeholders using industry standard formats and
  • techniques. Adding flexibility to this thread may hold over

advantages, too, as assessment techniques that allow for rapid authoring emerge [8]. 2.3 Tracking Tracking a performer’s levels of competency across the various requirements for their role(s) has traditionally been the managed by both the performer and the organization – from collecting certificates and recommendations to updating resumes and online profiles. While tools to aid the collation of such data have been around for a while, they often fail to fully integrate and align the picture and/or do not capture data at the level of competency. Performance and achievements are presumed to be indicative of competencies held, resulting in surprise when the full range of requirements of the job become apparent. But knowing what data to track and having “systems” (which might include people) adequately instrumented (which might include manual reporting) is no trivial

  • matter. One example of attempting to do this very thing

demonstrated just how complex is the task to connect systems, let alone generate, collate, disambiguate, analyse, and report such data in real-time. Even when significant data is known about performers prior to engaging in a development activities, tracking and visualizing performance across time, in ways that are meaningful to stakeholders and that inform adaptive experiences and accurate assessment, remains an elusive goal. Data sharing technologies such as xAPI [9] and competency framework systems [10] are emerging to offer heretofore unavailable insight into worker performance. Such tools are the sine qua non of this thread and will require significant future work to reach the levels of deployment that organizations will need to enable a CBW.

Acknowledgements

The author thanks the organizations that provided

  • pportunities to gain these insights, and session chair Dr.

Robby Robson for the chance to share them.

slide-3
SLIDE 3

IT2EC 2020 IT2EC Extended Abstract Template Presentation/Panel

References

[1] B. Crandall, et al., Working minds: A practitioner's guide to cognitive task analysis. MIT Press (2006) [2] B. Moon. Capturing Cognitive Performance and

  • Expertise. In The Oxford Handbook of Expertise.

Oxford University Press. (2019) [3] J. Phillips, et al., Insurgent Mindset Training: A Perspective-Taking Approach to Improvised Explosive Device Defeat Training. Proceedings of the I/ITSEC. (2009) [4] S. Gallagher, et al., Total Learning Architecture development: A design-based research approach. In Proceedings of the I/ITSEC. (2017) [5] K. Ross, J. Phillips, Developing Mastery Models to Support the Acquisition and Assessment of

  • Expertise. In The Oxford Handbook of Expertise.

Oxford Press (2019) [6] P. Ward, et al., Accelerated expertise: Training for high proficiency in a complex world. Psychology Press (2013) [7] W. Zachary. et al., “Rapidized” cognitive task analysis." IEEE Intelligent Systems 27, no. 2, 61-66 (2012) [8] B. Moon, Brian, and S. Rizvi, Sero!: A Learning Assessment Platform for Adult Learning Environments. In International Conference

  • n

Applied Human Factors and Ergonomics. (2017) [9] K. Kevan. P. Ryan, Experience API: Flexible, decentralized and activity-centric data

  • collection. Technology, knowledge and learning 21.1

143-149 (2016) [10] R. Robson. J. Poltrack, Using competencies to map performance across multiple activities. Proceedings

  • f the I/ITSEC. (2017)

Brian Moon Biography

  • Mr. Moon is the Chief Technology Officer for Perigean

Technologies and founder of Sero! Learning Assessments,

  • Inc. His practice enables expertise management, design

and evaluation of cognitive systems, and the assessment of

  • knowledge. He is recognized for concept mapping, as

highlighted in Applied Concept Mapping: Capturing, Analyzing, and Organizing Knowledge. .