VR-Based Toolsets for Instructors H.K. Yaqub 1 1 Unity3D Developer, - - PDF document

vr based toolsets for instructors
SMART_READER_LITE
LIVE PREVIEW

VR-Based Toolsets for Instructors H.K. Yaqub 1 1 Unity3D Developer, - - PDF document

VR-BASED TOOLSETS FOR INSTRUCTORS VR-Based Toolsets for Instructors H.K. Yaqub 1 1 Unity3D Developer, BMT, Bath, UK Abstract This paper outlines the development of a suite of VR tools (ENGAGE Instructor Toolset) which allow instructors to


slide-1
SLIDE 1

VR-BASED TOOLSETS FOR INSTRUCTORS

VR-Based Toolsets for Instructors

H.K. Yaqub1

1Unity3D Developer, BMT, Bath, UK

Abstract —This paper outlines the development of a suite of VR tools (ENGAGE Instructor Toolset) which allow instructors to monitor the activity of students in a virtual-training scenario. They also allow instructors to issue aid and introduce dynamic obstacles or changes in task. It makes use of consumer VR technologies and incorporates gaze- aided interaction as well as design principals found in games using an omniscient viewpoint.

1 Introduction

BMT’s ENGAGE platform is a flexible and high fidelity training platform used to power virtual training

  • applications. The features of the platform include

integration with LRS (Learning Record Store) technologies, tools for developers and user to curate scenarios, instructor-to-student interaction (giving hints and directions) and analytics visualisation. While the core functionality of ENGAGE has been developed for a while, we are continually seeking ways to improve the platform and make it more accessible for customer use. This includes a recent research project making it more effective for instructors to assess and communicate with students during live virtual training. This paper outlines the research and development contacted to date and the proposed future development tasks to ensure effective instructor-student interaction.

2 Use Case

Consider a scenario of a training application on board a large ship similar to our previous work on the QEC aircraft carrier visualisation project. It is a PC-based application which consists of an accurate model of the interior of the QEC aircraft carrier which allows users to familiarise themselves with layouts, routes, standard procedures and emergency procedures. It also offers the ability for an instructor to curate training exercises by letting them set waypoints, objects of interest and emergency scenarios like power failures, flooding, causalities and fire. During a live scenario, the instructor is able to follow the student and manipulate the scenario by adding or removing tasks

  • n the fly.

These virtual tasks are designed for multiple simultaneous student (VR or non-VR) working together in the same virtual environment. They may all be collocated in the same section and deck, or they could be in different parts

  • f the ship. Though their instruction and progress are

automated and tracked, it still requires real-time input from a human moderator in a teaching and/or assessment scenario.

3 Toolset

When considering how to approach this interaction challenge, one option to consider is for the moderator to curate and monitor the scenarios from a “God View” perspective (omniscient or top-down third-person point of view). Examples of this interface can be found in real-time strategy games, simulation games (e.g. “The Sims”) and many roleplaying games. These genres of games require macro and micro management of a multitude of assets in real-time. The application of this technique provides a range of benefits including:

  • Comprehensive Map Interaction - showing all the layers

in a quick to access view.

  • Identifying Student Avatars - Identifying markers or

symbols hovering above the players to identify them.

  • Selecting and Directing - With the help of gaze tracking

hardware for greater item precision.

  • Curating Scenarios - Quick to adapt and create new

scenarios for changing applications and requirements.

  • Activity and Progress Tracking - View allows for regular

reporting to be presented alongside movement tracking.

4 VR vs PC

The toolset was originally envisioned as a PC application in the vein of similar applications mentioned previously. Literature also suggests that PC can be preferable as the guide

  • r

moderator in asynchronous/asymmetric multiplayer tasks [1, 2] as PC interface offers high degree

  • f control and familiarity.

However, improvements in VR interaction design and hardware allow COTS HMDs to be seriously considered as a preferable interface for instructors. Games such as “Final Assault” [3] demonstrate methods exclusive to VR which allow for naturistic interactions and management of a large number of agents in war-gaming (real-time- strategy) scenario. Games such as “Star Trek Bridge Crew” [4] demonstrate how a VR user using motion controllers can simulate fine-grain interactions. The “Spacetime” tool [5] demonstrates novel approaches to multiple VR user collaboration. It allows for users to

slide-2
SLIDE 2

VR-BASED TOOLSETS FOR INSTRUCTORS

Technologies and Architectures

design and arrange a virtual environment by providing a suite of tools which include diegetic version control, selecting multiple items and grouping them dynamically, and functions for working on the same virtual asset. Many

  • f these approaches can be extended to an instructors’ VR

toolset. The ENGAGE instructor toolset is being designed for COTS headsets. The solution can be built for a multitude

  • f headsets, including both room-scale setups (e.g. HTC

Vive, Valve Index, Oculus Rift S) and inside-out tracked heads (e.g. Oculus Quest). Currently the default interaction method is distal pointing [6] as a carry-over from PC- interaction methods. While this method is a tried-and-true method of interacting with virtual items, it does not provide the most naturalistic means. Distal pointing can be augmented using gaze-aided interaction [7, 8]. Typically, gaze-based interactions involve detecting focal time and intensity, blink detection. This approach uses gaze as a direct interaction method. Gaze-aided interaction is a method where the focal point is used to influence and augment other modalities of interaction i.e. VR motion controllers and hand tracking. Examples of gaze-aided interaction can be found in hand- tracking research [9, 10] where gaze is used to allow the user to easily ‘grab’ or ‘pinch’ a moving virtual object using a combination of gesture recognition and distal

  • pointing. This method has been implemented in ENGAGE

to aid controller-based interactions as it should allow for more precise interaction in a potentially busy environment.

5 Experimental Conditions

Evaluation of the instructor toolsets will be based on

  • bjective task-performance and the participant’s self-
  • assessment. The toolset is implemented to work as a VR

and PC-based interface. Therefore conditions will include a comparison of the PC version vs VR using established evaluation methods [11]. With the introduction of gaze- aided interaction, other experimental conditions will include standard VR interaction vs Gaze-aided controller

  • interaction. As hand-tracking is now becoming more

standard with headsets such as the Oculus and Vive series, there is also a potential for utilising hand tracking (augmented by gaze-aided input [9, 10]). Scenario experimental conditions will also be varied by the use of basic path-finding AI and human actors.

6 Experimental Task Design

Each participant will be provided with a curated scenario consisting of a list of actions and commands. These commands will require the participant to utilise all of the features of the toolset. Their performance on the task (completion time, errors etc.) and their self-evaluation (questionnaire) and observations will be captured. The participant will be required to utilise the toolset in a live training scenario which will consist of a simplified environment comprised of multiple floors (decks) and interactive items. The participant (assuming the role of the instructor) will be required to direct virtual actors (students) to do certain tasks. These actions will include  Selecting one or more users  Locating a particular student  directing students to locations or interactive

  • bjects via waypointing or signposting

 Introducing obstruction and interfering with scenarios (e.g. emergency situation) Example command statements include “Direct P1 to Console B” and “Direct P1 and P2 to Compartment A”. Initial experimentation will be implemented using NPC agents with basic path-finding AI. The experiments will then be replicated with human actors who are instructed to follow a strict set of rules which include only changing behaviour when directed by the participant. The ‘human’ students can be a mixture of PC and VR users. The participant will be scored by their response times and number of errors made. Eye-tracking will be implemented in the VR conditions regardless of whether they are using gaze-aided interaction for analytical

  • purposes. They will be issued with questionnaires and

encouraged to give any open feedback on their experience with the interface.

7 Benefit for Instructors

The ENGAGE Instructor toolset is designed to allow for an intuitive, novel and naturalistic way of communicating with multiple virtual students. The toolset itself aims to make it simpler to facilitate remote learning, training and

  • assessment. By mixing immersive technologies with

well-established omniscient design principles (‘God- game’ style applications) the format of the instructor application should be familiar to use. This allows for enhanced instructor-student interaction through the use of naturalistic VR interaction. COTS VR HMDs have matured to stage where naturistic interactions can be implemented without the use of peripherals and large design challenges. VR provides an ideal space for visualising a large amount of statistical information while directly observing the actions and performance of multiple students. The interface is implemented in such a way that it should require little training and a lower-barrier of entry to using it so that the instructor is not burdened. The interaction methods outlined in this paper aim to provide the instructor with all of the required interactive functionality without overloading them with a variety of tools and

  • functions. The ability to move around freely and use

natural gestures and distal pointing to manipulate the environment (as opposed to mouse and keyboard) should provide a positive experience for the instructor and student alike.

slide-3
SLIDE 3

VR-BASED TOOLSETS FOR INSTRUCTORS

Technologies and Architectures

References

[1] R. Horst, R. Dorner, M. Peter, ‘VR-Guide: A Specific User Role for Asymmetric Virtual Reality Setups in Distributed Virtual Reality Applications’, Group VR/AR Conference, (2018) [2] L. A. Thomsen, N. C. Nilsson, R. Nordahl, B. Lohmann, ‘Asymmetric collaboration in virtual reality’, (2019) [3] Final Assault VR, https://www.finalassaultvr.com/ [4] Star Trek Bridge Crew, https://www.ubisoft.com/en- gb/game/star-trek-bridge-crew/ [5] H. Xia, S. Herscher, K. Perlin, D. Wigdor, ‘Spacetime: Enabling Fluid Individual and Collaborative Editing in Virtual Reality’, The 31st Annual ACM Symposium, (2018) [6] R. Kopper, D. A. Bownman, M. G. Silva, R. P. McMahan, ‘A human motor behavior model for distal pointing tasks’, International Journal of Human- Computer Studies, 68, 603-615 (2010) [7] A. T. Duchowski, ‘Gaze-based Interaction: A 30 Year Retrospective’, Computers & Graphics, 73, 59 (2018) [8] P. Mohan, W. B. Goh, C. W. Fu, S. K. Yeung, ‘DualGaze: Addressing the Midas Touch Problem in Gaze Mediated VR Interaction’, IEEE International Symposium on Mixed and Augmented Reality Adjunct, (2018) [9] K. Pfeuffer, B. Mayer, D. Mardanbegi, H. Gellersen, ‘Gaze + pinch interaction in virtual reality’, The 5th ACM Symposium on Spatial User Interaction, (2017) [10] I. Chatterjee, R. Xiao, C. Harrison, ‘Gaze+Gesture: Expressive, Precise and Targeted Free-Space Interactions’, (2015) [11] S. C. Mallam, S. Nazir, S. K. Renganayagalu, J. Ernstsen, S. Veie, A. E. Edwinson, ‘Design of Experiment Comparing Users of Virtual Reality Head-Mounted Displays and Desktop Computers: Volume V: Human Simulation and Virtual Environments, Work With Computing Systems (WWCS), Process Control’, 20th Congress of the International Ergonomics Association, (2018)

Author/Speaker Biographies

  • Dr. Hashim Khalid Yaqub has been developing VR and

AR applications at BMT for 6 years. During this time he completed his EngD, where he researched and proposed methods for reducing simulator sickness in head-mounted

  • VR. He specialises in using AR/VR for training as well as

for designing and prototyping