HoloRange AR Sandtable Prototype Lessons Learned Formatted: English - - PDF document

holorange ar sandtable prototype lessons learned
SMART_READER_LITE
LIVE PREVIEW

HoloRange AR Sandtable Prototype Lessons Learned Formatted: English - - PDF document

IT 2 EC 2020 1-6-19 DRAFT HoloRange AR Sandtable Prototype Lessons Learned Enhancing Training Through xR HoloRange AR Sandtable Prototype Lessons Learned Formatted: English (United States) Daniel J. Lacks, PhD 1 , Rodney Choi 2 1 Chief


slide-1
SLIDE 1

IT2EC 2020 1-6-19 DRAFT HoloRange AR Sandtable Prototype Lessons Learned Enhancing Training Through xR

HoloRange AR Sandtable Prototype Lessons Learned

Daniel J. Lacks, PhD1, Rodney Choi2

1Chief Scientist, Cole Engineering Services, Inc., Orlando, FlodidaFlorida, USA 2 Program Manager, Cole Engineering Services, Inc., Orlando, FlodidaFlorida, USA

Abstract — The US Marine Corps is building a world class wargaming facility located at Headquarters, Quantico,

  • Virginia. Commercial innovation is driving the vision to bring and build useful modern world-class technologies into

the wargaming environment. Included is a desire to build an affordable Holodeck-like experience to immerse wargamers in high risk/high security environments to combine live and table-top wargaming with MR/AR/VR

  • solutions. Cole Engineering Services, Inc. (CESI) prototyped building M&S and data visualization, collaboration,

grease pen, and other features using Magic Leap One devices for this effort. This work shares CESI’s lessons learned from the industry perspective to explain what was accomplished, and lessons learned on how M&S industry partners can build AR wargaming solutions. New and creative possibilities unique to an AR solution show promise for creating a new type of Image Generator (IG) system, while variations to familiar challenges of terrain correlation and interoperability require more investigation.

1 Background

The US Marine Corps is building a world class wargaming facility located aboard Marine Corps Base Quantico,

  • Virginia. Commercial innovation is driving the vision to

bring and build useful modern world-class technologies into the wargaming environment. This innovation exploration and prototyping includes data analytics, artificial intelligence, collaboration, integration, employing standards, and eXtended Reality (XR) technologies such as Mixed Reality (MR), Augmented Reality (AR), and Virtual Reality (VR). The vision includes building an affordable Holodeck-like experience to immerse wargamers in high risk/high security environments to combine live and table-top wargaming with XR solutions. Cole Engineering Services, Inc. (CESI) prototyped building Modeling and Simulation (M&S) and data visualization, collaboration, grease pen, and other features using the Unity game engine and Magic Leap One devices for this effort called the HoloRange. The HoloRange prototype is a modern implementation of a digital sand table – a digital twin of the battlefield. AR creates a 3D map which may be superimposed anywhere in the real world to include a tabletop, the floor, a wall, or at angles. Users collaborate by sharing a common view of the map, icons, events, and other visual artifacts. Users also may work offline in their own map local to their AR device, and then rejoin the collaborative game at any time. Such a feature prevents crowding around the table and bending in awkward positions to see finer details. The grease pen feature allows users to draw or erase freehand drawings on invisible planes parallel to the user. The pen may be extended, retracted, or snapped to the map. The data visualization feature included charts and graphs. The digital nature of the AR solution allows users to explore the map and visualize simulations at a resolution comfortable for their eyesight without knocking over pieces or messing up a physical sand table. These AR features show potential for planning, execution, and analysis wargaming activities. While to focus of the prototype effort was on wargaming use cases, we anticipate the technology suits other use cases such as training, planning, and mission rehearsal.

2 Augmented Reality Research

In order to build a useable solution, it is important to understand the technology and the value it provides. 2.1 Prior Research One example of a similar former research conducted is for the Augmented REality Sandtable (ARES) from the (former) Human Research and Engineering Directorate (now known as Combat Capabilities Development Command Soldier Center), US Army Research Labs [1]. The research focuses on creating a digital battlespace visualization at the point of need. The improvements claimed by the research include improved spatial awareness, situational awareness, distributed collaboration, 3D terrain and scenario authoring, and the applicability of various US Army Warfighting Challenges (as of 2017). These improvements are further detailed to explain important findings:

  • Representation of 3D space is improved when

projected in 3D rather than on a 2D monitor.

  • Methods to depict a User-Defined Common

Operating Picture (UDOP [2]) are different in AR compared to traditional means and need more investigation.

  • Mobile AR devices bring digitized sandtables to the

point of need from classroom to battlefield.

  • Data visualization needs require more research, but

the AR platform is unique and provides a mechanism that a traditional sandtable cannot.

Formatted: English (United States) Formatted: English (United States) Formatted: English (United States) Formatted: Tab stops: 3.33 cm, Left

slide-2
SLIDE 2

IT2EC 2020 1-6-19 DRAFT HoloRange AR Sandtable Prototype Lessons Learned Enhancing Training Through xR 2.2 The AR Platform The state-of-the-art AR devices of 2019 are still in their infancy compared to their 2D and VR counterparts, with improvements to field of view and graphics anticipated to come in the coming years. It was important to our approach to develop software that may be ported between not only emerging AR platforms, but other XR devices since an industry prevalent and authorized secure solution does not yet exist. It allows for flexibility in the software if the AR platform does not pan out, it also allows different XR solutions to interoperate with the AR solution. Thus, the particular platform was not as important to us at the time as the software design methodology to use it. We selected the Magic Leap One Creator Edition [3] device for this prototype because it was perceived at the time to have the best ability to network to other devices, largest field of view, and best hardware human factors

  • considerations. We chose the Unity gaming platform to

build the software so that it may be used on the Magic Leap One, Microsoft HoloLens, and other devices anticipated in the future. The Magic Leap One has the following features (below). The software development experience is like an Android smartphone; the Magic Leap operating system Lumin is an Android variant.

  • Lightwear (the headset) a digital camera; speech to

text audio input; onboard speakers; an optical relay. The computer vision maps the user’s space to represent digital objects in the space. The spatial audio depicts sounds at correct distances and locations.

  • Lightpack (the wearable computer) includes a

combination of six CPU cores and 256 GPU cores; 8 GB of memory; 128 GB of storage capacity; WiFi and Bluetooth networking; and a three-hour battery life.

  • Control (the handheld controller) includes six

Degrees of Freedom (DoF) sensors; touch sensitivity; 7.5 hours of continuous use; and haptics. 2.3 Spatial Computing Magic Leap’s ability to map 3D space to assist with digital visualization is an impressive feature. When donning the device in a new physical space, it asks the user to scan the room so that it may build a digital representation of the room to include the floor, ceiling, walls, tables, chairs, etc. called a Guided Mesh (Fig. 1) [4].

  • Fig. 1. A sample guided mesh helps to digitize surfaces.

The technique used to create the guided mesh is used to combine real world surfaces with digital animations such as a digital whale jumping from a gym floor or a digital rocket ship crashing into a wall. The technique creates hard-coded waypoints representing the depth of the room based on the 3D origin where the player stands. The accuracy of the guided mesh is impressive; however, it has documented shortfalls such as gauging mirrors, windows, and dark surfaces. Leveraging the guided mesh also requires coding specific to the Magic Leap One, a design principle we are enforcing. Also, it takes time to scan the room which in our opinion limits the ease of use, imposes constraints on collaboration since people move around, and constrains rapid usability at the point of need. The sandtable use case is unique to general AR problems because we can assume the surface to project the sandtable on is perceived to be placed in a common area, clear of obstructions, and flat - even if it is not. Our solution was instead to scan a QR-code to establish the same point of origin, but instead of placing that point relative where the user is standing, it is placed in a corner

  • f the sandtable. This technique allows us to orient the

sandtable identically for each participant and only requires a few seconds to setup. It is also conducive to deploying at the point of need since users can also place their sandtables remotely in a suitable location.

  • Fig. 2. A QR code defines the sandtable origin in the corner.

3 Digital Twin Design Approach

Creating a digital twin of the real world in XR requires digitization

  • f

terrain, people, platforms, tactics, behaviors, weather effects, cyber effects, and more. As envisioned for the prototype, the digital twin is a representation of anticipated, historical, or hypothetical wargaming or training situations. We could not afford, nor did we want to design these solutions directly into the HoloRange prototype. Thus, we needed to leverage crucial, existing, and interchangeable technologies needed to achieve success are the terrain and simulation. This leadled us to an important design principle decision to write software in the style of an Image Generation (IG)

  • device. The HoloRange does not simulate, predict, or

compute tactical outcomes; it merely represents those depictions similar tolike how a monitor shows graphics on a computer or how an IG device shows the airspace to a pilot in a flight simulator. Thus, the HoloRange is loosely coupled to the terrain via a conversion process and the simulation via a network buss architecture. Another important design principle is to use international standards for building a solution that spans multiple use cases.

Formatted: Centered Formatted: English (United States) Formatted: English (United States) Formatted: English (United States) Formatted: English (United States)

slide-3
SLIDE 3

IT2EC 2020 1-6-19 DRAFT HoloRange AR Sandtable Prototype Lessons Learned Enhancing Training Through xR 3.1. Terrain Terrain AR was generated using the CESI’s Key Terrain

  • nline service [5]. Key Terrain allows users us to draw

areas of interest on a web-enabled map, pull open source terrain information from the Internet, and convert it to the Filmbox (FBX) format required by Unity. Key Terrain was also used to convert terrain into the Objective Terrain Format (OTF) used by the One Semi-Automated Forces (OneSAF) simulator and others. That approach provided a cost effective and correlated terrain for our prototype. 3.2 Simulation The HoloRange prototype is driven by the US Army open source OneSAF simulation framework feeding information

  • ver

the CESI MSSV (massive)

  • infrastructure. The MSSV infrastructure provides the

network buss transport and has the ability for a client simulator to specify commands to display on a menu which is unique to other existing standards. OneSAF provides the models and interactions to represent and animate the MIL-STD-2525D icons on the map. CESI prototyped a symbol web server to host Spatial Illusion’s Milsymbol and OneSAF SVG icons. The MSSV data model includes entity state and interactions but is unique because it allows simulators to specify how client menus and options appear to the user. This reduces the amount of customized interactions since each simulator can ask for and receive the information it needs to, for example, create a movement order. An interoperability challenge was that MSSV, also currently a prototype under development, has a limited data model. Workarounds were needed to interact with OneSAF’s user data gateway. A lesson learned is that the wargaming use cases need to inform MSSV and other data model developments to include data collaboration, planning, runtime, data analytics, and tech control features. 3.3 Standards One standard leveraged was the MIL-STD-2525D symbology standard [6]. The standard allows us to future proof the HoloRange since few currently developed solutions use the newer 2525D variant of the standard. The approach allows us to build board game-styled tiles which has a lower polygon count compared to high resolution 3D platform and lifeform models; this approach improves the performance and responsiveness of the AR experience. The approach lead guided us to prototype a reusable Symbol Service to host MIL-STD-2525C and D symbols in addition to OneSAF and other symbols. By providing a URL with the symbol code, the user receives the symbol data to display which may be in the form of a PNG, JPG, SVG, FBX, or otheranother supported format which varies based on the symbol requested. The MIL-STD-2525C and D symbols are provided by the Spatial Illusions open source milsymbol project [7].

4 HoloRange Features

The HoloRange features explored usability, applicability, and ease of use. The default view is the collaborative sandtable map locked onto the table, floor, or wall as

  • riented by scanning a QR-code wherever it is placed. The

size of the map can range from the size of a small table to a very large floor; the boundary was not calculated for the prototype, butprototype but is anticipated to cover a large gymnasium

  • floor. While the collaborative view allows multiple

participants using the AR technology to look at the same map fixed in space at the same time, the user also has the ability tocan bring the map into a personal view for zooming and panning without affecting the collaborative game. Collaboration features include the ability to place virtual sticky notes to annotate the map, a grease pen to highlight or draw graphics in 3D space or snapped to the ground (Fig 3), billboards to visualize data and other information, and depicting MIL-STD-2525D symbology which is a familiar to the anticipated audience. One user can present the sandtable to other non-AR wearing participants by broadcasting their headset to a television. We also experimented with 3D symbols and explored approaches to lower the polygon count for performance. Users may order the simulated units to move and shoot.

  • Fig. 3. (Clockwise starting from the top-left) 3D animated

models, MIL-STD-2525D symbols, artillery ambush, and grease pen with drawing wand.

Each of these features were tested at a local site and a remote site bridged by Cisco 892FSP Layer Two Tunneling Protocol Version 3 (L2TPV3) encryption

  • devices. This allowed the remote participant to simply don

the AR gear, scan the QR-code, and then join the game without any complicated or onorousonerous networking skills. Thus, the point-of-need user has the same AR setup experience as the local participant.

5 Lessons Learned

5.1 Technical Lessons The HoloRange solution need to improve upon Tterrain

  • management. The Ddevice has memory limitations in

holding terrain data and limitations in displaying large

Commented [DL1]: Dale - what is the word/page count? Do I have room to write about the network IoT standard MQTT used? Formatted: English (United States) Formatted: English (United States)

slide-4
SLIDE 4

IT2EC 2020 1-6-19 DRAFT HoloRange AR Sandtable Prototype Lessons Learned Enhancing Training Through xR

  • areas. Compiling and loading data onto the Magic Leap

One takes a long time since the terrain needs to be packaged and uploaded to the device. A better approach may be to stream terrain data during runtime which we plan to investigate in the future. This would allow us to show more terrain areas and levels of details. Since AR technologies are still so new, Oone thing not anticipated w as the f act that AR technologies are still so new the im pact t

  • u

s a bi l ity / por t a bi l ity

  • f

existing common game engine features and capabilities. A developer may be spoiled by the breadth and depth of features built into Unity or available for Unity via plugins

  • n the Internet for gaming applications. However, those

features are not necessarily AR, XR, or Magic Leap

  • compatible. We had to develop many typically taken for

granted features from scratch such as menus, terrain usability, and networking, as well as having to rethink approaches to performance in a traditional 3D game compared to AR. For example, is a high-resolution helicopter rotor animation with shadows and other visual effects necessary for AR? While it is effortless on a modern GPU, running these animations in AR is equivalent to running on an Android device with hardware limitations designed to extend battery life and not scale to the extent that M&S applications demand. The Magic Leap One Control simplifies user interaction into the form factor of a 3D mouse. Since we did not want to use specific Magic Leap features for the prototype for the ability to port our software between devices, we were not able to use advanced features such as speech-to-text and finger motions. Human factors became an important and time consumingtime-consuming challenge; though the tradeofftrade-off in our opinion was worth it to devices new ways to intuitively control the HoloRange with such a minimal set of controls. One example of creating intuitive controls was not to reinvent the controller’s use as used in other applications. The bumper button is commonly used to exit menus in other apps, so it is used similarly in the HoloRange. Likewise, navigating the custom builtcustom-built menus used the control wheel in a familiar fashion. Challenges with the terrain correlation include mapping geocentric 64-bit coordinates to Unity’s 32-bit coordinate system and the Magic Leap’s screen

  • coordinates. Icons in AR display slightly off from

OneSAF locations, solutions are actively being explored. While work was done to minimize the polygon count for performance, we also need the ability to scale the terrain in a manner that provides more coverage and detail on demand by streaming the terrain. Battery life is problematic. Ideally the devices would have a mechanism for “hot swapping” batteries. 5.2 Application Lessons There is a cultural hurdle in incorporating XR

  • technologies. Military personnel, particularly senior

personnel, are reluctant to put themselves in situations where they may appear foolish. The vision restrictions of VR devices and the awkward gestures associated with many XR interfaces present an obstacle to adoption. Adoption of XR for training will require reduction of these

  • bstacles and delivery of sufficient value to overcome

these concerns. In their present state XR technologies are most useful for viewing the situation rather than controlling a scenario. The interfaces are limited and often cumbersome. The primary value of XR is the three-dimensional

  • view. This is what must be exploited.

The value of AR over VR is found primarily in the ability of the wearer to function in the environment rather than the direct collaboration (i.e., being able to physically point at things). The terrain placement is not sufficiently

  • precise. A laser pointer like interface would be more

useful. The devices need to accommodate users who wear glasses or provide some mechanism for adjusting focus. Need a mechanism for allowing others to see what a wearer is seeing. This is essential for troubleshooting and initial training of users. We implemented a multistep process for routing a display from the headset to a smart phone to a television. The HoloRange prototype was developed primarily for use in Wargaming. However, several military personnel who viewed the prototype noted applicability to training and operations. For training, the 3D view can accelerate understanding of complex battlefield geometry problems such as integration of supporting arms or phasing

  • f amphibious operations. The technology supports

planning, runtime, and after-action review activities. It may be used for Course of Action (COA) development, Rehearsal of Concept (ROC) drills, and a Common Operational Picture (COP) representation. For operations, the virtual sandtable provides an excellent mechanism for presenting operational courses of actionCOAs and other plans. As the technology becomes more portable and ruggedized, it could be used to deliver vastly improved orders briefings in the field.

6 Conclusion

Overall, the HoloRange prototype greatly assisted us in evaluating the AR state-of-the-art technologies and

  • rienting us to further improve in future iterations.

Continuing the assumption that an approved AR device for military use will produced in the future, our current results experimenting with AR is that it and other XR platforms will provide a viable platform for creating a digital twin sandtable for wargaming, training, and other use cases. The HoloRange brought forth differences in and the need to provide both collaborative features and individualized

  • features. We intended to explore more UDOP features, but

the timeframe of the prototype limited our ability to better explore integrating optional features or applying an XML standard like UDOP to assist defining what those features are. The MSSV infrastructure provided the ability to connect OneSAF to the HoloRange. However, it would have been more practical to interface the HoloRange with OneSAF using a standard such as the Distributed Interactive Simulation (DIS) or High Level Architecture (HLA) standard. Using DIS or HLA would be a turnkey

slide-5
SLIDE 5

IT2EC 2020 1-6-19 DRAFT HoloRange AR Sandtable Prototype Lessons Learned Enhancing Training Through xR approach to providing legacy simulators with a modern AR IG. DIS specifically would have allowed us to effortlessly expand our simulation base from being only OneSAF-driven to including other simulators. The current DIS and HLA standards limit the control and display

  • ptions for AR, for example we would need to map DIS

enumerations to something the Symbol Server can

  • provide. But in the case of HLA, we would be able to

design a specific and reusable Base Object Model (BOM) data model which could be imported into any Federated Object Model (FOM) for compatibility to extend a federation to support unique AR features. That data model may later be incorporated into DIS, MSSV, or any other architecture. XR provides a mechanism to share a common visualization in three dimensions. Our experience was that while the technology requires significant overall maturation, it is more than sufficient for providing an image generation system that offers an excellent blend of common and personalized viewing options.

Acknowledgements

Do we want to acknowledge anyone?

References

[1] C.J. Garneau, M.W. Boyce, P.L. Shorter, N.L. Vey, C.R. Amburn, The Augmented Reality Sandtable (ARES) Research Strategy, ARL-TN-0875 (2018) https://apps.dtic.mil/dtic/tr/fulltext/u2/1048013.pdf [2]

  • S. Mulgund, S. Landsman, User Defined Operational

Pictures for Tailored Situational Awareness (2007) http://www.dodccrp.org/events/12th_ICCRTS/CD/h tml/presentations/090.pdf [3] Magic Leap Fact Sheet (2018) https://www.magicleap.com/static/magic-leap-fact- sheet.pdf [4] J. Brodsky, Guided Meshing in Create (2019) https://www.magicleap.com/news/for- creators/guided-meshing-in-create [5] Key Terrain https://keyterrain.io [6] Department of Defense Interface Standard – Joint Military Symbology (2014) https://www.jcs.mil/Portals/36/Documents/Doctrine/ Other_Pubs/ms_2525d.pdf [7] Military Symbols (milsymbol), https://github.com/spatialillusions/milsymbol

Author/Speaker Biographies

  • Dr. Daniel Lacks is the Chief Scientist at CESI. He

received Computer Engineering a PhD degree in 2007 from the University of Central Florida. Dr. Lacks worked as a Subject Matter Expert on the USMC PM WGC prototype effort focusing on systems engineering and scenario generation. Rodney Choi is a retired U.S. Marine. He served as the CESI program manager for the wargaming prototype

  • effort. He holds a B.S. in computer science from the U.S.

Naval Academy and a M.S. in CS from the U.S. Naval Postgraduate School.

Formatted: English (United States) Formatted: English (United States) Formatted: English (United States) Formatted: English (United States) Formatted: English (United States) Formatted: English (United States) Formatted: English (United States) Formatted: English (United States) Formatted: English (United States) Formatted: English (United States) Formatted: English (United States) Formatted: English (United States)