Relative Pose Data Augmentation and Other Tracked Devices in Virtual Environments

Technology #34272

Questions about this technology? Ask a Technology Manager

Download Printable PDF

Image Gallery
Illustration of the cooperative motion capture (COMOCAP) concept: (a) Depicts body-relative signaling in a multi-person setup. Each person wears components that can send/receive intra-user signals (red dashed) between a reference body point, the head or torso, and the moving limbs. A centralized global system (green dashed) tracks each person's body to the environment. (b) Depicts the cooperative motion capture approach where transceivers (transmitter-receiver units) worn on each person's body send/receive signals (red dashed) to/from the transceivers of other people in a peer-to-peer fashion. As people get closer to each other, the inter-user signals increase in strength and robustness, compared to conventional non-inertial approaches.
Categories
Researchers
Gregory Welch, Ph.D.
External Link (nursing.ucf.edu)
Gerd Bruder, Ph.D.
External Link (www.ist.ucf.edu)
Patent Protection

US Patent Pending
Publications
A Novel Approach for Cooperative Motion Capture (COMOCAP)
International Conference on Artificial Reality and Telexistence, Eurographics Symposium on Virtual Environments (2018). https://doi.org/10.2312/egve.20181317

Key Points

  • System for augmenting relative pose data values between co-located tracked devices in a multi-user virtual reality or augmented reality
  • Provides dynamic cooperative relative tracking of devices such as head-mounted displays (HMDs) and handheld controllers
  • Ability to estimate the relative positions of multiple users/devices

Abstract

Researchers at the University of Central Florida have invented a better way to track objects like head-mounted displays (HMDs) and handheld controllers of multiple users as they interact in shared space. Virtual environment systems today encounter some amount of error (static or dynamic) that can affect the system's usefulness. If two or more users are tracked close to each other in a joint task, the error problems increase. The UCF multi-participant tracking system combines and extends conventional global and body‐relative approaches to "cooperatively" estimate the relative poses between all useful combinations of devices worn or held by two or more users. Example applications include hands-on training where medical professionals simulate surgical or trauma team activities, small military units in joint training exercises, or civilians in multi-user scenarios.

Technical Details

The UCF invention consists of systems and methods for tracking one user/object with respect to all others and the environment. Tracking technologies for interactive computer graphics (for example, virtual/augmented reality or related simulation, training, or practice) are used to estimate the posture and movement of humans and objects in a three‐dimensional working volume. This is typically known as six-degree-of-freedom (6DOF) "pose" tracking (estimation of x, y, z positions and roll, pitch, yaw orientations).

Compared with the UCF system, conventional technologies lack the following capabilities:

  • Feedback or cooperation between multiple user-worn systems. In conventional systems, the pose/posture of objects for an individual are estimated independently in an "open loop" fashion, so any virtual (VR or AR) entities that the users see are not necessarily aligned/registered with each other (not accurately collocated).
  • The ability to continue tracking blocked components. Most conventional systems require line-of-sight sensing. As multiple users get closer to each other, the signals of tracked components become occluded (blocked) by an object or another user in the environment, causing the system to stop working.

The invention has the added benefit of allowing the HMDs, handheld controllers, or other tracked devices to "ride out" periods of reduced or no observability of externally mounted devices.

Benefit

  • Leaves the underlying design or functions of head-mounted display systems unchanged
  • Enables discrete tracking of finite movements for training purposes
  • Can plug into conventional VR/AR systems

Market Application

  • VR/AR systems and simulations
  • Hybrid motion capture systems
  • Multi-user training (such as medical surgical or trauma teams, military units)