Avatar Embodiment Simulation for VR Research

At a glance

I built an experiment-ready VR application in Unity + C#, designed to investigate how environmental cues affect a user’s sense of embodiment and behavior. My role was translating a research idea into a robust, repeatable software system that could run multi-stage experiences, capture detailed behavioral data, and operate reliably with VR hardware constraints.

Role: Software Developer | Computer Graphics Independent Study
Tools: Unity 2022, C#, OpenXR, SteamVR, HTC Vive, CSV telemetry pipeline
Timeline: Jan 2024 – May 2024
Advisor: Dr. Tabitha Peck

Problem / Objective

Traditional user experience work focuses on screen flows and usability, but immersive systems introduce new UX engineering challenges: real-time spatial feedback, continuous user motion, and calibrated interaction with environments.

The goal was to build:

  1. A guided multi-stage experience with clear interaction cues and task progression, and
  2. A consistent measurement infrastructure to support downstream analysis of user behavior.

This meant designing software that handled VR hardware integration, interaction logic, telemetry, and repeatable session workflows — all while maintaining an experience that felt coherent and usable in VR.

Design & Development Approach

1. Defining the Flow & Environment Variants

  • Built a user experience with three logical stages: a mirror embodiment exercise, a bar counter interaction, and a dance floor free-form section — each with clearly defined transitions and interaction affordances.
  • Architected the code to support two environment variants with controlled visual differences (inclusive vs neutral), enabling clear tool-driven comparisons without altering the experience structure.

2. Interaction & Runtime Engineering

  • Implemented VR interactions using Unity’s C# APIs with both OpenXR and SteamVR support.
  • Faced and resolved hardware/runtime interoperability issues: full-body tracking required SteamVR rigs incompatible with OpenXR input paths, so I shipped a procedural walking solution that maintained consistent navigation while avoiding headset input conflicts.

3. Telemetry & Data Instrumentation

  • Defined a telemetry schema for proximity, gaze targets, timing, and head-mounted display transforms.
  • Built real-time capture routines that logged these signals to structured CSV datasets for later analysis — ensuring each session could be replayed or processed for behavioral insights.
Telemetry output (CSV snippet)
Example rows showing session IDs, variant, stage markers, and sampled behavioral signals.
RecordData → CSV Export
timestamp_ms session_id variant_id stage_id hmd_pos_x hmd_pos_y hmd_pos_z hmd_yaw_deg gaze_target_id npc_proximity_m zone_id event_marker
124350 P014_A GSD 1_bathroom 0.21 1.63 -0.88 172.4 mirror 3.42 bathroom stage_start
124400 P014_A GSD 2_bar 1.92 1.62 -2.14 96.1 npc_bartender 1.08 bar_counter entered_zone
124450 P014_A GSD 3_dance 3.14 1.62 -3.05 43.7 signage_exit 2.35 dance_floor stage_transition

4. Operational Runbook & Calibration Workflow

  • Produced documentation: detailed session runbook that covers hardware setup, environment selection, participant height calibration, user ID capture, and telemetry export steps.
  • Programmed responsive UI prompts for calibration inputs that inform avatar scaling and comfort levels.

By intertwining UX goals with software engineering rigor, the platform became operationally viable for controlled sessions.

Architecture

Flow & Environment Variants
Staged session runner + content switch for GSD-friendly vs Neutral environments.
Stage Controller Variant Loader Scene Routing
Bathroom Bar Dance Floor
Interaction & Runtime Engineering
XR input + locomotion + interaction logic; designed to stay consistent even under runtime constraints.
OpenXR Input SteamVR Tracking Procedural Walk
Grab/Select NPC Proximity Look Targets
Telemetry & Data Instrumentation
Samples behavior signals and writes structured session logs for analysis/replay.
RecordData Schema + Markers Session IDs
CSV Writer Local Save Export/Upload
Pillar 1: Flow
  • State machine per stage
  • Variant swap without changing pipeline
  • Consistent boundaries across runs
Pillar 2: Runtime
  • XR input abstraction
  • Locomotion compatibility fixes
  • Interaction rules & affordances
Pillar 3: Telemetry
  • Stable schema + stage markers
  • Sampling cadence (Hz)
  • Session IDs + file naming
Pillar 4: Ops
  • Calibration UI (height → avatar scale)
  • Runbook-driven setup
  • Repeatability & reduced operator error

Implementation Highlights

  • Unity 2022 & XR Integration: Modular scene controllers and interaction handlers to support dynamic environment variants.
  • Telemetry Pipeline: Custom logging class that writes consistent CSV entries during runtime events.
  • Calibration UI: Lightweight in-scene UI for height/ID that updates avatar parameters at session start.
  • Procedural Motion: Algorithmic walking solution to maintain physical navigation without full body trackers.

Outcome & Next Steps

This project resulted in a fully executable VR application that:

  • Is ready for participant data collection without further engineering refactors.
  • Captures aligned behavioral telemetry streams that can be analyzed independently.
  • Has a repeatable session workflow that can be run on HTC Vive hardware with minimal setup.

The project is a complete interactive system — benchmarked against usability and engineering constraints, and ready to plug into larger research pipelines.