NextGen Project Header

VR Training Platform for Minnesota's Law Enforcement Trainees

NextGen Badge is a VR training platform that immerses criminal justice students in realistic, emotionally charged scenarios like traffic stops and domestic disputes. Built in Unreal Engine 5, it creates a safe, repeatable space to test both technical skills and emotional resilience.

As the Product Designer (UX-focused), I led experience design, supported production, and collaborated closely with developers, 3D artists, and the PM from concept through ongoing development.

5-7 min read | June 2024 → June 2025* | Product Designer & TL | VR, Unreal Engine 5, Unity


TL;DR

Traditional police training focuses on procedure, not empathy. The University of Minnesota and Minneapolis PD wanted a way to teach emotional awareness and real-time decision-making through VR.

Solution:

We built NextGen Badge in Unreal Engine 5 for Oculus Quest 3 — an immersive training platform where students navigate high-pressure scenarios like traffic stops and domestic calls. I led UX design, scripted branching dialogues, defined interaction systems, and coordinated production across design, art, and development.

Result:

Delivered two fully interactive, performance-optimized VR scenarios with realistic dialogue, spatial audio, and emotional pacing. Early testing showed stronger engagement and improved decision awareness, guiding expansion for broader rollout in 2025.


Gameplay Snippet: Raycast Observation & UI Interactions in Police Training | Turn on sound for dialogues.

Context

Traditional police training focuses heavily on procedures but often overlooks emotional intelligence. The University of Minnesota and Minneapolis PD partnered with us to explore how VR could help bridge that gap.

I worked closely with the PM, client, 3D artists, and Unreal developers to:

  • Design the end-to-end in-headset experience.
  • Define UI systems for gaze, triggers, and onboarding.
  • Write branching dialogue scripts and produce storyboards.
  • Maintain design documentation and asset lists.
  • Direct visual tone and audio direction with the art and sound teams.

Team: Me (Product Design) · 2 Unreal Developers · 2 3D Artists · 1 Sound Designer

Approach

Scenario Design

We designed two key simulations — Traffic Stop and Domestic Call — built around empathy and decision-making under pressure. Working with law-enforcement SMEs ensured realism, accuracy, and authentic pacing.

Visual Design - Type Scale
Planning of the Domestic Call scenario · view img↗
Ideation, User Flow, and Interactions
Ideation process, user flows, and interaction patterns · view img↗

Script & Storyboarding

I turned branching dialogue scripts into interactive storyboards and Unreal Blueprints. Dialogue was paced to create emotional tension and reflection moments for trainees.

Early prototypes used AI-generated voice acting (ElevenLabs) for iteration, later replaced by professional recordings integrated into the same Blueprint system.

Interaction & UI

The interface was intentionally minimal. Users interacted through subtle gaze and ray-cast cues rather than menus, keeping focus on body language and tone.

Haptics were tied to feedback moments like prompts, item pickups, and emotional beats.

Interaction Design
Planning of the Interaction System · view img↗
Additional Screen
Early wireframes and Scenario Selector screens · view img↗

Visual Direction

Every detail referenced Minnesota: lighting, weather, and props grounded the experience in a familiar but neutral environment.

Visual Design - Design Components
Planning of the Traffic Stop scenario · view img↗
Ideation, User Flow, and Interactions
Wireframes and Figma designs · view img↗

Spatial Audio & Haptics

Spatialized dialogue, ambient soundtracks, and tactile cues deepened immersion. For instance, a buzzing phone or a raised voice triggered matching vibrations and localized sound.

Feature Screenshot
Dialogue and Spatial Audio implementation · view img↗

Core Systems

VR Pawn & Player Interaction

Custom VR Pawn using Meta XR plugin for Quest 3, with ray-cast visual interactions and alerts.

Project Image
Custom structured VR Pawn using Meta XR plugin for Quest 3. · view img↗

NPC System + Randomizer

A Blueprint-based MetaHuman randomizer that shuffles NPC bodies, outfits, and voices for scenario variety.

Dialogue System

Data Table–driven Blueprint logic linking each line to a voice clip, animation cue, and event trigger.

Spatial Audio & Haptics

  • Implemented Spatial Audio in UE using attenuation volumes.
  • Scenario specific soundtracks, Dialogue pinned to NPCs, Subtitles in VR.
  • SFX e.g., rumble during NPC yelling, buzzing phone call, knock vibration.

Lighting & Sequencer

Combined Lumen reflections with baked lighting for performance. Sequencer timelines handled cutscenes, ambient loops, and branching outcomes.

Feature Screenshot
Built Sequencer timelines for cutscenes and ambient NPC animation loops. · view img↗
ToolBag Default State
Ray-casting based Visual Cues and Alerts · view img↗

Production Systems

We used Notion for sprints, Figma for UX and visual design, and Sheets for asset tracking. Builds and retired assets were organized in Dropbox, while all client deliverables were managed in Google Drive.

Milestones were mapped across 2024–2025, coordinating with developers, testers, and voice actors. I worked closely with PM to understand scope changes, and client feedback I ensured sensitive content aligned with cultural and ethical standards.

Regular syncs with the university and PD stakeholders kept feedback loops fast and transparent.

Outcome

By mid-2025, NextGen Badge evolved into a functioning VR prototype used by the University of Minnesota's criminal justice program for early testing. Trainees reported that the simulation felt

ToolBag OnSeen State
Gameplay Screenshot · view img↗
Key Feature Screenshot
Gameplay Screenshot · view img↗

"real enough to trigger instinct," highlighting how tone and timing directly affected NPC behavior. Faculty noted the platform's potential as both a recruitment and empathy-training tool.

Performance optimization on Quest 3 brought stable frame rates and reduced motion fatigue, allowing longer, more natural sessions.

Each scenario now runs with full voice acting, randomized NPCs, and tracked decision outcomes — setting the stage for scalable deployment to other departments.

Early prototyping: UI Interaction, MetaHuman Character Selection & Level Portal