ASCEND

Overview

Ascend started as a VR traversal prototype and grew into a full stealth-action traversal game for my Master's thesis at Hochschule Darmstadt. The central challenge was building a game that ran on both VR and PC from a single codebase, where the two platforms have fundamentally different interaction models, UI conventions, and locomotion requirements.

By the end it had a complete blockout level, a multi-state stealth AI, a wide variety of traversal mechanics, narrative dialogue, and platform-specific UI, all sharing the same underlying systems. The research behind the project looked at how different platforms affect player presence and experience in stealth-traversal games.

Project Details

Platform: PCVR (Meta Quest), PC

Engine: Unreal Engine

Tech: Blueprint Visual Scripting

Role: Solo Designer and Developer

Context: Master's Thesis, Hochschule Darmstadt

Status: Prototype available on itch.io, source code to be released open-source upon thesis completion

Narrative and Setting

You play as North, a trained operative navigating dystopian rooftops in a city where an AI built to support urban populations has turned hostile. Five mission objectives are spread across the level, injecting malware, retrieving a prototype, disabling communication towers, shutting down robot manufacturing, and hacking the main system.

The rooftop setting was a deliberate choice. Height creates tension naturally and raises the stakes of traversal. Getting caught by a robot forces a reload from the nearest checkpoint, so the elevated environment and the stealth system push against each other in a way that keeps the player alert throughout. Voice-overs and sound effects were produced using ElevenLabs, directed to match the narrative tone at each objective transition.

Traversal Mechanics

The problem was providing a wide enough variety of traversal types to make the research comparison meaningful, while keeping each mechanic implementable across both platforms without one version feeling worse. Walking, climbing ladders and ropes, crossing balance beams, riding elevators, riding ziplines, and traversing ramps between buildings at varying heights all made it into the final build.

Each mechanic works differently across platforms. Ladder climbing in VR means physically moving your arms. On PC you press a button and use directional keys to move. Ziplines carry you automatically once attached. Elevators require pulling a lever and pressing a button in sequence. The traversal system was built modularly to support iterative design throughout development.

Stealth AI

The design problem for the AI was creating tension without making failure feel unfair. The solution was a gradual detection model with a visible progress bar, giving the player time to react before things escalate.

At half fill the robot switches to investigation mode and moves to where it last saw you. At full fill it enters full alert, chasing and shooting the player which results in a reload from the nearest checkpoint. Crouching reduces both your visual profile and the noise you generate. An EMP device temporarily deactivates nearby robots when things get too close.

One thing to note here is the variables that make the robots harder or easier for the player to manage are easily editable, making it very efficient for faster tweaking. To keep the comparison fair between the two platforms, the robots are made easier to not frustrate the player during their gameplay, especially in VR. This also bears a consequence that the PC version might feel too easy due to maintaining the same difficulty level as the VR one for fair comparison.

Permanent robot deactivation is possible by approaching from behind and completing a deactivation sequence. In VR this means physically positioning your hand at the right distance. On PC it uses character collider proximity and a button press.

The sonar scanning system adds a secondary awareness layer. Activating it highlights enemies in red and objectives in green for about five seconds before resetting. The temporary nature was a deliberate design decision. Permanent threat indicators remove the need for players to actively manage their awareness. The sonar nudges them to build and maintain a mental model of enemy positions instead.

Platform-Specific Implementation

The core problem was that VR and PC have completely different interaction conventions, but the research required both versions to offer the same experience as closely as possible. The solution was a single codebase with platform-specific branches only where interaction logic, input handling, or UI display genuinely differed.

The VR version uses an HMD with head-tracking and motion controllers. Movement is thumbstick-based with snap-rotation to reduce cybersickness. All interactions are physical. The UI is diegetic, a panel attached to your left hand that can be toggled whenever you need it.

The PC version supports keyboard/mouse and gamepad. Interactions trigger on proximity and a button press. The UI is a standard HUD following established first-person conventions.

Technical Notes

Extra care was taken to profile tick cost and change them with timer handles to optimize it as much as possible. The codebase was written with open-source release in mind from the start , extensively commented and structured so future contributors can pick it up without needing to ask how anything works.

Switching between platforms requires only simple configuration changes rather than meddling with codebase. As we know the Game Instance is one of the few classes in Unreal Engine that lives beyond world changes. So, a variable change in the game instance class communicates throughout all level transitions and is easier to reference in different places where platform specific implementation is needed. The whole platform change takes only three visual setting changes in the project, which plays a very important role for faster iterations and bug fixes.

Visual Design and Navigation

A consistent color language solved the problem of communicating interactivity without text prompts cluttering both a VR and a flat screen simultaneously. Interactive elements are yellow throughout. Enemies appear red and objectives appear green during sonar scans. The first building block serves as a tutorial space with control instructions written on the walls, keeping the teaching method consistent and fair across both platforms.

What I Learned

Building the same game for two fundamentally different interaction models taught me more about input design and player presence than any single-platform project could have. The moments where the platforms diverged most, the ladder and rope climb, the lever interaction, were where the design decisions became most interesting.

Gameplay Video (PC)

Gameplay Video (VR)