XRIDDLE
Overview
XRiddle started with a specific technical question: can AR Foundation and Vuforia coexist in a single Unity application without falling apart? Both frameworks are widely used in AR development, but combining them in one app is genuinely tricky because they compete for the same hardware.
The project grew out of The Labyrinth of Echoes, a team project at Hochschule Darmstadt inspired by Pink Floyd's work, an immersive experience concept where visitors play archaeologists exploring a mysterious labyrinth, solving riddles, and uncovering a truth about a machine threatening the world. I served as the Creative Technologist on that team and was responsible for the prototype design. This project was extracted from the broader concept as a focused portfolio piece demonstrating the technical architecture and AR interaction design on their own terms.
Tested on Android, both frameworks support iOS by design and the architecture is ready for it, an Apple developer license just was not available at the time to verify.
Project Details
Platform: Android (iOS architecture-ready)
Engine: Unity
Tech: C#, AR Foundation, Vuforia
Role: Creative Technologist, Solo Developer
Source Code: Available on GitHub
Status: Prototype available on itch.io
The Problem
AR Foundation and Vuforia do different things well. AR Foundation handles plane detection and world anchoring. Vuforia handles image recognition and target-based tracking. There are real scenarios where you want both in the same experience.
The problem is they conflict at the hardware level. Both try to own the camera, the gyroscope, and the XR subsystem initialization. Running them naively together causes instability.
The Solution
The key insight was that the ARSession must remain active at all times. Disabling it causes a black screen because both AR Foundation and Vuforia share the underlying camera pipeline. Toggling the session itself is not the right level of control.
Instead, SwitchARMode toggles VuforiaBehaviour and ARPlaneManager independently while leaving ARSession running. Switching to image tracking mode enables VuforiaBehaviour and the ObserverBehaviour components on each image target, then disables ARPlaneManager. Switching to placement mode does the reverse. The two frameworks never actively conflict because only one set of components is processing camera input at a time, while the shared session underneath stays stable throughout.
UI elements are toggled using CanvasGroup alpha, interactable, and blocksRaycasts rather than SetActive. This avoids layout recalculation overhead and keeps the visual state consistent without destroying and rebuilding the canvas hierarchy.
Canvas Reparenting
One non-obvious problem with Vuforia image tracking is that Vuforia's DefaultObserverEventHandler calls SetActive on child GameObjects when a target is lost. If a UI canvas is a child of the image target, it gets deactivated when tracking is lost, even if you set CanvasGroup alpha to 0 instead.
This was solved by reparenting the canvas to a persistent transform in the scene when a target is first tracked. The canvas still appears attached to the target visually, but it is no longer a child of the target hierarchy. When tracking is lost, Vuforia deactivates the image target's children as usual, but the canvas is no longer one of them.
What the Experience Does
In image tracking mode, pointing the camera at a target image triggers a puzzle or interactive element anchored to it. Multiple target images are supported, each with its own content.
In placement mode, scanning a real-world surface positions the 3D model at the detected plane location. A scale slider adjusts the XR Origin's inverse scale, making the placed model appear larger or smaller relative to the environment. Mode switching happens within the same session, which is the point.
What I Learned
The biggest lesson was to define system boundaries before writing a single line of code. The conflict between AR Foundation and Vuforia is not obvious until you try to run them together, at which point you have already built things the wrong way. Thinking through which system owns which resource, and when, before touching the editor would have saved significant debugging time. The specific lesson about ARSession staying active was the kind of thing that only becomes clear after hitting the black screen the first time. That habit of mapping ownership and boundaries upfront now comes before any implementation decision on projects involving multiple systems or frameworks.