Skip to content

Exploring the mixed reality continuum

Have you ever found yourself wondering where augmented reality ends and virtual reality begins? The answer lies in understanding the mixed reality continuum – a concept that’s transforming how game developers create immersive experiences by blending physical and digital worlds.

What is the mixed reality continuum?

The mixed reality (MR) continuum represents the spectrum of experiences that bridge our physical reality and completely virtual environments. First proposed by Paul Milgram and Fumio Kishino in 1994 as the “Reality-Virtuality Continuum,” this theoretical framework positions different immersive technologies along a spectrum based on how much they blend or replace our physical world.

At one end of the spectrum is our physical reality, and at the other end is complete virtual reality. Between these endpoints lie various mixed reality experiences that combine elements of both worlds:

PositionTechnologyInteraction TypeEnvironment
Reality-dominantAR (Augmented Reality)Digital overlays on realityReal world with digital additions
Middle groundMR (Mixed Reality)Bidirectional interactionMerged physical-digital spaces
Virtuality-dominantVR (Virtual Reality)Full immersion in digitalCompletely virtual environment

What makes mixed reality unique is its bidirectional interaction capability – virtual objects can influence and be influenced by real-world elements in real-time. Imagine a virtual ball in an MR game that doesn’t just appear to bounce off your coffee table but actually responds to its exact dimensions and position, or a holographic character that intelligently navigates around your furniture, peeking from behind a bookshelf before retreating when you approach.

The theoretical foundation of mixed reality

Unlike VR/AR/MR/XR which are often discussed as separate technologies, the mixed reality continuum emphasizes their interconnected nature. This theoretical foundation is crucial for game developers to understand because it informs how different technologies can be leveraged to create specific types of experiences.

The continuum concept helps developers think beyond rigid categories and instead consider:

  • Level of immersion: How much of the user’s perception is replaced by virtual elements
  • Environmental awareness: How the technology incorporates or replaces the physical world
  • Interaction models: How users engage with digital and physical elements simultaneously

This theoretical understanding enables developers to make informed decisions about which technologies best serve their creative vision, rather than forcing concepts to fit within a specific technology’s limitations. For instance, a puzzle game might be more effective as an AR experience if the concept relies on players’ awareness of their surroundings, while a deep narrative adventure might benefit from VR’s complete immersion.

The 7 classes of mixed reality displays

Within the mixed reality continuum, researchers have identified seven distinct classes of displays that enable these experiences:

  1. Monitor-based (non-immersive): Desktop screens showing virtual objects in real environments, like viewing architectural models placed in photographic contexts.
  2. Projection-based: Digital content projected onto physical surfaces, such as the Microsoft RoomAlive concept that transforms entire rooms into interactive game spaces.
  3. Video see-through: Cameras capture reality and display it with digital overlays, as seen in most smartphone AR applications and VR headsets with passthrough features.
  4. Optical see-through: Transparent displays that overlay digital content on direct view of reality, exemplified by devices like Microsoft HoloLens or Magic Leap.
  5. Hybrid: Combinations of different display technologies, such as systems that use both projection and head-mounted displays for collaborative work.
  6. Immersive: Fully surrounding displays that block out reality, like traditional VR headsets that create complete virtual environments.
  7. Cave Automatic Virtual Environment (CAVE): Room-sized immersive projection environments where multiple users can experience the same virtual content simultaneously.

Each class offers different advantages for game development, from the accessibility of monitor-based systems to the full immersion of CAVE environments. Technical artists must consider these display classes when designing assets and interactions that will function effectively across the continuum.

Applications in game development

The mixed reality continuum opens up exciting possibilities for game developers looking to create innovative experiences:

Environmental integration

MR enables games to incorporate the player’s physical environment as a gameplay element. For example:

  • A strategy game where virtual units navigate around real furniture, using couches as mountains and coffee tables as plateaus
  • Horror games where creatures appear to emerge from the player’s actual walls, crawling along real surfaces before attacking
  • Puzzle games that use the dimensions and features of the player’s space, perhaps requiring players to physically move objects to solve challenges

This environmental awareness creates personalized experiences unique to each player’s physical surroundings. Microsoft’s Minecraft Earth demonstrated this concept by allowing players to build virtual structures anchored to real locations, visible to anyone with the app visiting the same space.

Collaborative experiences

Mixed reality shines in multiplayer scenarios where players can:

  • Gather around a virtual game board placed on a real table, seeing the same game pieces from their respective positions
  • See and interact with other players’ avatars in their shared physical space, allowing for natural social dynamics
  • Collaborate on building virtual structures in real-world locations, combining their creative efforts in persistent spatial environments

These social experiences blend digital gameplay with physical presence, creating new forms of multiplayer interaction not possible in traditional gaming. The board game Demeo exemplifies this approach, allowing multiple players to gather around a virtual tabletop game while maintaining awareness of each other’s physical presence.

Prototyping and visualization

For game developers and technical artists, the mixed reality continuum offers powerful tools for the development process itself:

  • Visualizing 3D assets in real-world scale before finalizing designs, ensuring proportions feel natural when experienced in the game
  • Testing spatial relationships between game elements in physical space, spotting issues with level layouts or object placement
  • Collaborating on level design by placing virtual elements in shared spaces, allowing team members to walk through environments together

Using tools like 3D design and prototype in VR, developers can rapidly iterate on concepts while maintaining a sense of how they’ll appear in the final experience. This approach is particularly valuable for indie teams working remotely, as it allows for spatial collaboration regardless of physical location.

Mixed reality vs. AR vs. VR: Key differences

While these technologies exist on a continuum, understanding their distinct characteristics helps developers choose the right approach for their projects:

AspectARMRVR
EnvironmentReal world + digital overlaysMerged physical-digital spacesFully virtual environment
InteractionOne-way (digital → real)Bidirectional (digital ↔ real)Immersive, isolated digital
HardwareSmartphones, AR glassesHoloLens, Magic LeapMeta Quest, HTC Vive
Use CasesPokémon Go, utility appsCollaborative design, spatial gamingBeat Saber, Half-Life: Alyx

The key differentiator of mixed reality is how digital objects can be anchored to and interact with the physical world. Unlike AR, which simply overlays information, MR creates a hybrid environment where virtual objects behave as if they have physical presence – they can be occluded by real objects, cast shadows, and respond to real-world physics.

Consider an AR game like Pokémon Go, where creatures appear in the world but don’t interact with it meaningfully. In contrast, an MR version might show Pokémon hiding behind trees, running around obstacles, or responding to changes in the physical environment – creating a much more convincing illusion of these creatures existing in our world.

Technical considerations for developers

When developing for the mixed reality continuum, technical artists and developers face unique challenges:

Spatial mapping and environmental understanding

MR applications need to understand the physical environment to enable meaningful interactions. This requires:

  • Real-time scanning and mapping of physical spaces using technologies like SLAM (Simultaneous Localization and Mapping)
  • Object recognition to identify surfaces and items, distinguishing between floors, walls, furniture, and other elements
  • Physics simulations that account for both virtual and physical elements, ensuring believable interactions between worlds

These technical requirements demand specialized development approaches that differ from traditional game development. Frameworks like ARKit, ARCore, and Microsoft’s Mixed Reality Toolkit provide tools for environmental understanding, but developers must build systems that gracefully handle the unpredictability of real-world spaces.

Asset optimization

Creating assets for mixed reality requires careful optimization:

  • Models must be lightweight enough for real-time rendering on mobile processors, often requiring significant polygon reduction compared to PC or console games
  • Textures need to look convincing when placed alongside real-world objects, with attention to lighting consistency and material properties
  • Lighting must adapt to real-world conditions for believable integration, potentially using environment mapping to match ambient illumination

Tools like Alpha3D’s 3D modelling studio can help developers quickly create optimized assets for mixed reality experiences without extensive manual modeling. This approach is particularly valuable for indie developers who may not have dedicated technical artists on their team.

Cross-platform considerations

The mixed reality continuum spans multiple devices and platforms, requiring developers to consider:

  • How experiences translate between AR-capable phones and dedicated MR headsets, potentially scaling features based on available hardware
  • Content that can adapt to different display types and interaction models, from touch controls to spatial gestures
  • Asset pipelines that support multiple target platforms, balancing quality and performance across various devices

This often means creating flexible systems that can function across the continuum rather than building for a single device. Unity and Unreal Engine provide cross-platform support for XR development, allowing developers to target multiple points on the continuum with shared codebases.

As technology evolves, several trends are shaping the future of the mixed reality continuum:

Hardware convergence

Next-generation devices are blurring the lines between AR and VR:

  • Headsets with high-quality passthrough cameras enable both MR and VR experiences in a single device, as seen with the Meta Quest 3
  • Lightweight glasses with advanced projection are improving AR capabilities while approaching the immersion of traditional MR
  • Haptic technologies are enhancing physical feedback in virtual interactions, from controller vibration to full-body haptic suits

This convergence is making the continuum more fluid, allowing developers to create experiences that move seamlessly between different reality states. For example, a game might begin in AR mode, transition to full MR for interactive segments, and include VR sequences for story moments – all within a single session and device.

Spatial computing platforms

Major tech companies are building platforms that support development across the continuum:

  • Microsoft’s Mixed Reality tools and Omniverse AR from NVIDIA provide comprehensive development environments for creating spatial experiences
  • Apple’s Vision Pro introduces spatial computing to mainstream consumers with a focus on natural interaction and high-fidelity visuals
  • Meta’s Reality Labs continues to advance social MR experiences that connect people across physical distances

These platforms are making it easier for developers to create content that spans multiple points on the continuum. By providing integrated toolsets and standardized APIs, they reduce the technical barriers that previously made cross-reality development challenging.

Content conversion and adaptation

As the lines between technologies blur, tools for converting content between formats are becoming essential:

  • Convert 2D video to 3D VR solutions help repurpose existing content for immersive displays, extending the lifespan of conventional media
  • VR to 2D converter tools enable sharing immersive experiences on conventional screens, improving accessibility for users without specialized hardware
  • AI-powered systems automatically adapt content for different points on the continuum, optimizing assets and interactions for various display types

These conversion capabilities allow developers to create once and deploy across multiple reality formats, maximizing the reach of their content while minimizing production costs.

Practical implementation for indie developers

For indie game developers with limited resources, the mixed reality continuum might seem daunting. However, several approaches make it more accessible:

Start with AR and expand

Begin with mobile AR experiences using frameworks like ARKit or ARCore, then gradually incorporate more MR elements as resources allow. This provides a manageable entry point with existing devices before investing in specialized hardware.

For example, a puzzle game could start as a phone-based AR experience where objects appear on surfaces, then evolve to include more environmental interactions as development progresses. This staged approach allows for testing core gameplay concepts before tackling the complexities of full mixed reality.

Leverage cross-platform tools

Engines like Unity and Unreal provide tools for developing across the continuum with a single codebase. This allows indie developers to create adaptable experiences that can function at multiple points on the spectrum.

Unity’s XR Interaction Toolkit and Unreal’s XR framework both support development across various devices, from smartphones to dedicated headsets. By designing with these flexible systems from the start, developers can future-proof their projects and reach users across the continuum.

Focus on core interactions

Rather than trying to create fully-featured MR experiences immediately, focus on perfecting one or two core interactions that leverage the unique capabilities of mixed reality. A well-executed simple concept often outperforms an ambitious but poorly implemented one.

The indie game “Cubism” demonstrates this principle by focusing on a single, polished mechanic – placing 3D puzzle pieces in a virtual space. This focused approach allowed the developer to create a high-quality experience with limited resources, while still taking advantage of spatial computing capabilities.

Conclusion

The mixed reality continuum represents more than just a collection of technologies – it’s a framework for understanding how digital and physical realities can merge to create new forms of interactive experiences. For game developers, technical artists, and indie creators, this continuum offers unprecedented creative possibilities by breaking down the barriers between virtual and physical worlds.

As hardware advances and development tools mature, we’re moving toward a future where experiences aren’t limited to being “AR” or “VR” but exist as fluid points along the mixed reality spectrum – adapting to users’ needs and environments. By understanding this continuum, developers can create more immersive, contextual, and meaningful experiences that blend the best aspects of both physical and digital realities.

The most exciting aspect isn’t just what’s possible today, but how this continuum will continue to evolve, opening new creative frontiers for those ready to explore the space between the real and the virtual.