In a recent conversation about XR, a colleague and I were talking about the various current and upcoming features available for VR and AR experiences. Specifically, we were discussing multi-feature experiences — experiences that want to leverage more than one XR capability at a time — and so my colleague proposed a question: “Can there ever be an experience which needs both teleportation and anchors?”
This was an extremely interesting question — even more so than we realized at first. At the time, we understood teleportation to be a VR-only feature. We defined teleportation to be the instantaneous change of the user’s point-of-view (POV) position from one location to another; and since that kind of teleportation is impossible in the real world, it follows that teleportation must also be impossible in any experience which incorporates the real world — in other words, in any AR experience. Anchors, on the other hand, we considered to be an AR-only feature. We defined an anchor to be a point of correspondence between the real world and the virtual world; and since the real world has no bearing on the virtual world, it follows that anchors can only make sense in an AR experience. Thus, any experience which allowed teleportation must be a VR experience, and any experience which used anchors must be an AR experience; and, according to our understanding at the time, the two paradigms were fundamentally incompatible.
But we were wrong. After further conversation and investigation, we concluded that it actually is possible for an experience to need both teleportation and anchors. In fact, an example of such an experience was first prototyped and demoed more than five years ago. The problem, we realized, was that our definition for teleportation had been flawed: instead of defining teleportation in terms of changing the POV position, we should have defined it in terms of changing the real-to-virtual world correspondence. That simple redefinition completely reframed our understanding of the relationship between teleportation and anchors, allowing us to recognize that a single experience can, in fact, very reasonably require both features.
This was an important realization for us. Because we had believed those two features were incompatible on a conceptual level, we had never considered what developers would need to do — and what they’d need from Babylon — in order to use those two features together as we thought such a usage could never occur. Happily (thanks in large part to Babylon’s commitment to simplicity), it does not appear that our misconception of incompatibility led to any API or architecture decisions that will constrain us going forward: none of our APIs prevent anchors and teleportation from being used together. However, we still didn’t know how a developer should go about creating such an experience in Babylon; and furthermore, in light of what we’ve realized, we now have an even bigger question to consider: “If teleportation and anchors can coexist, what other seemingly-incompatible XR features might someday need to work together in the same experience?”
To explore this question, we’ve concocted a hypothetical application designed to be the “kitchen sink” of XR experiences. Every XR feature and technique we‘ve been able to think of, both currently-available and upcoming, is incorporated into this experience. The goal of this thought experiment is to give perspective on how all these features may interact, possibly conflict, and ultimately coexist with each other. Thus, the hypothetical application we’ve concocted must, as far as we’ve considered it, be able to support…
- Teleportation: instaneously changing POV position in a virtual experience.
- Continuous virtual-space motion: changing POV position in a virtual experience by “animating” gradual changes over time.
- Real-world motion: changing POV in a virtual experience to match motions made by the user in the real world.
- AR: an experience involving virtual objects imposed over/into the real world.
- VR: an experience fully supplanting the real world with a virtual substitute.
- Fixed anchors: AR abstractions which represent an unmoving location in the real world.
- Movable anchors: AR concepts which represent a moving location in the real world.
- Plane detection/spatial mapping: techniques for perceiving real world surfaces for use in virtual experiences.
- Local-presence multiplayer: multiple individuals in the same physical and virtual location participating in the same experience.
- Remote-presence multiplayer: multiple individuals in the different physical locations but similar virtual locations participating in the same experience.
- Persistence: the property by which an experience maintains state across user sessions, allowing users to “leave” and “rejoin.”
…all in the same experience, all at the same time. The rest of this blog post will be dedicated to describing this experience and starting the conversation about what it might imply for XR. Welcome to the Martian Skiff scenario.
The Martian Skiff Scenario
Inspired by the 2015 prototype experience referenced above, we decided to set our hypothetical all-the-XR-features scenario on Mars. This hypothetical experience is used by a NASA team that wants to study Mars as a group. The team’s physical office in the real world (on Earth) has an open layout. In order to virtually explore Mars, this team decides to transform their shared physical office space into the deck of a virtual “skiff,” which they will be able to “sail” around the surface of Mars in order to explore the planet together.
(Note: this scenario is probably not a realistic representation of an efficient way for a NASA team to study Mars. This is not a problem because the realism/utility of the scenario is not the point. The point of the Martian Skiff scenario is not that anyone should create this experience; it’s that someone could create this experience, and platforms such as Babylon.js need to be able to support that.)
This experience, then, takes place in a specific, large physical space — the open-layout office. Participants within this space use devices with AR capabilities (HoloLens, Oculus Quest-like devices with passthrough AR, handhelds like iPhones with ARKit, etc.) while remote participants can engage with the virtual aspects of the experience using VR devices. The office space itself is large enough (and filled with enough objects) that it is useful to have multiple anchors spanning the space. Participants, both local and remote, can traverse the experience space at will, moving freely amongst virtual spaces described by various anchors and even transporting virtual objects along with them. Furthermore, because the space is functioning as the deck of the virtual skiff, the entire office can travel over the surface of Mars via either continuous traversal or teleportation.
To help this make more sense, let’s consider a narrative example of a sequence of events which could happen in this experience. NASA geologist Alice is at her desk at the southwest (in the real world) corner of the office. She is wearing a HoloLens and is already participating in the experience. The Martian Skiff (upon which the office virtually rides) is currently parked somewhere on Mars, not moving. Looking over the edge of the skiff at the virtual Martian surface rendered beyond, Alice spies a noteworthy rock and, reaching out with hand tracking, picks it up and pulls it back to her real-world desk. She sets the virtual rock down on her desk and begins to study it. As she studies, the skiff begins to move (presumably because someone else was interested in studying something elsewhere), but Alice is unaffected by this; the rock she’s studying is on her desk and thus travels with the skiff. After making a discovery, Alice decides to pass the rock along to her colleague Bob, whose desk is on the northeast side of the office. She also wants to deliver some handwritten notes she made about her discovery. Rather than walk all the way to the other side of the office, Alice walks to a nearby (physical) cart and places both the virtual rock and the handwritten physical notes onto it for delivery to Bob.
This cart is owned by Charlie, who frequently pushes it around the office to help his colleagues deliver things to each other. Charlie does not use an HMD and is instead participating in the experience using his iPhone. Seeing the handwritten notes on the cart and noting the virtual rock he can see through his phone, Charlie begins to physically push the cart through the office toward Bob’s desk. The physical notes and the virtual rock both remain on the cart as it moves through the space. Note that, while all this is happening, the skiff with the office on it is still moving over the virtual surface of Mars. When the cart finally gets close to Bob’s desk, Charlie uses his phone to pick up the rock and his hand to pick up the notes, transferring both to the desk and leaving them there for Bob.
Bob is running late this morning and arrives a few minutes after the rock and notes have been delivered. Donning his own Quest-like HMD with pass-through AR and joining the experience, Bob sees the rock, reads Alice’s notes, and begins his own examination. Excited by Alice’s discovery, Bob thinks of another related region of Mars he wants to study and so requests that Dana, who is piloting the skiff, take them to the site. Because this destination is very far away, Dana (who is working remotely and whose presence in the office is as a virtual avatar articulated from her VR experience at home) decides not to travel there by continuous motion and instead causes the skiff to teleport immediately to the desired location on the surface of Mars.
Usage of Features
This narrative provides examples for all of the major features that the Martian Skiff scenario was conceived to explore. To unpack this, here is the feature list given earlier in this document with mention of how each feature appeared in the narrative described above.
- Teleportation: at the end of the narrative, Dana teleported the entire skiff (with the office on it) from one location on Mars to another.
- Continuous virtual-space motion: throughout most of the narrative, the skiff is continuously moving across the surface of Mars.
- Real-world motion: both Alice and Charlie walk through the physical office space while in the experience, and their perspectives on virtual objects change to match their moving perspectives on the real world.
- AR: Alice, Bob, and Charlie all participate in the experience using varying manifestations of AR on different devices.
- VR: Dana participates in the experience using pure VR. Furthermore, when Alice initially looked out to find the rock and could see only virtual objects (as the office was behind her), she arguably experience VR as well.
- Fixed anchors: the office space in general is described using multiple anchors so that XR entities can be present throughout it. In particular, there are fixed anchors near both Alice’s and Bob’s desks so that virtual objects can be placed upon them.
- Movable anchors: there is a movable anchor on Charlie’s cart which allows virtual objects placed upon it to travel with it when it moves through the real world.
- Plane detection/spatial mapping: Alice and Charlie’s devices used spatial mapping and plane detection, respectively, when setting down the virtual rock on physical surfaces. (Note: this is not the only way “setting down” could have been implemented — the experience developer could have also manually placed physics-enabled shapes in the correct places relative to the anchors — but plane detection/spatial mapping are options which are, for completeness’s sake, what we choose to assume were used here.)
- Local-presence multiplayer: Alice, Bob, and Charlie all share the experience from the same physical locality.
- Remote-presence multiplayer: Dana shares the experience from a remote locality.
- Persistence: the experience is consistent and persistent across all users and across time, as demonstrated by how the virtual rock is left for Bob so that, when he arrives later, he finds it where it was placed on his desk.
In sum, the Martian Skiff scenario illustrates an example where all of the mentioned XR features would be expected to function together, at the same time, within the same experience. As far as I know, no real experience has ever been created which uses all these features together in this way. Perhaps no single experience will ever attempt to use all of them. However, the existence of a hypothetical scenario such as the Martian Skiff suggests that experience developers eventually could want all or any subset of the features at the same time, and so development platforms such as Babylon must do what we can to make sure that such multi-feature scenarios are possible.
So What Does It All Mean?
And what does this imply for the development of Babylon? And why am I talking about it in a blog post?
As for what exactly the Martian Skiff scenario means, the short answer is that we don’t fully know yet. A core purpose of exercises like this — proposing hypothetical “extreme” use cases — is to stress-test our notions about the features, APIs, architecture, etc. to help ensure that we don’t make decisions which will end up artificially constraining developers. An example of such a decision would have been if we had acted on our earlier belief that teleportation and anchors could never exist in the same experience by making it impossible for them to coexist in that way. We didn’t do that, but if we had, we would have needed to walk that back, possibly breaking backwards compatibility, once we realized that our assumptions had been flawed.
By illustrating all known XR features functioning cohesively in a single experience, the Martian Skiff scenario tells us that it is not safe for us to implement two XR features in ways that are explicitly incompatible. It also reveals certain things about how different features relate to each other, thereby hinting at the implementations and/or best practices which will allow those inter-feature relationships to flourish. For example, the “whole office teleports together” feature of the skiff exemplified teleportation in AR and thus helped us codify our understanding of teleportation as real-to-virtual correspondence rather than POV position. Hints and revelations like these are the reason I say that we don’t yet fully know what the Martian Skiff scenario means. We’ll have to keep thinking about it and about other scenarios that we also want to support as we continue to learn more.
Because there’s so much uncertainty, we also can’t give a definite answer about what this implies for the development of Babylon: we don’t know yet. As mentioned above, hypothetical scenarios can give us hints about how features should work together and what APIs/implementations/architectures are likely to work well, but only by actually building and using the XR functionality will we know for sure. Hypotheticals are valuable as a quick way to check our assumptions and assess which way to go, but at the end of the day such things can only take us so far. We need tangible implementations. We need prototypes. We need real use cases. We need the Community.
And that leads to the one concrete answer I can give to a question today: that’s why I’m talking about this in a blog post. As I’ve mentioned repeatedly, we’ve only started thinking about the future of XR in this way very recently, and the few thoughts we have aren’t fully-formed. That is exactly why we feel it’s important to invite the Babylon.js Community into the conversation as soon as we possibly can. Right now the conversation is mainly hypothetical, but before too long we hope it’ll start to transform into prototypes, then APIs, then demos, then usages. We want the Community to join us on every step of that journey because ultimately it’s the Community that must determine what implementations/APIs/practices are good and useful and right. It’s the Community that will use these capabilities to produce the next and most exciting wave of XR experiences. I love hypotheticals like the Martian Skiff scenario, but they can only take us so far. They’re a start, but they’re imaginary; it’s the Community that makes it real.
— Justin Murray