WebXR, AR and e-commerce: a Guide for Beginners
A few years back, I was very happy about a rug I found online. It would be perfect for my living room. Once I finally received it, I unrolled the rug on the floor and… Ta-da… No, not at all! It was actually horrible. The rug was nice by itself, but I just hadn’t pictured that its color wouldn’t fit at all with the rest of the room. I had to go through the hassle of returning it. Does this sounds familiar?
We are now in 2021 and 3D visualization technologies, initially developed for gaming and visual effects, have begun to transform retail experiences. This transformation accelerated recently of course, and I constantly see on Twitter amazing Augmented Reality (AR) examples which bring virtual objects in a living room or a den. For e-commerce, AR can considerably reduce the number of returns due to buyer’s remorse.
End-user adoption friction for AR should also be reduced significantly with the upcoming WebXR API from the W3C. On a supporting mobile device, an AR experience becomes accessible with a simple URL link, without the need to install a native app. It also becomes so easy to share!
Looking at our Babylon.js implementations of WebXR and of the latest glTF extensions released by the Khronos Group, I finally set myself up to build a little demo. In this article, you’ll discover how to create a simple AR experience in the browser: visualizing and manipulating a virtual piece of furniture in your living room.
Background on latest standards
Before getting into the details of the demo and the code, let’s start with a quick background and a few links for those who want to learn more.
WebXR Device API: The proposal is currently in draft phase. It allows developers to easily create cross-browser AR/VR web experiences. Babylon.js 4.2 adds support for hand tracking, hit-testing, advanced teleportation, cross-device input management and controller support. To learn more:
New glTF extensions: The 3D Assets and 3D Commerce Working Groups from Khronos have recently released new glTF extensions and guidelines to help align the industry:
- PBR extensions for Clear Coat, Transmission and Sheen for a better visual realism of some 3D assets
- Material Variant Support to combine multiple asset variants that share the same geometry into a single glTF asset
- Real-time Asset Creation Guidelines to “assist artists create efficient, reliable models for retail and e-commerce”
- 3D Commerce Panel — Render Everything Everywhere (Webinar)
Demo scenario
In the demo of this article, the user goes through the following workflow when adding a 3D model in AR:
- Go to the demo URL on a WebXR AR capable device (currently Android device with Chrome and “WebXR Incubation” flag enabled and AR Core installed)
- Enter in AR mode (by clicking the headset icon on the bottom right corner)
- Fix the floor with the mobile to have a pre-visualization of the model appear (a “ghost” version) and move it to the desired location
- Touch anywhere on the screen to have the full model appear and repeat until finding a position that fits well in the living room
- Change variant to check which one matches better (“Variant” button)
- Finalize the position and move around it in the room to see how it fits overall (“Set” button)
Clone and variant
Let’s start with some features that we can check on a regular machine with this Babylon.js Playground example, before going into the AR side of the demo. The advantage of doing this separately is that testing is easier and faster since we don’t have to use the AR mobile device.
Cloning the model to create a “ghost”: To position the model in the real world, we’ll use a “ghost” version of it: a black and white semi-transparent version that we move around in the room. This is done by cloning the mesh and changing the material of its sub-meshes with an alpha of 0.25 for the transparent aspect.
Changing the variant of the model: Thanks to the new KHR_materials_variant gtTF extension, only one model is loaded and we can switch between its variants (an orange “Mango Velvet” or a blue “Peacock Velvet” version of the Sheen Chair in our case) that have a different texture.
Setting-up the AR Scene
Finally, the time to get into the AR part of the code has come! Babylon.js makes it really easy to use WebXR.
The implementation offers WebXR Experience Helpers which will initialize WebXR automatically for you and provide an environment with which you can develop your XR (AR in our case) experiences. The Helper does it all for you: it initializes the XR scene, creates an XR Camera, initialize the features manager, create an HTML UI button to enter XR, etc…
It also offers the WebXR Features Manager, our XR plugin system, that allows you to pick the WebXR features you need (in our case only Hit Testing) and ensure the backward compatibility as the API evolves.
Positioning with hit-testing
The magic of WebXR and AR comes into play with the hit-testing feature. In the demo, it is used to position the model in the real world.
As it is explained in the documentation: “think about a ray that is broadcasted from your phone’s screen towards to object you are looking for. If the device’s AR capabilities allows it, it will let you know the position and orientation of the point relative to you.”
What’s next
Building this little demo was a lot of fun. How often is what we build so much connected to our daily life? It could be enhanced with shadows, which would further blur the lines between the two worlds. The 3D model would appear like it is casting a shadow on the real floor (this could be done using Plane Detection like in this Playground example, or by simply adding an invisible plane and a dark ellipse under the chair at the floor level). Another addition, much bigger, would be to take this JavaScript code and re-use it in native apps across devices. This is the promise of Babylon Native and, probably, one of my next demos!
Thomas Lucchini — Babylon.js