Full disclosure: I am a technical artist working for Microsoft on the Babylon.js engine. In the past, however, I have worked full time in the Unity engine on projects for HoloLens, 3D for Everyone, and Real-Time 3D Scanning. So what is it like to move from a native engine like Unity to a WebGL engine like Babylon.js? It was not as jarring as I would have initially thought.
First of all, we need to acknowledge that comparing a native 3D engine to a WebGL engine is not an apples to apples comparison. There are limitations on WebGL that are not an issue for a native engine, but on the flip side the accessibility of a WebGL experience is much broader than a native app. If you are making an experience that is targeted to a specific piece of hardware, being able to leverage a native engine is going to give you a lot of flexibility. But when you are trying to reach as many users as possible regardless of the device they are using, it can be much more powerful to lean into a WebGL solution.
In my work across several teams at Microsoft my main focus was to produce prototypes of user interactions, UI layouts, visual effects, and more. The best way for me to do that was to create a Unity project of that specific interaction or even mock up the whole experience so we could test the end-to-end user flow. It was much faster for us to let the engineers focus on the hard work of making production-ready code where I could work fast and loose just to get an experience for design to test. This was a good way to reduce code churn on the engineers when integrating features since everything had been user-tested already. Better yet, if the prototype can be deployed on the target hardware, we could get a feel for the form factor while testing the prototype.
For Unity, that meant creating and side loading a unique package for a specific device. This was a tax that had to be paid with every iteration, and could be very annoying if there were a small bug or typo that needed to be corrected. It also resulted in a lot of passing around of unique devices since a device needed to have the prototype side-loaded onto it before it could be used for testing. There also needed to be the ability to hide some debug UI in the prototype so that we could determine if the device had the latest version of the prototype on it. We would, from time to time find a device that had not been updated and was being tested on an old build which was a waste of time for everyone.
Just from the standpoint of a WebGL prototype being up-to-date and usable on any device with a web browser I could see that this would be a faster method of iteration while keeping all testing devices up to date. No creation of individual packages based on the device. No manual side loading. No need to get a specific device into the hands of someone wanting to review the prototype. And with the option of creating a Progressive Web App, I can see where I could get a lot of flexibility out of any prototype that I made.
When I first started working in Babylon.js, I did miss many of Unity’s tools while I found my footing doing these same tasks in code manually. Then we started talking as a team about what we could do to make Babylon.js a better environment for development. We always strive to make the engine better technically — faster, smaller, more accurate rendering — but what about making things faster for the developer? One example was when we started talking about the challenges of new users trying to write shaders in glsl and saw there was a huge opportunity to create a tool to make shader creation easier and faster while leaving the optimization of the code to someone who really knows how to write glsl shaders. With that the Node Material and its accompanying editor were born. We wanted this tool to be something that felt familiar to our users so we looked at the landscape of all the tools using node editors: Unity, Unreal, Substance Designer, Houdini, etc. We decided to lean into the interactions set by other tools in the industry so that our users could rely on workflows they are already familiar while using Babylon.js.
We knew that we could not stop with just the Node Material Editor because there were so many other operations we could make faster with a targeted tool. To that end, in this upcoming 4.2 release, we have now introduced a particle system editor, a sprite editor, a texture inspector, HDR prefiltering tools, and the addition of post processes, procedural textures, and particles into the Node Material Editor. And this is just the start for the full list of tools we are envisioning for the future of Babylon.js.
And for debugging assets, trying out materials and shaders, and quickly iterating on an asset’s rendering the sandbox is an invaluable tool. This is especially critical when authoring content for the web consumption so that you see exactly how your asset will render in real time. Relying on an offline renderer to author content meant for real-time rendering can lead to an undesirable outcome. Luckily, the sandbox offers simple drag and drop loading of several file types which gets you up and running to test your asset and even generate renders right from the inspector.
If you are currently using a native engine and are thinking about a WebGL solution where you can write your code once and have it visible on a variety of devices, there’s no better time to give Babylon.js a try. Version 4.2 is almost upon us and we are already looking to the future of the next tools we want to add to the engine. I hope you will come along with us for the journey!
Senior Technical Artist, Microsoft