From Unity to Babylon.js — How is the journey?

Full disclosure: I am a technical artist working for Microsoft on the Babylon.js engine. In the past, however, I have worked full time in the Unity engine on projects for HoloLens, 3D for Everyone, and Real-Time 3D Scanning. So what is it like to move from a native engine like Unity to a WebGL engine like Babylon.js? It was not as jarring as I would have initially thought.

First of all, we need to acknowledge that comparing a native 3D engine to a WebGL engine is not an apples to apples comparison. There are limitations on WebGL that are not an issue for a native engine, but on the flip side the accessibility of a WebGL experience is much broader than a native app. If you are making an experience that is targeted to a specific piece of hardware, being able to leverage a native engine is going to give you a lot of flexibility. But when you are trying to reach as many users as possible regardless of the device they are using, it can be much more powerful to lean into a WebGL solution.

One of my prototypes for describing a point cloud with particles using depth of field

In my work across several teams at Microsoft my main focus was to produce prototypes of user interactions, UI layouts, visual effects, and more. The best way for me to do that was to create a Unity project of that specific interaction or even mock up the whole experience so we could test the end-to-end user flow. It was much faster for us to let the engineers focus on the hard work of making production-ready code where I could work fast and loose just to get an experience for design to test. This was a good way to reduce code churn on the engineers when integrating features since everything had been user-tested already. Better yet, if the prototype can be deployed on the target hardware, we could get a feel for the form factor while testing the prototype.

For Unity, that meant creating and side loading a unique package for a specific device. This was a tax that had to be paid with every iteration, and could be very annoying if there were a small bug or typo that needed to be corrected. It also resulted in a lot of passing around of unique devices since a device needed to have the prototype side-loaded onto it before it could be used for testing. There also needed to be the ability to hide some debug UI in the prototype so that we could determine if the device had the latest version of the prototype on it. We would, from time to time find a device that had not been updated and was being tested on an old build which was a waste of time for everyone.

Photo by Halacious on Unsplash

Just from the standpoint of a WebGL prototype being up-to-date and usable on any device with a web browser I could see that this would be a faster method of iteration while keeping all testing devices up to date. No creation of individual packages based on the device. No manual side loading. No need to get a specific device into the hands of someone wanting to review the prototype. And with the option of creating a Progressive Web App, I can see where I could get a lot of flexibility out of any prototype that I made.

But I had been very comfortable working in Unity so I was a little uneasy moving from that environment and C# to writing the entire prototype in Javascript or Typescript to be viewed in a browser. But when I really looked at what I was doing in Unity, it was a lot of file organization and custom scripts attached to assets imported into the engine. That part wouldn’t change. But I also recognized that I had relied heavily on the environment’s tools like the Animation Controller state machine, the sprite editor, or even plug-ins like Shader Forge (and now the integrated Shader Graph).

A custom lighting shader made in the legacy Unity plugin Shader Forge

When I first started working in Babylon.js, I did miss many of Unity’s tools while I found my footing doing these same tasks in code manually. Then we started talking as a team about what we could do to make Babylon.js a better environment for development. We always strive to make the engine better technically — faster, smaller, more accurate rendering — but what about making things faster for the developer? One example was when we started talking about the challenges of new users trying to write shaders in glsl and saw there was a huge opportunity to create a tool to make shader creation easier and faster while leaving the optimization of the code to someone who really knows how to write glsl shaders. With that the Node Material and its accompanying editor were born. We wanted this tool to be something that felt familiar to our users so we looked at the landscape of all the tools using node editors: Unity, Unreal, Substance Designer, Houdini, etc. We decided to lean into the interactions set by other tools in the industry so that our users could rely on workflows they are already familiar while using Babylon.js.

A custom lighting shader made in the Babylon.js Node Material Editor

We knew that we could not stop with just the Node Material Editor because there were so many other operations we could make faster with a targeted tool. To that end, in this upcoming 4.2 release, we have now introduced a particle system editor, a sprite editor, a texture inspector, HDR prefiltering tools, and the addition of post processes, procedural textures, and particles into the Node Material Editor. And this is just the start for the full list of tools we are envisioning for the future of Babylon.js.

So what do I now think about switching from an environment like Unity to Babylon.js? There are definite reasons to use each and you should definitely look at the benefits of both options. However, when thinking about a web solution, I prefer the control I have over the experience by writing the Javascript/Typescript myself while still having access to these time/sanity-saving tools. And one of my favorite parts of Babylon.js is the playground which we use every day to quickly test out ideas, illustrate solutions for questions on the forum, or easily share code with the community.

The Babylon.js playground is a great place to try out, debug, or share code

And for debugging assets, trying out materials and shaders, and quickly iterating on an asset’s rendering the sandbox is an invaluable tool. This is especially critical when authoring content for the web consumption so that you see exactly how your asset will render in real time. Relying on an offline renderer to author content meant for real-time rendering can lead to an undesirable outcome. Luckily, the sandbox offers simple drag and drop loading of several file types which gets you up and running to test your asset and even generate renders right from the inspector.

The Babylon.js sandbox offers a scene explorer and property inspector for debugging and experimentation

If you are currently using a native engine and are thinking about a WebGL solution where you can write your code once and have it visible on a variety of devices, there’s no better time to give Babylon.js a try. Version 4.2 is almost upon us and we are already looking to the future of the next tools we want to add to the engine. I hope you will come along with us for the journey!

Patrick Ryan
Senior Technical Artist, Microsoft

https://twitter.com/PatrickCRyan

Babylon.js: Powerful, Beautiful, Simple, Open — Web-Based 3D At Its Best. https://www.babylonjs.com/

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store