Creating a Product Customization Experience with Babylon.js

Babylon.js
11 min readJun 12, 2023

--

Most of us have likely seen an e-commerce experience that incorporates 3D representations of products to let the user explore them from different angles or even renders a product in mixed reality to give a sense of scale in the user’s space. One aspect of the shopping experience that is unlocked by real-time rendering engines like Babylon.js is allowing the customer to design the look of the product through their choices of colors and materials and see the result of their choices immediately. An excellent example of this is Nike By You Custom Shoes experience powered by Babylon.js which allows users to shape every detail of their shoes to their liking.

Nike By You allows users to customize every aspect of the shoe to their liking.

But what if the customization experience, we want to create is more free form? There is a very large market tailored to custom-branded products offering to put a company’s logo and information on a host of products to promote the business. Beyond that, there is also a sizeable customer base looking for singular or small-run custom products which are tailored to technologies like dye sublimation or direct-to-garment printing. When ordering these types of products online, we will likely see a 2D experience where we can upload an image and possibly place it within the imprint area. Sometimes, there is a mock-up of the final product in another step when the uploaded image is rendered on an image of the product. In many cases, it’s hard to get a true sense of the material properties of the product or how the finished product will look.

This is where we can rely on the physically based rendering (PBR) standards employed in e-commerce today mixed with dynamic texture creation to allow us to mix PBR material rendering with user generated textures. To demonstrate, I have created a small experience that allows the user to customize the imprint of a skateboard deck to their liking from a combination of colors and graphics.

A series of six renderings of a skateboard showing customization steps of the bottom of the deck. The user chooses a red rectangle, makes it a narrow stripe and rotates it to 45 degrees. Next they add an orange field behind the stripe before rotating it and moving it down vertically to align with the stripe.
From left to right, the user adds colored rectangles to the base skateboard deck and is able to scale, rotate, and translate each color to create a more complex layered graphic.

In the image above, we start on the left with a skateboard asset using a PBR material and lighting designed to highlight the material properties. The user then chooses a deep red color for the imprint, creating a rectangle of that color which fills the imprint area. Wanting to create a stripe, the user makes the rectangle narrow and then rotates it to the desired angle. To make a more complex design, the user adds a second rectangle in orange below the first rectangle before rotating and changing the vertical position lining it up with the original rectangle.

Even with this very simple example using only rectangles, we can see potential of creating a complex design layering multiple elements. An important aspect to note here is that the material properties of the deck don’t change when we add the custom graphics to the product. As we can see in the close-up image below, the light reflecting off the surface of the skateboard material does not change by adding the graphics as we are inserting these graphics into the material’s lighting calculations and mixing them with the base color texture. This preserves the metallic and roughness values of the original material so that the graphics feel like they are integrated into the material with a consistent surface. We could certainly also modify the metallic and roughness values the custom graphics if needed, but for this example I wanted to make the graphics feel as though they are painted right on the surface with a varnish over them.

A close up of the skateboard with the red stripe and orange field printed over a wood grain material. The lighting reflecting on the surface is uniform from the wood grain across both colors showing the surface texture influenced by the wood.
A close view of the specular reflections on the skateboard deck shows a consistent feel across the wood and color graphics implying that the graphics from the dynamic texture are part of the material.

There are other methods for adding custom graphics to a mesh in real-time rendering such as decals. A decal is a projection of an image on to a surface based on its intersection with a cube. This decal is often a second mesh rendered on top of the original mesh with its own material. This technique is often seen in video games for any trace of a character affecting the environment temporarily such as wet footprints on the ground. Blending the material qualities of the original mesh with the decal, however, is more difficult with this approach.

Another decal method, which is available in Babylon.js, is a Texture Decal where the decal projection is written onto the base color texture rather than being a separate mesh and material. Looking at the image below, we can see that the spray paint spots on the brick conform to the surface and are placed with a mouse click. These graphics are written directly into the base color texture and while this will retain the material properties of the original mesh, this decal has changed the original base color texture so changing the decal becomes difficult.

A sphere made of rough and aging bricks with four spots spray painted on the brick with paint running down the sphere. The paint colors are blue, red, white, and green and the paint conforms to the surface of the brick mesh.
Texture decals in Babylon.js will write an image directly into the base color texture of the material helping to maintain the same feel as the base material but are difficult to edit once written.

This is where the dynamic texture feature of Babylon.js shines. We can draw anything we want to the texture using standard HTML canvas features and then update the texture as often as we need. This allows us to draw paths, rectangles, or images in any order to the texture and then change, update, or reorder them as much as needed. Setting this up takes a little planning, but there is nothing that is too difficult, so let’s walk through the process.

Mesh UVs

The first place to start is planning where we want to allow for a custom graphic on the asset. This could be a live imprint area on a mug or shirt. In this case, we target the bottom of the skateboard deck but don’t want the dynamic texture to render anywhere else. The easiest way to do this is to create multiple UV sets for the mesh. There will be a base UV set that will define the default texture unwrap for the mesh as usual.

A screen capture of the Maya UI showing the model for the skateboard as well as the UV layout of all pieces of the mesh packed into the 0–1 space.
The base UV set containing all UV islands in one layout which maps the PBR textures to the mesh.

Then we want a second UV set that uses all of 0–1 UV space for the imprint area of the skateboard deck. This will maximize the texel density of the texture while isolating the texture to only the area we care about. The one thing to note here is that the proportion of the imprint area has been stretched to fill 0–1 space. This is to maximize the use of texels in the dynamic texture, but we will create the texture in the proportion of the original mesh area so that there is no stretching of the texels.

The Maya UI showing the skateboard mesh as well as a second UV set that contains only the bottom faces of the deck stretched to fill 0–1 space
The second UV set mapping only the bottom of the skateboard deck stretched to fill 0–1 space

The other note here is that the rest of the UV islands that shouldn’t render our dynamic texture on are no longer within 0–1 UV space. Babylon.js needs to have a UV coordinate for every vertex in the mesh, and the Maya exporter will not export a UV set without every vertex mapped into UV space. That does not mean, however, that every vertex needs to be mapped into 0–1 space. To help us isolate the rest of the UV islands from our imprint area, we position the rest of the UV islands elsewhere in UV space, in this case in the 2–3 range of UV space. This will allow us to make a simple comparison in the shader and mix the dynamic texture only with mesh UVs mapped to values less than or equal to 1.0.

The Maya UI showing the skateboard mesh and the entirety of the second UV set layout with the bottom of the skateboard deck stretched to fill 0–1 space and all other UV islands laid out in 2–3 space
The second UV set pushes all faces that should not render the dynamic texture outside of 0–1 space to ensure we can isolate those islands from drawing the texture.

The node material created for this asset is basically a standard PBR Metallic Roughness shader with a simple addition to blend the dynamic texture with the base color texture. The image below is the entire node material graph which is minimal in terms of complexity. Most of the heavy lifting is done by the PBRMetallicRoughness block which handles the lighting calculations for the shader.

Node material Editor graph for the skateboard shader showing all of the nodes wired to create the final render.
The majority of this node material is just a simple PBRMetallicRoughness graph with a few extra nodes to mix the dynamic texture with the base color texture.

The one additional piece to this graph is the extra Texture block to which we wire the dynamic texture. There are a couple of assumptions made here to make this work. The first is that our dynamic texture has an alpha channel and that anything drawn to the texture — such as a path, rectangle, or image — also has an alpha value. This is the main way we combine the dynamic texture and the base color texture — if the alpha of a given texel is 0.0, the corresponding texel from the base color texture is used and if the alpha of a texel is 1.0, that texel from the dynamic texture is used. Any value in between will mix the texels from the two textures according to the alpha value of the texel.

Note that the alpha value of an image drawn to the context can also be leveraged to enhance the design. In the image below the wavy lines have an overall alpha value of 0.25, with the area between the lines having an alpha value of 0.0. When that image is drawn over two rectangles of different colors, the waves blend with each color according to their alpha value. The color of the waves, which is a warm white, will add 25 percent of that warm white to whatever color it is drawn over. This is a good method to make a graphic more flexible and usable over any color.

Skateboard deck with a half blue, half orange imprint on the bottom with wavey lines running across both colors rendering light blue on the blue half and light orange on the orange half
Taking advantage of alpha in graphics can make them more versatile as can be seen here. The wavy lines use alpha transparency to create a different shade when used on top of contrasting colors.

The second assumption we make with this asset is all UV islands that shouldn’t display the dynamic texture are mapped to UV coordinates greater than 1.0. We can see in the image below that we take each UV coordinate and test if it has a value equal to or less than 1.0. If true, we pass a value of 1.0 and if false, we pass a value of 0.0. This value is multiplied with the alpha value of the dynamic texture we used above to mix the two textures. This means that any vertex that is mapped to a UV coordinate greater than 1.0 always return the base color texture, even if the dynamic texture had an alpha value that would allow a texel from the dynamic texture to render.

Node material graph showing how the dynamic texture is mixed with the base color texture by using a lerp node based on the dynamic texture alpha muliplied by a test of if a UV coordinate is less than or equal to 1.0.
The part of the graph that mixes the dynamic texture with the base color texture. We use both the alpha of the dynamic texture and the UV coordinate value to determine which texture to pass on to the lighting calculation.

Tying it all together with code

The first two components — a mesh with appropriate UVs and a node material — are the biggest parts of the equation to creating a product customizer. All that is left is to create the code needed to update and manage the dynamic texture. There are a host of other considerations for the UI and UX of the experience when it comes to the user interface, feature set, allowing users to upload their own images, and more. However, those will be influenced by the needs of each individual experience that this technique could be used for. The user interface implemented in this example is simple just a simple example to illustrate the technique but could also be a springboard for other experiences.

The first thing we need to do is to create the dynamic texture. Remember we said that we are stretching the UVs of our imprint area to fit the entirety of 0–1 space. This is where we correct for that distortion by creating the texture with dimensions that fit the original aspect ratio of the mesh faces. In the case of the skateboard, the width of the imprint area is about a third of its height. To keep the texel density of the two textures the same as one another, we just assign the same height as the base color texture to the height of the dynamic texture and then a third of that value for the width of the dynamic texture. Simply use:

let myDynamicTexture = new BABYLON.DynamicTexture("dynamicTex", {width: imageWidth, height: imageHeight}, scene);

Next, we need to load any images that we want to draw into the dynamic texture. Make sure these are in a format that supports an alpha channel like the .png format. With the texture created and the images loaded, we have everything we need to create a custom texture. Dynamic textures in Babylon.js use the standard HTML canvas methods to draw pixels, so we will need to first get the texture context.

const ctx = myDynamicTexture.getContext();

We will use the context to update positions, order, and size of the content we draw. That update will come from user input, usually in the form of a slider value changing or a button being pressed. This input tells us we need to update the dynamic texture. When updating the texture there are a couple of important steps, particularly when we are transforming the content we draw to the context. The first is that we need to clear the texture so that we don’t have any ghosting from previous draw updates.

ctx.clearRect(0, 0, textureWidth, textureHeight);

Next, we need to save the context in its current form as we will rotate and/or translate the context before we draw to it and then need to return it to the original state to render. We do this with:

ctx.save();

With the context saved, we can then translate and rotate the with:

ctx.translate(left, top, width, height);
ctx.rotate (valueInRadians);

This handles any positioning of the context before we draw our content, such as a rectangle:

ctx.fillstyle = “Red”;
ctx.fillRect(left, top, width, height);

The last step is to restore the context to the save point we started with which will undo any translation or rotation we performed on the context:

ctx.restore();

This is the process for drawing content that is not aligned with the texture. We will always draw the content in the middle of the canvas and with no rotation so translating and rotating the context before drawing the content will make the content appear to be positioned how we want.

There is only one more step here, which is to update the dynamic texture so the user can see their changes with:

myDynamicTexture.update();

Those are all the steps we need to draw a custom piece of content to the dynamic texture. To add in a layer system, we simply loop through the above steps once per layer. In the case of this example, we limit the number of layers to five, so we loop through drawing to the context five times and then update the texture at the end. All the data for our layers is stored in an array holding the type of content — rectangle or image — as well as size, position, and rotation. If we want to reorder the layers, we simply move the elements of the array around and then loop through them drawing each one to the context one at a time.

Beyond this example

As we can see in the playground linked above, even re-drawing the dynamic texture as quickly as a user might change a slider value is still extremely responsive, creating a compelling experience for the user. And the ability to add, delete, move, rotate, and reorder graphics in the dynamic texture applied to a product will result in a sticky experience. Even with only a few images and colors, the variety of designs that can be achieved is quite surprising.

An image of several skateboards with different custom designs. The top and bottom of the raw wood deck are shown on the left with three unique designs shown to the right. The first is a scowling blue face on a blue background with white stripes, the second is a vector illustration of a horse head with orange and purple swirling mane, and the third a yelling orange face with orange and blue stripey background.
The default material for the skateboard on the left creates the base for a surprising amount of variation in graphic layouts on the right that can be created in real time by the user.

There are so many other things that can be added to this experience to expand its power. Everything from allowing the user to assign colors to graphics drawn into the dynamic texture, to uploading their own images, to allowing users to create as many layers as they want, to allowing users to save and load designs can create a memorable and exciting experience. I hope this sparks some ideas for how a custom product experience can elevate an e-commerce experience and how easy it can be to implement a similar system. Happy customizing!

Patrick Ryan (@PatrickCRyan)

--

--

Babylon.js
Babylon.js

Written by Babylon.js

Babylon.js: Powerful, Beautiful, Simple, Open — Web-Based 3D At Its Best. https://www.babylonjs.com/

No responses yet