Another take at decals

Captain Caveman forever!

Before talking about the real subject of this post, let me introduce myself in a few words because it is the custom when a new team member publishes his first post on medium!

So, I’m Evgeni Popov and I’m a new (proud!) member of Babylon.js team for 3 weeks now… How to tell two lies in the same sentence! Truth to be told, my name is Alexis Vaginay, I’m from France, and even though I was hired by Microsoft 3 weeks ago, I’ve been working with the Babylon.js team for two years, in 2020 and 2021. People who lurk on the Babylon.js forums probably know me because I’m a pretty active member (this badge is a bit scary knowing that I got it a month ago and that during the first 9 months of this year I was on “vacation” — at least I wasn’t working with the team…).

Evgeni Popov is a name I chose a long time ago (around 1997–98 I think) to avoid using my real name when I was surfing the Internet, which was just starting. I used this name because I love russian names 😃(“Evgeni” comes from Ievgueni Kafelnikov who was a great tennis player and “Popov” from Aleksandr Popov who was a great swimmer).

“Goldorak, Go!”

I am especially interested in computer graphics, even if for my studies I followed a general engineering course, because there was no university in France that taught computer graphics at the time (yes, I’m that old). That’s why I was part of a small team of demo makers (named Realtime — how appropriate!): with my friend Mythos, we were two coders… and that was it! For some of our productions, we asked for help with some graphics or music, but overall we produced most of the content ourselves. If you want to check it out, try pouet but beware of the infamous coder’s art! Our most “famous” demo is Goldorak, which won the Somewhere in Holland contest in 1995. Personally, I’m most proud of the Doom part of the After Death demo, because I coded it before the Doom source code was released by John Carmack (and I remember spending a lot of time fixing my rendering of perspective-corrected textured triangles!).

In the Babylon.js team, my main task will be to fix and improve everything related to the core renderer, but that doesn’t mean that I won’t touch other parts of the engine, of course. A number of issues have already been assigned to me, which will keep me busy for at least a few months!

Oh, and about my avatar: I just loved Captain Caveman when I was (very) young and I think it suits me very well because I’m a bit of a caveman myself (I like to stay at home, like an old bear in his cave).

Now that the introductions are done, let’s dive in!

As you know (or maybe not!), decals are textures that you can apply (project) onto any mesh in your scene. They are usually used to add detail to a surface: bullet holes, wall tags, bloodstains, etc.

Babylon.js added decals in v2.1 and we have improved the support recently by allowing to add decals to rigged meshes (see documentation). This works by creating a mesh that closely matches the mesh on which the decal is projected (the destination mesh) and the decal image is used as the diffuse (or emissive) texture for the material applied to that mesh.

This works quite well, but there are a number of drawbacks:

  • Each time a decal is created, we have to create a new mesh. This can be time consuming if the destination mesh is complex, with many faces. Creating a large number of decals in a short period of time (such as the bullet holes of a machine gun) may result in stuttering rendering during mesh creation. Also, the number of draw calls in a frame will be increased by the number of decals.
  • Depending on the destination mesh, you may need to tweak the material.zOffset property of the decal to avoid z-fighting.
  • This does not work for meshes with morph targets (nor for meshes with custom vertex deformations). If a mesh has morph targets, the decal will not follow the deformations of the morph. For example:

The current method can be called “Mesh Decals” because it creates a new mesh for each new decal. There is (at least) one other method which is to create a (decal) texture for the destination mesh and is called “Texture Decals”.

It works by rendering the decal in a texture and using that texture as an additional diffuse/detail texture when rendering the target mesh. It solves all the problems listed above:

  • There are no new meshes created and therefore the number of draw calls remains the same with or without the decals.
  • The generation of the decal in the texture is fast because it is done by the GPU, so you can create a lot of decals in a short time.
  • Since it is simply an additional texture that is combined with the other textures in the material shader, it also works if a mesh has morph targets or any other custom vertex deformation.

Of course there are some drawbacks to this method:

  • You need an extra texture per mesh that can receive decals. If you want to support special rendering effects for the decal (like bump, roughness, etc), you will need even more textures (one for bump, one for roughness, etc).
  • If your mesh has large extensions, you may need to use large texture decals to get enough detail for the decals.
  • You cannot selectively remove some decals and not others in a decal texture: you can either remove them all (by clearing the decal texture or disabling the effect in the mesh), or none.
  • The texture coordinates of the mesh must be unique, which means that each triangle of the mesh has to map to a different texture area.
  • Mipmaps are generally required to limit aliasing issues. If you update a large number of decal textures often, this can have a substantial impact on performance.

Now that we have a better idea of what Texture Decals are, let’s do some coding!

Our main task is therefore to render the decal image into a texture before using that texture as an additional diffuse/detail texture in the mesh material.

However, it is not enough to copy the decal image into the destination texture. If you do that, you will end up with distortions because the mesh usually has some curvature that you have to take into account when drawing the decal in the texture.

For example, let’s draw the lion’s head decal in the decal texture as a simple bit blit (copy):

Wrong decal rendering

As you can see, the lion’s head has been copied directly into the texture (upside down to render correctly on the sphere) but we don’t get the expected result when rendering on the sphere, the texture is deformed because of the curvature of the sphere.

We have to take the curvature into account when we draw the decal image in the texture:

Right decal rendering

Now the lion’s head is distorted in the texture but it appears correctly on the sphere!

How do you distort the image in the first place? Enter texture space rendering!

The uv texture space is where you want to render the decal image, because uvs were (normally!) defined to avoid distortions when applying a texture to a mesh.

To render in texture space, simply use the uv coordinates of the vertex instead of its coordinates in 3D space in the vertex shader:

Vertex shader

The value you set in gl_Position is a clip space position, so the x/y/z components must be between -1..1 (0..1 in WebGPU for the z coordinate). Note that the z coordinate is not used (except by the GPU for clipping), so 0 is a good value for both WebGL and WebGPU.

For demonstration purposes, let’s make a simple fragment shader that simply reads the diffuse texture:

Fragment shader

So, what do you get when this code is used to generate a texture? The diffuse texture as used by the mesh!

Take this PG: https://www.babylonjs-playground.com/frame.html#HJRLG3

Texture space rendering

Note that we don’t get the full diffuse texture because other parts of the alien use other parts of the texture and we have only rendered the head.

So what we need to do now is to find where the decal image will be projected into this texture and draw it there.

What we need is to generate a projection matrix corresponding to the decal projector:

In this screenshot, the projector is represented by the red box.

We want to project the decal image onto the parts of the mesh that intersect this box. To make this easier, the matrix we are going to compute will project all the points inside the box to the coordinates 0..1 for x/y/z. This way, it is easy to reject (clip) the points that are not inside the box. And conveniently, the x/y coordinates are also the coordinates we need to use to read from the decal image! See the createDecalMatrixfunction in the demo PG below for the math used to create the matrix.

Once we have calculated the matrix, the shader code is simple (only the relevant bits are extracted):

Main shader code for decal texture space projection

normalView is the normal calculated in the projector coordinate system: if the z component is greater than 0, it means the normal is facing away from the projection direction, so the point must be clipped. The easiest way to do this is to define a z value for gl_Position outside the range 0..1 so that the GPU will clip it.

Now the decal follows all mesh deformations as expected:

Here’s a link to the first demo PG:

https://www.babylonjs-playground.com/frame.html#9BVW2S#12

For simplicity, you can put only one decal per mesh: once you left-click, assigning decals to that mesh is disabled. You can also right click to rotate the decal before applying it.

In this PG, I also created a material plugin that is able to merge the decal texture with the diffuse texture, both for a standard material and a PBR material. See this blog post and the documentation for detail about material plugins.

Here’s another demo:

https://www.babylonjs-playground.com/frame.html#9BVW2S#15

In this one, you can click on the meshes and 5 decals will be projected in turn.

You can use the PG as a basis for your own experiments. Here are some ideas for extending/improving the technique:

  • Support other textures like bump, roughness, etc.
  • If you are projecting a lot of textures in a short period of time, disable the generation of mipmaps when creating the render target texture and manually call `engine.generateMipmaps()` when enough time has passed after the last projection.
  • Instead of storing the colors in the texture decal, store the uvs (the ones you would use to read the decal image) instead. In the material shader, read these uvs and use them to sample the decal image. So, in effect, we are using indirection before we sample the decal image. The advantage is that even if you want to support multiple textures for your decal (bump, roughness, etc.), you only need one decal texture. Also, you do not need to enable mip mapping for the decal texture with this method. If you want to support different decals on a mesh, you will need to store a decal ID in addition to the uv coordinate in the decal texture. In the material shader, in addition to sampling the diffuse decal image, you will also sample the bump decal image, the roughness decal image, etc. This is more complicated, but it consumes less GPU memory and does not require generating mip maps, as explained above. See this blog post for more details on this method.

Happy decaling!

GIF courtesy of WiperTags Wiper Covers on GIPHY

Popov — Babylon.js Team

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store