Animated Gifs in WebGL
The popularity of animated Gifs is only growing, and I have to admit that I totally love it :-) More and more, I receive questions about integrating them in Babylon.js, so here it is !!!

Despite looking like an easy task, unfortunately,
You can not directly use animated Gifs in WebGL, or I am not aware and this would be fantastic despite ruining the article :-)
As there are no browser APIs available, it looks like we should parse the file manually in Javascript. This sounds a bit crazy at first, but after a couple of research, I found this project and by the way… it has the best name ever.
We now have a client Gif parser, Babylon.js, some time to build something, so what about using post processes on top of Gifs to make them look even cooler ?
Live Demo: https://www.babylonjs.com/demos/animatedgif/.
The entire code of the project is all open source and available on Github:
I cannot encourage you enough to reuse it in any of your Babylon projects. And if you would like this feature directly embedded in the main repo, share some claps and we’ll definitely make it happen… and no, it is not just a trap.
As I have already detailed all the features we can rely on for video processing in this article, I will simply introduce here, how we brought the AnimatedGifTexture to life.
After parsing a Gif in the browser, we get back a list of frame pixels or mainly patches that we need to convert into a texture to be usable in WebGL.
At first we relied on a Babylon.js DynamicTexture which is a pretty good fit but for performance reasons (preventing extra copies between 2d and WebGL contexts), we decided to go fully in WebGL. Again, I thought it was all easy: “let’s simply update a texture with a texSubImage2D (to handle the position of the patches)”… but unfortunately, again, I quickly figured out that patches have transparency to handle and well…
The simplest solution is then to rely on an offscreen texture where we can accumulate all the Gif frames.
At this point, you are probably thinking it would be a lot of code so let’s take the boilerplate stuff as granted and focus only on the offscreen texture render part since it is a recurring in any rendering app.
For some work projects, I have simplified a lot the way you can render to a texture. You can now rely on our new EffectRenderer and EffectWrapper as nice combination with our RenderTargetTexture.
Quick Demo here: https://playground.babylonjs.com/#3Y3KQR and what we do inside of it:
1. We first create a render target texture we can draw on
2. Then, to render into a texture, we need to get our shaders ready
As you can see it is pretty straightforward and obviously, there are more options available if you want to pass in custom uniforms or samplers.
Now, you have everything setup, and there is one last step: render your effect into your texture:
That is it, in only a few lines you can create and run any effects you want inside an offscreen texture or the canvas. For a more complete example, you can definitely check the code of the AnimatedGifTexture.
Oh and I almost forgot…