Web Video Processing Made Easy

If you are like me, and not as young as you’d like, you probably had the awesome opportunity to play some of your favorite games on the Game Boy. As a geek with fond memories of the Game Boy and huge fan of WebGL, I recently started to wonder how hard it would be to put myself inside of a Game Boy. Yup…you read that correctly!

Live Demo: https://www.babylonjs.com/demos/videoprocessing/.

Ta Da!!!!

Now you’re probably wondering where I want to go from here. Let me share with you how easy it is to create Real Time Video Filters in your web page.

As a matter of fact, it is easy if you use Babylon.js!!!

First things first: developing on the web can be very cumbersome today. Especially if you rely on different packages to set up your experience. For this project, I will simply rely on Webpack, TypeScript and the new Babylon.js Controls library to put myself inside of a Game Boy. *Geeky Smile*

You can find the skeleton I started from on Github:

As you can see, it only uses a handful of dependencies and includes our new ES6 version of Babylon.js to enable tree shaking and greatly reduces the app size:

Also, it contains a basic Webpack setup to ensure you can easily run it locally or build it for production.

Basically, by default, the webpack dev server will enable you to efficiently iterate locally and the “env” prod will be able to easily create your app bundle, ready to deploy.

Now, we have our basic setup, we can start creating the application!!!

About the application itself…it’s incredibly simple. It basically just converts a video feed into a WebGL texture and then draws this texture on the canvas while applying a filter to each of the texels (pixels from the texture).

The amazing thing is that the Babylon.js Image Filter Control does exactly this and can even be used in real time.

Enabling real time video filtering on your web page can look as simple as this:

Basically, you instantiate an “ImageFilter“ control and create the effect you want to apply (more on that in the next paragraph). During the render loop (or every single frame) the video texture is updated to the latest video frame and then rendered through the filter or effect we created earlier.

That is it, you don’t need anything else!!!

But even if filtering a static video is nice, remember the goal…I want to be IN the GameBoy…

To actually be in the box, I need to convert my WebCam feed into a texture. Babylon has built in support for this but I thought it might be interesting to show you what happens under the hood.

As the web is now AWESOME, we can rely on the MediaDevices API to provide the feed of the Webcam as a stream:

We can now plug this stream in a “Hidden” Video Element that will be used as the source of our Babylon Texture:

Lines 6 through 10 creates the “hidden” video. From lines 12 to 22 we are managing cross browser usage of the stream as a video source. Finally, we are wrapping the video in a Babylon Texture from 24 to 42.

It might seem a bit complex at first, but don’t forget: all of it is already available with more safety inside the Babylon VideoTexture APIs. aI’m simply highlighting here how it all works.

From this point, we are all set with an app containing the render loop, a video texture coming from the Webcam and some infrastructure to build and deploy.

It is finally time to reach the heart of the processing, the filter itself. Basically, the filter is a GLSL Fragment Shader that you can plug in the image filter as detailed in the documentation.

Here is what our shader looks like:

We are trying to render a video looking like an old V1 GameBoy. We therefore need to draw in “tiles” (8*8 pixels) with only 4 colors and a grid to fake the space between old console pixels:

1. We first find the closest tile we are in

const float screenSize = 512.;
vec2 tileUV = floor(gl_FragCoord.xy / gridSize) * gridSize / screenSize;

2. Then we extract the luminance of the current video texel

vec4 tileColor = texture2D(textureSampler, tileUV);
float tileLuminance = getLuminance(tileColor.rgb);

3. And finally we match the luminance with our color palette of 4 colors.

vec4 finalColor = palette[int(tileLuminance * 3. + lumaOffset)];

At this point, we only miss the grid:

1. We check horizontally or vertically if we are on the grid,

onGridline(gl_FragCoord.x, gridSize)

2. By using a simple modulo operation to know if our current position matches the defined grid spacing

return mod(floor(distFrom), spacing) == 0.0;

3. And if we are on the grid we use the gridColor instead of what was previously computed from the luminance

gl_FragColor = gridLineColor;

You can now run the application and see yourself inside of a GameBoy! https://www.babylonjs.com/demos/videoprocessing/.

Come on! This is the COOLEST THING EVER!!!!!

And obviously, the full source code is available on Github.

Do not hesitate to reach out to us on our Forum or Twitter for any questions.



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store