Skip to main content

WebGL 2 fur simulation

WebGL 2 recently became available in latest Firefox and Chrome so it was tempting to try out some of its new features. One of the most important WebGL 2 (and OpenGL ES 3.0, which it is based upon) features is instanced rendering. This feature reduces draw calls overhead by drawing the same geometry multiple times with altered transformations. It was supported in certain implementations of WebGL 1 too but required a certain extension. Most useful for foliage and particles, this technique is also quite often used to simulate fur.


There are a lot of articles on fur simulation in OpenGL, but our implementation is roughly based on  this YouTube tutorial. It describes process of creating custom Unity shader, however its step-by-step instructions are really insightful. If you are not familiar with this technique of fur simulation, we recommend to spend 13 minutes and watch this video to understand how it works.
All textures for demo are hand-painted from scratch (by looking at photos of fur) - they are very simple and don’t require any special skills.

You can try a live demo here -


To showcase fur simulation first let’s see result of rendering only 2 additional fur layers with quite large fur thickness (distance between layers). You can clearly see original object without fur, and two transparent fur layers:

If we increase layers count and reduce layer thickness we can get more realistic result. On this image with 6 layers of relatively thin layers you can see that each layer fades away from fully opaque to transparent:

And finally we can get quite realistic results with 20 very thin layers:

Our demo has 5 different presets - 4 fur presets and 1 moss. They are rendered with the same shader but with different input.
Each preset of fur is defined by the following parameters: start and end color of fur for AO simulation, layers count and layer thickness to specify fur length, diffuse and alpha textures, and finally, a wave scale for wind simulation.

First we draw cube with the same diffuse texture used for fur layers. It should be dimmed to the same color as first fur layer for it to blend nicely with fur so it is multiplied by fur start color. We use a really simple shader here which takes fragment color from texture and multiplies it by specified color.

Next, we need to draw fur layers. They are translucent and require correct blending mode to look as intended. Using regular glBlendFunc() blending mode resulted in either too dim fur or too bright results because they affect alpha channel and therefore distort fur colors. On the other side, glBlendFuncSeparate() function specifies separate blending modes for RGB and alpha channels of fragments and it was possible to keep alpha constant for each layer (controllable in shader) while nicely blending fur color. This is blending function used in demo:

gl.blendFuncSeparate(gl.SRC_ALPHA, gl.ONE_MINUS_SRC_ALPHA, gl.ZERO, gl.ONE);

This image shows how separate blending compares to other non-separated blending modes:

After correct blending mode is set, we start drawing fur. It is implemented in a single draw call - all instancing is done by GPU and shader, so all further explanations are related to shader. Please note that GLSL 3.0 is different from GLSL 1.0 - you can refer to this tutorial on how to update your old shaders to new version.
To create fur layers, shader extrudes each vertex in direction of normal (so you can easily adjust fur direction by changing model normals). The higher gl_InstanceID (built-in variable with current instance number) is, the further we should extrude vertices:

float f = float(gl_InstanceID + 1) * layerThickness; // calculate final layer offset distance
vec4 vertex = rm_Vertex + vec4(rm_Normal, 0.0) * vec4(f, f, f, 0.0); // move vertex in direction of normal

For fur to look realistic, it should be dense at the start and thin at the end. This can be done by reducing layer alpha. Also, to simulate ambient occlusion fur should be darker inside and brighter outside. Both these parameters are specified by start and end color of fur. Typically start color is [0.0, 0.0, 0.0, 1.0] and end color is [1.0, 1.0, 1.0, 0.0] so it starts with completely black and ends with source diffuse color, while alpha fades away from fully opaque to transparent.
First, in vertex shader we calculate layer color coefficient and then interpolate from start to end color, and then simply multiply diffuse color by that value. Finally, alpha value of fragment is multiplied by value from alpha map which determines fur hairs pattern.

// vertex shader
float layerCoeff = float(gl_InstanceID) / layersCount;
vAO = mix(colorStart, colorEnd, layerCoeff);

// fragment shader
vec4 diffuseColor = texture(diffuseMap, vTexCoord0); // get diffuse color
float alphaColor = texture(alphaMap, vTexCoord0).r; // get alpha from alpha map
fragColor = diffuseColor * vAO; // simulate AO
fragColor.a *= alphaColor; // apply alpha mask

There can be multiple ways to simulate fur movement on wind. In this demo we move each vertex a little bit depending on time uniform passed to shader. To do this, we have to use some sort of unique “hash” value for vertices with the same coordinates. We cannot rely on built-in gl_VertexID variable because it is actually different for different vertices, even the ones with the same coordinates can have different gl_VertexID. So we calculate some “magic sum” of vertex coordinate and create a sine+cosine wave based on that value. Example of moving vertex according to input of time uniform:

const float PI2 = 6.2831852; // Pi * 2 for sine wave calculation
const float RANDOM_COEFF_1 = 0.1376; // just some random float
float timePi2 = time * PI2;
vertex.x += sin(timePi2 + ((rm_Vertex.x+rm_Vertex.y+rm_Vertex.z) * RANDOM_COEFF_1)) * waveScaleFinal;
vertex.y += cos(timePi2 + ((rm_Vertex.x-rm_Vertex.y+rm_Vertex.z) * RANDOM_COEFF_2)) * waveScaleFinal;
vertex.z += sin(timePi2 + ((rm_Vertex.x+rm_Vertex.y-rm_Vertex.z) * RANDOM_COEFF_3)) * waveScaleFinal;

Further Improvements

While already achieving quite good results, this implementation can be improved by applying directional force (wind) to fur and/or adjustable fur length using per-vertex coefficients. Feel free to get sources from GitHub and modify it according to your needs, code uses MIT license.


Popular posts from this blog

Porting Android live wallpaper to WebGL

Now WebGL works in almost any browser, including mobile ones so it was tempting to try it out. We have experience in creating Android apps using OpenGL ES 2.0. We have released quite a lot of 3D live wallpapers with rich 3D graphics. They are implemented in Java + OpenGL ES 2.0 without using third-party game engines (such as Unity) or high-level frameworks (such as libGDX). This makes our apps lightweight and well-optimized. And since WebGL is based on OpenGL ES 2.0 process of porting live wallpaper to run in browser is quite straightforward.

Developer's notes V. Custom watch faces for Android Wear with OpenGL ES 2.0

As soon as Google announced possibility to create custom watch faces with new Android 5.0 for Android Wear, we’ve ordered brand new ASUS ZenWatch to develop watch face (I believe nothing can beat quality of this device for its price). We decided not to port any of existing live wallpapers to Android Wear but to create some completely new scene for it. This resulted in creating Axiom Watch Faces app which consists of 5 digital watch faces, implemented in 3D with OpenGL ES 2.0.