Skip to main content

WebGL 2 fur simulation

WebGL 2 recently became available in latest Firefox and Chrome so it was tempting to try out some of its new features. One of the most important WebGL 2 (and OpenGL ES 3.0, which it is based upon) features is instanced rendering. This feature reduces draw calls overhead by drawing the same geometry multiple times with altered transformations. It was supported in certain implementations of WebGL 1 too but required a certain extension. Most useful for foliage and particles, this technique is also quite often used to simulate fur.


Concept

There are a lot of articles on fur simulation in OpenGL, but our implementation is roughly based on  this YouTube tutorial. It describes process of creating custom Unity shader, however its step-by-step instructions are really insightful. If you are not familiar with this technique of fur simulation, we recommend to spend 13 minutes and watch this video to understand how it works.
All textures for demo are hand-painted from scratch (by looking at photos of fur) - they are very simple and don’t require any special skills.

You can try a live demo here - https://keaukraine.github.io/webgl-fur/.

Implementation

To showcase fur simulation first let’s see result of rendering only 2 additional fur layers with quite large fur thickness (distance between layers). You can clearly see original object without fur, and two transparent fur layers:

If we increase layers count and reduce layer thickness we can get more realistic result. On this image with 6 layers of relatively thin layers you can see that each layer fades away from fully opaque to transparent:

And finally we can get quite realistic results with 20 very thin layers:

Our demo has 5 different presets - 4 fur presets and 1 moss. They are rendered with the same shader but with different input.
Each preset of fur is defined by the following parameters: start and end color of fur for AO simulation, layers count and layer thickness to specify fur length, diffuse and alpha textures, and finally, a wave scale for wind simulation.

First we draw cube with the same diffuse texture used for fur layers. It should be dimmed to the same color as first fur layer for it to blend nicely with fur so it is multiplied by fur start color. We use a really simple shader here which takes fragment color from texture and multiplies it by specified color.

Next, we need to draw fur layers. They are translucent and require correct blending mode to look as intended. Using regular glBlendFunc() blending mode resulted in either too dim fur or too bright results because they affect alpha channel and therefore distort fur colors. On the other side, glBlendFuncSeparate() function specifies separate blending modes for RGB and alpha channels of fragments and it was possible to keep alpha constant for each layer (controllable in shader) while nicely blending fur color. This is blending function used in demo:

gl.blendFuncSeparate(gl.SRC_ALPHA, gl.ONE_MINUS_SRC_ALPHA, gl.ZERO, gl.ONE);

This image shows how separate blending compares to other non-separated blending modes:


After correct blending mode is set, we start drawing fur. It is implemented in a single draw call - all instancing is done by GPU and shader, so all further explanations are related to shader. Please note that GLSL 3.0 is different from GLSL 1.0 - you can refer to this tutorial on how to update your old shaders to new version.
To create fur layers, shader extrudes each vertex in direction of normal (so you can easily adjust fur direction by changing model normals). The higher gl_InstanceID (built-in variable with current instance number) is, the further we should extrude vertices:

float f = float(gl_InstanceID + 1) * layerThickness; // calculate final layer offset distance
vec4 vertex = rm_Vertex + vec4(rm_Normal, 0.0) * vec4(f, f, f, 0.0); // move vertex in direction of normal

For fur to look realistic, it should be dense at the start and thin at the end. This can be done by reducing layer alpha. Also, to simulate ambient occlusion fur should be darker inside and brighter outside. Both these parameters are specified by start and end color of fur. Typically start color is [0.0, 0.0, 0.0, 1.0] and end color is [1.0, 1.0, 1.0, 0.0] so it starts with completely black and ends with source diffuse color, while alpha fades away from fully opaque to transparent.
First, in vertex shader we calculate layer color coefficient and then interpolate from start to end color, and then simply multiply diffuse color by that value. Finally, alpha value of fragment is multiplied by value from alpha map which determines fur hairs pattern.

// vertex shader
float layerCoeff = float(gl_InstanceID) / layersCount;
vAO = mix(colorStart, colorEnd, layerCoeff);

// fragment shader
vec4 diffuseColor = texture(diffuseMap, vTexCoord0); // get diffuse color
float alphaColor = texture(alphaMap, vTexCoord0).r; // get alpha from alpha map
fragColor = diffuseColor * vAO; // simulate AO
fragColor.a *= alphaColor; // apply alpha mask

There can be multiple ways to simulate fur movement on wind. In this demo we move each vertex a little bit depending on time uniform passed to shader. To do this, we have to use some sort of unique “hash” value for vertices with the same coordinates. We cannot rely on built-in gl_VertexID variable because it is actually different for different vertices, even the ones with the same coordinates can have different gl_VertexID. So we calculate some “magic sum” of vertex coordinate and create a sine+cosine wave based on that value. Example of moving vertex according to input of time uniform:


const float PI2 = 6.2831852; // Pi * 2 for sine wave calculation
const float RANDOM_COEFF_1 = 0.1376; // just some random float
float timePi2 = time * PI2;
vertex.x += sin(timePi2 + ((rm_Vertex.x+rm_Vertex.y+rm_Vertex.z) * RANDOM_COEFF_1)) * waveScaleFinal;
vertex.y += cos(timePi2 + ((rm_Vertex.x-rm_Vertex.y+rm_Vertex.z) * RANDOM_COEFF_2)) * waveScaleFinal;
vertex.z += sin(timePi2 + ((rm_Vertex.x+rm_Vertex.y-rm_Vertex.z) * RANDOM_COEFF_3)) * waveScaleFinal;

Further Improvements

While already achieving quite good results, this implementation can be improved by applying directional force (wind) to fur and/or adjustable fur length using per-vertex coefficients. Feel free to get sources from GitHub and modify it according to your needs, code uses MIT license.

Comments

Popular posts from this blog

Voxel Airplanes 3D Live Wallpaper

Today we've released a Voxel Airplanes 3D Wallpaper . This live wallpaper brings a fully 3D scene with cute cartoonish airplanes flying above various landscapes to the home screen of your phone. Scene has a distinct old-skool pixelated look. All objects are deliberately of a low fidelity - ground and clouds have these crunchy, unfiltered textures. Yet they are full of small eye-catching details like occasional stylized anime-like glass reflections and planes moving around ever so slightly, struggling in the strong wind. Make sure to enable a fully randomized mode to see all varieties of planes and terrains!

3D Castle Live Wallpaper

We're glad to announce the release of new Android live wallpaper app - "3D Castle Live Wallpaper". Immerse yourself in a dynamic, enchanting 3D scene featuring a charming and stylized castle nestled within a lush forest. This animated live wallpaper is designed to breathe life into your device, offering a rich and captivating visual experience. Step into a whimsical world where a beautifully designed castle stands amidst a serene forest. The cartoon-style graphics exude charm and character, making your screen come alive with vivid colors and engaging details. Get app on Google Play - https://play.google.com/store/apps/details?id=org.androidworks.livewallpaper.cartooncastle3d

Developer's notes about OpenGL ES programming

  Introduction During development of our live wallpapers we have tried various scenarios of using OpenGL ES and have used many optimizations of both performance and visuals of our live wallpapers. In this article I will try to tell what problems can developer experience while developing for OpenGL ES on Android.