How to implement real-time polygon refraction with threejs

How to implement real-time polygon refraction with threejs

Preface

In this tutorial, you will learn how to make an object look like glass in three steps using Three.js.

When you render a 3D object, whether using some 3D software or using WebGL for real-time display, you always have to assign materials to it to make it visible and have the desired appearance.

Many types of materials can be mimicked using ready-made procedures from libraries like Three.js, but in this tutorial I will show you how to make an object appear to act like glass using three objects (in three steps).

Step 1: Setup and front refraction

For this demonstration I will be using a diamond geometry, but you can follow along with a simple box or any other geometry.

Let's build our project. We need a renderer, a scene, a perspective camera and our geometry. In order to render our geometry, we need to assign a material to it. Creating this material will be the main focus of this tutorial. So go ahead and create a new ShaderMaterial with basic vertex and fragment shaders.

Contrary to what you might expect, our material will not be transparent, in fact, we will be sampling and deforming anything that is behind the diamond. To do this we need to render the scene (without the diamonds) to a texture. I'm just rendering a full screen plane using an orthographic camera, but this could also be a scene filled with other objects. The easiest way to split background geometry from a rhombus in Three.js is to use "layers".

this.orthoCamera = new THREE.OrthographicCamera( width / - 2, width / 2, height / 2, height / - 2, 1, 1000 );
// assign the camera to layer 1 (layer 0 is default)
this.orthoCamera.layers.set(1);

const tex = await loadTexture('texture.jpg');
this.quad = new THREE.Mesh(new THREE.PlaneBufferGeometry(), new THREE.MeshBasicMaterial({map: tex}));

this.quad.scale.set(width, height, 1);
// also move the plane to layer 1
this.quad.layers.set(1);
this.scene.add(this.quad);

Our rendering loop looks like this:

this.envFBO = new THREE.WebGLRenderTarget(width, height);

this.renderer.autoClear = false;

render() {
    requestAnimationFrame( this.render );

    this.renderer.clear();

    // render background to fbo
    this.renderer.setRenderTarget(this.envFbo);
    this.renderer.render( this.scene, this.orthoCamera );

    // render background to screen
    this.renderer.setRenderTarget(null);
    this.renderer.render( this.scene, this.orthoCamera );
    this.renderer.clearDepth();

    // render geometry to screen
    this.renderer.render( this.scene, this.camera );
};

Okay, now it's time for a little theory. Transparent materials, such as glass, can be bent and therefore visible. That's because light travels slower through glass than it does through air, so when a light wave hits a glass object at an angle, this change in speed causes the wave to change direction. This change in the direction of the waves describes the phenomenon of refraction.

To replicate this in code, we will need to know the angle between our eye vector and the diamond's surface (normal) vector in world space. Let's update our vertex shader to calculate these vectors.

varying vec3 eyeVector;
varying vec3 worldNormal;

void main() {
    vec4 worldPosition = modelMatrix * vec4( position, 1.0);
    eyeVector = normalize(worldPos.xyz - cameraPosition);
    worldNormal = normalize( modelViewMatrix * vec4(normal, 0.0)).xyz;
    gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
}

In the fragment shader we can now use eyeVector and worldNormal as the first two parameters of glsl's built-in refraction function. The third parameter is the ratio of the indices of refraction, which is the index of refraction (IOR) of our fast medium (air) divided by the IOR of our slow medium (glass). In this case the value is 1.0/1.5, but you can adjust the value to get the results you want. For example, water has an IOR of 1.33 and diamonds have an IOR of 2.42.

uniform sampler2D envMap;
uniform vec2 resolution;

varying vec3 worldNormal;
varying vec3 viewDirection;

void main() {
    // get screen coordinates
    vec2 uv = gl_FragCoord.xy / resolution;

    vec3 normal = worldNormal;
    // calculate refraction and add to the screen coordinates
    vec3 refracted = refract(eyeVector, normal, 1.0/ior);
    uv += refracted.xy;
    
    // sample the background texture
    vec4 tex = texture2D(envMap, uv);

    vec4 output = tex;
    gl_FragColor = vec4(output.rgb, 1.0);
} 

very nice! We have successfully written a refraction shader. But our diamonds are barely visible…partly because we’re only dealing with one visual property of the glass. Not all light passes through the material to be refracted, in fact, some of it is reflected. Let’s see how to achieve it!

Step 2: Reflection and the Fresnel equations

For the sake of simplicity, in this tutorial we will not calculate proper reflections and will just use white as the bounce light. Now, how do we know when to reflect and when to refract? In theory, this depends on the refractive index of the material, and when the angle between the incident vector and the surface normal is greater than the critical angle, the light wave will be reflected.

In the fragment shader, we will use the Fresnel equations to calculate the ratio between reflected and refracted rays. Unfortunately glsl doesn't have this equation built in either, but you can copy it from here:

float Fresnel(vec3 eyeVector, vec3 worldNormal) {
    return pow( 1.0 + dot( eyeVector, worldNormal), 3.0 );
}

Now we can simply blend the refraction texture color with the white reflection color based on the Fresnel ratio we just calculated.

uniform sampler2D envMap;
uniform vec2 resolution;

varying vec3 worldNormal;
varying vec3 viewDirection;

float Fresnel(vec3 eyeVector, vec3 worldNormal) {
    return pow( 1.0 + dot( eyeVector, worldNormal), 3.0 );
}

void main() {
    // get screen coordinates
    vec2 uv = gl_FragCoord.xy / resolution;

    vec3 normal = worldNormal;
    // calculate refraction and add to the screen coordinates
    vec3 refracted = refract(eyeVector, normal, 1.0/ior);
    uv += refracted.xy;

    // sample the background texture
    vec4 tex = texture2D(envMap, uv);

    vec4 output = tex;

    // calculate the Fresnel ratio
    float f = Fresnel(eyeVector, normal);

    // mix the refraction color and reflection color
    output.rgb = mix(output.rgb, vec3(1.0), f);

    gl_FragColor = vec4(output.rgb, 1.0);
} 

It's looking much better, but there are still some shortcomings... Well, we can't see the other side of the transparent object. Let’s fix this!

Step 3: Multi-sided refraction

From what we have learned so far about reflection and refraction, we can understand that light can bounce back and forth several times inside an object before leaving it.

To get physically correct results we would have to trace every ray, but unfortunately this is too computationally intensive to render in real time. So I'm going to show you a simple approximation to at least visually see the back side of our diamond.

In a fragment shader, we need the world normals for the front and back sides of the geometry. Since we can't render both sides at the same time, we need to render the back-facing normals to a texture first.

Let's make a new ShaderMaterial like we did in step 1, but this time we render the world normal as gl_FragColor.

varying vec3 worldNormal;

void main() {
    gl_FragColor = vec4(worldNormal, 1.0);
}

Next, we'll update our rendering loop to include the backface pass.

this.backfaceFbo = new THREE.WebGLRenderTarget(width, height);

...

render() {
    requestAnimationFrame( this.render );

    this.renderer.clear();

    // render background to fbo
    this.renderer.setRenderTarget(this.envFbo);
    this.renderer.render( this.scene, this.orthoCamera );

    // render diamond back faces to fbo
    this.mesh.material = this.backfaceMaterial;
    this.renderer.setRenderTarget(this.backfaceFbo);
    this.renderer.clearDepth();
    this.renderer.render( this.scene, this.camera );

    // render background to screen
    this.renderer.setRenderTarget(null);
    this.renderer.render( this.scene, this.orthoCamera );
    this.renderer.clearDepth();

    // render diamond with refraction material to screen
    this.mesh.material = this.refractionMaterial;
    this.renderer.render( this.scene, this.camera );
};

Now we sample the backface normal texture in our refractive material.

vec3 backfaceNormal = texture2D(backfaceMap, uv).rgb;

Finally, we combine the front and back normals.

float a = 0.33;
vec3 normal = worldNormal * (1.0 - a) - backfaceNormal * a;

In this equation, a is just a scalar value indicating how much of the back-facing normal should be applied.

We did it! We can see all sides of a diamond only because we are refracting and reflecting it because of the material it is made of.

limitation

As I've already explained, it's impossible to render physically correct transparent materials in realtime with this approach. Another problem occurs when rendering multiple glass objects in front of each other. Since we only sample the environment once, we cannot see through a chain of objects. Finally, the screen space refraction I've demonstrated here doesn't work well near the edges of the canvas, because light may be refracted to values ​​outside its bounds, and we didn't capture that data when rendering the background scene to the render target.

Of course, there are ways to overcome these limitations, but they may not all be good solutions for your real-time rendering in WebGL.

The above is the details of how to use threejs to achieve real-time polygon refraction. For more information about the JS library, please pay attention to other related articles on 123WORDPRESS.COM!

You may also be interested in:
  • How to achieve 3D dynamic text effect with three.js
  • Three.js sample code for implementing dewdrop animation effect
  • Detailed explanation of the use and performance testing of multithreading in three.js
  • Detailed analysis of three.js displaying Chinese fonts and tween application
  • Sample code of three.js off-screen canvas in WeChat mini-game
  • three.js uses uv and ThreeBSP to create a courier cabinet function
  • Three.js shader material built-in variable example detailed explanation
  • Vue page introduces three.js to realize 3D animation scene operation
  • Three.js sample code for making dynamic QR codes
  • Three.js sample code for mosaicking images

<<:  How to make Python scripts run directly under Ubuntu

>>:  MySQL 5.7.10 installation and configuration tutorial under Windows

Recommend

Docker link realizes container interconnection

Table of contents 1.1. Network access between con...

Explain how to analyze SQL efficiency

The Explain command is the first recommended comm...

jQuery implements article collapse and expansion functions

This article example shares the specific code of ...

Quickly solve the problem of slow Tomcat startup, super simple

Today I helped a classmate solve a problem - Tomc...

Detailed explanation of MySQL phantom reads and how to eliminate them

Table of contents Transaction Isolation Level Wha...

How to view nginx configuration file path and resource file path

View the nginx configuration file path Through ng...

Detailed explanation of JavaScript's built-in Date object

Table of contents Date Object Creating a Date Obj...

How to implement a password strength detector in react

Table of contents Preface use Component Writing D...

Use pure CSS to achieve switch effect

First is the idea We use the <input type="...

How to deploy MySQL master and slave in Docker

Download image Selecting a MySQL Image docker sea...

How to ensure that every page of WeChat Mini Program is logged in

Table of contents status quo Solution Further sol...