Interactivity - sahilshahpatel/fluid-sim GitHub Wiki
Before we start implementing the rules of motion for our fluid we need a way to add fluid to the container in the first place! That will be the focus of this chapter: connecting JavaScript mouse events to our shaders. At the very end we'll be able to draw on our canvas like so:
To create the effect seen above we'll need to know a few things each frame:
- Where to draw the circle
- What color to draw the circle
- How big to draw the circle
These will correspond to
- Mouse position
- Mouse velocity
- A constant of our choice
If you want to skip implementing mouse tracking yourself, you can copy and paste our code below.
(Spoiler!) Our mouse tracking code
this.mouse = {
pos: [0, 0],
vel: [0, 0],
};
let getNextPos = e => [
e.offsetX * this.settings.dataResolution[0] / this.canvas.clientWidth,
(this.canvas.offsetHeight - e.offsetY) * this.settings.dataResolution[1] / this.canvas.clientHeight
];
this.mousedown = false;
this.mousemoveTime = performance.now();
this.canvas.addEventListener('mousedown', e => {
this.mousemoveTime = performance.now();
this.mouse.pos = getNextPos(e);
this.mouse.vel = [0, 0];
this.mousedown = true;
});
// For mouseup we use document in case they dragged off canvas before mouseup
document.addEventListener('mouseup', () => { this.mousedown = false; });
this.canvas.addEventListener('mousemove', e => {
if (!this.mousedown) return;
let now = performance.now();
let dt = (now - this.mousemoveTime) / 1e3;
this.mousemoveTime = now;
let nextPos = getNextPos(e);
this.mouse.vel = [(nextPos[0] - this.mouse.pos[0]) / dt, (nextPos[1] - this.mouse.pos[1]) / dt];
this.mouse.pos = nextPos;
});
For the adventurous, there are plenty of guides (read: StackOverflow pages) which can show you how to add mouse tracking in JavaScript. Here are the requirements we have:
- We'll need to track if the mouse is currently down (in
this.mousedown
). Ideally clicking on the canvas triggers this and clicking elsewhere should not. The end of a click anywhere should set this to false. - We need the current position, preferrably in the same units as our data resolution (in
this.mouse.pos
) - We need the current velocity, same units as position (in
this.mouse.vel
)
If you would like to implement this yourself, take a look at the "mousedown", "mouseup", and "mousemove" JavaScript events. You can attach event listeners like so:
htmlElement.addEventListener("mousedown", e => {
// Do stuff here
// The scope of `this` remains the same in here,
// so you can access any fields of the FluidSimRenderer class
});
Don't forget to initialize these variables to some default value in addition to modifying them in your event listeners. It may be useful for debugging to use console.log(<position | velocity>)
from within your event listeners to help you make sure they are working correctly.
Great! We have our mouse tracking in JavaScript. Now we need to learn how to send that data to a shader.
Create a new shader at glsl/forces.glsl
(interactivity is also the same as external forces). Don't forget to copy in the same version and precision header as in render.glsl
!
We'll again want to add in vec2 fragUV
. This will be present in all our fragment shaders, so we'll stop mentioning it from here on out. We also need an output, which you can name as you please. We will use out vec4 cellValue
.
This is our first shader that will require inputs that aren't from the vertex shader, but instead from our JavaScript! GLSL has the perfect solution: uniforms. Uniforms are inputs to a shader which are the same for all fragments. To specify these input we use the uniform
keyword just as we have the in
and out
keywords. In addition to the inputs mentioned earlier, we'll also include dt
, the time delta for this frame, and res
the data resolution.
uniform vec2 mousePos;
uniform vec2 mouseVel;
uniform float radius;
uniform float dt;
uniform vec2 res;
We can now use these values in our shader. Our shader is going to be pretty simple. We want to output a rough circle of color around the mouse position. Our mouse position is in a different coordinate system compared to our fragment positions. For now, use the following code to convert between them -- we will explain our coordinate naming conventions and meanings later in this chapter.
// fragXY is in the same coordinate system as mousePos
vec2 fragXY = fragUV * res - 0.5;
vec2 dist = fragXY - mousePos;
We can then use the GLSL exp
function to create an exponential decay around the mouse position
float decay = exp(-1. * length(dist / radius));
vec2 impulse = mouseVel * decay * dt;
cellValue = vec4(impulse, 0., 1.);
If we were able to run our shader we would see a dot around our mouse whenever it was down. We aren't quite ready for that yet, though.
First thing's first we have to add our shader to the JavaScript just like we did with render.glsl
in Chapter 3 (in the init
function).
Now we need to do something extra: In order to send these uniforms to the shader we need to store their location. This is part of the WebGL API and will let us set these values later.
Each shader program we create will have its own uniform locations. To handle this we will create an object for each program which stores these locations. To get a location we must use the gl.getUniformLocation(<shaderProgram>, <uniformName>)
function. The uniform name must match exactly the variable name from the shader.
You can store these locations however you like. We use a local helper function again to make it easier:
(Spoiler!) Helper function code
let createUniforms = (program, names) => {
let uniforms = {};
names.forEach( name => {
uniforms[name] = gl.getUniformLocation(program, name);
});
return uniforms;
};
For now we only need this object for the forces shader. Assuming you have created the shader program forcesProgram
and have a local helper function createUniforms
, we will set up our uniforms object like so
this.forcesUniforms = createUniforms(['mousePos', 'mouseVel', 'radius', 'dt', 'res']);
Now is the time to fill in the applyForces
function. To best wrap this shader, add a parameter to the function for each uniform it requires. (You may want to exclude res
from this since it should always be set to this.dataResolution
, but that is up to you).
All of our wrapper function will begin and end like the render wrapper did with slight modifications:
gl.useProgram(this.forcesProgram);
gl.bindVertexArray(this.quad.vao);
gl.enableVertexAttribArray(this.forcesProgram.vertexPositionAttribute);
gl.bindBuffer(gl.ARRAY_BUFFER, this.quad.buffer);
gl.vertexAttribPointer(this.forcesProgram.vertexPositionAttribute, this.quad.itemSize, gl.FLOAT, false, 0, 0);
// TODO: Set uniforms before running the shader!
gl.bindFramebuffer(gl.FRAMEBUFFER, null);
gl.viewport(0, 0, ...this.settings.renderResolution);
gl.drawArrays(this.quad.glDrawEnum, 0, this.quad.nItems);
To set our uniforms we will us the gl.uniform*
functions (see a full list here). For example, to set the res
uniform we would use
gl.uniform2fv(this.forcesUniforms.res, this.settings.dataResolution);
You can set the rest on your own! We've been through a lot, so let's take a break and enjoy the fruits of our labor for a bit. We're not actually done yet, but we can see some temporary progress by making a quick change.
Change the animate
function to call applyForces
instead of render
. If you load up your page, you should get an effect similar to shining a laser pointer on your canvas.
This is not quite the same as the first GIF we showed because we don't yet have a way to remember the last frame's image and draw on top of it rather than by replacing it.
That is where render-to-texture comes into play.
(Don't forget to undo your changes to animate
before moving on!)
We will need two textures (for now): a velocity texture and an output texture:
- The velocity texture will be where we store our main data.
- The output texture is sort of temporary. WebGL does not allow a texture to be both the input and render target of a shader, but we want our textures to do just that in order to evolve over time. Our workaround is to create this output texture, render to it, and then swap the pointers. You'll see how this works by the end of this section.
In our constructor there is a section for creating textures and VAOs. That is what we will fill in now. We're going to need to repeat this same procedure many times, so let's create a local function for it:
let createTexture = () => {
let texture = gl.createTexture();
// TODO: initialization
return texture;
}
As you can see, the gl.createTexture
function will create the texture object for us, but we still need to do some initialization before it's ready.
There are many different kinds of textures we could use, so we need to tell the GPU what these textures will be exactly. To understand what options we have, we'll need a crash course on coordinate systems*.
*different resources may use different naming conventions for these coordinates. These are the definitions we will use from here on out.
-
UV coordinates we've seen already in our fragment shader. In UV coordinates (0, 0) is the bottom left corner and (1, 1) is the top-right. Note that a pixel center is not at (0, 0) -- the edge of the pixel is there. UV coordinates are what must be passed in to GLSL's
texture
function to sample a texture. -
ST coordinates are also called "texture coordinates". In a 5x5 texture, ST coordinates can range from 0 to 5. Now perhaps it makes sense why a (0, 0) UV wasn't a pixel center! In ST coordinates you need a half to get to a pixel center, so (0.5, 0.5) is our lower-left pixel and (4.5, 4.5) is our upper right pixel. ST coordinates can be easily obtained from UV coordinates via
vec2 st = uv * textureResolution
. -
XY coordinates are like array indices, but with a bottom-left origin. So (0, 0) is the bottom left pixel and in a 5x5 texture (4, 4) would be the top right pixel. In general
vec2 xy = st - 0.5
. Since GLSL uses UVs for textures it is often more convenient to work in UV or ST coordinates than XY. They are defined here to help you understand the relation to 2D array indices.
Trick question time: what happens if I sample a texture at XY coordinates (-1, -1)? What about at XY coordinates (0.25, 0.25)?
Inputs like these aren't illegal in GLSL, which is actually very useful! Instead, we must tell the GPU how we want it to deal with these edge cases. For example, we could wrap it so that (-1, -1) is equivalent to the top-right pixel or we could clamp it to (0, 0). (0.25, 0.25) could be rounded to the nearest pixel or it could return a blend of the surrounding pixels (via bilinear interpolation).
Ok, we're now ready to fill in that initialization. We will want to enable bilinear interpolation (it will be useful later) and disable coordinate wrapping (since we have solid boundaries). We can do so with the following commands:
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
(See https://www.khronos.org/registry/OpenGL-Refpages/gl4/html/glTexParameter.xhtml for detailed documentation)
We need to do one last piece of initialization by specifying the data format of the texture. We will want floating point textures, and for maximum compatibility we will use 4-component (RGBA) textures. Remember that while not all of our textures will use all four components, we need each texture to be identical so that we can perform the pointer swap with our output texture.
To set the format we will use gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA32F, ...this.settings.dataResolution, 0, gl.RGBA, gl.FLOAT, null)
.
Now that we have this local createTexture
function, let's use it to create our textures:
this.velocityTexture = createTexture();
this.outputTexture = createTexture();
To be able to send textures as input to our shader we'll need to create a new uniform. In GLSL the variable is actually a texture sampler, so we use the type sampler2D
. In our forces.glsl
shader, let's add a new uniform:
uniform sampler2D data;
In our main
function the only modification we'll make is to add to the color rather than overwritting it:
vec4 oldValue = texture(data, fragUV);
cellValue = oldValue + vec4(impulse, 0., 0.);
Now we need to hook this uniform into our JavaScript. Add a location for it into your uniforms object, and add an argument to applyForces
for it.
Sending a texture as a uniform is slightly different than other uniforms, so pay attention. Each GLSL sampler is assigned a texture unit (WebGL guarantees that there are at least 8 texture units). Each texture unit is assigned a texture. So all in all we have
// Set the data sampler to texture unit 0
gl.uniform1i(this.forcesUniforms.data, 0);
// Bind our velocity texture to texture unit 0
gl.activeTexture(gl.TEXTURE0);
gl.bindTexture(gl.TEXTURE_2D, data);
We also need to change applyForces
to render to our outputTexture
instead of to the canvas. We'll do this by replacing the gl.bindFramebuffer
call with
gl.bindFramebuffer(gl.FRAMEBUFFER, this.framebuffer);
gl.viewport(0, 0, ...this.settings.dataResolution);
gl.framebufferTexture2D(gl.FRAMEBUFFER, gl.COLOR_ATTACHMENT0, gl.TEXTURE_2D, this.outputTexture, 0);
Note the three changes here:
- We have now bound an actual frame buffer instead of null. A frame buffer is a way of keeping track of attachments, but we don't use it for much. Binding null will automatically render to screen, but binding
this.framebuffer
will allow us to render to texture. - Our viewport size is now
dataResolution
instead ofrenderResolution
. This allows us to have a final render with a higher resolution than our data. This will be useful for debugging later. - We've added a line setting
this.outputTexture
as the render target for the bound frame buffer. This enables render-to-texture.
Now we must call applyForces
from within update
. Remember that the output of this render is sent to this.outputTexture
. To get it back into this.velocityTexture
, create a temporary variable at the top of update
(we'll use it often). Then add a line after the call to applyForces
to swap the pointers. In the end it should look like
let tmp;
applyForces(...);
tmp = this.velocityTexture; this.velocityTexture = this.outputTexture; this.outputTexture = tmp;
The last step is to send our data to the render shader and use it to set our output color. The steps for doing so are the same as the steps we took to set up the forces.glsl
uniforms, so this is a great opportunity to make sure you've been following along well.
Finally! We have our paintable canvas! To get the reset button working, simply call this.clearTexture(this.velocityTexture)
from within the reset
function. Now you can draw to your heart's content, reset, and keep drawing!
So far our only data texture is for velocity. Now is a good time to add another texture for dye. The way our fluid simulation works is to simulate the flow of the fluid, and so it implicitly assumes there is fluid everywhere to convey that flow along. To visualize the flow we imagine dropping dye into that flow and watching it move through it.
Use what you've learned to create a dye texture. Rather than creating a second texture input to applyForces
, simply call the function twice, once for the velocity texture and again on the dye texture. (For the dye texture you can set mouseVel
's Y component to 0 since dye is a scalar value.)
Finally, add the dye texture as a uniform in render.glsl
. We use the dye's x-channel as the red output and the velocity's xy-channels as the green/blue output.
In the next chapter we will add advection, the process by which the fluid moves according to its velocity.