Images Loading and Display - martin-pr/possumwood GitHub Wiki

The image editing tools in Possumwood are relatively basic, aimed primarily at preparing image data to be used in OpenGL rendering and 3D algorithms.

This initial tutorial shows how to load an image and display it in the OpenGL viewport using OpenGL.

Loading an image

To load an image from an external file, we can use the images/load node:

alt text

In this node's properties, we can select an image filename; for now, let's use one of the example files - examples/hdrihaven_envmaps/rathaus_1k.png. Clicking on "Show details..." button will then bring up a simple image viewer, showing the content of the image.

alt text

Image as a texture in OpenGL

To display a loaded image in the viewport, we need to create a geometry to attach the image to, and a set of vertex and fragment shaders, linked by a program, to display it. We also need to convert the image to a GL Uniform object of type Sampler, to make it accessible in the OpenGL shaders.

As the input geometry, we can use a simple polymesh/grid instance with a single level of subdivision. We pass this through a polymesh/vertex_data node to convert it to a vertex source, which can then be passed into render/draw node.

To set up the GL Uniform data, we need to instantiate a render/uniforms/texture instance, which converts an image to a Sampler object. To allow our shaders to access the viewport transformation, we also need to chain the texture node with an instance of render/uniforms/viewport node.

Finally, we need to set up the GL Program, by linking a render/program with a render/vertex_shader and a render/fragment_shader instance.

Linking these appropriately, we should get a simple quad in the scene:

alt text

Mapping the texture

To display our image/texture on the quad, we need to make sure that our sampler is correctly setup first. In the properties of the texture node, we need to set up the name exposed in the OpenGL shader ("image" in our case). This will expose a uniform sampler2D image in the uniforms output:

alt text

To map the texture onto a polygon in a fragment shader, we need to expose a mapping parameter in the vertex shader first. This method is usually referred to as UV mapping, with the 2 parameters u and v describing the position of a texture pixel on the polygon (or polygonal mesh), traditionally in range [0..1].

In our case, the plane's vertex positions in world space correspond to the UV values - we need only to pass them untransformed into the fragment shader:

#version 130

in vec3 P;                     // position attr from the vbo

uniform mat4 iProjection;      // projection matrix
uniform mat4 iModelView;       // modelview matrix

out vec2 uv;                   // uv texturing parameters

void main() {
	// convert the world-space position to camera space
	vec4 pos4 = vec4(P.x, P.y, P.z, 1);
   	gl_Position = iProjection * iModelView * pos4;

	// UV parameters are just untransformed world-space position
	uv = vec2(P.x, P.y);
}

In the fragment shader, we can then use the sampler to fetch a colour per fragment based on the uv parameters, by changing its code to:

#version 130

out vec4 color;
in vec2 uv;
uniform sampler2D image;

void main() {
	color = vec4(texture(image, uv));
}

This leads to our texture displayed on our polygon:

alt text

While this setup is mostly functional, our texture is unfortunately flipped. This is caused by the difference in coordinate systems - images have their origin traditionally placed in the top left corner of the screen, while a 3D space often has its Y axis pointing up (the reverse of the image space). To correct for this, we can simply invert the Y axis in the fragment shader:

#version 130

out vec4 color;
in vec2 uv;
uniform sampler2D image;

void main() {
	color = vec4(texture(image, vec2(uv.x, 1.0-uv.y)));
}

Aspect ratio

While our current setup displays our texture on a square correctly, the original resolution of the image was 1024x512, which leads to a rectangle with the aspect ratio of 2:1. To get the correct aspect ratio in our display, we need to extract image metadata from the image, and pass them into the vertex shader as uniforms. This can be achieved using the image/metadata node and two render/uniform/unsigned nodes, passing the width and height parameters into uniforms with the same name:

alt text

Accessing these in the vertex shader, we can then compute the aspect ratio, and deform the grid accordingly:

#version 130

in vec3 P;                     // position attr from the vbo

uniform mat4 iProjection;      // projection matrix
uniform mat4 iModelView;       // modelview matrix

uniform uint width;
uniform uint height;

out vec2 uv;                   // uv texturing parameters

void main() {
	// compute the aspect ratio from image width and height
	float aspect = float(width) / float(height);

	// convert the world-space position to camera space
	vec4 pos4 = vec4(P.x * aspect, P.y, P.z, 1);
   	gl_Position = iProjection * iModelView * pos4;

	// UV parameters are just untransformed world-space position
	uv = vec2(P.x, P.y);
}

This leads to a simple setup, displaying an image as a texture in an OpenGL viewport:

alt text