Clarify "Projective texturing" component - michaliskambi/x3d-tests GitHub Wiki

Comparison of Projective texturing solution in CGE with new Projective Texture Mapping component (in X3D 4 draft specification) component

Castle Game Engine approach to projective texturing is part of CGE "shadow maps" extensions, documented on https://castle-engine.io/x3d_extensions_shadow_maps.php .

Of course it can be used without shadow maps too (just to project textures, like a real-world "projector" casts movies on a wall). CGE primary need was to define a way to project texture from a light source (or a viewpoint) using ProjectedTextureCoordinate node that can be used in a similar way to the standard TextureCoordinateGenerator. In particular, casting a texture from a light source is essential to perform shadow mapping, that's why I developed my projective texturing extension "by the way" of developing shadow maps extensions.

A simple example how does it look like is on: https://github.com/castle-engine/demo-models/blob/master/shadow_maps/projective_texturing_simple.x3dv . You can open it with view3dscene.

There is a big difference in Castle Game Engine approach and the X3D 4 spec approach:

  1. X3D 4 approach adds projector nodes that are defined outside of Shape. Projector node affects many shapes, using the "global" or "scoped" mechanism.

    In contrast, CGE ProjectedTextureCoordinate is placed within a texCoord field of a geometry. You need to explicitly add it to the relevant geometry nodes. My ProjectedTextureCoordinate was modeled after standard TextureCoordinateGenerator, so it cooperates with multi-texturing in the same way, and with "slots of textures within materials" (see X3D version 4: New features of materials, lights and textures) in the same way. E.g. you can have 1st texture layer coordinates determined by ProjectedTextureCoordinate, and 2nd texture layer coordinates determined by something else (like TextureCoordinateGenerator or explicit TextureCoordinate).

  2. X3D 4 projector nodes define their own projection parameters. The node type (TextureProjectorPerspective or TextureProjectorParallel) determines if the projection is perspective or orthogonal.

    In contrast, my ProjectedTextureCoordinate refers to a light source or a viewpoint node. You can use DEF / USE mechanism to refer to an existing light source / viewpoint this way, or you can define new light source / viewpoint inside. The projection parameters and type are determined looking at the node type (e.g. OrthoViewpoint or DirectionalLight result in orthogonal projection, Viewpoint or SpotLight result in perspective projection). In some cases I needed to add addtional fields to lights, see https://castle-engine.io/x3d_extensions_shadow_maps.php#section_light_parameters . E.g. all light sources have a new projectionNear, projectionFar fields.

    This was a necessity in my case. We need to have projection parameters specified at a light source, to make shadows maps possible from this light source. Then we can guarantee using the same projection parameters when (A) generating the shadow maps, and when (B) displaying the shadow maps. If projection parameters used for (A) and (B) would be different, shadow maps would not work OK.

  3. In summary, CGE approach relies of a few modifications to the existing nodes (some extensions to light nodes), and 1 simple new node: ProjectedTextureCoordinate (which is 100% consistent with TextureCoordinateGenerator in how it interacts with other textures).

    In contrast, X3D 4 approach relies on 2 new nodes, which are more independent from the rest of the specification, so the rest of the spec doesn't need to be modified. They define orthogonal/perspective projection using their own fields, and are not synchronized with any light or viewpoint.

TODO: Address how the projective texturing in X3D 4 spec affects existing textures

TODO: Test how X_ITE answers questions below, they seem to implement PTM ( http://create3000.de/users-guide/components/#projectivetexturemapping ).

When preparing example scenes for this component, I would suggest to address the question """how does projecting a texture work on a shape that already has textures?""".

  • What if the shape is already using a simple texture, i.e. it has "Application.texture" set to some "ImageTexture" and "IndexedFaceSet.texCoord" set to "TextureCoordinate" or "TextureCoordinateGenerator"?

  • What if the shape is already using multi-texturing, i.e. it has "Application.texture" set to a "MultiTexture" node e.g. with 2 "ImageTexture" instance, and "IndexedFaceSet.texCoord" contains a "MultiTextureCoordinate"?

    What if it has 2 nodes in "MultiTexture", but only 1 node in "MultiTextureCoordinate"?

  • After X3D version 4: New features of materials, lights and textures we have texture references inside materials (Material, PhysicalMaterial, UnlitMaterial). Because this way textures can affect various parameters, this is also like Collada or glTF. So, a third time we can ask a similar question: what happens if Material.diffuseTexture already specifies some texture, and you project a texture on it?

When you define a texture projector outside of the shape, you need to address the above questions.

One proposal: the projected texture should be added as an additional multi-texture layer to Appearance.texture. If necessary, the sizes of "MultiTexture" and "MultiTextureCoordinate" should be internally synchronized before adding this new layer. If a shape wasn't using multi-texturing (it only had 1 texture), it will effectively be internally converted to use multi-texturing.

Of course I talk here about "internal conversion", not something visible to the X3D author. But the X3D player should behave "as if" there was an additional texture, and the X3D specification could formulate requirements using this way ("The rendering result should be as if the projected texture was an additional multi-texture layer...").

In case the material used Material.diffuseTexture != NULL (or PhysicalMaterial.baseTexture != NULL or UnlitMaterial.emissiveTexture != NULL), then the projected texture in Apperance.texture will in effect be ignored, following 12.2.5 Relation of textures specified at material nodes to the "Apperance.texture" field. Authors are advised to put their textures in Appearance.texture instead, to make them multiple nicely with projected textures.

And to make it reliable, we should make a rule "all textures modulate by default" ( https://github.com/michaliskambi/x3d-tests/wiki/Make-RGB-and-grayscale-textures-treatment-consistent ), otherwise you will have problems because ImageTexture without multi-texturing right now has different semantics than a single ImageTexture inside a MultiTexture. The problem is described in this link, and my proposed solution is to "always modulate by default, whether you use multi-texturing or not, whether the texture is RGB or grayscale". I am addressing it anyway within X3D version 4: New features of materials, lights and textures.

P.S. Sorry if this sounds complicated :) Texturing is a non-trivial thing to define, and unfortunately X3D spec has a number of problems related to this, I was reporting them already a few years ago. But thinking about multi-texturing is necessary if you want to be able to project a texture X on a shape that is already textured with texture Y.