Data Baker - GhislainGir/BlenderGameToolsDoc GitHub Wiki
The Data Baker tool aims to facilitate and automate the process of baking arbitrary data into vertex color and UVs.
The concept of baking data into UVs and Vertex Colors is likely as old as real-time rendering itself.
Many artists and tech artists are probably already familiar with using vertex colors to paint masks in the RGBA channels. These masks are commonly used to drive artistic processes, such as customizing a material: fading in moss here, adding cracks to a brick texture there, sprinkling sand between floor cracks, and so on. You get the idea, using vertex colors is usually a very painterly process to control artistic parameters.
Taking a step back, one might wonder if the RGB channels of vertex colors could simply be used to store the XYZ components of a vector, any vector. And with some care, they can. The vertex color RGBA channels can be thought of as a way for each vertex to store four arbitrary 8-bit integers.
Similarly, artists are often taught to view UVs solely as a means of storing coordinates for texture projection onto a 3D surface, along with the constraints that come with this mindset: limiting texture deformation, hiding seams, staying within the [0:1] bounds etc. However, if we take a step back, just like vertex colors, UVs can be seen as a way for each vertex to store two 16- or 32-bit floats. Most DCC softwares and real-time applications allow up to eight UV maps, which means up to 16 floats can be encoded per vertex.
While you do usually need to follow general guidelines when authoring a UV map for texture projection, like limiting stretch, these constraints no longer exist and possibilities become endless when you start thinking of UVs as a way to store arbitrary data, such as baking pivots, axis, normals, shape keys/morph targets, and so on.
Let’s start with a basic idea and iterate on it. This asset is composed of many individual grass blades, each with its own pivot point, allowing them to rotate around it to potentially simulate wind motion amongst other things.
However, once exported into a game engine, these individual meshes are merged, and their pivot points are lost. As a result, applying a simple oscillating rotation in the vertex shader causes the entire mesh to rotate together.
Therefore, the idea is to store these pivot points so they survive the export process and can be retrieved in the vertex shader. In other words, we can create a UV map where each vertex stores a UV coordinate corresponding to the XY position of the pivot point it needs to rotate around.
Here, each vertex has a UV coordinate corresponding to the XY origin of the object it belongs to, taking into account the differences between Blender's and the target application's world and UV coordinate systems—more on that below.
Important
The position's X and Y components might exceed the [0:1] range, and could even be negative, but that's perfectly fine. Staying within the unit range is only somewhat relevant when using UVs to sample a texture. The GPU might wrap or clamp the coordinates, which alter how the texture is sampled. However, UVs are typically 16-bit floats in most game engines, and can even be 32-bit floats if needed so they are actually perfectly suited to store any arbitrary number, floating point precision issues aside.
Once the mesh is exported into a game engine, that extra UV map can be accessed to read the value it stores. In this case, the XY components of the pivot position.
Important
When storing data in UVs, it is crucial to understand that the UV coordinate system in the application where the data is baked (DCC software) may differ from the UV coordinate system in the target application (game engine).
For example, Blender is OpenGL-oriented, so UV(0,0) is located at the bottom-left corner. On the other hand, Unreal Engine is DirectX-oriented, meaning UV(0,0) is at the top-left corner. To convert between the two, a one-minus (1-x) operation is necessary. This operation needs to be performed only once and should ideally be done during the bake process. However, nothing prevents you from performing this operation in the vertex shader when reading the UV map instead.
Additionally, it's important to recognize that the world coordinate system may vary between applications. While Blender and Unreal Engine have similar X and Z world axes, their Y axes are inverted. As a result, any Y component, whether it's a position or a vector, must have its sign flipped.
Moreover, scale is important. Blender's default unit is one meter, while Unreal Engine's default unit is one centimeter. Therefore, a scale factor of 100 may need to be applied, either during the bake or when reading UVs as positions or offsets.
For this asset, the Z component is irrelevant since all pivot points are placed on the ground—Z is simply 0 and doesn't need to be stored. However, if the Z component of the pivot point did need to be stored, it could be saved in the U coordinate of an additional UV map, leaving the V component available for any other necessary data. In this way, the full XYZ pivot can be reconstructed by reading from multiple texture coordinates.
Because UVs can be stored as 32-bit floats, bit-packing can be used to efficiently combine multiple lower-precision values into a single UV channel—more on this further below.
This tool supports packing up to three floats into a single 32-bit float using 11, 10, and 10 bits of precision, respectively. This adds up to 31 bits, with 1 bit reserved to prevent generating NaN values. Naturally, some precision loss is to be expected—after all, you can't pack three 32-bit floats into a single 32-bit float without sacrificing some accuracy. While this makes bit-packing less ideal for large or highly varied values—like positions, it's still supported if needed. Bit-packing usually works best with unit vectors, as the precision loss typically isn’t a dealbreaker.
Important
Bit-packing isn't without its drawbacks, and for reasons that we'll explore shortly, it's currently only supported when storing data in the U axis of a UV map.
As briefly mentioned in the 'Theory' section, vertex colors can be used to store up to four 8-bit integers per vertex. As you may know, an 8-bit integer can store up to 256 values, ranging from 0 to 255. This is a fairly limited range to work with and makes vertex colors particularly unsuitable for storing large or highly varied values—such as positions—since these must first be remapped into the [0:255] range.
Additional biasing is required to handle negative values, typically by setting 127 as zero, with 0 representing -127 and 255 representing +128. This effectively limits the range to just 128 possible positions per signed axis.
However, the range is usually sufficient to store a unit vector—such as a normal—or a gradient.
Note
An 8-bit integer is far too limited in precision to make bit-packing techniques viable—at least for our use case.
Just like UVs and vertex colors, normals are vertex attributes that can be used to store data per vertex. However, this comes with significant side effects.
Storing custom data in normals can obviously lead to lighting issues, as the normals will no longer represent the actual vertex normals. It can also cause problems with binormal and tangent calculations. For this reason, storing arbitrary data in normals is generally not recommended, but it can be done in special cases. For example, you might not need accurate normals & tangents for unlit props, vfx meshes, and so on.
As the name suggests, normals are normalized, meaning arbitrary data can’t be easily stored in them. You can’t simply pack data into each component, since the final XYZ vector must have a unit length, which imposes constraints on the values each component can hold to meet this requirement.
This isn’t a limitation when storing a unit vector like the object’s forward, right, or up axis, or a shape key normal, or any other custom unit vector, as these naturally lie on the surface of a unit sphere.
However, when storing a non-unit vector, like the object’s XYZ position, things become more complicated.
Even though the values can be remapped so they fit within the unit sphere, the resulting vectors typically lie inside the volume of the unit sphere rather than on its surface. Which means that once normalized, these vectors get projected back onto its surface, corrupting the positions baked in the XYZ components.
Another way to look at it is to imagine a linear gradient being baked into just the X component of the normal, with Y and Z set to 0.0. It becomes clear that if the resulting XYZ vector must have unit length, the X component alone can only be 1.0, effectively destroying the gradient data. To preserve the gradient, another component—Y or Z—must be adjusted to maintain unit length. This can be achieved with the following equation:
or
This normalization is likely enforced at many stages of the pipeline—internally by the DCC software, during FBX export/import, or in the mesh serialization process of a game engine—and you most likely have no control over it.
Therefore, as far as I know, storing an arbitrary non-unit XYZ vector is not feasible. However, it is still possible to store two of its components and creating a 2D vector that lies within a unit circle.
By using the third component, it's possible to reconstruct a unit vector by projecting the 2D vector onto the surface of the sphere. This ensures the final XYZ vector has a unit length while preserving the remapped values stored in the first two components.
For this reason, when the tool detects that a non-unit XYZ vector is baked into the normals, it will discard the Z component. This can be observed in the 'report' panel, where one of the data layers will be missing—more on that below. Keep in mind, this does not occur when baking a unit vector, where all three XYZ components of the normal can be normally used.
Additionally, for simplicity’s sake, if only a single value is stored in any one of the X, Y, or Z components of the normal, the other two components will be set to zero, and the first zeroed component will then be adjusted to ensure the vector has unit length.
In short, storing any value in the normal will override the original normal data.
Note
Though it's technically not required, you might want to use 'Shade Smooth' on your objects before baking data into normals to avoid duplicating vertices—if possible—as discussed here.
Important
Storing custom data in normals likely requires very specific export and import settings in both your DCC software and your game engine, like Unreal Engine.
Important
Triangulating the mesh upon export may cause custom normals to be unexpectedly modified or updated, so it's usually necessary to triangulate the mesh before baking custom data into the normals.
Note
Though theoretically possible, this tool does not allow you to bit-pack multiple pieces of data into the normal's XYZ float components, as the enforced normalization—despite the normal being as close as possible to unit length—can unpredictably scramble the bits and corrupt the packed data.
It's important to remember that any data baked into UVs, vertex colors, or normals is likely stored in local space—that is, relative to the baked object's origin. For example, if you bake a pivot into UVs for a mesh positioned 50 cm along the X axis from the final object's origin, it will always read as 50 cm in X.
As a result, once the mesh is imported into a game engine and placed in the world, any transform applied to it—location, rotation, or scale—won’t affect the UV coordinates. This means the pivots stored in them will no longer reflect their actual position in world space. Fortunately, retrieving the pivots in world space is straightforward: simply take the local position and apply the object’s world space matrix. More can be read here
To summarize, it's essential to decide whether you want to work in local or world space. The key is consistency:
- Pivots can be read as-is, in local space, with computations performed using the mesh’s local vertex positions.
- Alternatively, pivots can be transformed into world space using the object’s world matrix, enabling computations based on world space vertex positions.
The image below illustrates these two scenarios: at the top, rotation around the world X axis is performed in world space, which requires the pivots stored in UVs to be transformed from local to world space. At the bottom, rotation around the world X axis is performed in local space, relative to the object, which requires the world X axis to be transformed into local space and then normalized to eliminate the effect of the object's scale.
The process of baking data is, for the most part, quite straightforward, but before diving into it, let’s quickly go over the common options and panels.
A panel called 'Mesh' lets you customize several options related to generating & exporting the baked mesh.
- Origin: Optional object to use as the baking origin instead of the world origin. It takes into account the object's location, rotation, and scale, which may lead to unexpected results. For this reason, it's considered experimental, but it might be useful in rare cases.
- Scale: Scale applied during baking (e.g. meters to centimeters)
- Invert X/Y/Z: Invert the world X/Y/Z axis (Y set to True for Unreal Engine compatibility)
- Name: Name of the baked object
- Materials: Enable to copy materials
- Merge: Enable merging of the duplicated selection once baking is complete. Otherwise, keep them separated to allow for additional bakes on the individual objects
- Clear Attributes: Enable this option to remove face corner attributes that store the raw vertex data for each layer. These attributes are named using each layer's unique ID, created and used internally during baking, and are unlikely to be useful after the bake is complete
- Duplicate: Enable this option to preserve the original selection and bake data on the duplicated mesh. Disable it at your own risk—doing so will modify the selection, which may lead to unwanted changes to the source data and unpredictable bake results if data blocks are shared
-
Single User: If the selection isn't duplicated, the bake may not work as expected when data blocks are shared. This ensures that meshes are made 'single user' to prevent conflicts during the baking process
- UV|Name: Name of the UVMap to be created or used for baking mesh UVs
- UV|Invert V: Invert the V axis of the UVMap and flip the VAT texture(s) upside down. Typically True for exporting to Unreal Engine or DirectX apps, False for Unity or OpenGL apps
-
Export: Enable to export the SDF bounds to an FBX file upon bake completion. Only available if the Blender file is saved
- Export|Name: Name for the exported FBX file (without the .fbx extension). is a placeholder tag that can be used to be replaced with the object's name
- Export|Path: File path for the exported FBX, excluding the file name. The path is relative to the Blender file
- Export|Advanced|Override: Enable to override any existing .fbx file
A panel called 'XML' lets you customize the export options for the XML file.
-
Export: Enable to export an XML file containing information about the bake process (recommended)
-
Export|Mode: Select how the XML file name and path are generated
- Mesh Path: Use the same FBX file name and path for the XML file. Defaults to 'Custom' if mesh is not exported
- Custom Path: Specify a custom XML file name and path
-
Export|Mode: Select how the XML file name and path are generated
- Export|Filename: Name for the exported XML file (without the .xml extension)
- Export|Path: Path for the exported XML file, excluding the file name. The path is relative to the Blender file
- Export|Override: Enable to override any existing .xml file
The Data Baker tool is a complete rewrite of my old, now deprecated 'Data Baker Addon'. It introduces a new paradigm I called Data Layers. Users are now completely free to pack any kind of data into any vertex attribute using layers.
For example, you might create three layers to store an object’s XYZ positions in UV maps, while packing the object's forward XYZ axis into the Vertex Color RGB channels, and a linear mask into the Alpha channel.
While this approach offers a high degree of flexibility, it also introduces more opportunities for conflicts. For instance, two data layers might target the same UV map and channel. The tool is designed to detect such conflicts, which will be indicated by a warning icon next to the affected layer’s name.
If you're unsure as to what causes such warning, simply attempt a bake — the tool will display a helpful message pointing out any issues.
Note
This method has its pros and cons. While it offers much greater flexibility, managing and configuring multiple layers may require a bit more clicking than before. To streamline your workflow, consider using the presets feature to save and reuse your commonly used bake setups.
Additionally, the tool is designed to create new layers in an intelligent way. When creating a new layer, it will automatically set certain settings based on the currently selected layer. For example, a Position X layer stored in UVMap 1, U channel, will prompt the creation of a Position Y layer stored in UVMap 1, V channel.
What’s most exciting about this new layer paradigm is the ability to bit-pack any data. Simply select the XY, XYZ or Fraction storage mode and target a layer.
For example, the XYZ components of an object’s position can be bit-packed into a single UV channel—using the 'UV - XYZ' storage mode—, while the XYZ components of its forward, right or up axis can be packed into another—assuming you can manage the precision loss, afford the minimal unpacking cost, and use 32-bit UVs.
When selecting a storage mode that allows bit-packing, two new buttons will appear to the right of the data layer list, allowing you to change which layer is targeted. This is indicated by a right arrow in front of the targeted layer.
You'll also be able to specify which 'virtual component' the selected layer will use for storage. If you're packing two layers using the 'XY' method, the layer that targets another can decide whether to store its data in the X or Y component, leaving the other component for the targeted layer.
Note
XY or XYZ packing has nothing to do with the X, Y, or Z axes, or even with the X, Y, or Z components of a vector. It’s simply a way of saying that two or three float values—labeled X, Y, and Z—are packed into a single float using a bit-packing method that depends on knowing where each value is encoded within the 32 bits that make up the float. The value labeled as the ‘X’ component might just as well contain a Z position.
Important
Bit-packing one layer into another stored in Vertex Colors or Normals is not allowed. The 8-bit precision of Vertex Colors is insufficient for effective bit-packing, and the automatic normalization of normals can corrupt the stored bits. Additionally, packing a layer into one that is already packed is not permitted, nor can a layer be packed into itself. The tool is designed to prevent these invalid operations or, at the very least, to warn you when conflicts are detected.
Warning
Bit-packing data in the V channel of a UV map is unsupported yet allowed. This may sound counterintuitive, but the reasoning is solid. Blender’s FBX importer assumes the mesh comes from an OpenGL-oriented application (like Blender, Maya, or 3ds Max) and therefore applies a hardcoded one-minus operation to the V channel to conform to DirectX conventions.
This behavior can be found in the FbxStaticMeshImport.cpp file, line 758:
UVCoordinates[UVID] = FVector2f(static_cast<float>(UVVector[0]), 1.0f - static_cast<float>(UVVector[1])); // flip the Y of UVs for DirectX
This operation is applied to the numerical float representation, which, when used for bit-packing arbitrary data, isn't meant to represent a meaningful value for mathematical operations. As a result, the float bits are irreversibly scrambled, and even applying an additional one-minus operation to undo the importer’s hardcoded one-minus won't restore the original bits. This effectively corrupts the data, making bit-packing in the V channel unusable.
That said, it's still allowed because game engines or applications that don’t perform this one-minus transformation can handle bit-packed data in the V channel without issue.
This is technically feasible, but not realistically implementable within a Blender Python addon. Deathrey suggested using a LUT (lookup table) to find the 32-bit values that would produce the intended bit pattern after the one-minus operation is applied. However, calculating the required LUT size results in approximately 34GB, which makes sharing and parsing it in Python an almost insurmountable challenge.
Another solution would be to submit a pull request to Epic Games, asking them to add an option to the FBX importer that disables the one-minus operation, preserving the original bits. But this is unlikely to be a viable or accepted solution.
A third option would be to revert to a numerically based packing method—which survives the one-minus operation—for storing data into a single float, as I did in my earlier implementation. However, this approach was initially discarded due to the poor precision these methods tend to offer.
Truth be told, storing data in textures—as done with the Object Attributes tool—is likely a superior method for handling arbitrary data at this point. That said, bit-packing is still perfectly viable in the U channel, which should open the door to some interesting possibilities.
The bake should produce a new mesh object that contains the data baked in vertex attributes: UVs, Vertex Colors and/or Normals.
Once the bake is complete, it's a good idea to double-check that the correct vertex attributes were created or modified on the generated mesh. While it may be difficult—or even impossible—to debug specific values when the bake involves remapping, bit-packing, or differences in world and UV coordinate axes, it's still worth verifying that the expected number of UV maps were generated, Vertex Color channels created, and/or custom normals adjusted.
Simply import the .fbx into your game engine. In Unreal Engine, you may want to check the following import options.
Note
If it wasn’t automatically exported, you’ll need to manually export the generated mesh to .fbx. Consider using the following export settings
Once a bake is attempted, a report panel will appear in the Vertex Animation panel, providing valuable insights into the baked information.
- Export: Exports the report to an XML file following the XML export settings.
- Clear: Clears the report. This may be useful as the report holds pointers to the baked object(s), which will make them persist in the Blender file even though they are later deleted.
This panel provides global information about the bake that won’t be covered in detail in this documentation to avoid unnecessary clutter, as most of the information should be self-explanatory. However, there are a few things worth mentioning.
Each baked layer will be listed, and when bit-packing is used, the layer will indicate which other layer it has packed. The 'report' panel will also display the min/max range to use to remap the values back to their initial range after unpacked with the adequate algorithm.
Many things can go south when baking arbitrary data into UVs, Vertex Colors or Normals. Here are some things to look out for.
- Double-check that the differences between Blender’s coordinate systems and those of your target application have been properly accounted for:
- Blender’s world axes are +X+Y+Z, while Unreal Engine’s world axes are +X-Y+Z.
- Blender uses OpenGL, meaning UV(0.0) is at the bottom-left corner, whereas Unreal Engine uses DirectX, where UV(0,0) is at the top-left corner.
- Blender’s default unit is 1 meter (though it can be adjusted), while Unreal Engine’s default unit is 1 centimeter, so a scale factor may need to be applied.
- Try 32-bit UVs in your targeted application if you think the issue you experience is related to 16-bit UVs imprecision.
- Double check the vertex attributes were created as expected: uvs, vcol, normals.
- Bit-packed data can't be read as-is and need to be unpacked with the appropriated algorithm, using the right min/max values, as reported by the tool.
- Ensure UVs are not overriden in your targeted application by lightmap UVs or any other process automated during mesh import.
- Disable any geometry-related optimization features in your game engine initially. This includes virtualized geometry systems like UE's Nanite, which may need to be deactivated both project-wide and on a per-asset basis.
- Triangulation during export may unexpectedly recompute vertex normals, something to always keep in mind if baking data into normals.
- Make sure you fully understand the implications of working with data baked in formats of varying precision. For example, 8-bit RGBA—such as Vertex Color—offers limited precision and requires remapping to be used accurately.
- The pixel shader interpolates vertex attributes, which can scramble bits and corrupt bit-packed data. To avoid this, always unpack such data in the vertex shader.
- Double check your export/import settings
- ...
The Data Baker tool's configuration and settings can be saved as a preset, which can be easily applied with a single click at any time. Presets can be added or removed using the button located to the right of the Data Baker tool header.
This feature uses Blender's internal preset system, which creates Python files to store all the necessary data. These files are stored in the following directories:
- Windows: C:\Users\<your username>\AppData\Roaming\Blender Foundation\Blender\<version number>\scripts\presets\operator\databaker_data\
- MAC: Library\Application Support\Blender\<version number>\scripts\presets\operator\databaker_data\
- Linux: ~/.config/blender/<version number>/scripts/presets/operator/databaker_data/
Warning
Preset .py files can be copied, pasted, and shared across different computers. However, only install presets from trusted sources, as these files can execute malicious Python code on your machine.
Here are a few words about the underlying Data Baker implementation.
You can add new properties to the DATABAKER_PG_SettingsPropertyGroup
class located in the Properties.py
file. This class stores global settings for the tool.
class DATABAKER_PG_SettingsPropertyGroup(PropertyGroup):
...
data_layers: CollectionProperty(type=DATABAKER_PG_DataLayerPropertyGroup, description="List of data layers")
data_layers_selected_index: IntProperty(name="", default=0, description="Selected data layer")
mesh_name: StringProperty(name="Name", default="BakedMesh.DATA", description="Name of the resulting baked mesh")
mesh_target_prop: StringProperty(name="Property", default="BakeSource", description="Custom property name for duplicated objects to be able to point to their original objects")
...
Take note of the data_layers: CollectionProperty
, which references the DATABAKER_PG_DataLayerPropertyGroup
class. This class defines all properties that can be used to describe a data layer.
class DATABAKER_PG_DataLayerPropertyGroup(PropertyGroup):
...
ID: StringProperty(name="ID", default="", description="")
ptr_ID: StringProperty(name="Ptr", default="", description="")
datas = [
("POSITION", "Position", "X/Y/Z component of the object's position"),
("AXIS", "Axis", "X/Y/Z component of the object's forward/right/up vector"),
("SHAPEKEY", "Shape key", "X/Y/Z offset/normal of the object's shapekey"),
("MASK", "Mask", "Linear/Spherical mask"),
("RANDOM", "Random", "Seeded random value per collection/object/face"),
("PARENT_POS", "Parent Position", "X/Y/Z component of the object's parent position"),
("PARENT_AXIS", "Parent Axis", "X/Y/Z component of the object's parent forward/right/up vector"),
("VALUE", "Value", "Fixed value"),
("CUSTOM_PROP", "Custom Property", "Object's Float/Integer custom property"),
("FRAME", "Frame", "Vertex offset/normal of the object's vertices at a given frame based on the current frame (vertex count/order must be maintained)"),
]
data: EnumProperty(name="Data", items=datas, default="POSITION", description="Value to bake")
component_x_y_z = [
("X", "X", "X-axis"),
("Y", "Y", "Y-axis"),
("Z", "Z", "Z-axis")
]
component: EnumProperty(name="Component", items=component_x_y_z, default="X", description="Component to bake")
...
You're free to add your own properties if needed, but keep in mind that existing ones are meant to be reused as much as possible. For example, the name: StringProperty
property can describe a shapekey’s name when the data layer is in shapekey mode, or represent something else in a different mode.
Speaking of modes, you'll likely need to add a new entry to the datas
enum list inside the DATABAKER_PG_DataLayerPropertyGroup
to support your new bake mode.
datas = [
("POSITION", "Position", "X/Y/Z component of the object's position"),
("AXIS", "Axis", "X/Y/Z component of the object's forward/right/up vector"),
("SHAPEKEY", "Shape key", "X/Y/Z offset/normal of the object's shapekey"),
("MASK", "Mask", "Linear/Spherical mask"),
("RANDOM", "Random", "Seeded random value per collection/object/face"),
("PARENT_POS", "Parent Position", "X/Y/Z component of the object's parent position"),
("PARENT_AXIS", "Parent Axis", "X/Y/Z component of the object's parent forward/right/up vector"),
("VALUE", "Value", "Fixed value"),
("CUSTOM_PROP", "Custom Property", "Object's Float/Integer custom property"),
("FRAME", "Frame", "Vertex offset/normal of the object's vertices at a given frame based on the current frame (vertex count/order must be maintained)"),
]
Still in the Properties.py
file, there's a collection property called DATABAKER_PG_ReportPropertyGroup
, which defines all the properties that must be saved after a bake. It includes the same data_layers
list as in the DATABAKER_PG_SettingsPropertyGroup
, acting as a direct duplicate.
class DATABAKER_PG_ReportPropertyGroup(PropertyGroup):
...
data_layers: CollectionProperty(type=DATABAKER_PG_DataLayerReportPropertyGroup, description="")
data_layers_selected_index: IntProperty(name="", default=0, description="")
baked: BoolProperty(name="Baked", default=False, description="")
success: BoolProperty(name="Success", default=False, description="")
msg: StringProperty(name="Message", default="", description="")
name: StringProperty(name="Name", default="", description="")
ID: StringProperty(name="ID", default="", description="")
...
Next, any new property must be made visible to the user. This is handled in the Panels.py
file. Properties added to the DATABAKER_PG_DataLayerPropertyGroup
can be displayed in the DATABAKER_PT_DataBaker
panel. From there, the currently selected data layer can be accessed, and the UI updated accordingly based on the configuration (position, shapekey, your new mode, etc.).
class DATABAKER_PT_DataBaker(bpy.types.Panel):
bl_idname = "DATABAKER_PT_databakerpanel"
bl_label = "Data Baker"
bl_space_type = 'VIEW_3D'
bl_region_type = 'UI'
bl_category = "Game Tools"
bl_order = 0
bl_options = {'DEFAULT_CLOSED'}
...
def draw(self, context):
layout = self.layout
scene = context.scene
settings = scene.DataBakerSettings
...
if settings.data_layers:
data = settings.data_layers[settings.data_layers_selected_index]
if data:
panel_header, panel_body = layout.panel("position")
if panel_header:
panel_header.prop(data, "data")
if panel_body:
if data.data == "POSITION":
row = panel_body.row()
row.prop(data, "component")
elif data.data == "AXIS":
...
Properties added to the DATABAKER_PG_SettingsPropertyGroup
can be shown wherever appropriate. For example, the DATABAKER_PT_MeshMainPanel
is a subpanel of the main DATABAKER_PT_DataBaker
panel and displays global settings related to the mesh.
class DATABAKER_PT_MeshMainPanel(bpy.types.Panel):
bl_idname = "DATABAKER_PT_meshmainpanel"
bl_parent_id = "DATABAKER_PT_databakerpanel"
bl_label = "Mesh"
bl_space_type = 'VIEW_3D'
bl_region_type = 'UI'
bl_category = "Game Tools"
bl_order = 1
bl_options = {'DEFAULT_CLOSED'}
def draw(self, context):
layout = self.layout
scene = context.scene
settings = scene.DataBakerSettings
row = layout.row()
row.prop(settings, "origin_obj")
row = layout.row()
row.prop(settings, "scale")
...
Newly added settings must also be included in preset files. This takes place in the DATABAKER_OT_DataBaker_AddPreset
operator, located in the Operators.py
file.
class DATABAKER_OT_DataBaker_AddPreset(AddPresetBase, bpy.types.Operator):
bl_idname = 'databaker_databakerpanel.addpreset'
bl_label = 'Add preset'
preset_menu = 'DATABAKER_MT_DataBaker_Presets'
preset_defines = [ 'settings = bpy.context.scene.DataBakerSettings' ]
preset_values = [
'settings.data_layers',
'settings.data_layers_selected_index',
'settings.mesh_name',
...
Properties added to the DATABAKER_PG_ReportPropertyGroup
can be shown in one of the 'Report' panels. You may create your own if needed. The DATABAKER_PT_ReportPanel
is a good reference—it displays general bake information.
class DATABAKER_PT_ReportPanel(bpy.types.Panel):
bl_idname = "DATABAKER_PT_reportpanel"
bl_parent_id = "DATABAKER_PT_databakerpanel"
bl_label = "Report"
bl_space_type = 'VIEW_3D'
bl_region_type = 'UI'
bl_category = "Game Tools"
bl_order = 500
bl_options = {'DEFAULT_CLOSED'}
...
def draw(self, context):
layout = self.layout
scene = context.scene
report = scene.DataBakerReport
...
row = layout.row()
if report.success:
row.label(text="Success", icon="CHECKMARK")
else:
row.label(text="Fail", icon="ERROR")
row = layout.row()
row.prop(report, "ID")
if not report.success:
row = layout.row()
row.label(text=report.msg)
row = layout.row()
row.label(text=report.name)
Important: any property added to the DATABAKER_PG_ReportPropertyGroup
must be reset at the start of a new bake. This happens inside the reset_bake_report()
function in the Functions.py
file.
def reset_bake_report():
report = bpy.context.scene.DataBakerReport
report.data_layers.clear()
report.data_layers_selected_index = 0
report.baked = False
report.success = False
report.msg = ""
report.name = ""
report.ID = ""
...
The new_bake_report()
function calls this reset function and sets global variables that are easily retrievable from the settings.
def new_bake_report(context: bpy.types.Context):
settings = context.scene.DataBakerSettings
reset_bake_report()
add_bake_report("baked", True)
add_bake_report("ID", uuid.uuid4().hex)
...
To set values during the main bake process, use the add_bake_report()
function.
def add_bake_report(prop_name: str, prop_value: float|int|str):
setattr(bpy.context.scene.DataBakerReport, prop_name, prop_value)
There are also utility methods to add, retrieve, and update properties of data layers stored in the DATABAKER_PG_ReportPropertyGroup
.
def add_bake_layer_report(data_layer, packing, pack_range = None):
def clear_bake_layer_report(data_layer) -> bool:
def edit_bake_layer_report_range_min(data_layer, prop_value: mathutils.Vector = mathutils.Vector((0.0, 0.0, 0.0))) -> bool:
def edit_bake_layer_report_range_max(data_layer, prop_value: mathutils.Vector = mathutils.Vector((0.0, 0.0, 0.0))) -> bool:
def edit_bake_layer_report_range(data_layer, value, prop_name: str = "range_min") -> bool:
def get_bake_layer_report_range_min(data_layer) -> float:
def get_bake_layer_report_range_max(data_layer) -> float:
def get_bake_layer_report_range_valid(data_layer) -> bool:
def get_bake_layer_report_range(data_layer, prop: str = "min") -> float:
Properties stored in the DATABAKER_PG_ReportPropertyGroup
are meant to be important information exposed to the user and exported to XML. The export_xml()
function handles formatting the XML output using the properties defined in that property group.
def export_xml(context: bpy.types.Context) -> tuple[bool, str, str]:
settings = context.scene.DataBakerSettings
report = context.scene.DataBakerReport
root = ET.Element("BakedData",
type="Data",
ID=report.ID,
version="1.0")
# unit
unit_el = ET.SubElement(root, "Unit",
system=report.unit_system,
unit=str(report.unit_unit),
length=str(report.unit_length),
scale=str(report.unit_scale),
invert_x=str(report.unit_invert_x),
invert_y=str(report.unit_invert_y),
invert_z=str(report.unit_invert_z))
...
Now, let’s talk about the core of the algorithm: the bake()
function.
def bake(context: bpy.types.Context) -> tuple[bool, str, str]:
bpy.ops.object.mode_set(mode="OBJECT")
settings = context.scene.DataBakerSettings
new_bake_report(context)
wm = bpy.context.window_manager
wm.progress_begin(0, 99)
#############
# BAKE INFO #
bake_start_time = time.time()
...
########
# BAKE #
success, msg = bake_data_layers(context, layers_info, meshes, empties)
if not success:
add_bake_report("success", False)
add_bake_report("msg", msg)
return (False, 'ERROR', msg)
...
It handles various preprocessing steps before calling the main baking function: bake_data_layers(context, layers_info, meshes, empties)
.
def bake_data_layers(context, layers_info, meshes, empties) -> tuple[bool, str]:
settings = context.scene.DataBakerSettings
data_layers_uvs = []
data_layers_vcols = []
data_layers_normals = []
# pre bake
for data_layer, layer_info in layers_info:
to_bake, packing_mode, packing = layer_info
if not to_bake:
continue
data_layer_min = [0.0, 0.0, 0.0]
data_layer_max = [0.0, 0.0, 0.0]
for layer_packed_index, layer_packed in enumerate(packing):
if layer_packed:
pre_bake_func = get_data_layer_pre_bake_function(layer_packed)
...
# bake
bake_data_layer_uv(context, meshes, data_layers_uvs)
bake_data_layer_vcol(context, meshes, data_layers_vcols)
bake_data_layer_normal(context, meshes, data_layers_normals)
return (True, "")
For each data_layer to bake, this function calls get_data_layer_pre_bake_function()
, which returns the correct pre-bake function based on the data layer’s configuration.
def get_data_layer_pre_bake_function(data_layer: object):
if data_layer:
if data_layer.data == "POSITION" :
return pre_bake_position
elif data_layer.data == "AXIS":
return pre_bake_axis
...
return pre_bake_zeros
These sub-functions are defined in Functions.py
, and the pre_bake_zeros()
function serves both as the default fallback and as a documented template. You're invited to copy it to create your own pre-bake function. This function is responsible for storing the data to bake, as-is, into face-corner attributes on the mesh, and it returns the min/max range.
def pre_bake_zeros(context: bpy.types.Context, data_layer: object, meshes: list, empties: list) -> tuple[float, float]:
settings = context.scene.DataBakerSettings
custom_prop = settings.mesh_target_prop if settings.mesh_target_prop != "" else "BakeSource"
signed_axis = mathutils.Vector((-1.0 if settings.invert_x else 1.0,
-1.0 if settings.invert_y else 1.0,
-1.0 if settings.invert_z else 1.0))
signed_scale = signed_axis * settings.scale
dgraph = bpy.context.evaluated_depsgraph_get()
bake_min = 0.0
bake_max = 0.0
for mesh in meshes:
mesh_source = mesh.get(custom_prop, mesh)
#obj_eval = mesh_source.evaluated_get(dgraph)
#mesh_eval = obj_eval.to_mesh(preserve_all_data_layers=True, depsgraph=dgraph)
#mesh_eval.transform(obj_eval.matrix_world)
...
# bunch of zeros!
data_to_bake = 0.0
data_loop_ids = [data_to_bake] * len(mesh.data.loops)
if data_layer.ID not in mesh.data.attributes:
mesh.data.attributes.new(name=data_layer.ID, type='FLOAT', domain='CORNER')
attr = mesh.data.attributes[data_layer.ID]
attr.data.foreach_set('value', data_loop_ids)
...
return bake_min, bake_max
It’s called a "pre-bake" function because it doesn't actually bake the data into the final format (like UVs or vertex colors) yet. This design simplifies things, allowing multiple layers to be baked together later on using bit-packing techniques—and makes the add-on easier to extend. This later-stage processing is handled within the bake_data_layers()
function, and you generally don’t need to worry about it.
Finally, the data layer's friendly name is computed in the get_data_layer_name()
function for display in the layer list.
def get_data_layer_name(data_layer: object) -> str:
if data_layer:
if data_layer.data == "POSITION":
prefix = "Parent Position " if data_layer.obj_mode == "PARENT" else "Position "
return prefix + data_layer.component
elif data_layer.data == "AXIS":
prefix = "Parent Axis " if data_layer.obj_mode == "PARENT" else "Axis "
return prefix + data_layer.component
elif data_layer.data == "SHAPEKEY":
prefix = "Parent Shapekey " if data_layer.obj_mode == "PARENT" else "Shapekey "
if data_layer.vertex_mode == "OFFSET":
return prefix + "Offset " + data_layer.component
elif data_layer.vertex_mode == "NORMAL":
return prefix + "Normal " + data_layer.component
else:
pass
...
return "UNKNOWN"
Summary:
- Add properties to the appropriate property groups: settings, data layer, or report.
- Make sure these properties are displayed to the user in the appropriate panels.
- Important properties in the report group must be reset at the start of a bake and likely need to be included in the XML export.
- Add your new mode to the
datas
enum list. - Update
get_data_layer_pre_bake_function()
to return your new pre-bake function for that mode. - Create a new pre_bake function that writes your data to face-corner attributes in the mesh, following guidelines explained in the
pre_bake_zeros()
function
Voilà! Have fun iterating on the tool and feel free to send PRs!