particle_system_design - ryzom/ryzomcore GitHub Wiki


title: Particle System Technical Design (2001) description: Technical design document for the NeL particle system — features, pipeline, system components, forces, collision zones, lights, and UML class model published: true date: 2026-03-16T00:00:00.000Z tags: editor: markdown dateCreated: 2026-03-16T00:00:00.000Z

From "en 2001 3d particle system technical design" — an internal Nevrax design document for the NeL particle system. {.is-info}

1. Features We Want to Support

  • Support for atmospheric phenomena such as rain, snow and so on. These effects must enable collision detection with the background, and/or some dynamic objects when needed, or at least, it should avoid rain or snow falling where it shouldn't.
  • Support for mesh as particles. Little objects such as rocks could be then used.
  • Support for various particle representations, including dot, line, triangle fans, bitmaps, shockwave, ripples.
  • Additional visual artefacts could be used, such as lens flares.
  • A wide range of pyrotechnic effects should be available. This includes fireworks and spell effects at least. So the system must be able to deal with particle splitting, smoke generation, smoke lighting and all what is needed to make these visual effects attractive.
  • Forces can be applied to particles.
  • Support for collision volumes or collision meshes when needed.
  • These volumes should also enable particle behaviour control: bouncing, sink for particles...
  • Forces such as wind should affect all systems at the same time (if they're flagged for that).
  • A variety of emitters should be available: mesh as an emitter (for example, when a character goes out of water, his clothes could be wet and let some drops fall), conic, planar, point and spherical emitters.
  • Emitter frequency should be tuneable.
  • Some particles should be able to cast and/or receive light.
  • The system should be able to handle event notifications (collisions, emissions etc.).

2. Pipeline for the Particle System

2.1. Overview

@startuml skinparam rectangleBorderColor black skinparam rectangleBackgroundColor white skinparam arrowColor black

rectangle "System at T" as A rectangle "Particles emission" as B rectangle "Position integration and\ncollisions detection" as C rectangle "Forces computation" as D rectangle "Speed integration" as E rectangle "Attributes update" as F rectangle "Particles Rendering" as G rectangle "System at T+ÎīT" as H

A -down-> B B -down-> C C -down-> D D -down-> E E -down-> F F -down-> G G -down-> H @enduml

2.1.1. The Motion Part

  • It deals with all the parts of the motion: creating new particles, and moving the existing ones.

Particles in a system can be locked to the system basis or not: when a component is expressed in the world basis, and is applied to an object of the system basis, it should be expressed in this basis before being applied, and reciprocally. Example: gravity (world basis) is applied to a particle in the system basis. When world-basis particles are generated by system-basis emitters, a transformation must also be done.

The motion part also deals with collisions: contacts with collision volumes, mesh volumes, background when they are needed. When a collision occurs, we can have several behaviours:

  • Bouncing
  • The particle is destroyed
  • The particle is split into several other particles. In fact, this can be achieved by binding an emitter to a particle. The emitter is activated only when the particle hits something (and thus dies — this behaviour should be removable, though).
  • We could also attach an event notification when a collision occurs. This could be useful when dealing with sound and so on.

2.1.2. Particles Rendering

This part uses the position computed by the previous step in order to display the particle. In this part, specific attributes such as colours are also updated. This topic will be dealt with specifically later.

2.2. Time Discretisation

When performing the integration of the dynamic equation, we do it with the Euler method, using the elapsed time. This may not be really accurate when low frame rates occur, but this should be sufficient for the purpose of the system (visual effects).

3. System Components

3.1. Notion of Located

3.1.1. Description

A located is an entity that has a position, a mass, a lifetime, and a velocity vector. Forces and collision zones all work on located. A located in itself does nothing: it is just moved by forces and just undergoes collisions or the like.

Each type of object starts from a located: additional features are then bound to located.

  • Particles binding gives a representation to a located. We can bind any number of particles to a located.
  • Emitter binding
  • Force binding — this gives the ability for located to cast forces. Some forces are not located (gravity...). But the fact of having a position gives us at least the possibility to easily represent it during an edition part.
  • Special zones binding (collision volumes etc.)
  • Light binding — this way, a located can cast some light.

Each of these objects will be hard coded, but with editable parameters.

3.1.2. Special Behaviour Flags

These flags are useful for enabling some special behaviour with located.

Flag Behaviour
Destroyed on collision The located disappears when it collides
Local basis The located lies in the basis of the particle system. It also means that any rotation of the system will rotate the particle too
Don't collide No tests are made with collision zones

3.2. Particles

3.2.1. Particle Types

They give a representation to a located. A number of representations can be thought of. Here is a non-exhaustive list of what will be implemented.

  • Dot — The particle is shown as a simple dot of the desired color on screen.

  • Line — A line which starts from the previous position of the particle and reaches the current position.

  • Gradient line — Nearly the same as above, but a given number of previous positions are kept in buffer. During the drawing process, a gradient allows fading to black, starting from the particle color.

  • FaceLookAt — This kind of quad is always facing the user. It is a simple sprite which is modulated by the particle color.

  • Faces — The same as above but it also uses a basis to give a direction to the particle.

  • FanLight — This is done by drawing a "circular" triangle fan. For each vertex, the distance to center varies over time (by using a periodical function). All vertices are black except the center vertex which has the light color. A saturation effect can be obtained by using a 1D texture which contains a gradient such as black — light color — white. Colors are then replaced by texture coordinates during the drawing.

FanLight particle diagram showing a circular triangle fan with radially varying vertex distances and center-to-edge color gradient

  • Meshes — 3D mesh objects used as particles.

  • Shockwave 1 — Shockwaves can be simply done with scaling faces with a texture on it but this may not fit the needs: textures could look crappy when scaling. A mesh as shown below may give better results (with a texture wrapping on the U coordinates mapped entirely on each quad).

Shockwave type 1 showing a disc-like mesh with radial quads that can be scaled outward, with UV mapping wrapped per quad

  • Shockwave 2 — A shockwave can also be seen as a ring:

Shockwave type 2 showing a ring mesh that expands outward over time

  • Ribbons — Ribbons are useful for simulating remanence. A ribbon is a mesh that uses previous positions of the particle to be drawn.

Ribbon particle showing a tube-like mesh constructed from previous particle positions, with a regular polygon cross-section

The basic shape for a ribbon is usually a regular polygon with N sides. Other shapes could give interesting results. Texture mapping could be useful in effects such as fire columns or the like...

  • Flat ribbons — Flat ribbons are a lot like ribbons, but are only generated by 2 vertices.

Flat ribbon particle showing a strip of quads connecting previous positions, forming a flat trail

The possibility to apply a texture can be useful in some cases. The use of EMBM could also allow interesting effects such as a filamentous aspect.

  • Lens flares — Lens flare is a simple, but beautiful effect. We may want to add a lens flare effect to a located. This also involves testing if there is something in front of the particle. Reading back from the Z-Buffer could be an option, but it may stall the rendering which is not desirable. A ray tracing test would also do the job. As it is costly, this means that not too many lens flares should be used at a time.

  • Lightning — They face the user and are generated in a texture.

...TO COMPLETE... {.is-danger}

3.2.2. Light Receiving

We may want some particles to receive light, for example, smoke particles. It doesn't mean a thing for all kinds of particle (for example, all particles that are additively blended). So, any particle deriver should tell whether it is relevant for it to receive light. If this is the case, light receiving can be activated.

3.3. Emitters

Emitters are particle producers. They can be attached to a located. In this design, they even must be. This allows us to easily add motion to an emitter (it follows the located to which it is attached). Moreover, emitters can emit other emitters this way.

3.3.1. Types of Emitters

As previously, emitters will be hard coded. This allows for simplicity of use (especially during edition), and doesn't prevent us from creating a more general version.

Here is a non-exhaustive list of emitter types:

  • Omnidirectional emitter
  • Conic emitter
  • Planar emitter
  • Spherical emitter
  • Mesh emitter — A mesh emitter derives from an emitter, and references a mesh. It uses it to get positions for emission of particles. For a wet character for example, it builds a database of admissible vertices (those whose normal is headed toward -Z).

3.3.2. Which Basis to Use?

Most of the time, the basis will be the same for the position and the direction determination. This basis could be obtained as follows:

  1. We work in the world basis
  2. We use the whole system basis
  3. We compute a basis from the velocity vector of the located (using the Gram-Schmidt method)

3.4. Forces

3.4.1. Forces Integration

In this system, we model objects as points, whatever their size is.

This is done using the dynamic equation:

$$m \frac{d\vec{v}}{dt} = m\vec{a} = \sum \vec{F}$$

The discrete version is:

$$\Delta\vec{v} = \frac{1}{m} \sum \vec{F} \times \Delta T$$

3.4.2. Forces Enumeration

  • Position dependent forces:

    • Gravity
    • Magnetic force
    • Electrostatic force
    • Cylindrical attractors
    • Fluid friction forces
    • Turbulent wind
  • Global forces:

    • Uniform gravity
    • Wind in some cases
    • Convection forces

3.5. Special Zones for Particle Behaviour Modelling

Their purpose is to provide some behaviour for the located such as bouncing for example. They can use the particle system basis or the world basis.

3.5.1. Zone Enumerations

  • Plane: an infinite plane
  • Rectangle: same as above, but with a finite extent
  • Disc
  • Parallelepiped
  • Sphere
  • Cylinder
  • Capped cylinder
  • Collision mesh

3.5.2. Collision Detection

It happens before the computation of the sum of forces.

At the end of the process, we've got:

(1) $\quad \vec{x}(t+\Delta t) = \vec{x}(t) + \Delta t \times \vec{v}(t)$

(2) $\quad \vec{v}(t+\Delta t) = \vec{v}(t) + \frac{\Delta t}{m} \sum{\vec{F}}$

(1) is true when no collision occurs.

If each collision zone has its implicit equation $P \in Z \Leftrightarrow Z(P) = 0$:

$$\exists k \in [0,1], ; Z(\vec{x}(t) + k \times \Delta t \times \vec{v}(t)) = 0$$

In this equation we treat the object as something punctual.

If this happens, we must find the closest collision volume if there are several.

3.5.3. Zone Usage

  • Bouncers — They only affect force-driven particles (because they need a speed vector). The bouncing is done by reflecting the velocity vector about the collision surface normal.

  • Sinks — When a collision occurs, the particle is destroyed.

3.6. Lights

In some occasions, we may want some located to cast lights. For this we use a special object that can be bound to a located. Any light type must derive from a CPSLight class to perform the job. These LocatedBinders should be processed during the Light Traversal, so they need a special observer.

4. Integration in the NeL Library

4.1. Integration with the Rendering Pipeline

We will follow the MOT pattern (Model Observer Traversal).

Here the model will be a particle system... and it will be drawn during the render traversal. As the system is likely to change at every frame, data must be kept in the instance rather than in the observer.

4.2. Events Handling

The following events can be handled:

  • A particle "dies"
  • A particle is created
  • A particle collides with something

Other possibilities could be added of course.

In order to integrate as an event emitter, each particle system must be an emitter. So it must derive from NLMISC::IEventEmitter, and it must register to the event server.

4.3. LOD Balancing

LOD balancing is needed to enable smooth frame rate when a lot of effects are cast at once.

LOD can affect 2 aspects:

  • Geometry accuracy: when particles are drawn using some special geometry (such as FanLights), quality degradation is done on the geometry (in the case of FanLights for example, the number of sectors is decreased). In some cases, the effect could even not be drawn. Size of the effect must be taken into account, so it must be decided on a per-case basis.

  • Particle emissions: some kind of emitters are designed to produce particles at a continuous and random rate. The rate for these emitters can be gracefully degraded as they're seen from far away. For other emitters, we can't say anything (main particles of the effect may be instantiated only once).

  • Particle number control: deleting particles of some type as the LOD decreases doesn't seem to be a good idea as some particles are more relevant for an effect than others. How to choose important particles, and those which are not? Of course artists could check some particles to say they can be deleted, but we'd rather avoid that.

5. UML Modelling

Here, we give the most important classes and relationships. All classes are located in the NL3D namespace.

Class hierarchy showing IShape/CTransformShape inheritance with CParticleSystemShape/CParticleSystemModel and CParticleSystem, connected via createInstance (deep copy)

Process and located hierarchy showing CParticleSystem containing CPSProcess instances, each containing CPSLocated with CPSLocatedBindable attachments, plus IModel and IStreamable interfaces

CPSLocatedBindable subtypes: CParticle (various particle implementations), CForce (various force implementations), CPSEmitter (various emitter implementations), CPSZone (collision volumes), and CPSLight (light casting by located)

Emitter-observer relationship: CPSEmitter has 0-n references to CPSLocated (type of located to emit), IObs is observed by CParticleSystemObs which has 0-1 reference to CParticleSystem

⚠ïļ **GitHub.com Fallback** ⚠ïļ