How it Works - Connor-bett-swin/i54SatelliteSwarms GitHub Wiki

Contents

  1. Module Factory
  2. Argument Parser
  3. Algorithm Block
  4. Federated Learning Block

Module Factory

The Module Factory is responsible for creating modules according to the structure defined for configuration, input handling and output formatting. A module is designed to have compartments that encapsulate communication functionality and act as a facade, concealing internal simulation logic, making them simple to construct and utilise (see Module). Config, Handler and Output module compartments are pieced together and packaged for easy access. The Module Factory also defines module identifiers to support standalone execution and seamless mapping between command line input and configuration.

Module Creation

Once the relevant classes for a Module have been developed, putting them together is a two-step process.

1. Module Tuple - This allows for combining the Config, Handler and Output into a wrapper, a single access point. Using the NamedTuple from Python typing, the classes are specified and containerised. This ensures that all aspects of the module remain “typed” in external files, meaning that the IDE can present its available properties and methods.

2. Create Module - This function instantiates the Core, Config and Handler and combines the Config, Handler and Output as the previously specified NamedTuple. Note that the Output is instantiated within the Module Core.

# Step 1: Module Tuple
class SatSimModule(NamedTuple):
    config: SatSimConfig
    handler: SatSimHandler
    output: SatSimOutput

# Step 2: Create Module
def create_sat_sim_module() -> SatSimModule:
    sat_sim = SatSim()
    sat_sim_config = SatSimConfig(sat_sim)
    sat_sim_handler = SatSimHandler(sat_sim)

    return SatSimModule(sat_sim_config, sat_sim_handler, sat_sim.output)

Module Keys

These keys, stored in an Enum, are intended to match the parent JSON objects within the JSON configuration file for Modules. This streamlines calling a specific create module function by only needing to provide a key and makes command-line argument parsing to update Module configuration possible. All that is required is to provide an appropriate uppercase name representation of the existing JSON module options object.

class ModuleKey(Enum):
    SAT_SIM = "sat_sim"
    ALGORITHM = "algorithm"
    FL = "federated_learning"
    
# (Excerpt from options.json) SAT_SIM string matches JSON
"sat_sim": {
    "start_time": "2024-09-24 11:00:00",
    "end_time": "2024-09-24 12:00:00",

Argument Parser

The Argument Parser is a hierarchy of commands and options for the CLI to configure modules, run a simulation workflow and update global system settings.

  1. Commands - The first hierarchy level consists of commands of which there are two types: Workflow Commands and System Commands. All arguments pertaining to a particular workflow are embedded in its command.
  2. Subparsers - The next level introduces subparsers for each workflow. These are generally per module, but a workflow can also have subparsers for unique configurations, such as standalone module execution.
    • To promote modularity, module subparsers are defined irrespective of workflow commands, such that they can be plugged into any. For example, the Satellite Simulator functionality may be applicable to both FLOMPS and another workflow. In this case, a singular defined Satellite Simulator subparser can be provided to both FLOMPS and other commands.
  3. Arguments - Finally, within subparsers, arguments are specified which correspond with the JSON configuration file properties designed to be user-facing.

Algorithm Block

What does it do?

The Algorithm Component is responsible for implementing a selection criteria to choose an aggregator satellite for a given timestamp. The Adjacency Matrices representing the connections between satellites undergoes trimming as they flow into the component core. The satellite with high number of connections are selected as aggregator and the connections that do not directly involve the aggregator satellite are removed.

Hence the trimming retains only the connections that indicate primary communication pathway for that timestamp. This component ensures that not any single satellite is being overwhelmed by selecting satellites with highest connectivity and prioritizing those with fewer prior selections. The FLAM output records data across multiple attributes like timestamps, satellite counts, selected satellite, aggregator flags and trimmed Federated Learning Adjacency Matrices.

Federated Learning Block

What does it do?

The Federated Learning Block is responsible for starting a number of clients to act as the swarm of edge compute devices for the distributed learning system. The client count as well as the round count are the main interactable pieces of functionality for this system and will represent the desired configuration for the satellite array that is being simulated (NOTE: the client count will be derived from the number of Satellites provided in the ingested TLE file if the System is being ran in the FLOMPS integrated mode).

The current Implementation of the Federated Learning relies on the Library Flower FL, This Framework creates a server and the desired number of clients and facilitates the communication of the devices through a local gRPC server. The machine learning model that is utilised by the clients is set when instantiating the FL module.

Configuration Guide

FL:

  • num_clients: manual configuration of how many clients you want created for the FL execution, this will be overridden in the context of a FLOMPS execution as the client count will be decided by the number of satellites present in the ingested TLE file.
  • num_rounds: Manual configuration of the number of rounds for the FL execution to process before returning the results

ML:

  • model_type: The machine learning model that will be utilised to train on the dataset (Current Model Inputs Supported: “ResNet50”, “SimpleCNN” )
  • data_set: the given dataset that will be distributed to the clients for training (Current Model Inputs Supported: “MNIST” )