Sensors - randyhook/knynet GitHub Wiki
Sensor Overview
Robots contain components to sense their environment, collectively classified as sensors. All incoming sensor data is packaged as an instance of the SensoryData class. In this way, the Agent can handle each kind of data (audio, visual, etc.) in a standard manner.
images/sensing-the-environment.png
The diagram depicts one audio and one visual sensor, but in reality their can be several more sensors and several more sensor types. Each sensor passes on the data it perceives to the agent's SensoryProcessor. The SensoryProcessor packages the data as SensoryEncoded. This allows the Agent to handle all incoming sensor data in a standard way.
Also note that SensoryProcessor is not part of the individual sensor units. In this way, existing sensors on the market may be used allowing the Agent software to manipulate the data that is output from them.
SensoryData and SensoryEncoded
SensoryData is raw data obtained from the environment. SensoryEncoded is SensoryData after it has been processed by Agencies employed by the Agent.
images/sensorydata-and-sensoryencoded.png
SensoryData is sent to the Decision Engine (1). This data is then used by any of the Agent's Agencies that are interested (2). For example, SensoryData obtained from an audio sensor may be of interest to both the general AudioAgency and the more specific NaturalLanguageAgency. The agencies process this data using their own particular methods and send back SensoryEncoded data to the DecisionEngine (3). For instance, the NaturalLanguageAgency could process the raw SensoryData into SpokenLanguage, which derives from SensoryEncoded. The Decision Engine then takes this data and may do several things with it, one of which may be storing the data in its knowledge base (4).