Agent - randyhook/knynet GitHub Wiki

Architecture Overview

images/agent-architecture.png

In this overview we see SensoryData being sent to the Agent. Here the Agent will check if it has an appropriate Agency to handle the data. For example, if the SensoryData passed in is audio data, and the Agent is aware it has an AudioAgency, it will pass the SensoryData to it and in return receive AudioEncoded, a sublcass of SensoryEncoded. Further, if the AudioAgency determines the audio data contains natural language, it will engage assistance from the NaturalLanguageAgency if present. SensoryEncoded will still be returned, but with richer data in the form of SpokenLanguage, a subclass of AudioEncoded.

Note that if the Agent cannot determine the appropriate Agency to handle the SensoryData, it simply stores the raw data in its KnowledgeBase as is.

Now that the Agent has SensoryEncoded data to deal with, it stores the data as a type of Knowledge. Combining this new knowledge with existing knowledge, the Agent may be decide that action is necessary. If so, the intended action is passed into Laws to make certain the Three Laws of Robotics are obeyed. If the Laws test is successful, the robot takes action.