nlu understanding - socrob/mbot_documentation GitHub Wiki
Google syntaxnet
Assign POS tags to words:
example (from pedro thesis page 36):
find - verb
room - noun
it - pronoun
Syntaxnet will build a tree from which the whole sentence can be divided into phrases.
NLTK:
Is the competitor of google syntaxnet but does not add the relation between the words which is very much used in the approach.
Transform words into feature vectors
This is currently being done by using word2vec, however a better way would be to use Stanford GloVe
About the neural network
LSTM (long short term memory) is used.
About the two neural networks + the SVM
The first NN inputs phrases and outputs actions.
for each action a (second) NN is used to detect arguments (slots).
The probability values (similar not exactly probability values) for each action are the input to the SVM that will output if the action is contained on the set of actions or if it is other. If it is other then the result is empty and you will get a NO INTERPRETATION outcome. This works as a threshold and in fact a threshold would be easier to implement but prof. Rodrigo wanted a SVM in this step. The threshold value would have to be manually obtained by testing multiple times.
Slot filling
Is the process of selecting the arguments of the commands.
example: bring me the water
here the command is "bringing" (for fbm3, in GPSR it is take) the arguments are me which is the beneficiary or person and then water which is an object. In fbm3 "the water" is the theme.