Decision Tree from Nodes - clumsyspeedboat/Decision-Tree-Neo4j GitHub Wiki

DTP_Methodology

Steps

  1. Query data - map training and testing data OR split 1 set of nodes into training and testing by defining ratio for training
  2. Run decision tree procedure - mention the class label (target attribute)

Create the decision tree based on nodes from the graph Database.

Create the Information Gain decision tree.

RETURN main.createTreeIG("targetAttribute","prune","max_depth")

Procedure to create the Information Gain DT based on the dataset from the Graph Database. "class_label" is the target attribute. After running the procedure, it will create the DT in Neo4j and also display the prediction time, generate time, confusion matrix and accuracy.

"prune": "True" if you want to prune the tree and "False" otherwise.

"max_depth": "depth level" when you want to prune and "0" otherwise. For example "3" for a depth level of 3.

Create the Gini Index decision tree.

RETURN main.createTreeGI("targetAttribute","prune","max_depth")

Procedure to create the Gini Index DT based on the dataset from the Graph Database. "class_label" is the target attribute. After running the procedure, it will create the DT in Neo4j and also display the prediction time, generate time, confusion matrix and accuracy.

"prune": "True" if you want to prune the tree and "False" otherwise.

"max_depth": "depth level" when you want to prune and "0" otherwise. For example "3" for a depth level of 3.

Create the Gain Ratio decision tree.

RETURN main.createTreeGR("targetAttribute","prune","max_depth")

Procedure to create the Gain Ratio DT based on the dataset from the Graph Database. "class_label" is the target attribute. After running the procedure, it will create the DT in Neo4j and also display the prediction time, generate time, confusion matrix and accuracy.

"prune": "True" if you want to prune the tree and "False" otherwise.

"max_depth": "depth level" when you want to prune and "0" otherwise. For example "3" for a depth level of 3.