ieee 7139367 - hassony2/inria-research-wiki GitHub Wiki

ICRA 2015

[ieee-7139367] A Scalable Approach for Understanding the Visual Structures of Hand Grasp [PDF] [notes]

Minjie Cai, Kris M Kitani, Yoichi Sato

Objective

Learn visual appearances of grasps

Infer through visual clustering grasp structure consistent with expert-designed taxonomies

Synthesis

Pipeline

  • Segmentation (claimed state of the art)

    • multi-model hand detector : collection of hand pixel classifiers
    • for given grame, color histogram determines best hand classifiers
    • Output : probability map at pixel level
    • fix size bounding box by binarizing proba map with threshold, max 2 regions are preserved
  • Grasp visual features

    • HOG for hand shape
    • SIFT + BOW representation for object contaxt
  • Classification

    • one vs all multi-class grasp classifiers to discriminate between grasps as defined by Feix's taxonomy
  • Visual similarity estimation

    • based on misclassification
    • symetric : sum of misclassified between two grasps / instances of the two grasps
    • iteratively merge of the 2 most similar grasps and retrain classifier on new grasps to extract structure
    • deduce grasp dendrogram (tree diagram of taxonomic relationships)

Dataset

UT Grasp Dataset : 17 grasp types (subset of Feix's taxonomy)