ideas - hassony2/inria-research-wiki GitHub Wiki
Ideas
-
visually detect contact (using proximity, shade, finger appearance (whitening?)
-
as in SMPLify find 3D mesh for exisiting model and infer joints (a bit done in nyu-dataset where depth maps are generated from model to fit target)
-
use hand parts (possibility to create 3D synthetic dataset)
-
unsupervised approach for grasp classification
-
consider that optimal grasp is used
-
visually infer independent contact regions (see definition in grasps)