表題番号:2018K-234 日付:2019/04/07
研究課題視覚情報等に基づく人体動作の自動教師システムの研究
研究者所属(当時) 資格 氏名
(代表者) 理工学術院 創造理工学部 教授 大谷 淳
研究成果概要
Recently, with the development of computer vision, achieving automatic guitar fingering teaching systems has been attracting a lot of attentions of academic research.  This research proposes a Particle Filter combined with ROI based method for (1) hand extraction, (2) fingertip candidate detection, and (3) fingertip tracking, as follows: (1) we propose an end-to-end CNN framework (some related works also call it as FCN, Fully-connected Network. This network is a dynamic variation of CNN, but we still call it CNN for respecting the origin) with hundreds of labelled hand segmentation images, which could segment the hand area at each frame of guitar playing videos accurately; (2) we combine Template Matching and reversed Hough Transform as the features foraccurately locating fingertip candidates; (3) we apply a temporal-grouping for the candidates based on ROI (region of interest) association to group the same fingertip candidates on consecutive frames and distribute particles in the surrounding area centered at each of the associated fingertip candidates to address the fast movements and self-occlusions of the fingertips.  
Experiments are conducted using videos of guitar plays under different conditions. For the hand region segmentation and fingertip tracking, the proposed method outperforms the related works.