Recognising Complex Activities with Histograms of Relative Tracklets

Sebastian Stein (Lead / Corresponding author), Stephen McKenna (Lead / Corresponding author)

Research output: Contribution to journalArticle

6 Citations (Scopus)
225 Downloads (Pure)

Abstract

One approach to the recognition of complex human activities is to use feature descriptors that encode visual inter-actions by describing properties of local visual features with respect to trajectories of tracked objects. We explore an example of such an approach in which dense tracklets are described relative to multiple reference trajectories, providing a rich representation of complex interactions between objects of which only a subset can be tracked. Specifically, we report experiments in which reference trajectories are provided by tracking inertial sensors in a food preparation sce-nario. Additionally, we provide baseline results for HOG, HOF and MBH, and combine these features with others for multi-modal recognition. The proposed histograms of relative tracklets (RETLETS) showed better activity recognition performance than dense tracklets, HOG, HOF, MBH, or their combination. Our comparative evaluation of features from accelerometers and video highlighted a performance gap between visual and accelerometer-based motion features and showed a substantial performance gain when combining features from these sensor modalities. A considerable further performance gain was observed in combination with RETLETS and reference tracklet features.
Original languageEnglish
Pages (from-to)82-93
Number of pages19
JournalComputer Vision and Image Understanding
Volume154
Early online date1 Sep 2016
DOIs
Publication statusPublished - Jan 2017

    Fingerprint

Keywords

  • activity recognition
  • relative tracklets
  • sensor fusion
  • food preparation

Cite this