Event detection using quantized binary code and spatial-temporal locality preserving projections

Hanhe Lin, Jeremiah D. Deng, Brendon J. Woodford

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)


We propose a new video manifold learning method for event recognition and anomaly detection in crowd scenes. A novel feature descriptor is proposed to encode regional optical flow features of video frames, where quantization and binarization of the feature code are employed to improve the differentiation of crowd motion patterns. Based on the new feature code, we introduce a new linear dimensionality reduction algorithm called "Spatial-Temporal Locality Preserving Projections" (STLPP). The generated low-dimensional video manifolds preserve both intrinsic spatial and temporal properties. Extensive experiments have been carried out on two benchmark datasets and our results compare favourably with the state of the art.

Original languageEnglish
Title of host publicationAI 2013
Subtitle of host publicationAdvances in Artificial Intelligence
EditorsStephen Cranefield, Abhaya Nayak
Place of PublicationCham
Number of pages12
ISBN (Electronic)978-3-319-03680-9
Publication statusPublished - 2013
Event26th Australasian Joint Conference on Artificial Intelligence - Dunedin, New Zealand
Duration: 1 Dec 20136 Dec 2013

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


Conference26th Australasian Joint Conference on Artificial Intelligence
Abbreviated titleAI 2013
Country/TerritoryNew Zealand


  • Anomaly detection
  • Event recognition
  • Manifold learning

ASJC Scopus subject areas

  • Theoretical Computer Science
  • General Computer Science


Dive into the research topics of 'Event detection using quantized binary code and spatial-temporal locality preserving projections'. Together they form a unique fingerprint.

Cite this