Visual tracking

Our laboratory has extensive decade-long track of working on visual tracking methods as basic reserarch and applying the algorithms to real life scenarios (e.g. robots, drones).

Discriminative correlation filter tracking

We explore online learning of target visual models via discriminative correlation filters. The research spans hand-crafted features and optimization techniques for CPU-based tracking as well as deep learning variants with discriminative feature adaptation and online segmentation.

Visual object tracking performance evaluation

One of the problems of visual tracking evaluation is a lack of a consistent evaluation methodology. This is hampering the cross-paper tracker comparison and faster advancement of the field. In our research we investigate different aspects of tracking evaluation. A continuous effort that is a part of our work is also the Visual Object Tracking Challenge (VOT).

Apparent motion patterns

We propose to go beyond pre-recorded benchmarks with post-hoc annotations by presenting an approach that utilizes omnidirectional videos to generate realistic, consistently annotated, short-term tracking scenarios with exactly parameterized motion patterns..

Drone tracking

The tracking algorithms we developed can be applied to autonomous robots like drones. Here are some results from this research application.

Local-global visual models in visual tracking

We addresses the problem of tracking objects which undergo rapid and significant appearance changes. We explore coupled-layer visual models that combines the target's global and local appearance.

Multiple interacting target tracking

We addressed the problem of tracking visually similar multiple interacting targets. The problem is formulated in the probabilistic framework of particle filters. Combined color/motion visual models and new methods for modelling highly dynamic targets are explored. The presented solutions have bee applied to commercial sports tracking systems we developed between 2006-2013.