Publication:
Object tracking in the presence of occlusions using multiple cameras: a sensor network approach

dc.contributor.authorErcan, Ali Özer
dc.contributor.authorEl Gamal, A.
dc.contributor.authorGuibas, L. J.
dc.contributor.departmentElectrical & Electronics Engineering
dc.contributor.ozuauthorERCAN, Ali Özer
dc.date.accessioned2014-07-08T06:29:25Z
dc.date.available2014-07-08T06:29:25Z
dc.date.issued2013
dc.descriptionDue to copyright restrictions, the access to the full text of this article is only available via subscription.en_US
dc.description.abstractThis article describes a sensor network approach to tracking a single object in the presence of static and moving occluders using a network of cameras. To conserve communication bandwidth and energy, we combine a task-driven approach with camera subset selection. In the task-driven approach, each camera first performs simple local processing to detect the horizontal position of the object in the image. This information is then sent to a cluster head to track the object. We assume the locations of the static occluders to be known, but only prior statistics on the positions of the moving occluders are available. A noisy perspective camera measurement model is introduced, where occlusions are captured through occlusion indicator functions. An auxiliary particle filter that incorporates the occluder information is used to track the object. The camera subset selection algorithm uses the minimum mean square error of the best linear estimate of the object position as a metric, and tracking is performed using only the selected subset of cameras.Using simulations and preselected subsets of cameras, we investigate (i) the dependency of the tracker performance on the accuracy of the moving occluder priors, (ii) the trade-off between the number of cameras and the occluder prior accuracy required to achieve a prescribed tracker performance, and (iii) the importance of having occluder priors to the tracker performance as the number of occluders increases. We find that computing moving occluder priors may not be worthwhile, unless it can be obtained cheaply and to high accuracy. We also investigate the effect of dynamically selecting the subset of camera nodes used in tracking on the tracking performance. We show through simulations that a greedy selection algorithm performs close to the brute-force method and outperforms other heuristics, and the performance achieved by greedily selecting a small fraction of the cameras is close to that of using all the cameras.en_US
dc.description.sponsorshipDARPA Microsystems Technology Office ; NSF ; CNS ; ARO ; DoD Multidisciplinary University Research Initiative.
dc.identifier.doi10.1145/2422966.2422973
dc.identifier.issn1550-4867
dc.identifier.issue2
dc.identifier.scopus2-s2.0-84876066534
dc.identifier.urihttp://hdl.handle.net/10679/461
dc.identifier.urihttps://doi.org/10.1145/2422966.2422973
dc.identifier.volume9
dc.identifier.wos000316964400007
dc.language.isoengen_US
dc.peerreviewedyesen_US
dc.publicationstatuspublisheden_US
dc.publisher Association for Computing Machineryen_US
dc.relation.ispartofACM Transactions on Sensor Networks
dc.relation.publicationcategoryInternational Refereed Journal
dc.rightsrestrictedAccess
dc.subject.keywordsAuxiliary particle filteren_US
dc.subject.keywordsCamera sensor networken_US
dc.subject.keywordsCollaborative signal processingen_US
dc.subject.keywordsNoisy perspective camera modelen_US
dc.subject.keywordsOcclusionen_US
dc.subject.keywordsSelectionen_US
dc.subject.keywordsSensor fusionen_US
dc.subject.keywordsSensor taskingen_US
dc.subject.keywordsTrackingen_US
dc.titleObject tracking in the presence of occlusions using multiple cameras: a sensor network approachen_US
dc.typearticleen_US
dspace.entity.typePublication
relation.isOrgUnitOfPublication7b58c5c4-dccc-40a3-aaf2-9b209113b763
relation.isOrgUnitOfPublication.latestForDiscovery7b58c5c4-dccc-40a3-aaf2-9b209113b763

Files

License bundle

Now showing 1 - 1 of 1
Placeholder
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed upon to submission
Description: