In this paper we study ability to sparse the neural networks which are used in the task of person re-identification in multi-camera CCTV systems. Sparse neural network allows significant reducing of the computation complexity. It means increasing of processing speed for huge volume
of data. The main idea of our research is decreasing of computation complexity with simultaneous preserving of neural network efficiency in the task of person re-identification. The paper consists of 4 parts and conclusion. The first part – introduction, describes sphere of person re-identification
applying and the key problems in this field. The second part – related work, describes main state-of-the-art approaches related to person re-identification techniques. In the third part called proposed approach, we formulate technique of sparse neural network learning. The fourth part
– experiment results, describes experiment conditions, constrains, training datasets, and results. In the conclusion we make our proposal on new technique usage.
No References for this article.
No Supplementary Data.
No Article Media
DOMAIN-SPECIFIC FEATURE EXTRACTION;
Document Type: Research Article
Publication date: 29 January 2017
More about this publication?
For more than 30 years, the Electronic Imaging Symposium has been serving those in the broad community - from academia and industry - who work on imaging science and digital technologies. The breadth of the Symposium covers the entire imaging science ecosystem, from capture (sensors, camera) through image processing (image quality, color and appearance) to how we and our surrogate machines see and interpret images. Applications covered include augmented reality, autonomous vehicles, machine vision, data analysis, digital and mobile photography, security, virtual reality, and human vision. IS&T began sole sponsorship of the meeting in 2016. All papers presented at EIs 20+ conferences are open access.
Please note: For purposes of its Digital Library content, IS&T defines Open Access as papers that will be downloadable in their entirety for free in perpetuity. Copyright restrictions on papers vary; see individual paper for details.