The International Arab Journal of Information Technology (IAJIT)


Occlusion-aware Visual Tracker using Spatial Structural Information and Dominant Features

To overcome the problem of occlusion in visual tracking, this paper proposes an occlusion-aware tracking algorithm. The proposed algorithm divides the object into discrete image patches according to the pixel distribution of the object by means of clustering. To avoid the drifting of the tracker to false targets, the proposed algorithm extracts the dominant features, such as color histogram or histogram of oriented gradient orientation, from these image patches, and uses them as cues for tracking. To enhance the robustness of the tracker, the proposed algorithm employs an implicit spatial structure between these patches as another cue for tracking; Afterwards, the proposed algorithm incorporates these components into the particle filter framework, which results in a robust and precise tracker. Experimental results on color image sequences with different resolutions show that the proposed tracker outperforms the comparison algorithms on handling occlusion in visual tracking.

[1] Abdennour S. and Hicham T., “Parallel Particle Filters for Multiple Target Tracking,” The International Arab Journal of Information Technology, vol. 13, no. 6, pp. 708-715, 2016.

[2] Comaniciu D., Ramesh V., and Meer P., “Real- Time Tracking of Non-Rigid Objects Using Mean Shift,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, Hilton Head Island, pp. 142-149, 2000.

[3] Danelljan M., Khan F., Felsberg M., and Weijer J., “Adaptive Color Attributes for Real-Time Visual Tracking,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, Columbus, pp. 1090-1097, 2014.

[4] Dhassi Y. and Aarab A., “Visual Tracking Based on Adaptive Interacting Multiple Model Particle Filter by Fusing Multiples Cues,” Multimedia Tools and Applications, vol. 77, no. 1, pp. 1-34, 2018.

[5] Fang Y., Wang C., Yao W., Zhao X., Zhao H., and Zha H., “On-Road Vehicle Tracking Using Part-Based Particle Filter,” IEEE Transactions on Intelligent Transportation Systems, vol. 20, no. 12, pp. 4538-4552, 2019.

[6] Henriques J., Caseiro R., Martins P., and Batista J., “High-Speed Tracking with Kernelized Correlation Filters,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 37, no. 3, pp. 583-596, 2015.

[7] Kohonen O. and Hauta-Kasari M., “Distance Measures in the Training Phase of Self- Organizing Map for Color Histogram Generation in Spectral Image Retrieval,” Journal of Imaging Science and Technology, vol. 52, no. 2, pp. 430- 441, 2008.

[8] Lan X., Ma A., Yuen P., and Chellappa R., “Joint Sparse Representation and Robust Feature-Level Fusion for Multi-Cue Visual Tracking,” IEEE Transactions on Image Processing, vol. 24, no. 12, pp. 5826, 2015.

[9] Liu H., Wang C., and Gao Y., “Scene-Adaptive Hierarchical Data Association and Depth- Invariant Part-Based Appearance Model for Indoor Multiple Objects Tracking,” CAAI Transactions on Intelligence Technology, vol. 1, no. 3, pp. 210-224, 2016.

[10] Martinez B., Vivet M., and Binefa X., “Compatible Particles for Part-Based Tracking,” in Proceedings of International Conference on Articulated Motion and Deformable Objects, Mallorca, pp. 1-10, 2010.

[11] Meng C. and Zhang X., “Object Tracking Method Based on Particle Filter of Adaptive Patches Combined with Multi-Features Fusion,” Multimedia Tools and Applications, vol. 78, pp. 8799-8811, 2019.

[12] Morales N., Toledo J., Acosta L., and Sánchez- Medina J., “A Combined Voxel and Particle Filter-Based Approach for Fast Obstacle Detection and Tracking in Automotive Applications,” IEEE Transactions Intelligent Transportation Systems, vol. 18, no. 7, pp. 1824- 1834, 2017.

[13] Nanaware V., Nerkar M., and Patil C., “Novel Development of Fast Processing Algorithm for the Moving Object Detection in RT Videos Using Histogram Orientation Gradient Method,” 198 The International Arab Journal of Information Technology, Vol. 18, No. 2, March 2021 in Proceedings of IEEE International Conference on Power, Control, Signals and Instrumentation Engineering, Chennai, pp. 2490-2495, 2017.

[14] Okuma K., Taleghani A., Freitas N., Little J., and Lowe D., “A Boosted Particle Filter: Multitarget Detection and Tracking,” in Proceedings of European Conference on Computer Vision, Prague, pp. 28-39, 2004.

[15] Pérez P., Hue C., Vermaak J., and Gangnet M., “Color-Based Probabilistic Tracking,” European in Proceedings of Conference on Computer Vision, Copenhagen, pp. 661-675, 2002.

[16] Sadeghian A., Alahi A., and Savarese S., “Tracking the Untrackable: Learning to Track Multiple Cues with Long-Term Dependencies,” in Proceedings of IEEE International Conference on Computer Vision, Venice, pp. 300-311, 2017.

[17] Sevilla-Lara L. and Learned-Miller E., “Distribution Fields for Tracking,” in Proceedings of IEEE Conference on CVPR, Providence, pp. 1910-1917, 2012.

[18] Shu G., Dehghan A., Oreifej O., Hand E., and Shah M., “Part-Based Multiple-Person Tracking with Partial Occlusion Handling,” IEEE Conference on Computer Vision and Pattern Recognition, Providence, pp. 1815-1821, 2012.

[19] Walia G. and Kapoor R., “Online Object Tracking via Novel Adaptive Multicue Based Particle Filter Framework for Video Surveillance,” International Journal of Artificial Intelligence Tools, vol. 27, no. 6, pp. 1850023, 2018.

[20] Wang X., Hou Z., Yu W., Pu L., Jin Z., and Qin X., “Robust Occlusion-Aware Part-based Visual Tracking with Object Scale Adaptation,” Pattern Recognition, vol. 81, no. 9, pp. 456-470, 2018.

[21] Wu Y., Cheng J., Wang J., Lu H., Wang H., Ling H., Blasch E., and Bai L., “Real-Time Probabilistic Covariance Tracking With Efficient Model Update,” IEEE Transactions on Image Processing, vol. 21, no. 5, pp. 2824-2837, 2012.

[22] Yang F., Lu H., and Yang M., “Robust Superpixel Tracking,” IEEE Transactions on Image Processing, vol. 23, no. 4, pp. 1639-1651, 2014.

[23] Yao R., Shi Q., Shen C., Zhang Y., and Hengel A., “Part-Based Robust Tracking Using Online Latent Structured Learning,” IEEE Transactions on Circuits and Systems for Video Technology, vol. 27, no. 6, pp.1235-1248, 2017.

[24] Zhang D., Zhang Z., Zou L., Xie Z., He F., Wu Y., and Tu Z., “Part-Based Visual Tracking with Spatially Regularized Correlation Filters,” The Visual Computer, vol. 36, no. 2, pp. 509-527, 2020.

[25] Zhang T., Liu S., Xu C., Liu B., and Yang M., “Correlation Particle Filter for Visual Tracking,” IEEE Transactions on Image Processing, vol. 27, no. 99, pp. 2676-2687, 2018.

[26] Zhou X., Xie L., Zhang P., and Zhang Y., “An Ensemble of Deep Neural Networks for Object Tracking,” in Proceedings of IEEE International Conference on Image Processing, Paris, pp. 843- 847, 2014.