Challenges and Solutions in Vision-Based Target Tracking for Autonomous Systems
Author(s):Ahmed S. El-Khidir1 and Fatima R. Al-Mahdi2
Affiliation: 1Department of Electrical Engineering, University of Tripoli, Tripoli, Libya 1Department of Computer Science, University of Tripoli, Tripoli, Libya
Page No: 9-22
Volume issue & Publishing Year: Volume 2 Issue 1,Jan-2025
Journal: International Journal of Advanced Engineering Application (IJAEA)
ISSN NO: 3048-6807
DOI: https://doi.org/10.5281/zenodo.17672456
Abstract:
This paper provides a comprehensive survey of advancements made over the past decade in the field of vision-based target tracking for autonomous vehicle navigation. The introduction begins by highlighting the motivations and wide-ranging applications of vision-based target tracking in autonomous vehicle navigation. These applications span various domains, underscoring the necessity for the development of robust algorithms capable of handling the diverse challenges faced by autonomous vehicles in dynamic environments. The discussion establishes that creating resilient vision-based tracking solutions is crucial for the efficient operation of autonomous systems. The review is organized into three primary categories: land, underwater, and aerial vehicles. Each category explores the specific techniques and methodologies developed for target tracking within its respective domain. For land-based autonomous vehicles, the focus is on approaches that manage obstacles, uneven terrains, and dynamic road conditions. In the context of underwater vehicles, challenges such as poor visibility, varying water conditions, and the need for energy-efficient operations are examined. For aerial vehicles, the discussion highlights the importance of precise tracking in three-dimensional space, which is critical for applications like surveillance, delivery, and disaster response. Additionally, the paper delves into the growing trend of integrating data fusion techniques to enhance the performance and robustness of vision-based target tracking systems. By combining data from multiple sensors and modalities, data fusion helps address limitations like occlusion, noise, and environmental variability, thereby improving tracking accuracy and reliability. Finally, the paper identifies several research challenges that remain unresolved, including issues like computational efficiency, real-time processing, and adapting to highly dynamic environments. It also explores potential future research directions, such as leveraging advancements in artificial intelligence, deep learning, and multi-modal data integration to further enhance the capabilities of vision-based target tracking systems for autonomous navigation.
Keywords: Computer Vision; Autonomous Vehicles; Mobile Robots; Target Tracking; Navigation; Sensor Data Fusion
Reference:
- [1] L. Mattos and E. Grant, “Passive sonar applications: target tracking and navigation of an autonomous robot,” in Proc. IEEE Int. Conf. Robotics and Automation, vol. 5, pp. 4265–4270, May 2004.
- [2] P. Trepagnier, J. Nagel, P. Kinney, M. Dooner, B. Wilson, C. Schneider, and K. Goeller, “Navigation and control system for autonomous vehicles,” Patent WO2008048707, 2008.
- [3] K. Fujimura and X. Liu, Patent US20087366325, 2008.
- [4] G. DeSouza and A. Kak, “Vision for mobile robot navigation: A survey,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 24, pp. 237–267, 2002.
- [5] D. Forsyth and J. Ponce, Computer Vision: A Modern Approach. Prentice Hall, USA, 2002.
- [6] T.-H. Li, S.-J. Chang, and W. Tong, “Fuzzy target tracking control of autonomous mobile robots using infrared sensors,” IEEE Trans. Fuzzy Syst., vol. 12, pp. 491–501, 2004.
- [7] P. Rives and J.-J. Borrelly, “Underwater pipe inspection using visual servoing techniques,” in Proc. IEEE/RSJ Int. Conf. Intelligent Robots and Systems (IROS), vol. 1, pp. 63–68, Sept. 1997.
- [8] S. Samarasekera, R. Kumar, T. Oskiper, Z. Zhu, O. Naroditsky, and H. Sawhney, “Unified framework for precise vision-aided navigation,” Patent US20080167814, 2008.
- [9] S. Nadimi and B. Bhanu, “Multistrategy fusion using mixture model for moving object detection,” in Proc. Int. Conf. Multisensor Fusion and Integration, pp. 317–322, Aug. 2001.
- [10] K. Kodagoda, Multi-Sensor Integration and Fusion in Mobile Robot Navigation, Ph.D. Thesis, Nanyang Technological Univ., China, 2003.
- [11] T. Shamir, “How should an autonomous vehicle overtake a slower moving vehicle,” IEEE Trans. Autom. Control, vol. 49, pp. 607–610, 2004.
- [12] Z. Qu, J. Wang, and C. Plaisted, “A new analytical solution to mobile robot trajectory generation in the presence of moving obstacles,” IEEE Trans. Robotics, vol. 20, pp. 978–993, 2004.
- [13] S. Yang and C. Luo, “A neural network approach to complete coverage path planning,” IEEE Trans. Syst., Man Cybern. B, vol. 34, pp. 718–724, 2004.
- [14] W. Whittaker, J. Johnston, and J. Ziglar, “Obstacle detection arrangements for autonomous vehicles,” Patent WO2008070205, 2008.
- [15] B. Yamauchi, “Autonomous mobile robot,” Patent WO2008013568, 2008.
- [16] B. Graf, M. Hans, and R. Schraft, “Mobile robot assistants,” IEEE Robot. Autom. Mag., vol. 2, pp. 67–77, 2004.
- [17] B. Jensen, G. Froidevaux, X. Greppin, A. Lorotte, L. Mayor, M. Meisser, G. Ramel, and R. Siegwart, “The interactive autonomous mobile system ROBOX,” in Proc. IEEE/RSJ Int. Conf. Intelligent Robots and Systems, vol. 2, pp. 1221–1227, Oct. 2002.
- [18] S. Katsura and K. Ohnishi, “Human cooperative wheelchair for haptic interaction,” IEEE Trans. Ind. Electron., vol. 51, pp. 221–228, 2004.
- [19] B. Jung and G. Sukhatme, “Region-based cooperative multi-target tracking,” in Proc. IEEE/RSJ IROS, vol. 3, pp. 2764–2769, 2002.
- [20] A. Balasuriya and T. Ura, “Underwater robots for cable following,” in Proc. IEEE/RSJ IROS, vol. 4, pp. 1811–1816, 2001.
- [21] F. Yang, Vision-Based Target Tracking and Navigation for AUVs, Master’s Thesis, Nanyang Technological Univ., China, 2001.
- [22] B. Balasuriya, Computer Vision for AUV Navigation, Ph.D. Thesis, Univ. of Tokyo, Japan, 1998.
- [23] G. Scholz and C. Rahn, “Profile sensing with an actuated whisker,” IEEE Trans. Robot. Autom., vol. 20, pp. 124–127, 2004.
- [24] C. Marques and P. Lima, “Multisensor navigation for nonholonomic robots,” IEEE Robot. Autom. Mag., vol. 11, pp. 70–82, 2004.
- [25] K. Skrzypczyk, “Game theory based target following by robot teams,” in Proc. Int. Workshop Robot Motion & Control, pp. 91–96, June 2004.
- [26] Y. Zhang and J. Wang, “Obstacle avoidance using dual neural networks,” IEEE Trans. Syst., Man Cybern. B, vol. 34, pp. 752–759, 2004.
- [27] P. Ogren and N. Leonard, “A convergent dynamic window approach,” IEEE Trans. Robotics, vol. 21, pp. 188–195, 2005.
- [28] F. Lamiraux, D. Bonnafous, and O. Lefebvre, “Reactive path deformation,” IEEE Trans. Robotics, vol. 20, pp. 967–977, 2004.
- [29] A. Yilmaz, O. Javed, and M. Shah, “Object tracking: A survey,” ACM Comput. Surv., vol. 38, 2006.
- [30] F. Dellaert, W. Burgard, D. Fox, and S. Thrun, “Condensation algorithm for mobile robot localization,” in Proc. CVPR, vol. 2, pp. 558–594, 1999.
- [31] R. Murrieta-Cid, M. Briot, and N. Vandapel, “Landmark identification in natural environments,” in Proc. IEEE/RSJ IROS, vol. 1, pp. 179–184, 1998.
- [32] R. Sim and G. Dudek, “Mobile robot localization from learned landmarks,” in Proc. IEEE/RSJ IROS, vol. 2, pp. 1060–1065, 1998.
- [33] S. Li and S. Tsuji, “Finding landmarks autonomously,” in Proc. 11th IAPR Conf., pp. 316–319, 1992.
- [34] A. Briggs, D. Scharstein, D. Braziunas, C. Dima, and P. Wall, “Mobile robot navigation using self-similar landmarks,” in Proc. ICRA 2000, vol. 2, pp. 1428–1434, Apr. 2000.
- [35] G. Jang, S. Lee, and I. Kweon, “Color landmark-based self-localization,” in Proc. IEEE ICRA, vol. 1, pp. 1037–1042, 2002.
- [36] R. Wei, R. Mahony, and D. Austin, “Bearing-only control law for stable docking,” in Proc. IEEE/RSJ IROS, vol. 3, pp. 3793–3798, 2003.
- [37] D. Breed, “Autonomous vehicle travel control systems,” Patent US20080161986, 2008.
- [38] N. Hirai and H. Mizoguchi, “Visual tracking of human back/shoulder,” in Proc. IEEE/ASME AIM, pp. 527–532, 2003.
- [39] J.-H. Lee, K. Morioka, and H. Hashimoto, “Physical distance-based HRI,” in Proc. IECON, vol. 4, pp. 3068–3073, 2002.
- [40] K. Morioka, J. Lee, and H. Hashimoto, “Human-following mobile robot,” IEEE Trans. Ind. Electron., vol. 51, pp. 229–237, 2004.
- [41] K. Morioka, J. Lee, and H. Hashimoto, “Physical agent for human following,” in Proc. IEEE/RSJ IROS, vol. 2, pp. 1234–1239, 2002.
- [42] K. Nishiwaki, S. Kagami, J. Kuffner, M. Inaba, and H. Inoue, “Online humanoid walking control,” in Proc. IEEE ICRA, pp. 911–916, 2003.
- [43] H. Kwon, Y. Yoon, J. Park, and A. Kak, “Person tracking with uncalibrated cameras,” in Proc. IEEE ICRA, pp. 2877–2883, 2005.
- [44] T. Ogata, S. Matsuda, J. Tan, and S. Ishikawa, “Real-time human motion recognition,” in Proc. IEEE SMC, vol. 6, pp. 5290–5295, 2004.
- [45] M. Doi, M. Nakakita, Y. Aoki, and S. Hashimoto, “Real-time vision for mobile robots,” in Proc. IEEE RO-MAN, pp. 442–449, 2001.
- [46] C.-C. Wang, SLAM and Moving Object Tracking, Ph.D. Thesis, Carnegie Mellon Univ., 2004.
- [47] M. Hajjawi and A. Shirkhodaie, “Cooperative visual tracking by robot teams,” in Proc. Southeastern Symp. System Theory, pp. 376–380, 2002.
- [48] D. Burschka and G. Hager, “Real-time visual tracking for navigation,” 2004, pp. 1–8.
- [49] N. Dao, B.-J. You, S.-R. Oh, and M. Hwangbo, “Visual self-localization using natural lines,” in Proc. IEEE/RSJ IROS, vol. 2, pp. 1252–1257, 2003.
- [50] C.-C. Wang and C. Thorpe, “SLAM with detection of moving objects,” in Proc. IEEE ICRA, vol. 3, pp. 2918–2924, 2002.
- [51] H. Karlsson and J. Nygards, “Efficient tracking using Kalman filter,” in Proc. IEEE/RSJ IROS, pp. 98–104, 2002.
- [52] R. Canals, A. Roussel, J.-L. Famechon, and S. Treuillet, “Vision-based target tracking system,” IEEE Trans. Ind. Electron., vol. 49, pp. 500–506, 2004.
- [53] M. Tomono, “Path planning with single-camera target finding,” in Proc. IEEE CIRA, pp. 465–470, 2003.
- [54] H. Zhang, K. Yuan, and J.-D. Liu, “Fast and robust vision for autonomous robots,” in Proc. IEEE Robotics, Intelligent Systems & Signal Processing, vol. 1, pp. 60–65, 2003.
- [55] J. Yu, S. Wang, and M. Tan, “Parallel algorithm for tracking robot fishes,” in Proc. IEEE Robotics, Intelligent Systems & Signal Processing, vol. 1, pp. 359–364, 2003.
- [56] M. J. Mazo, A. Speranzon, K. Johansson, and X. Hu, “Multi-robot tracking using directional sensors,” in Proc. IEEE ICRA, vol. 2, pp. 1103–1108, 2004.
- [57] A. Betser, P. Vela, and A. Tannenbaum, “Tracking flying vehicles using geodesic snakes,” in Proc. IEEE CDC, vol. 2, pp. 1649–1654, 2004.
- [58] W. Ng, G. Leng, and Y. Low, “Coordinated movement for searching cluttered environments,” in Proc. IEEE/RSJ IROS, vol. 1, pp. 400–405, 2004.
- [59] T. Tan and K. Baker, “Efficient gradient-based vehicle localization,” IEEE Trans. Image Process., vol. 9, pp. 1343–1356, 2000.
- [60] H. Gonzalez-Banos, C.-Y. Lee, and J.-C. Latombe, “Real-time combinatorial tracking,” in Proc. IEEE ICRA, vol. 2, pp. 1683–1690, 2002.
- [61] C.-Y. Lee, H. Gonzalez-Banos, and J.-C. Latombe, “Real-time tracking of unpredictable targets,” in Proc. ICARCV, vol. 4, pp. 596–601, 2002.
- [62] J. Ding, H. Kondou, H. Kimura, Y. Hada, and K. Takase, “Robust tracking for camera control,” in Proc. SICE, vol. 2, pp. 1191–1196, 2002.
- [63] D. Burschka and G. Hager, “Scene classification from disparity maps,” in Proc. ICPR, vol. 3, pp. 708–712, 2002.
- [64] R. Mori and F. Miyazaki, “GAG strategy for ball tracking,” in Proc. IEEE/RSJ IROS, vol. 1, pp. 281–286, 2002.
- [65] Q. Yu, M. Yang, and H. Wang, “Real-time ego-motion estimation,” in Proc. WCICA, vol. 6, pp. 4720–4724, 2004.
- [66] R. Murrieta, A. Sarmiento, and S. Hutchinson, “Maintaining a moving target in sensing range,” in Proc. IEEE/RSJ IROS, vol. 2, pp. 1184–1191, 2003.
- [67] C. Higgins and V. Pant, “A biomimetic VLSI sensor for visual tracking,” IEEE Trans. Circuits Syst. I, vol. 51, pp. 2384–2394, 2004.
- [68] T. Dang, C. Hoffmann, and C. Stiller, “Fusing optical flow and stereo disparity,” in Proc. IEEE ITS, pp. 112–117, 2002.
