Sensor-Independent Vision-based Terrain Classification for Autonomous Outdoor Robot Navigation
DOI:
https://doi.org/10.64751/ijdim.2026.v5.n2(2).785Keywords:
Terrain segregation, Feature extraction, MobileNetV2, Outdoor navigation, Robotic perceptionAbstract
Autonomous outdoor robots are being widely adopted across critical sectors such as agriculture, environmental surveillance, disaster response, security monitoring, and smart transportation systems. Their effectiveness in navigating complex and unstructured environments is highly dependent on precise terrain identification, as outdoor surfaces vary significantly in texture, composition, and stability. Traditional navigation techniques that rely on sensors like ultrasonic, infrared, or Light Detection and Ranging (LiDAR) often face limitations in accurately interpreting visually complex or ambiguous terrains, leading to reduced navigation reliability and performance. Additionally, conventional terrain recognition methods that depend on manual inspection or basic sensor data are generally time-intensive, susceptible to errors, and lack scalability for real-time large-scale applications. To overcome these challenges, this project introduces an automated vision-based terrain classification framework that combines computer vision with machine learning to improve robotic navigation efficiency. The system employs MobileNetV2 as a deep feature extractor to obtain meaningful visual patterns from terrain images, which are then classified using multiple supervised learning models, including Logistic Regression Classifier (LRC), Naïve Bayes Classifier (NBC), Ridge Classifier (RC), and eXtreme Gradient Boosting (XGBoost). The workflow incorporates structured image acquisition, preprocessing, feature extraction, and classification to ensure accurate multi-class terrain recognition. By minimizing reliance on traditional sensors and manual processes, the proposed approach enhances accuracy, scalability, robustness, and cost efficiency, ultimately enabling safer and more intelligent autonomous robotic operations in diverse outdoor conditions.
Downloads
Published
Issue
Section
License

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.






