CREATING ALERT MESSAGES BASED ON WILD ANIMAL ACTIVITY DETECTION USING HYBRID DEEP NEURAL NETWORKS

Authors

  • T.Narendranath Teja Author
  • Pujari Kiran Naik Author

DOI:

https://doi.org/10.64751/ijdim.2025.v4.n3.pp164-172

Keywords:

Animal detection, VGG-19, Bi-LSTM, deep learning, activity recognition, video surveillance, wild animal monitoring, alert system

Abstract

The escalating conflicts between humans and wildlife have become a pressing global concern, especially in regions where agricultural land, villages, and forest reserves coexist in close proximity. Wild animals, when entering human settlements, not only cause property and crop damage but also pose significant threats to human lives. Similarly, forest officials and conservation workers often face risks when monitoring animals, particularly during nighttime or in dense terrains. Therefore, a reliable and real-time animal activity detection and alert generation system is crucial. Traditional computer vision techniques and early machine learning models, while effective in certain contexts, often fall short in accuracy, adaptability, and scalability when applied to highly variable environments such as forests. Recent advances in deep learning, particularly convolutional neural networks (CNNs) and recurrent neural networks (RNNs), have enabled significant improvements in image and video analysis tasks. This paper introduces a hybrid model combining Visual Geometry Group Network (VGG-19) with Bidirectional Long Short-Term Memory (Bi-LSTM) networks for the detection, classification, and monitoring of animal activities in forest environments. By leveraging the spatial learning ability of CNNs and the temporal sequence modeling capability of LSTMs, the proposed model achieves enhanced accuracy in identifying animal types, tracking locomotion, and generating alert messages. The system incorporates preprocessing, object detection via YOLOR, classification through VGG-19, and sequential analysis with Bi-LSTM to produce context-aware SMS alerts containing animal type, location, and activity. The approach was evaluated using four benchmark datasets—Camera Trap, WildAnim, Hoofed Animal, and CDNet—comprising over 45,000 images and 90,000 video frames. Experimental results demonstrated an average classification accuracy of 98%, a mean Average Precision (mAP) of 77.2%, and a frame rate of 170 FPS, outperforming several state-of-the-art models such as YOLOv5, Faster R-CNN, and SSD. The research contributes not only a novel hybrid architecture but also a comprehensive experimental validation that establishes its robustness in real-world scenarios. Moreover, this system has significant implications for wildlife conservation, forest security, and rural safety. By providing timely alerts, it bridges the gap between advanced artificial intelligence techniques and practical conservation needs, thereby safeguarding both human and animal lives.

Downloads

Published

2025-09-02

How to Cite

T.Narendranath Teja, & Pujari Kiran Naik. (2025). CREATING ALERT MESSAGES BASED ON WILD ANIMAL ACTIVITY DETECTION USING HYBRID DEEP NEURAL NETWORKS. International Journal of Data Science and IoT Management System, 4(3), 164-172. https://doi.org/10.64751/ijdim.2025.v4.n3.pp164-172

Similar Articles

1-10 of 102

You may also start an advanced similarity search for this article.