Open Access Open Access  Restricted Access Subscription or Fee Access

Border Security Robot Vehicle with YOLO V4

Ashish Dewangan, Ankit Singh Chauhan, Abhinav Sahrawat, Aditya Kumar Singh, Abhishek Gautam


The objective of this project is to design and manufacture a prototype of Border
security surveillance UGV with YOLO V4 algorithm. This versatile robot vehicle use a remote camera method to track active persons, burning, hazardous chemicals, metals, and obstructions in remote regions and transmit data to a central location. To offer instant reaction from sensors, the suggested system employs YOLOV4 and machine intelligence. While the robot is functioning, the
vehicle is outfitted with sensors that may inform the user if an intruder enters the range and use intelligence system for further action. The car operates in a user-controllable mode in which all sensors, such as metal detectors, smoke detector, and ultrasonic sensors, are set to perform automatic actions and a gun surmounted on it. Using an RF module, the individual is able to handle the robot vehicle and relay information to it.User could watch the surroundings using the built-in camera and can give directions to change the path accordingly and can shoot the object the feel suspicious.


Surveillance; UGV; YOLO V4; Sensors; RF module; Metal detector; Ultrasonic sensor.

Full Text:



N.I. Ismail, H.S. Zurriati, M. Ali, A.A. Shariffuddin, and N.I. Kamel. "Computational aerodynamics study on neo-ptero micro unmanned aerial vehicle," Evergreen, 8 (2) 438-444 (2021).

A. Mishra, R. Priyadarshini, S. Choudhary, and R. M. Mehra, "Qualitative Analysis of Intra-Class and Inter-Class Clustering Routing and Clusterization in Wireless Sensor Network," Evergreen, 358-373 (2021).

H. T. Lee, W.C. Lin, C.H. Huang, and Y.J. Huang, “Wireless indoor surveillance robot,” In SICE Annual Conference 2011, 2164-2169, IEEE.

M. Grega A. Matiolanski, P. Guxik, and M. Leszczuk, ”Automated Detection of firearms and knives in a CCTV image,” Sensors, 16 (1) (2016).

H. Kukreja, N. Bharath, C. S. Siddesh, and S. Kuldeep. "An introduction to artificial neural network." Int J Adv Res Innov Ideas Educ, 1 27-30 (2016).

J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, ”You only look once: unified, real- time object detection,” in Proceedings of the IEEE Conference on Computer vision and pattern recognition, Las vegas, NV, USA, June 2016.

J. Gu, Z. Wang, J. Kuen, L. Ma, A. Shahroudy, B. Shuai, T. Liu et al. "Recent advances in convolutional neural networks." Pattern recognition 77 354-377 (2018).

T.N. Dief, and S. Yoshida, “System identification for quad-rotor parameters using neural network,” Evergreen, 3 (1) 6–11 (2016). doi:10.5109/1657380.

J. Wu, "Introduction to convolutional neural networks." National Key Lab for Novel Software Technology. Nanjing University. China 5 (23) 495 (2017).

L. Pang, H. Liu, Y. Chen, and J, Miao” Real-time concealed object detection from passive millimeter wave images based on the YOLOv4 algorithm,” Sensors, 20 (6) (2020).

P. Mustamo, "Object detection in sports: TensorFlow Object Detection API case study." University of Oulu 1 (2018).

S. Lagouvardos, J. Dolby, N. Grech, A. Antoniadis, and Y. Smaragdakis, "Static analysis of shape in TensorFlow programs." In 34th European Conference on Object-Oriented Programming (ECOOP 2020). Schloss Dagstuhl-Leibniz-Zentrum für Informatik, (2020).

A. Goyal, S.B. Anandamurthy, P. Dash, S. Acharya, D. Bathla, D. Hicks and P. Ranjan, “Automatic border surveillance using machine learning in remote video surveillance systems,” In Emerging Trends in Electrical, Communications, and Information Technologies, 751-760 (2020), Springer, Singapore.

R. Olmos, S. Tabik, and F. Herrera, “Automatic handgun detection alarm in videos using deep learning,” Neurocomputing, 275 66-72 (2018).

K.L. Masita, A.N. Hasan, and T. Shongwe, "Deep learning in object detection: A review." In 2020 International Conference on Artificial Intelligence, Big Data, Computing and Data Communication Systems (icABCD) IEEE, 1-11 (2020).

S.K. Pal, A. Pramanik, J. Maiti, and P. Mitra, "Deep learning in multi-object detection and tracking: state of the art." Applied Intelligence, 51 (9) 6400-6429 (2021).

S. Thomas, and A. Devi, “Design and implementation of unmanned ground vehicle (UGV) for surveillance and bomb detection using haptic arm technology,” In 2017 International Conference on Innovations in Green Energy and Healthcare Technologies (IGEHT), 1-5 (2017). IEEE.

P. Tokekar, H. Joshua Vander, D. Mulla, and V. Isler, "Sensor planning for a symbiotic UAV and UGV system for precision agriculture." IEEE Transactions on Robotics, 32 (6) 1498-1511 (2016).

L. Matthies, T. Litwin, K. Owens, A. Rankin, K. Murphy, D. Coombs, J. Gilsinn, S. Legowik, M. Nashman, and B. Yoshimi. "Performance evaluation of UGV obstacle detection with CCD/FLIR stereo vision and LADAR." In Proceedings of the 1998 IEEE International Symposium on Intelligent Control (ISIC) held jointly with IEEE International Symposium on Computational Intelligence in Robotics and Automation (CIRA) Intell, 658-670. IEEE (1998).

S.S. Haykin, and S.S. Haykin, eds. “Kalman filtering and neural networks,” 284. New York: Wiley, 2001.

O. Hosam, and A. Alraddadi, “K-means clustering and support vector machines approach for detecting fire weapons in cluttered scenes,” Life Science Journal, 11 (9) (2014).

Kale, V. Mandar, Ratnaparkhe, and A. Bhalchandra, "Sensors for landmine detection and techniques: A review." International Journal of Engineering Research and Technology, 2 (1) 1-7 (2013).

A. Mohamed, M. El-Gindy, and J. Ren, "Advanced control techniques for unmanned ground vehicle: literature survey," International journal of vehicle performance, 4 (1) 46-73 (2018).

O. Elijah, T.A. Rahman, I. Orikumhi, C.Y. Leow, and M.N. Hindia, “An overview of Internet of Things (IoT) and data analytics in Agriculture: Benefits and Challenges,” IEEE Internet of Things Journal, 5 (5) 3758-3773 (2018). doi: 10.1109/JIOT.2018.2844296. G. D. Nugraha, B. Sudiarto, and K. Ramli, “Machine learning-based energy management system for prosumer”, Evergreen, 7 (2) 309-313 (2020). doi: 10.5109/4055238



  • There are currently no refbacks.