ABSTRACT

Our research group focuses on developing application for agricultural (agri-) advancement and security issues. In this study, we aimed to develop a deep learning based manually drive drone-based system for agri-workplaces to capture and send visual data (e.g., stumbling workers, shivering workers who may be unwell) to inform agri-managers and the workers' family members. The system is based on a small-sized drone named Squared Cam ver. 1.0 (G-FORCE Inc., Shenzhen, China) that is covered with full-protection frames. Furthermore, we use an NVIDIA Jetson Nano (NVIDIA Inc., California, USA) with an NVIDIA artificial intelligence (AI) oriented chip. The web camera CMS-V36 BK (SANWA Supply Inc., Okayama, Japan) mounted on the drone finds diverse objects using a You Only Look Once (YOLO)-based algorithm. We implemented indoor experimental trials using video datasets previously obtained in real outdoor farmland. The dataset contains experienced and inexperienced subjects performing 1) working motions in their sitting positions naturally and 2) stumbling motions in four directions. Our proposed deep learning and drone-based real-time posture-judgement function differs from existing mobile agri-machine techs and products with respect to its main aims and structural features. We confirm the system’s practical benefit and potential to improve agri-security levels.