Population growth, climate change, and the worldwide COVID-19 pandemic are imposing increasing pressure on global agricultural production. The challenge of increasing crop yield while ensuring sustainable development of environmentally friendly agriculture is a common issue throughout the world. Autonomous systems, sensing technologies, and artificial intelligence offer great opportunities to tackle this issue. In precision agriculture (PA), non-destructive and non-invasive remote and proximal sensing methods have been widely used to observe crops in visible and invisible spectra. Nowadays, the integration of high-performance imagery sensors (e.g., RGB, multispectral, hyperspectral, thermal, and SAR) and unmanned mobile platforms (e.g., satellites, UAVs, and terrestrial agricultural robots) are yielding a huge number of high-resolution farmland images, in which rich crop information is compressed. However, this has been accompanied by challenges, i.e., ways to swiftly and efficiently making full use of these images, and then, to perform fine crop management based on information-supported decision making. In the past few years, deep learning (DL) has shown great potential to reshape many industries because of its powerful capabilities of feature learning from massive datasets, and the agriculture industry is no exception. More and more agricultural scientists are paying attention to applications of deep learning in image-based farmland observations, such as land mapping, crop classification, biotic/abiotic stress monitoring, and yield prediction. To provide an update on these studies, we conducted a comprehensive investigation with a special emphasis on deep learning in multiscale agricultural remote and proximal sensing. Specifically, the applications of convolutional neural network-based supervised learning (CNN-SL), transfer learning (TL), and few-shot learning (FSL) in crop sensing at land, field, canopy, and leaf scales are the focus of this review. We hope that this work can act as a reference for the global agricultural community regarding DL in PA and can inspire deeper and broader research to promote the evolution of modern agriculture.
To solve the complexity of the traditional motion intention recognition method using a multi-mode sensor signal and the lag of the recognition process, in this paper, an inertial sensor-based motion intention recognition method for a soft exoskeleton is proposed. Compared with traditional motion recognition, in addition to the classic five kinds of terrain, the recognition of transformed terrain is also added. In the mode acquisition, the sensors’ data in the thigh and calf in different motion modes are collected. After a series of data preprocessing, such as data filtering and normalization, the sliding window is used to enhance the data, so that each frame of inertial measurement unit (IMU) data keeps the last half of the previous frame’s historical information. Finally, we designed a deep convolution neural network which can learn to extract discriminant features from temporal gait period to classify different terrain. The experimental results show that the proposed method can recognize the pose of the soft exoskeleton in different terrain, including walking on flat ground, going up and downstairs, and up and down slopes. The recognition accuracy rate can reach 97.64%. In addition, the recognition delay of the conversion pattern, which is converted between the five modes, only accounts for 23.97% of a gait cycle. Finally, the oxygen consumption was measured by the wearable metabolic system (COSMED K5, The Metabolic Company, Rome, Italy), and compared with that without an identification method; the net metabolism was reduced by 5.79%. The method in this paper can greatly improve the control performance of the flexible lower extremity exoskeleton system and realize the natural and seamless state switching of the exoskeleton between multiple motion modes according to the human motion intention.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.