Recently, many methods to interpret and visualize deep neural network predictions have been proposed and significant progress has been made. However, a more class-discriminative and visually pleasing explanation is required. Thus, this paper proposes a region-based approach that estimates feature importance in terms of appropriately segmented regions. By fusing the saliency maps generated from multi-scale segmentations, a more class-discriminative and visually pleasing map is obtained. We incorporate this regional multi-scale concept into a prediction difference method that is model-agnostic. An input image is segmented in several scales using the super-pixel method, and exclusion of a region is simulated by sampling a normal distribution constructed using the boundary prior. The experimental results demonstrate that the regional multi-scale method produces much more class-discriminative and visually pleasing saliency maps.
Crop monitoring is highly important in terms of the efficient and stable performance of tasks such as planting, spraying, and harvesting, and for this reason, several studies are being conducted to develop and improve crop monitoring robots. In addition, the applications of deep learning algorithms are increasing in the development of agricultural robots since deep learning algorithms that use convolutional neural networks have been proven to show outstanding performance in image classification, segmentation, and object detection. However, most of these applications are focused on the development of harvesting robots, and thus, there are only a few studies that improve and develop monitoring robots through the use of deep learning. For this reason, we aimed to develop a real-time robot monitoring system for the generative growth of tomatoes. The presented method detects tomato fruits grown in hydroponic greenhouses using the Faster R-CNN (region-based convolutional neural network). In addition, we sought to select a color model that was robust to external light, and we used hue values to develop an image-based maturity standard for tomato fruits; furthermore, the developed maturity standard was verified through comparison with expert classification. Finally, the number of tomatoes was counted using a centroid-based tracking algorithm. We trained the detection model using an open dataset and tested the whole system in real-time in a hydroponic greenhouse. A total of 53 tomato fruits were used to verify the developed system, and the developed system achieved 88.6% detection accuracy when completely obscured fruits not captured by the camera were included. When excluding obscured fruits, the system’s accuracy was 90.2%. For the maturity classification, we conducted qualitative evaluations with the assistance of experts.
Will a person be seen as more superior if he or she receives an award in front of a large audience in comparison with a small audience? We predicted that this would hold true for East Asians, whose cultural logic of face asserts that a person's worth can only be conferred by collective others, but would not hold true for European Americans, whose cultural logic of dignity promotes the judgement of a person's worth based on their own perspective. This study found an audience‐size effect for East Asians, in which participants gave higher appraisals to a target when they imagined the target's high performance to have been seen by 10 other people (vs. one other person) even when the target's performance level remained constant. In contrast, Westerners were not affected by the size of the audience witnessing the target's performance. In addition, perceived social reputation was found to mediate the audience‐size effect; the participants imagining the target performing well in front of 10 others (vs. one other) perceived others as thinking more highly of the target; this in turn led participants to give higher appraisals to the target. As expected, this mediation effect was only found for East Asians.
The rating of perceived exertion (RPE) scale has been found to reflect physiological responses, and this study aimed to assess the validity of using the Borg CR-10 scale and velocity loss to evaluate muscle fatigue quantified by surface electromyography during back squat (BS) exercise. A total of 15 collegiate male athletes underwent three non-explosive BS tasks comprising low, medium, and high volumes at 65% of their one-repetition maximum. RPEs, spectral fatigue index (SFI), and velocity loss during BS exercise were assessed throughout the trials. Significant differences in overall RPE (p < 0.001) and average SFI (p < 0.05) were observed between the conditions, whereas no significant difference was observed in average velocity loss. Significant increases in RPE and SFI (p < 0.001) were observed within the exercise process, whereas a significant increase in velocity loss was not observed. Correlation analyses indicated a significant correlation between RPE and SFI obtained during exercise (r = 0.573, p < 0.001). However, no significant correlation was observed between velocity loss and SFI. These results demonstrated that RPE could be used as a muscle fatigue predictor in BS exercise, but that velocity loss may not reflect muscle fatigue correctly when participants cannot and/or are not required to perform BS explosively. Furthermore, practitioners should not use velocity loss as a muscle fatigue indicator in some resistance exercise situations, such as rehabilitation, beginner, and hypertrophy programs.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.