2021
DOI: 10.3390/ani11102774
|View full text |Cite
|
Sign up to set email alerts
|

Automatic Recognition of Fish Behavior with a Fusion of RGB and Optical Flow Data Based on Deep Learning

Abstract: The rapid and precise recognition of fish behavior is critical in perceiving health and welfare by allowing farmers to make informed management decisions on recirculating aquaculture systems while reducing labor. The conventional recognition methods are to obtain movement information by implanting sensors on the skin or in the body of the fish, which can affect the normal behavior and welfare of the fish. We present a novel nondestructive method with spatiotemporal and motion information based on deep learning… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
12
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 32 publications
(13 citation statements)
references
References 26 publications
1
12
0
Order By: Relevance
“…According to Laughlin et al (2017), the behavioural activity of organisms represents the final cohesive result of diversified physiological and biochemical alterations. The observed behavioural alterations in this study agree with previous reports by (Pulgar et al, 2019;Xia et al, 2018;Wang et al, 2021;Boyle et al, 2020). The behavioural changes observed may be attributed to the neurotoxic effect of the chemical by its inhibition (Ogungbemi et al, 2019).…”
Section: Behavioural Responsesupporting
confidence: 93%
“…According to Laughlin et al (2017), the behavioural activity of organisms represents the final cohesive result of diversified physiological and biochemical alterations. The observed behavioural alterations in this study agree with previous reports by (Pulgar et al, 2019;Xia et al, 2018;Wang et al, 2021;Boyle et al, 2020). The behavioural changes observed may be attributed to the neurotoxic effect of the chemical by its inhibition (Ogungbemi et al, 2019).…”
Section: Behavioural Responsesupporting
confidence: 93%
“…This review includes articles with Qualsyst score percentages ≥60%. As a result, 23 articles were included for in-depth analysis [ 7 , 13 , 14 , 15 , 16 , 27 , 28 , 29 , 30 , 31 , 32 , 33 , 34 , 35 , 36 , 37 , 38 , 39 , 40 , 41 , 42 , 43 , 44 ]; the remaining 5 articles were excluded since their score was below the acceptance percentage [ 45 , 46 , 47 , 48 , 49 ]. The flow diagram of the review phases’ is shown in Figure 2 .…”
Section: Resultsmentioning
confidence: 99%
“…The first study identified dates back to 2016 [ 39 ], and 95.7% (22/23) of the studies were conducted in the last five years (2018–2022). Most of the 23 studies were conducted in Europe (11/23, 47.8%) [ 7 , 15 , 16 , 29 , 30 , 32 , 33 , 38 , 39 , 40 , 41 ] followed by Asia (8/23, 34.7%) [ 13 , 27 , 34 , 35 , 36 , 42 , 43 , 44 ], and Oceania (3/23, 13%) [ 31 , 37 , 44 ]. An overview of the results is presented in Table 2 , and a detailed analysis of the selected studies are displayed in Table 3 , Table 4 and Table 5 , according to sensor fusion level.…”
Section: Resultsmentioning
confidence: 99%
“…Videos can capture both spatial and temporal information on fish feeding behaviour, providing context information for fish feeding behaviour [14], [15]. Converting the original RGB video into an optical flow image sequence and then fed it into a 3D CNN is a common method for video-based FFIA, which outperforms image-based models [12], [14], [30]. However, processing video data is computationally demanding, making it impractical for on-device applications.…”
Section: A Visual-based Ffia Methodsmentioning
confidence: 99%