Acoustic cameras are increasingly used in monitoring studies of diadromous fish populations, even though analyzing them is time-consuming. In complex in situ contexts, anguilliform fish may be especially difficult to identify automatically using acoustic camera data because the undulation of their body frequently results in fragmented targets. Our study aimed to develop a method based on a succession of computer vision techniques, in order to automatically detect, identify and count anguilliform fish using data from multiple models of acoustic cameras. Indeed, several models of cameras, owning specific technical characteristics, are used to monitor fish populations, causing major differences in the recorded data shapes and resolutions. The method was applied to two large datasets recorded at two distinct monitoring sites with populations of European eels with different length distributions. The method yielded promising results for large eels, with more than 75% of eels automatically identified successfully using datasets from ARIS and BlueView cameras. However, only 42% of eels shorter than 60 cm were detected, with the best model performances observed for detection ranges of 4-9 m. Although improvements are required to compensate for fish-length limitations, our cross-camera method is promising for automatically detecting and counting large eels in long-term monitoring studies in complex environments.