The application of long-range infrared observation systems is challenging, especially with the currently available high spatial resolution infrared camera systems with resolutions comparable with their visual counterparts. As a result of these developments, the obtained infrared images are no longer limited by the quality of system but by atmospheric effects instead. For instance, atmospheric transmission losses and path radiance reduce the contrast of objects in the background and optical turbulence limits the spatial resolution in the images. Furthermore, severe image distortion can occur due to atmospheric refraction, which limits the detection and identification of objects at larger range. EOSTAR is a computer program under development to estimate these atmospheric effects using standard meteorological parameters and the properties of the sensor. Tools are provided to design targets and to calculate their infrared signature as a function of range, aspect angle, and weather condition. Possible applications of EOSTAR include mission planning, sensor evaluation and selection, and education. The user interface of EOSTAR is fully mouse-controlled, and the code runs on a standard Windows-based PC. Many features of EOSTAR execute almost instantaneous, which results in a user friendly code. Its modular setup allows its configuration to specific user needs and provides a flexible output structure.
Operating in a coastal environment, with a multitude of boats of different sizes, detection of small extended targets is only one problem. A further difficulty is in discriminating detections of possible threats from alarms due to sea and coastal clutter, and from boats that are neutral for a specific operational task. Adding target features to detections allows filtering out clutter before tracking. Features can also be used to add labels resulting from a classification step. Both will help tracking by facilitating association. Labeling and information from features can be an aid to an operator, or can reduce the number of false alarms for more automatic systems.In this paper we present work on clutter reduction and classification of small extended targets from infrared and visual light imagery. Several methods for discriminating between classes of objects were examined, with an emphasis on less complex techniques, such as rules and decision trees. Similar techniques can be used to discriminate between targets and clutter, and between different classes of boats. Different features are examined that possibly allow discrimination between several classes. Data recordings are used, in infrared and visual light, with a range of targets including rhibs, cabin boats and jet-skis.
For naval operations in a coastal environment, detection of boats is not sufficient. When doing surveillance near a supposedly friendly coast, or self protection in a harbor, it is important to find the one object that means harm, among many others that do not. For this, it is necessary to obtain information on the many observed targets, which in this scenario are typically small vessels. Determining the exact type of ship is not enough to declare it a threat. However, in the whole process from (multi-sensor) detection to the decision to act, classification of a ship into a more general class is already of great help, when this information is combined with other data to assist an operator.We investigated several aspects of the use of electro-optical systems. As for classification, this paper concentrates on discriminating classes of small vessels with different electro-optical systems (visual and infrared) as part of the larger process involving an operator. It addresses both selection of features (based on shape and texture) and ways of using these in a system to assess threats. Results are presented on data recorded in coastal and harbor environments for several small targets.
Efficient military operations require insight in the capabilities of the available sensor package to reliably assess the operational theatre, as well as insight in the adversary's capabilities to do the same. This paper presents the EOSTAR model suite, an end-to-end approach to assess the performance of electro-optical sensor systems in an operational setting. EOSTAR provides the user with coverage diagrams ("where can I see the threat?") and synthetic sensor images ("how do I perceive the threat?"), and allows assessing similar parameters for threat sensors. The paper discusses the elements of EOSTAR and outlines a few of the possible applications of the model.
Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington VA 22202-4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.