No abstract
The article presents an overview of the theoretical and experimental work related to unmanned aerial vehicles (UAVs) motion parameters estimation based on the integration of video measurements obtained by the on-board optoelectronic camera and data from the UAV’s own inertial navigation system (INS). The use of various approaches described in the literature which show good characteristics in computer simulations or in fairly simple conditions close to laboratory ones demonstrates the sufficient complexity of the problems associated with adaption of camera parameters to the changing conditions of a real flight. In our experiments, we used computer simulation methods applying them to the real images and processing methods of videos obtained during real flights. For example, it was noted that the use of images that are very different in scale and in the aspect angle from the observed images in flight makes it very difficult to use the methodology of singular points. At the same time, the matching of the observed and reference images using rectilinear segments, such as images of road sections and the walls of the buildings look quite promising. In addition, in our experiments we used the projective transformation matrix computation from frame to frame, which together with the filtering estimates for the coordinate and angular velocities provides additional possibilities for estimating the UAV position. Data on the UAV position determining based on the methods of video navigation obtained during real flights are presented. New approaches to video navigation obtained using the methods of conjugation rectilinear segments, characteristic curvilinear elements and segmentation of textured and colored regions are demonstrated. Also the application of the method of calculating projective transformations from frame-to-frame is shown which gives estimates of the displacements and rotations of the apparatus and thereby serves to the UAV position estimation by filtering. Thus, the aim of the work was to analyze various approaches to UAV navigation using video data as an additional source of information about the position and velocity of the vehicle.
A nonlinear stochastic control problem related to flow control is considered. It is assumed that the state of a link is described by a controlled hidden Markov process with a finite state set, while the loss flow is described by a counting process with intensity depending on a current transmission rate and an unobserved link state. The control is the transmission rate, and it has to be chosen as a nonanticipating process depending on the observation of the loss process. The aim of the control is to achieve the maximum of some utility function that takes into account losses of the transmitted information. Originally, the problem belongs to the class of stochastic control problems with incomplete information; however, optimal filtering equations that provide estimation of the current link state based on observations of the loss process allow one to reduce the problem to a standard stochastic control problem with full observations. Then a necessary optimality condition is derived in the form of a stochastic maximum principle, which allows us to obtain explicit analytic expressions for the optimal control in some particular cases. Optimal and suboptimal controls are investigated and compared with the flow control schemes used in TCP/IP (Transmission Control Protocols/Internet Protocols) networks. In particular, the optimal control demonstrates a much smoother behavior than the TCP/IP congestion control currently used. 0032-9460/05/4102-0150 c 2005 Pleiades Publishing, Inc.
An automatic landing of an unmanned aerial vehicle (UAV) is a non-trivial task requiringa solution of a variety of technical and computational problems. The most important is the precisedetermination of altitude, especially at the final stage of approaching to the earth. With currentaltimeters, the magnitude of measurement errors at the final phase of the descent may be unacceptablyhigh for constructing an algorithm for controlling the landing manoeuvre. Therefore, it is desirableto have an additional sensor, which makes possible to estimate the height above the surface of therunway. It is possible to estimate all linear and angular UAV velocities simultaneously with thehelp of so-called optical flow (OF), determined by the sequence of images recorded by an onboardcamera, however in pixel scale. To transform them into the real metrical values it is necessary toknow the current flight altitude and the camera angular position values. The critical feature of theOF is its susceptibility to the camera resolution and the shift rate of the observed scene. During thedescent phase of flight, these parameters change at least one hundred times together with the altitude.Therefore, for reliable application of the OF one needs to coordinate the shooting parameters withthe current altitude. However, in case of the altimeter fault presence, the altitude is also still to beestimated with the aid of the OF, so one needs to have another tool for the camera control. One of thepossible and straightforward ways is the camera resolution change by pixels averaging in computerpart which performed in coordination with theoretically estimated and measured OF velocity. Thearticle presents results of such algorithms testing from real video sequences obtained in flights withdifferent approaches to the runway with simultaneous recording of telemetry and video data.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.