Objective: Prediction of moving objects with uncertain motion patterns is emerging rapidly as a new exciting paradigm and is important for law enforcement applications such as criminal tracking analysis. However, existing algorithms for prediction in spatio-temporal databases focus on discovering frequent trajectory patterns from historical data. Moreover, these methods overlook the effect of some important factors, such as speed and moving direction. This lacks generality as moving objects may follow dynamic motion patterns in real life.Methods: We propose a framework for predicating uncertain trajectories in moving objects databases. Based on Continuous Time Bayesian Networks (CTBNs), we develop a trajectory prediction algorithm, called PutMode (Prediction of uncertain trajectories in Moving objects databases). It comprises three phases: (i) construction of TCTBNs (Tra-S. Qiao ( ) School jectory CTBNs) which obey the Markov property and consist of states combined by three important variables including street identifier, speed, and direction; (ii) trajectory clustering for clearing up outlying trajectories; (iii) predicting the motion behaviors of moving objects in order to obtain the possible trajectories based on TCTBNs.Results: Experimental results show that PutMode can predict the possible motion curves of objects in an accurate and efficient manner in distinct trajectory data sets with an average accuracy higher than 80%. Furthermore, we illustrate the crucial role of trajectory clustering, which provides benefits on prediction time as well as prediction accuracy.
Purpose This study aimed to quantitative assess the fundus tessellated density (FTD) and associated factors on the basis of fundus photographs using artificial intelligence. Methods A detailed examination of 3468 individuals was performed. The proposed method for FTD measurements consists of image preprocessing, sample labeling, deep learning segmentation model, and FTD calculation. Fundus tessellation was extracted as region of interest and then the FTD could be obtained by calculating the average exposed choroid area of per unit area of fundus. Besides, univariate and multivariate linear regression analysis have been conducted for the statistical analysis. Results The mean FTD was 0.14 ± 0.08 (median, 0.13; range, 0–0.39). In multivariate analysis, FTD was significantly ( P < 0.001) associated with thinner subfoveal choroidal thickness, longer axial length, larger parapapillary atrophy, older age, male sex and lower body mass index. Correlation analysis suggested that the FTD increased by 33.1% ( r = 0.33, P < .001) for each decade of life. Besides, correlation analysis indicated the negative correlation between FTD and spherical equivalent (SE) in the myopia participants ( r = −0.25, P < 0.001), and no correlations between FTD and SE in the hypermetropia and emmetropic participants. Conclusions It is feasible and efficient to extract FTD information from fundus images by artificial intelligence–based imaging processing. FTD can be widely used in population screening as a new quantitative biomarker for the thickness of the subfoveal choroid. The association between FTD with pathological myopia and lower visual acuity warrants further investigation. Translational Relevance Artificial intelligence can extract valuable clinical biomarkers from fundus images and assist in population screening.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.