-This paper presents improvements in image gap restoration through incorporation of edge-based directional interpolation within multi-scale pyramid transforms. Two types of image edges are reconstructed; (a) the local edges or textures, inferred from the gradients of the neighbouring pixels and (b) the global edges between image objects or segments, inferred using Canny detector. Through a process of pyramid transformation and downsampling, the image is progressively transformed into a series of reduced size layers until at the pyramid apex the gap size is one sample. At each layer an edge 'skeleton' image is extracted for edge-guided interpolation. The process is then reversed; from the apex, at each layer, the missing samples are estimated (an iterative method is used in the last stage of up-sampling), up-sampled and combined with the available samples of the next layer. Discrete cosine transform and a family of discrete wavelet transforms are utilized as alternatives for pyramid construction. Evaluations over a range of images, in regular and random loss pattern, at loss rates of up to 40%, demonstrate that the proposed method improves PSNR by 1 to 5 dB compared to a range of best published works.Index Terms-Error concealment, multi-scale DCT/DWT pyramid, edge detection, image gap recovery, packet loss concealment. I. INTRODUCTIONmage gap restoration have a wide range of applications that includes in-painting of missing or damaged segments in still images or the replacement of image data packet lost in transmission. Further examples of applications and environments where image gap restoration can be usefully applied are enhancement of distorted biomedical signals [1], restoration of archived damaged images [2] and packet loss concealment over internet protocol (IP) networks [3].A main current application of image gap restoration is packet loss concealment. Packet loss errors may occur due to network congestions or due to signal loss in mobile devices. IP networks are best-effort environments [4,5] where the packet delivery is not guaranteed. The rapid growth in demand for relatively high bandwidth image/video streaming applications over IP networks motivates the need for packet loss recovery and concealment in order to provide more reliable network services and more acceptable user experience [6].There are three broad approaches for mitigating the loss of quality in received images due to packet loss: (a) automatic request for retransmission (ARQ) of the lost packets, (b) error control via forward error correction (FEC) methods and (c) error concealment (EC) methods. The first method retransmits a copy of the damaged/lost packet and results in an increase in bandwidth and delay proportional to error rate [5]. This method can be used on request for retransmission in networks where there is an interaction between sender and receiver. The second category of methods, FEC, employs error correction coding to recover lost pixels from the received information. This implies that the pixel values in successive...
Different types of data, needs of users and variety application problems are lead to produce a range of methods to discover patterns and dependent relationships. This application follows a set of association rules according to know which one of set of objects affects on a set of other objects. This association rules predict the occurrence of an object based on the occurrence of other objects. The associative algorithms have the challenge of redundant association rules and patterns, but studying various methods of association rules is expressive that the recent researches focused on solving the challenges of the tree and lattice structures and their compounds about association algorithms. In this paper, the associative algorithms and their function are described, and finally the new improved association algorithms and the proposed solutions to solve these challenges are explained.
Clustering is one of the data mining methods. In all clustering algorithms, the goal is to minimize intracluster distances, and to maximize intercluster distances. Whatever a clustering algorithm provides a better performance, it has the more successful to achieve this goal. Nowadays, although many research done in the field of clustering algorithms, these algorithms have the challenges such as processing time, scalability, accuracy, etc. Comparing various methods of the clustering, the contributions of the recent researches focused on solving the clustering challenges of the partition method. In this paper, the partitioning clustering method is introduced, the procedure of the clustering algorithms is described, and finally the new improved methods and the proposed solutions to solve these challenges are explained.
In medical science, collecting and classifying data from various diseases is a vital task. The confused and large amounts of data are problems that prevent us from achieving acceptable results. One of the major problems for diabetic patients is a failure to properly diagnose the disease. As a result of this mistake in diagnosis or failure in early diagnosis, the patient may suffer from complications such as blindness, kidney failure, and cutting off the toes. Nowadays, doctors diagnose the disease by relying on their experience and knowledge and performing complex and time-consuming tests. One of the problems with current diabetic, diagnostic methods is the lack of appropriate features to diagnose the disease and consequently the weakness in its diagnosis, especially in its early stages. Since diabetes diagnosis relies on large amounts of data with many parameters, it is necessary to use machine learning methods such as support vector machine (SVM) to predict the complications of diabetes. One of the disadvantages of SVM is its parameter adjustment, which can be accomplished using metaheuristic algorithms such as particle swarm optimization algorithm (PSO), genetic algorithm, or grey wolf optimizer (GWO). In this paper, after preprocessing and preparing the dataset for data mining, we use SVM to predict complications of diabetes based on selected parameters of a patient acquired by laboratory test using improved GWO. We improve the selection process of GWO by employing dynamic adaptive middle filter, a nonlinear filter that assigns appropriate weight to each value based on the data value. Comparison of the final results of the proposed algorithm with classification methods such as a multilayer perceptron neural network, decision tree, simple Bayes, and temporal fuzzy min–max neural network (TFMM-PSO) shows the superiority of the proposed method over the comparable ones.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.