The article shows that the solution to the problem of increasing the noise immunity (noise immunity and secrecy of functioning) of the ICS can be achieved using systems of nonlinear signals with improved ensemble, structural and correlation properties. Two classes of nonlinear complex discrete signals are considered: characteristic discrete signals (CDS) and cryptographic signals (CS). Methods for the synthesis of these signals are presented. The paper gives a statistical simulation model for studying the noise immunity of various classes of signals in the Gaussian channel. Using this model, estimates of the dependence of the error probability on the signal-to-noise ratio were obtained for various classes of signals, namely: CDS, KS and standard BPSK AFM-16 signals. It is shown that for the signal-to-noise ratio – 10 the error probability for the CDR is 4.6875e-06, for the CS is 3.515625e-06, and for the AFM-16 is 0.002025. Thus, the use of nonlinear complex discrete signals, in particular, CDS and KS, can significantly increase the noise immunity of signal reception in modern ICS. At the same time, taking into account the improved ensemble and structural properties of these nonlinear signals, it is possible to improve significantly the indicators of crypto- and imitation security of the systems functioning.
Information systems in general and databases in particular are vulnerable to accidental or malicious attacks aimed at compromising data integrity. Security is easier if you have a clear model that is the formal expression of security policy. The paper explores known security models related to data integrity, their applicability and significance for databases. The analysis of formal models for ensuring data integrity revealed that each of them, having certain advantages and disadvantages, has the right to use. The decisive factor in making a decision is an assessment of a specific situation, which will make it possible to make the right choice, including their complex application. In this regard, the paper notes that the Clark-Wilson model, the undoubted advantages of which are its simplicity and ease of joint use with other security models, is advisable to use as a set of practical recommendations for building an integrity assurance system in information systems. While stating the fact that traditional DBMSs support many of the mechanisms of the Clark-Wilson model, the article points out that implementations based on standard SQL require some compromise solutions. Analyzing the Biba model, the paper concludes about its relative simplicity and the use of a well-studied mathematical apparatus. It is noted that in practice, for the creation of secure information systems, as systems that ensure the confidentiality and data integrity, it is important to unite the Bell-LaPadula and Biba models. Moreover, this union should be on the basis of one common lattice, but with two security labels (confidentiality and integrity) with the opposite character of their definition. This is exactly the variant of combining the Bell-LaPadula and Biba models that is recommended for use in modern information systems and DBMSs, where a mandatory security policy is implemented.
The subject matter of the paper is the development of fingerprint local structures based on the new method of the minutia vicinity decomposition (MVD) for the solution to the task of fingerprint verification. It is an essential task because it is produced attempts to introduce biometric technology in different areas of social and state life: criminology, access control system, mobile device applications, banking. The goal is to develop real number vectors that can respond to criteria for biometric template protection schemes such as irreversibility with the corresponding accuracy of equal error rate (EER). The problem to be solved is the problem of accuracy in the case of verification because there are false minutiae, disappearing of truth minutiae and there are also linear and angular deformations. The method is the new method of MVD that used the level of graphs with many a point from 7 to 3. This scheme of decomposition is shown in this paper; such a variant of decomposition is never used in science articles. The following results were obtained: description of a new method for fingerprint verification. The new metric for creating vectors of real numbers were suggested – a minimal path for points in the graphs. Also, the algorithm for finding out minimal paths for points was proposed in the graphs because the classic algorithm has a problem in some cases with many points being 6. These problems are crossing and excluding arcs are in the path. The way of sorting out such problems was suggested and examples are given for several points are 20. Results of false rejection rate (FRR), false acceptance rate (FAR), EER are shown in the paper. In this paper, the level of EER is 33 % with full search. 78400 false and 1400 true tests were conducted. The method does not use such metrics as distances and angles, which are used in the classical method of MVD and will be used in future papers. This result is shown for total coincidences of real number, not a similarity that it is used at verifications. It is a good result in this case because the result from the method index-of-max is 40 %.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.