Abstract:Abstract. Finding the smallest eigenvalue of a given square matrix A of order n is computationally very intensive problem. The most popular method for this problem is the Inverse Power Method which uses LUdecomposition and forward and backward solving of the factored system at every iteration step. An alternative to this method is the Resolvent Monte Carlo method which uses representation of the resolvent matrix [I −qA] −m as a series and then performs Monte Carlo iterations (random walks) on the elements of… Show more
“…Monte Carlo methods are an established technique for solving problems where the unknown quantity can represented as the mathematical expectation of some random variable, so that by sampling this variable one can obtain estimates of the true value by averaging. In the domain of linear algebra the Monte Carlo methods have been used for solving linear systems, estimating eigenvalues or inverting matrices (see, e.g., (Alexandrov, 2005;Dimov, 1999)). In machine learning they have wide applicability since the problems at hand are inherently stochastic.…”
Support Vector Machines are a widely used tool in Machine Learning.
They have some important advantages with regards to the more popular Deep
Neural Networks. For the problem of image classification, multiple SVMs may be used and the issue of finding the best hyperparameters adds additional complexity and increases the overall computational time required. Our goal is to develop and study Monte Carlo algorithms that allow faster discovery of good hyperparameters and training of the SVMs, without impacting negatively the final accuracy of the models. We also employ GPUs and parallel computing in order to achieve good utilisation of the capabilities of the available hardware. In this paper we describe our methods, provide implementation details and show numerical results, achieved on the publicly available Architectural Heritage Elements image Dataset.
“…Monte Carlo methods are an established technique for solving problems where the unknown quantity can represented as the mathematical expectation of some random variable, so that by sampling this variable one can obtain estimates of the true value by averaging. In the domain of linear algebra the Monte Carlo methods have been used for solving linear systems, estimating eigenvalues or inverting matrices (see, e.g., (Alexandrov, 2005;Dimov, 1999)). In machine learning they have wide applicability since the problems at hand are inherently stochastic.…”
Support Vector Machines are a widely used tool in Machine Learning.
They have some important advantages with regards to the more popular Deep
Neural Networks. For the problem of image classification, multiple SVMs may be used and the issue of finding the best hyperparameters adds additional complexity and increases the overall computational time required. Our goal is to develop and study Monte Carlo algorithms that allow faster discovery of good hyperparameters and training of the SVMs, without impacting negatively the final accuracy of the models. We also employ GPUs and parallel computing in order to achieve good utilisation of the capabilities of the available hardware. In this paper we describe our methods, provide implementation details and show numerical results, achieved on the publicly available Architectural Heritage Elements image Dataset.
“…There has been renewed interest in MCMs in recent times, for example [18,11,12,10,7,13,1,2,3,9,8]. The primary reason for this is the efficiency of parallel MCMs in the presence of high communication costs.…”
We present quasi-Monte Carlo analogs of Monte Carlo methods for some linear algebra problems: solving systems of linear equations, computing extreme eigenvalues, and matrix inversion. Reformulating the problems as solving integral equations with a special kernels and domains permits us to analyze the quasi-Monte Carlo methods with bounds from numerical integration. Standard Monte Carlo methods for integration provide a convergence rate of O(N^(−1/2)) using N samples. Quasi-Monte Carlo methods use quasirandom sequences with the resulting convergence rate for numerical integration as good as O((logN)^k)N^(−1)). We have shown theoretically and through numerical tests that the use of quasirandom sequences improves both the magnitude of the error and the convergence rate of the considered Monte Carlo methods. We also analyze the complexity of considered quasi-Monte Carlo algorithms and compare them to the complexity of the analogous Monte Carlo and deterministic algorithms.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.