We propose a new approach to SAT solving which solves SAT problems in vector spaces as cost minimization of a differentiable cost function J sat . In our approach, a solution, satisfying assignment, of a SAT problem in n variables is represented by a binary vector u ∈ {0, 1} n such that J sat (u) = 0. We search for such u in a vector space R n by cost minimization, i.e., starting from an initial u 0 , we minimize J sat to zero while iteratively updating u by Newton's method. We implemented our approach as an incomplete matrix-based differentiable SAT solver MatSat. Although existing main-stream SAT solvers decide each bit of a solution assignment one by one, be they of conflict driven clause learning (CDCL) type or of stochastic local search (SLS) type, MatSat fundamentally differs from them in that it updates all variables at once and continuously approaches a solution in a vector space. We conducted experiments to measure the scalability of MatSat with random 3-SAT problems. In these experiments, for example, we showed that MatSat implemented on GPU can solve the problem with n = 3 × 10 5 variables, demonstrating the feasibility of hardware acceleration by GPU for matrix-based solvers like MatSat. We also compared MatSat with nine state-of-theart CDCL and SLS SAT solvers in terms of execution time by conducting experiments with several random and non-random data sets. In the case of easy random SAT, the performance of MatSat comes between the SLS solvers and the CDCL solvers whereas it is ranked 1st on the difficult one. On the other hand, MatSat showed poor performance on non-random SAT problems. To improve its poor performance, we introduced weighted variables and clauses and confirmed the effectiveness of the weighted version of MatSat on non-random SAT.