Scalar-on-function regression, where the response is scalar valued and the predictor consists of random functions, is one of the most important tools for exploring the functional relationship between a scalar response and functional predictor(s). The functional partial least-squares method improves estimation accuracy for estimating the regression coefficient function compared to other existing methods, such as least squares, maximum likelihood, and maximum penalized likelihood. The functional partial least-squares method is often based on the SIMPLS or NIPALS algorithm, but these algorithms can be computationally slow for analyzing a large dataset. In this study, we propose two modified functional partial least-squares methods to efficiently estimate the regression coefficient function under the scalar-on-function regression. In the proposed methods, the infinite-dimensional functional predictors are first projected onto a finite-dimensional space using a basis expansion method. Then, two partial least-squares algorithms, based on re-orthogonalization of the score and loading vectors, are used to estimate the linear relationship between scalar response and the basis coefficients of the functional predictors.The finite-sample performance and computing speed are evaluated using a series of Monte Carlo simulation studies and a sugar process dataset.