Relying on fixed point techniques, Mahey, Oualibouch and Tao introduced in [11] the scaled proximal decomposition on the graph of a maximal monotone operator (SPDG) algorithm and analyzed its performance on inclusions for strongly monotone and Lipschitz continuous operators. The SPDG algorithm generalizes the Spingarn's partial inverse method by allowing scaling factors, a key strategy to speed up the convergence of numerical algorithms.In this note, we show that the SPDG algorithm can alternatively be analyzed by means of the original Spingarn's partial inverse framework, tracing back to the 1983 Spingarn's paper. We simply show that under the assumptions considered in [11], the Spingarn's partial inverse of the underlying maximal monotone operator is strongly monotone, which allows one to employ recent results on the convergence and iteration-complexity of proximal point type methods for strongly monotone operators. By doing this, we additionally obtain a potentially faster convergence for the SPDG algorithm and a more accurate upper bound on the number of iterations needed to achieve prescribed tolerances, specially on ill-conditioned problems.