Due to the increase in manufacturing/environmental uncertainties in the nanometer regime, testing digital chips under different operating conditions becomes mandatory. Traditionally, stuck-at tests were applied at slow speed to detect structural defects and transition fault tests were applied at-speed to detect delay defects. Recently, it was shown that certain cell-internal defects can only be detected using at-speed stuck-at testing. Stuck-at test patterns are power hungry, thereby causing excessive voltage droop on the power grid, delays the test response and finally leading to false delay failures on the tester. This motivates the need for peak power minimization during at-speed stuck-at testing. In this paper, we use input toggle minimization as a means to minimize circuit's power dissipation during at-speed stuck-at testing under the CSP-scan DFT scheme. For circuits whose test sets are dominated by don't cares, this paper maps the problem of optimal X-filling for peak input toggle minimization to a variant of interval coloring problem and proposes a dynamic programming (DP) algorithm (DP-fill) for the same along with a theoretical proof for its optimality. For circuits whose test sets are not dominated by don't cares, we propose a max scatter Hamiltonian path algorithm, which ensures that the ordering is done such that the don't cares are evenly distributed in the final ordering of test cubes, thereby leading to better input toggle savings than DP-fill. The proposed algorithms, when experimented on ITC99 benchmarks, produced peak power savings of up to 48% over the best known algorithms in literature. We have also pruned the solutions thus obtained, using Greedy and Simulated Annealing strategies with iterative 1-bit neighborhood, to validate our idea of optimal input toggle minimization as an effective technique for minimizing peak power dissipation during at-speed stuck-at testing.