The Blum-Kalai-Wasserman algorithm (BKW) is an algorithm for solving the learning parity with noise problem, which was then adapted for solving the learning with errors problem (LWE) by Albrecht et al. Duc et al. applied BKW also to the learning with rounding problem (LWR). The number of blocks is a parameter of BKW. By optimizing the number of blocks, we can minimize the time complexity of BKW. However, Duc et al. did not derive the optimal number of blocks theoretically, but they searched for it numerically. Duc et al. also showed that the required number of samples for BKW for solving LWE can be dramatically decreased using Lyubashevsky's idea. However, it is not shown that his idea is also applicable to LWR. In this paper, we theoretically derive the asymptotically optimal number of blocks, and then analyze the minimum asymptotic time complexity of the algorithm. We also show that Lyubashevsky's idea can be applied to LWR-solving BKW, under a heuristic assumption that is regularly used in the analysis of LPNsolving BKW. Furthermore, we derive an equation that relates the Gaussian parameter σ of LWE and the modulus p of LWR. When σ and p satisfy the equation, the asymptotic time complexity of BKW to solve LWE and LWR are the same.