We propose bootstrap learning as a computational account for why human learning is modular and incremental, and identify key components of bootstrap learning that allow artificial systems to learn more like people. Originated from developmental psychology, bootstrap learning refers to people's ability to extend and repurpose existing knowledge to create new and more powerful ideas. We view bootstrap learning as a solution of how cognitively-bounded reasoners grasp complex environmental dynamics that are far beyond their initial capacity, by searching ‘locally’ and recursively to extend their existing knowledge. Drawing from techniques of Bayesian library learning and resource rational analysis, we propose a computational modeling framework that achieves human-like bootstrap learning performance in inductive conceptual inference. In addition, we demonstrate modeling and behavioral evidence that highlights the double-edged sword of bootstrap learning, such that people processing the same information in different batch orders could induce drastically different causal conclusions and generalizations, as a result of the different sub-concepts they construct in earlier stages of learning.