Visual perceptual learning (VPL) is long-term performance increase resulting from visual perceptual experience. Task-relevant VPL of a feature results from training of a task on the feature relevant to the task. Task-irrelevant VPL arises as a result of exposure to the feature irrelevant to the trained task. There are at least two serious problems. First, which stage of information processing is changed in association with task-relevant VPL is controversial. Second, no model has ever explained both task-relevant and task-irrelevant VPL. Here we propose a dual plasticity model, in which there are feature-based plasticity that is a change in a representation of the learned feature and task-based plasticity that is a change in processing of the trained task. While the two types of plasticity underlie task-relevant VPL, only feature-based plasticity lies under task-irrelevant VPL. This model provides a new comprehensive framework in which apparently contradictory results could be explained.