In many practical situations the outputs of a plant are not measured exactly, but are corrupted by quantization errors. Often the effect of the quantization error is neglected in the control design phase, which can lead to undesirable effects like limit cycles and even chaotic behavior once the controller has been implemented. In this paper we present a method based on 11 optimal control that minimizes the amplitude of the oscillations in the to-be-controlled variables. Analytical and numerical examples illustrate the elegance of the Il-theory in this setting.