The normal and reverse short-channel effect of LDD MOSFET's with lateral channel-engineering (pocket or halo implant) has been investigated. An analytical model is developed which can predict V th as a function of L e ; VDS; VBS, and pocket parameters down to 0.1-m channel length. The new model shows that the V th roll-up component due to pocket implant has an exponential dependence on channel length and is determined roughly by (Np) 1=4 Lp. The validity of the model is verified by both experimental data and two-dimensional (2-D) numerical simulation. On the basis of the model, a methodology to optimize the minimum channel length Lmin is presented. The theoretical optimal pocket implant performance is to achieve an Lmin approximately 5560% that of a uniform-channel MOS-FET without pocket implant, which is a significant (over one technology generation) improvement. The process design window of pocket implant is analyzed. The design tradeoff between the improvement of short-channel immunity and the other device electrical performance is also discussed. I. INTRODUCTION T HE REDUCTION of threshold voltage with decreasing channel length and increasing drain voltage is widely used as an indicator of the short-channel effect in evaluating CMOS technologies. This adverse roll-off effect is perhaps the most daunting road block in future MOSFET design. The device minimum acceptable channel length, , is primarily determined by the roll-off. The roll-off can be reduced or even reversed, i.e., the increases with decreasing channel length, by locally raising the channel doping next to the drain or drain/source junctions. The socalled Reverse Short-Channel Effect (RSCE) was originally observed in MOSFET's due to oxidation-enhanced-diffusion [1] or implant-damage-enhanced diffusion [2] which are very difficult to control. In the past few years, the locally high doping concentration in the channel near source/drain junctions has been implemented via lateral channel engineering,