The present study is aimed to evaluate and compare loudness adaptation between normal hearing and cochlear-implant subjects. Loudness adaptation for 367-s pure tones was measured in five normal-hearing subjects at three frequencies (125, 1,000, and 8,000 Hz) and three levels (30, 60, and 90 dB SPL). In addition, loudness adaptation for 367-s pulse trains was measured in five Clarion cochlear-implant subjects at three stimulation rates (100, 991, and 4,296 Hz), three levels (10, 50, and 90% of the electric dynamic range), three stimulation positions (apical, middle and basal), and two stimulation modes (monopolar and bipolar). The method of successive magnitude estimation was used to quantify loudness adaptation. Similar to the previous results, we found that loudness adaptation in normal-hearing subjects increases with decreasing level and increasing frequency. However, we also found a small but significant loudness enhancement at 90 dB SPL in acoustic hearing. Despite large individual variability, we found that loudness adaptation in cochlear-implant subjects increases with decreasing levels, but is not significantly affected by the rate, place and mode of stimulation. A phenomenological model was proposed to predict loudness adaptation as a function of stimulus frequency and level in acoustic hearing. The present results were not fully compatible with either the restricted excitation hypothesis or the neural adaptation hypothesis. Loudness adaptation may have a central component that is dependent on the peripheral excitation pattern.