AgI sensitized TiO2 nanotube arrays (AgI/TiO2-NTs) with adjustable
β/γ ratio of AgI were prepared
by a simple dissolution–precipitation–calcination process.
The samples were characterized by various techniques, including X-ray
diffraction, X-ray photoelectron spectroscopy, scanning electron microscopy,
ultraviolet–visible diffuse reflectance spectroscopy, linear
sweep voltammetry, electrochemical impedance spectroscopy, and Mott–Schottky
plots. We found that calcination temperature (100–500 °C)
had significant effect on regulating the phase of AgI. After calcination
at 350 °C, the highest β/γ ratio of AgI was achieved.
Moreover, greatly enhanced photocurrent response and reduced charge
transfer resistance were also observed, which together led to easier
generation and separation of photogenerated electron–hole pairs.
Thus, for the reduction of Cr(VI) under visible light, significantly
enhanced photoelectrocatalytic (PEC) performance was observed using
AgI/TiO2-NTs calcined at 350 °C (denoted as AgI/TiO2-NTs350) as photoanode and Ti foil as cathode, respectively.
At very low content of AgI (1.25%), the estimated k
Cr(VI) (0.0155 min–1) was nearly 5 times
that of pure TiO2-NTs350.