Understanding the role of dopant
deactivation on plasmon frequency
and extinction is important for the rational design of plasmonic semiconductor
nanocrystals (PSNCs). Aliovalent dopants do not always contribute
a free carrier to a localized surface plasmon resonance (LSPR) for
many reasons, including the existence of a depletion region, the pinning
of carriers at neutral defect sites, or even the formation of a secondary
insulating microphase (inclusions) not observable in the powder X-ray
diffraction (pXRD). Here, we investigate such possibilities and their
role in determining the LSPR frequency of Al-, Ga-, and In-doped ZnO
NCs. Elemental analysis, pXRD, and absorption measurements are utilized
to examine the impact of dopant incorporation on the resulting properties.
Both simple and advanced effective mass Drude models are used to fit
the mid-infrared plasmons, while one-electron oxidant chemical titrations
are used as an independent measure of the free electron concentrations.
The results of these analyses indicate that dopant/host lattice mismatch
leads to inefficient carrier generation for aliovalent substitution,
potentially due to local spinel-like inclusions. Smaller dopant ions
are more likely to incorporate interstitially and form spinel phases,
which results in an increased number of pinned carriers. Improved
size matching from Al3+ to In3+ results in increased
substitution efficiency and subsequently higher free carrier concentrations
and LSPR frequencies. Drude model correction factors are calculated
for each sample and compared to the literature value for n-ZnO determined via full band structure calculations. Each dopant
is shown to have a unique correction factor, further illustrating
the effect of differing ionic radii on the resulting LSPR.