HgCdTe heteroepitaxy on low-cost, large-lattice-mismatched substrates such as Si continue to be plagued by large threading dislocation densities that ultimately reduce the operability of the thermal imaging detector array. Molecular-beam epitaxy (MBE) of 10 lm-to 15 lm-thick CdTe buffer layers has played a crucial role in reducing dislocation densities to current state-ofthe-art levels. Herein, we examine the possibility that growth on locally backthinned substrates could prove advantageous in further reducing dislocation densities in the CdTe/Si heteroepitaxial system. Using defect decoration techniques, a decrease in dislocation (etch-pit) density of up to $42% has been measured in CdTe regions where the underlying Si substrate was chemically back-thinned to $20 lm. A theoretical understanding is proposed, where a substrate-thickness-dependent dislocation image force is a likely cause for the experimentally observed reduction in threading dislocation density. These observations raise the prospect of combining localized substrate thinning with other techniques to further reduce dislocation densities to levels sought for HgCdTe/CdTe/Si and other large-lattice-mismatched systems.