Snakes, or active contours, have been widely used in image processing applications. An external force for snakes called gradient vector flow (GVF) attempts to address traditional snake problems of initialization sensitivity and poor convergence to concavities, while generalized GVF (GGVF) aims to improve GVF snake convergence to long and thin indentations (LTIs). In this paper, we find and show that both GVF and GGVF snakes essentially yield the same performance in capturing LTIs of odd widths, and generally neither can converge to even-width LTIs. Based on a thorough investigation of the GVF and GGVF fields within the LTI during their iterative processes, we identify the crux of the convergence problem, and accordingly propose a novel external force termed as component-normalized GGVF (CN-GGVF) to eliminate the problem. CN-GGVF is obtained by normalizing each component of initial GGVF vectors with respect to its own magnitude. Experimental results and comparisons against GGVF snakes show that the proposed CN-GGVF snakes can capture LTIs regardless of odd or even widths with a remarkably faster convergence speed, while preserving other desirable properties of GGVF snakes with lower computational complexity in vector normalization.