Rail cross-section profile detection can assess the wear and tear of measured rails, providing crucial references for railway maintenance and upkeep. It is challenging to use the conventional method to register only the profile of rail head detection accurately. After the rail edge adjustment in some conventional railways, it becomes difficult to determine the base point of rail profile registration. Due to the wear of non-working edge of rail, a rail profile registration method is proposed. Firstly, a wear prediction model based on the generalized regression neural network is constructed by optimizing smoothing parameters through ten-fold cross-validation to achieve the optimal values. This model predicts the wear values on the non-working rail edge, providing reliable coordinates for two wear points as alignment reference points. Secondly, an initial alignment between the measured profile and the target profile is achieved using the nearest point iterative algorithm, which ensures that both profiles are in the same region and oriented similarly. Thirdly, the weight is assigned to the wear measurement points based on their respective wear values. The predicted positions of wear points are applied for calculating the translation and rotation parameters. These parameters could align the measured profile and facilitate the final profile alignment. Lastly, the experimental profiles under rail adjustment conditions were registered, verifying the accuracy of the proposed method. The research results indicate that under rail adjustment conditions, the mean squared error (MSE) between the calculated and actual values of lateral wear is 0.094 mm2, which is lower than the MSE from manual measurements. The calculated lateral wear value for experimental profiles achieved high alignment and calculation accuracy. This method can be applied in practical projects, providing an effective solution for rail head profile alignment and serving as a reference for profile alignment on the non-working rail edge with incomplete measurement data.