Personalized 3D human avatars have aroused a great deal of interest because it is attractive to most people, particularly generation Z, to have the digital twins in their own appearance to live, work, interact, and shop in the metaverse. Nevertheless, personalized avatars are rarely used in practice because of the computational cost and hardware restrictions in the creation process. This has resulted in avatars of diverse topologies being used on different platforms/systems for various applications, which further hinders the utilization of personalized avatars. This paper reports on a new method for personalizing human avatars, which includes the reconstruction of personalized face models from single images and transferring the reconstructed 3D facial shape and appearance to avatars with varying topologies. This newly developed method is compared with state-of-the-art face reconstruction and personalized avatar reconstruction methods. Based upon the results obtained, it was concluded that the new method created more realistic and true-to-life avatars. This method has been applied in an augmented reality (AR) mobile application, enabling users to engage in virtual try-on experience of fashion. The code will be released once the paper is published.