In this study, we propose the Uncertainty-aware Motion Planning Network (UaMPNet) to address the challenges of learning-based motion planning in out-of-distribution scenarios, such as novel environments, with a primary focus on enhancing motion planning performance. UaMPNet comprises a feature extraction network and an uncertainty-aware sampling network (UaSN). The feature extraction network is constructed as a variational auto encoder characterized by a normalizing flow. It not only extracts features from complex 3D point cloud data but also models that serve as a multimodal distribution, enabling fine-grained clustering of environments with similar characteristics. Additionally, UaSN, leveraging evidential learning, provides both predictions and uncertainties, allowing the adjustment of sampling ranges based on the uncertainty associated with predictions. This promotes guided sampling and exploration within limited regions in new environments. We integrate UaMPNet with the rapidly exploring random trees (RRT)-connect algorithm, creating a learning-based motion planning algorithm capable of both exploration within limited ranges and exploitation toward the goal area in new environments. Evaluating the proposed algorithm's motion planning performance in novel environments, including both simple 3D spaces and intricate office environments with a 7-DoF Franka Emika Panda robot, we demonstrate its superior performance compared with that of state-of-the-art learning-based motion planning algorithms.