In the manufacturing industry, automatic quality inspections can lead to improved product quality and productivity. Deep learning-based computer vision technologies, with their superior performance in many applications, can be a possible solution for automatic quality inspections. However, collecting a large amount of annotated training data for deep learning is expensive and time-consuming, especially for processes involving various products and human activities such as assembly. To address this challenge, we propose a method for automated assembly quality inspection using synthetic data generated from computer-aided design (CAD) models. The method involves two steps: automatic data generation and model implementation. In the first step, we generate synthetic data in two formats: two-dimensional (2D) images and three-dimensional (3D) point clouds. In the second step, we apply different state-of-the-art deep learning approaches to the data for quality inspection, including unsupervised domain adaptation, i.e., a method of adapting models across different data distributions, and transfer learning, which transfers knowledge between related tasks. We evaluate the methods in a case study of pedal car front-wheel assembly quality inspection to identify the possible optimal approach for assembly quality inspection. Our results show that the method using Transfer Learning on 2D synthetic images achieves superior performance compared with others. Specifically, it attained 95% accuracy through fine-tuning with only five annotated real images per class. With promising results, our method may be suggested for other similar quality inspection use cases. By utilizing synthetic CAD data, our method reduces the need for manual data collection and annotation. Furthermore, our method performs well on test data with different backgrounds, making it suitable for different manufacturing environments.