Due to the widespread application of semiconductor technology in integrated circuits, more and more design studies on analog integrated circuits are gradually being implemented. However, due to the nature of analog integrated circuits, it is time-consuming and inefficient. Therefore, there are lots of experts studying how to reduce the design cycle of analog ICs. The use of machine learning in analog circuits stands out, as machine learning-based design methods have significantly reduced the analog cycle time. This review report will first introduce the algorithms related to machine learning, and the second half will outline the existing applications of machine learning in an analog integrated circuit and compare them.