As people begin to notice mixed reality, various studies on user satisfaction in mixed reality (MR) have been conducted. User interface (UI) is one of the representative factors that affect interaction satisfaction in MR. In conventional platforms such as mobile devices and personal computers, various studies have been conducted on providing adaptive UI, and recently, such studies have also been conducted in MR environments. However, there have been few studies on providing an adaptive UI based on interaction satisfaction. Therefore, in this paper, we propose a method based on interaction-satisfaction prediction to provide an adaptive UI in MR. The proposed method predicts interaction satisfaction based on interaction information (gaze, hand, head, object) and provides an adaptive UI based on predicted interaction satisfaction. To develop the proposed method, an experiment to measure data was performed, and a user-satisfaction-prediction model was developed based on the data collected through the experiment. Next, to evaluate the proposed method, an adaptive UI providing an application using the developed user-satisfaction-prediction model was implemented. From the experimental results using the implemented application, it was confirmed that the proposed method could improve user satisfaction compared to the conventional method.