The last decade has seen a significant growth of HCI research in mental health technologies while Artificial intelligence raises both challenges and opportunities to better support symptom identification or personalization of interventions. There has been also a growth of commercial AIbased mobile apps for mental health. Despite emerging HCI work on reviewing mental health apps, those that are AI-based have received limited attention. To address this gap, we report a functionality review of 13 apps selected from 127 apps from the Apple Store. The selection criteria involved a minimum rating of 4 out of 5. After eliminating duplicates, irrelevant, and low-rated apps, an expert evaluation and auto-ethnography approach were used to explore apps' functionalities. Findings indicate that apps support functions such as tracking and detecting emotions and moods, providing recommendations for therapy and well-being interventions, and supporting talking therapy through conversational agents powered by Natural Language Processing models. A critical finding is apps' limited support for AI literacy and explainability, as well as limited consideration for ethical concerns regarding personal data, its reliability, and algorithmic biases. Our paper concludes with three design implications for AI-based mental health apps towards developing conversational agents to support Cognative Behavoural Therapy interventions based on tracked multimodal data, addressing the ethics of NLP biases, and user exploration of AI-based models and their explainability.