The variability in conversational responses within and between AI chatbots is likely important. Therefore, running a query multiple times is likely useful using current systems. Our blinded assessor results highlight a critical need to examine and refine how AI chatbots address sensitive health-related queries, particularly in paediatric contexts. Therapeutic interactions should validate pain and adopt a nuanced and empathetic approach, which may include asking curious questions, showing reasoning in a step-by-step manner, employing a gentle tone, posing validating statements and questions, and checking in on feelings during the interaction. While the chatbots we assessed affirmed the reality of children's pain ("your pain is real"), this may not be sufficient. Such affirmations lack understanding of individual context and may inadvertently provoke frustration (e.g., "this is not relevant to me"), similar to providing a generic leaflet or information handout. Some chatbots refused to respond, perhaps indicating some sensitivity to the individual context of pain experiences, but no chatbots sought to clarify the context with questions. Some chatbots checked in with users via statements about usefulness and invited further questions. This could suggest some basic aspects of therapeutic communication and support patient autonomy.Our commentary highlights the need for more extensive testing of AI chatbots, particularly in light of the risks of AI chatbot "hallucinations" and "falsehood mimicry" inherent in AI responses. Our data align with findings that recommend users should be cautious when interpreting healthcare-related advice from interactions with current AI chatbots. 5 Assessment tools designed specifically to assess the usefulness and consistency of AI chatbot responses should be developed and tested so that future analyses can interpreted with high confidence. Research in languages other than English is also needed. AI chatbots hold tremendous potential to provide information and support. Current chatbot capabilities in addressing complex and sensitive issues like paediatric chronic pain need refinement.Recommending one chatbot over another is challenging, given the very frequent updates and new developments currently occurring in this industry. Our commentary highlights the potential for AI chatbots to engage with users, especially children, in a manner that is empathetic, validating, and supportive, though current interactions appear more instructional. The development and testing of custom chatbots ("GPTs"), where important criteria are met consistently, should be explored.