Most of us are familiar with the uncomfortable feeling that results from skepticism about others' capacity to see the world as we do. If we dig too deep into this solipsistic worry, we might feel alone in the universe-no one can feel what it is like to be me. One might say that it is a question of attitude. Regardless of whether the other actually can empathize with us, our attitude prevents us from believing it. In a correspondence article in Nature Human Behaviour, Perry (2023) recently made her attitude toward the prospect of empathic AI clear: they will never know how it feels to be human! This sentiment is part of a broader aversion toward the prospect of artificial empathy (AE) (e.g., Montemayor et al., 2022;Zaki, 2023). While we agree that these dystopic concerns should be taken seriously, we also believe that the debate would benefit from additional nuance. More precisely, we argue that the AI systems of today-exemplified by AE skeptics such as Perry-are not the appropriate metric to evaluate the potential for AE and should not be used as support for why people might dismiss AE as non-genuine empathy.At the core of Perry's critique is the observation that AE is well-received until recipients realize it was generated by an AI. Perry provides two explanations for this "artificialempathy paradox". Firstly, "AI can learn to say the right words-but knowing that AI generated them demolishes any potential for sensing that one's pain or joy is genuinely being shared". Secondly, human empathy is valued because it is demanding and finite, and since "AI entails no emotional or time cost", it fails to indicate that "the recipient holds any unique importance". However, we argue that neither explanation succeeds in discrediting the prospect of artificial empathy.Empathy is a notoriously convoluted concept (see Cuff et al., 2016 for a review) and researchers often highlight cognitive-, affective-, and motivational components of empathy (Zaki, 2014;Perry, 2023). Cognitive empathy, sometimes called perspectivetaking or mentalizing, is the intellectual ability to understand how the other perceives and experiences their situation (Decety and Cowell, 2014;Zaki and Ochsner, 2016;Marsh, 2018). Cognitive empathy is to a degree already achievable for AI, which can detect and identify human emotions (Montemayor et al., 2022;Perry, 2023). Affective empathy, or experience sharing, refers to how one vicariously feels and experiences the other's emotional states (Decety and Cowell, 2014;Zaki and Ochsner, 2016;Marsh, 2018). This kind of experience-sharing is potentially not obtainable for AI. Lacking lived subjective experience (Montemayor et al., 2022;Perry, 2023), trying to share the experience of an emotional human may not resonate adequately as the AI reasonably does not feel anything (Turkle, 2007). The motivational component, also called empathic concern, can be understood as a motivation to support others' wellbeing or help them alleviate suffering (Decety and Cowell, 2014;Zaki and Ochsner, 2016;Marsh, 2018). However, while it ...