Recent digital technological developments facilitate the translation, empowerment, and extension of expressing individual and collective ideas, beliefs, and attitudes towards specific concepts and events through different social platforms. In this complex arena, social media manipulation campaigns and corresponding mechanisms like disinformation and misinformation are used through techniques like deep fakes and fake news for, e.g., altering existing information and spreading manufactured information to {targeted, diverse} audiences or producing polarization among communities and users. Nevertheless, academic and practitioner efforts to capture, control, and limit social manipulation techniques exist in the form of strategies and policies based on human intelligence, Artificial Intelligence, or a combination thereof. However, such mechanisms and consonant techniques advance in adaptivity and complexity and can reach and impact broader communities. On this behalf, and especially in conjunction with the ongoing events surrounding the ongoing war in Ukraine, increased attention and dedication is shown to both current and recent events surrounding this conflict, e.g., Crimea’s annexation and the MH17 crash in 2014. Such events characterize old battles of ongoing conflicts that could teach important lessons on understanding the role and involvement of Russian and Ukrainian diaspora communities in corresponding social manipulation discourses in social platforms like Twitter and Facebook. To tackle this, multidisciplinary research is conducted using the Design Science Research methodology following the Data Science approach building a series of Machine Learning models. Accordingly, this research aims to build and bring social awareness and resilience to both users and social media policy decision-makers on the role, involvement, and implications of diaspora digital communities in conflicts.