Military Cyber Operations are an integral part of modern warfare and national security strategies as they crossed the science fiction realm, represent a real operational battlefield, and developed into an option in military toolboxes. Seeing ongoing technological advancements that allow the creation and use of complex mechanisms and technologies, the increasing digitalization of critical infrastructure, the growing abundance of data collected, generated, and exchanged between multiple parties, and the rise of stakeholders engaging in building and/or executing military Cyber Operations together with the increased number of such operations being conducted all over the globe reflects important lessons that need to be learned: the core element of such operations are humans: they build, acquire, execute, and assess them while also being impacted entities by them, e.g., through (psychological or physical) injury or death, or damage or destructions of human infrastructure. Moreover, the key process governing all the life cycle phases of military Cyber Operations is human decision-making or humanly assisted/augmented decision-making relying on advanced intelligent methods built with AI. Nevertheless, building and conducting military Cyber Operations should be done in a legal, responsible, and effective way implying a deep understanding of the context and adversary, proper target and cyber weapon selection, development, and use, and a clear overview of potential effects produced. These represent important aspects that should be properly defined and tackled in this domain. Hence, this research aims to introduce the Human-Centred AI concept and approach in the military cyber domain to illustrate ways to prioritize human involvement and interaction, human understanding, effective decision-making, and ethical considerations when building and conducting military Cyber Operations. To this end, an extensive literature review is conducted in the military, cyber, and AI domains together with instantiation on military Cyber Operations.