Recently, hyperscale artificial intelligence (AI) has been undergoing rapid development at an astonishing pace. These hyperscale AIs based on deep neural networks (DNNs) require significant amounts of matrix-vector multiplication operations for learning and inference and are known to consume substantial amounts of power. Currently, digital accelerators such as graphic processing units and neural processing units are used to perform these operations. However, developing accelerators for DNNs that are more energy efficient and capable of faster processing is becoming increasingly essential. Analog in-memory computing (or neuromorphic hardware) technologies using crossbar array architecture and memristor devices are expected to be the most efficient hardware for DNN operations. This topical article aims to introduce, from the perspective of electronic materials (memristors), the technical challenges in developing neuromorphic hardware and the alkali ion-based memristor devices designed to overcome these challenges.