The exponential growth of data and information has stimulated technological progress in computing systems that utilize them to effectively discover patterns and produce important insights. Neural network algorithms have been applied to conventional silicon transistor-based hardware to do highly parallel computations, drawing inspiration from the structure and functions of biological synapses and neurons in the brain. Nevertheless, synapses composed of many transistors are limited to storing binary data, and the utilization of intricate silicon neuron circuits to handle these digital states poses challenges in achieving low-power and low-latency computing. This study examines the significance of developing memories and switches for synaptic and neural components in building Neuromorphic systems that can efficiently conduct cognitive tasks and recognition. This chapter closely examines and rates the latest progress in Neuromorphic computing, focusing on how these changes impact edge and Internet of Things technologies. It is also being thought about how to use tiny switches and short-term memory to copy the action of neurons. Once this is done, more Studies in many areas should be able to focus on the design, circuitry, and devices of Neuromorphic systems.