Emerging energy-efficient neural network processor is a promising hardware design to accelerate neural network algorithms with high performance and low power consumption. Typically, static random-access memory (SRAM) is employed to develop large buffers using in the processor. The bit cell of SRAM contains six transistors, leading to low density and large leakage current. In particular, several AI processors need multiple port and transfer-based SRAMs, which decrease the density and increase the power consumption. Recently, emerging spin-orbit torque magnetic random-access memory (SOT-MRAM) becomes a possible solution to replace the SRAM as working memory. However, more operations should be supported by the SOT-MRAM to provide sufficient functions, such as multiple-port memory, transpose memory, data-streaming operations. In this paper, we develop the working memory of neural network processor with SOT-MRAM to build the design library including the transpose operations, multiple-port memory, and data-streaming based buffer arrays. Equiped with those operations provided by SOT-MRAM, we can build high performance and energy-efficient neural network processors.