Liquid State Machine (LSM) is a brain-inspired architecture used for solving problems like speech recognition and time series prediction. LSM comprises of a randomly connected recurrent network of spiking neurons. This network propagates the non-linear neuronal and synaptic dynamics. Maass et al. have argued that the non-linear dynamics of LSMs is essential for its performance as a universal computer. Lyapunov exponent (µ), used to characterize the non-linearity of the network, correlates well with LSM performance. We propose a complementary approach of approximating the LSM dynamics with a linear state space representation. The spike rates from this model are well correlated to the spike rates from LSM. Such equivalence allows the extraction of a memory metric (τM ) from the state transition matrix. τM displays high correlation with performance. Further, high τM system require lesser epochs to achieve a given accuracy. Being computationally cheap (1800× time efficient compared to LSM), the τM metric enables exploration of the vast parameter design space. We observe that the performance correlation of the τM surpasses the Lyapunov exponent (µ), (2 − 4× improvement) in the high-performance regime over multiple datasets. In fact, while µ increases monotonically with network activity, the performance reaches a maxima at a specific activity described in literature as the edge of chaos. On the other hand, τM remains correlated with LSM performance even as µ increases monotonically. Hence, τM captures the useful memory of network activity that enables LSM performance. It also enables rapid design space exploration and fine-tuning of LSM parameters for high performance.