The Gripon-Berrou neural network (GBNN) is a recently invented recurrent neural network embracing a LDPClike sparse encoding setup which makes it extremely resilient to noise and errors. A natural use of GBNN is as an associative memory. There are two activation rules for the neuron dynamics, namely SUM-OF-SUM and SUM-OF-MAX. The latter outperforms the former in terms of retrieval rate by a huge margin. In prior discussions and experiments, it is believed that although SUM-OF-SUM may lead the network to oscillate, SUM-OF-MAX always converges to an ensemble of neuron cliques corresponding to previously stored patterns. However, this is not entirely correct. In fact, SUM-OF-MAX often converges to bogus fixed points where the ensemble only comprises a small subset of the converged state. By taking advantage of this overlooked fact, we can greatly improve the retrieval rate. We discuss this particular issue and propose a number of heuristics to push SUM-OF-MAX beyond these bogus fixed points. To tackle the problem directly and completely, a novel post-processing algorithm is also developed and customized to the structure of GBNN. Experimental results show that the new algorithm achieves a huge performance boost in terms of both retrieval rate and run-time, compared to the standard SUM-OF-MAX and all the other heuristics.