Based on the theoretical neuroscience, G. Cotardo and A. Ravagnavi in [6] introduced a kind of asymmetric binary codes called combinatorial neural codes (CN codes for short), with a "matched metric" δ r called asymmetric discrepancy, instead of the Hamming distance d H for usual error-correcting codes. They also presented the Hamming, Singleton and Plotkin bounds for CN codes with respect to δ r and asked how to construct the CN codes C with large size |C| and δ r (C). In this paper we firstly show that a binary code C reaches one of the above bounds for δ r (C) if and only if C reaches the corresponding bounds for d H and r is sufficiently closed to 1. This means that all optimal CN codes come from the usual optimal codes. Secondly we present several constructions of CN codes with nice and flexible parameters (n, K, δ r (C)) by using bent functions.