In this paper, we consider the opportunities and constraints, which rest on quantization as a guiding principle for data representation and compression. In particular, we propose a novel model of Symmetric Quantile Quantizer (SQQ) and we describe in detail its parameterization. We suggest a simple method for offline precalculation of its parameters and we examine the inevitable loss of information introduced by SQQ, as an important part of bit optimization task at the traditional network level, which can be globally mapped out in many contemporary solutions. Our anticipation is that such precalculated values can be leveraged in deterministic quantization process. We highlight that this notice heavily relies on the fact that the values of interest are distributed according to the Laplacian distribution, which we consider in the paper. The basic difference of our SQQ and the previously established asymptotically optimal quantizer model, that is, Scalar Companding Quantizer (SCQ), is reflected in the fact that, in SCQ model, both decision thresholds and representation levels are determined in accordance with the specified compressor function, whereas in our SQQ model, a precedence of SCQ model for the straightforward decision thresholds calculation is used, while the representation levels are optimally determined for the specified decision thresholds and assumed Laplacian distribution. As a result, our SQQ outperforms SCQ in terms of signal-to-quantization noise ratio (SQNR). As stated in this paper, there are numerous indications to make us believe that appropriate quantizer parameterization will move us closer to an optimization in the amount of the transferred data in bits, which is strongly dependent on the amount of SQNR.