Entropy coding, which is an essential part of audio compression, is always required to manage the tradeoffs between compression efficiency and computational complexity, and the strategy to achieve them highly depends on the distributions of inputs. In this paper, we present a method of controlling them for enhancing the compression efficiency of Golomb-Rice (GR) encoding, one of the simplest entropy coding methods optimal for Laplacian distributions. We will show that the proposed invertible and low-complexity mapping of integers enables the GR encoding to assign nearly the optimal code length for a wider range of distributions, generalized Gaussian distributions, maintaining low computational cost. A simulation by random numbers reveals that the proposed coder based on this scheme works about 6 times faster than the state-of-the-art arithmetic coder for Gaussian-distributed integers maintaining the increase in relative redundancy around 2.6%, which is much lower than that of a conventional GR coder. Additionally, an application to a practical speech and audio coding scheme is presented, and an objective evaluation for real speech and audio signals confirms the advantages of the proposed method in compression. The method is expected to widen the capability of low-complexity entropy coding, providing us with more flexible codec designs.