This paper investigates probability density functions (PDFs) that are continuous everywhere, nearly uniform around the mode of distribution, and adaptable to a variety of distribution shapes ranging from bell-shaped to rectangular. From the viewpoint of computational tractability, the PDF based on the Fermi-Dirac or logistic function is advantageous in estimating its shape parameters. The most appropriate PDF for n-variate distribution is of the form:where x, m ∈ R n , Σ is an n × n positive definite matrix, and r > 0 is a shape parameter. The flat-topped PDFs can be used as a component of mixture models in machine learning to improve goodness of fit and make a model as simple as possible.