There has been work on exploiting polynomial approximation to solve distributed nonconvex optimization problems. This idea facilitates arbitrarily precise global optimization without requiring local evaluations of gradients at every iteration. Nonetheless, there remains a gap between existing theoretical guarantees and diverse practical requirements for dependability, including privacy preservation and robustness to network imperfections (e.g., time-varying directed communication, asynchrony and packet drops). To fill this gap and keep the above strengths, we propose a Dependable Chebyshev-Proxy-based distributed Optimization Algorithm (D-CPOA). Specifically, to ensure both accuracy of solutions and privacy preservation of local objective functions, a new privacy-preserving mechanism is designed. This mechanism leverages the randomness in block-wise insertions of perturbed data and separate subtractions of added noises, and its effects are thoroughly analyzed through (α, β)-data-privacy. In addition, to gain robustness to various network imperfections, we use the push-sum consensus protocol as a backbone, discuss its specific enhancements, and evaluate the performance of the proposed algorithm accordingly. Thanks to the linear consensusbased structure of iterations, we avoid the privacy-accuracy trade-off and the bother of selecting appropriate step-sizes in different settings. We provide rigorous treatments of the accuracy, dependability and complexity. It is shown that the advantages brought by the idea of polynomial approximation are perfectly maintained when all the above challenging requirements exist. Simulations demonstrate the efficacy of the developed algorithm.