Assume that in independent two-dimensional random vectors (Λι, θ\) ,..., (ΧΝ,ΘΝ), each Θϊ is distributed according to some unknown prior distribution density g, and that , given θ,, Xi has the conditional density function q(x -Θ,), i = 1,..., TV. In each pair the first component is observable, but the second is not. After the (JV+l)-th pair (X^r+i, is obtained, the objective is to construct the empirical Bayes estimator of a polynomial f(0w + 1) = with given coefficients bj. In the paper we derive the empirical Bayes estimator of b(9) without any parametric assumptions on g. The upper bound for the mean squared error is obtained. The lower bound for the mean squared error over the class of all possible empirical Bayes estimators is also derived. It is shown that the estimators constructed in the paper have the optimal or nearly optimal convergence rates. Examples for familiar families of conditional distributions are considered.