Random Fourier features (RFF) have been successfully employed to kernel approximation in large scale situations. The rationale behind RFF relies on Bochner's theorem, but the condition is too strict and excludes many widely-used kernels, e.g., dot-product kernels and indefinite kernels. In this paper, we present a unified RFF framework for indefinite kernel approximation in the Reproducing Kernel Kreȋn Spaces (RKKS). Besides, our model is also suited to approximate a dot-product kernel on the unit sphere, as it can be transformed into a shiftinvariant but indefinite kernel. By the Kolmogorov decomposition scheme, an indefinite kernel in RKKS can be decomposed into the difference of two unknown positive definite (PD) kernels. The spectral distribution of each underlying PD kernel can be formulated as a nonparametric Bayesian Gaussian mixtures model. Based on this, we propose a double-infinite Gaussian mixture model in RFF by placing the Dirichlet process prior. It takes full advantage of high flexibility on the number of components and has the capability of approximating indefinite kernels on a wide scale. In model inference, we develop a non-conjugate variational algorithm with a sub-sampling scheme for posterior inference. It allows for the non-conjugate case in our model and is quite efficient due to the sub-sampling strategy. Experimental results on several large classification datasets demonstrate the effectiveness of our nonparametric Bayesian model for indefinite kernel approximation when compared to other representative random features based methods.