Multilayer perceptrons (MLPs) and radial basis functions networks (RBFNs) have been widely concerned in recent years. In this paper, based on k-plane clustering (kPC) algorithm, we propose a novel artificial network model termed as Plane-Gaussian network to enlarge the arsenal of the neural networks. This network adopts a so-called Plane-Gaussian activation function (PGF) in hidden neurons. Replacing traditional central point of Gaussian radial basis function (RBF) with central hyperplane, PGF forms a band-shaped rather than spheral-shaped receptive field in RBF, which makes PGF able to express its peculiar geometrical characteristics: locality and globality. Importantly, it is also proved that PGF network (PGFN) having one hidden layer is capable of universal approximation. As a universal approximator, PGFN gives an informal way of bridging the gap between MLP and RBFN. The experiments report comparison between training time and classification accuracies on some artificial and UCI datasets and conclude that (1) PGFN runs significantly faster than MLP and (2) PGFN has comparable or better classification performance than MLP and RBFN, especially in subspace-distributed datasets.