An idea that has recently appeared in the neural network community is that networks with heterogeneous neurons and non-standard neural behaviors can provide computational advantages. A theoretical investigation of this idea was given by Kampakis (2013) for spiking neurons. In artificial neural networks this idea has been recently researched through Neural Diversity Machines (Maul, 2013). However, this idea has not been tested experimentally for spiking neural networks. This paper provides a first experimental investigation of whether neurons with non-standard behaviors can provide computational advantages. This is done by using a spiking neural network with a biologically realistic neuron model that is tested on a supervised learning task. In the first experiment the network is optimized for the supervised learning task by adjusting the parameters of the neurons in order to adapt the neural behaviors. In the second experiment, the parameter optimization is used in order to improve the network's performance after the weights have been trained. The results confirm that neurons with non-standard behaviors can provide computational advantages for a network. Further implications of this study and suggestions for future research are discussed.