Neuromorphic embodied cell assembly agents that learn are one application being developed for the Human Brain Project (HBP). The HBP is building tools, available for all researchers, for building brain simulations. Existing simulated neural Cell Assembly agents are being translated to the platforms provided by the HBP; these agents run on neuromorphic chips in addition to von Neumann based computers. Whilst translation of the agents to the software technology demanded by the HBP platforms is relatively straightforward, porting to the neuromorphic chips is a non-trivial software engineering task. Versions of the simple agent, CABot1, have been developed in fatiguing leaky integrate and fire neurons, Izhikevich neurons and leaky integrate and fire neurons. These have been developed to run in Java, PyNN, NEST and Neuromorphic hardware. All variants are roughly equivalent. The agents view a picture, implement simple commands, and respond to a context sensitive directive involving the content of the picture. By running variants of these agents on different platforms, and with the different simulated neural models, implicit assumptions in these models can be revealed. Once these Cell Assembly agents have been translated and embodied in a virtual environment, they will be extended to learn more effectively. The use of neural hardware supports the real time simulation of many more neurons, providing a platform for exploration of more complex simulated neural systems.