Neural networks (NNs) have achieved superhuman accuracy in multiple tasks, but NNs predictions' certainty is often debatable, especially if confronted with out of training distribution data. Averaging predictions of an ensemble of NNs can recalibrate the certainty of the predictions, but an ensemble is computationally expensive to deploy in practice. Recently, a new hardware-efficient multi-input multi-output (MIMO) NN was proposed to fit an ensemble of independent NNs into a single NN. In this work, we propose the addition of early-exits to the MIMO architecture with inferred depth-wise weightings to produce multiple predictions for the same input, giving a more diverse ensemble. We denote this combination as MIMMO: a multiinput, massive multi-output NN and we show that it can achieve better accuracy and calibration compared to the MIMO NN, simultaneously fit more NNs and be similarly hardware efficient as MIMO or the early-exit ensemble.