The outcome of a single quantum experiment is unpredictable, except in a purestate limit. The definite process that takes place in the apparatus may either be intrinsically random or be explainable from a deeper theory. While the first scenario is the standard lore, the latter implies that quantum mechanics is emergent. In that case, it is likely that one has to reconsider radiation by accelerated charges as a physical effect, which thus must be compensated by an energy input. Stochastic electrodynamics, for example, asserts that the vacuum energy arises from classical fluctuations with energy 1 2h ω per mode. In such theories the stability of the hydrogen ground state will arise from energy input from fluctuations and output by radiation, hence due to an energy throughput. That flux of energy constitutes an arrow of time, which we call the "subquantum arrow of time". It is related to the stability of matter and it is more fundamental than, e.g., the thermodynamic and cosmological arrows.2 The situation being: after we adopt this or that interpretation, we run into oddities. A logical conclusion is, however, not to blame the theory but the making of such assumptions. 3 A similar statement by Edward Witten about string theory: "String theory is a part of 21st-century physics that fell by chance into the 20th century" is to be regarded as equally remarkable. This theory has not reproduced the standard model of particle physics, and even if it would, for what reason would the necessary set of parameters be singled out? In fact, string theory has more the looks of a framework, applicable to different situations, including Fermi liquid theory, than of a fundamental theory of Nature. 4 That this is not obvious to everybody, may be seen from several approaches to the interpretation of QM that try to do away with measurements, sometimes even banning the term. See [4] for a review.