While the governance of algorithms is of growing societal concern, the energy sector has been slow to engage with this issue. We argue that there are at least three systemic concerns to the design and operation of algorithms in the new, digital energy era. Namely, reliance on algorithms can bias considerations towards the easily quanti able, that they can inhibit explainability, transparency and trust, and that they could undermine energy users' autonomy and control. We examine these tensions through an interdisciplinary study that reveals the diversity and materiality of algorithms. Our study focuses on neighbourhood-scale batteries (NSBs) in Australia as a case study of new energy algorithms. We conducted qualitative research with energy sector professionals and citizens to understand the range of perceived bene ts and risks of NSBs and the algorithms that drive their behaviour. Issues raised by stakeholders were integrated into our development of multiple NSB optimisation algorithms, whose impacts on NSB owners and customers we quanti ed through techno-economic modelling. Our resultsshow the allocation of bene ts and risks vary considerably between different algorithm designs. This insight a need to improve energy algorithm governance, enabling accountability and responsiveness across the design and use of algorithms so that the digitisation of energy technology does not lead to adverse public outcomes. Taken together, our study underscores the importance for researchers and developers of new algorithms to take a holistic view of stakeholders and public bene t, and demonstrates one method to practice responsible algorithm design.
MainAlgorithms are playing ever more pervasive and critical roles in the operation of the electricity system as it becomes digitised and decentralised. The design of these algorithms -considering the values and biases they encode -has to date received scant attention by energy researchers and developers. We believe this to be an important omission given recent controversies regarding the reach of algorithmic authority in ever more areas including healthcare 1,2 , hiring practices 3,4 , law enforcement 5,6 and new media's role in social and political life 7 which have exposed deep ethical shortcomings that challenge the "arc of inevitability" of technical solutions 8 .Existing discussions of algorithms in the energy eld have tended to focus on questions of data privacy and the cybersecurity of smart meters and in-home devices 9,10 . While important, we argue that these overlook at least three systemic concerns. The rst is that algorithms tend to narrow considerations to factors that offer plentiful, easily quanti ed data, such as nancial values, excluding public values that may be harder to quantify. Such omissions narrow the conception of energy and have the potential to introduce structural biases 11 . Secondly, the challenging explainability of algorithms exacerbates accountability concerns and distrust of the energy system, and risks creating hidden outcomes 11 such as new forms o...