Due to the promoted integration of renewable sources, a further growth of strongly transient, distributed generation is expected. Thus, the existing electrical grid may reach its physical limits. To counteract this, and to fully exploit the viable potential of renewables, grid-balancing measures are crucial. In this work, battery storage systems are embedded in a grid simulation to evaluate their potential for grid balancing. The overall setup is based on a real, low-voltage distribution grid topology, real smart meter household load profiles, and real photovoltaics load data. An autonomous optimization routine, driven by a one-way communicated incentive, determines the prospective battery operation mode. Different battery positions and incentives are compared to evaluate their impact. The configurations incorporate a baseline simulation without storage, a single, central battery storage or multiple, distributed battery storages which together have the same power and capacity. The incentives address either market conditions, grid balancing, optimal photovoltaic utilization, load shifting, or self-consumption. Simulations show that grid-balancing incentives result in lowest peak-to-average power ratios, while maintaining negligible voltage changes in comparison to a reference case. Incentives reflecting market conditions for electricity generation, such as real-time pricing, negatively influence the power quality, especially with respect to the peak-to-average power ratio. A central, feed-in-tied storage performs better in terms of minimizing the voltage drop/rise and shows lower distribution losses, while distributed storages attached at nodes with electricity generation by photovoltaics achieve lower peak-to-average power ratios.