Federated learning (FL) is emerging as the most promising approach to collaboratively train a machine learning (ML) model on a common task without centralizing data. During each FL round, participants locally train a partial model with its on-premises data. Such models are subsequently aggregated to derive a global one. How these partial models are combined is a primary concern. Traditional approaches usually rely on a parameter server that introduces many weaknesses such as single point of failure, lack of trustworthiness among unknown participants, and incapacity to handle the traffic generated from millions of devices.Thus, to overcome such concerns, blockchain has recently been proposed as a valuable solution to improve the robustness of FL approaches. The full-blown benefits of using blockchain enable tackling the limits of centralized servers. However, energy consumption is still one of the significant factors inhibiting its widespread due to the current discussions on climate change and sustainability. Recently, a growing number of research works have been focusing on integrating FL and blockchain, nevertheless, adequate analysis and estimate of their energy and power consumption are often lacking.This paper presents an estimate of the power consumption of FlowChain, an architecture that integrates FL with blockchain to simplify the use of FL. Experimental results demonstrate that the overall power consumption significantly depends on the ML model adopted.