Measurement of effusion rate is a primary objective for studies that model lava flow and magma system dynamics, as well as for monitoring efforts during on-going eruptions. However, its exact definition remains a source of confusion, and problems occur when comparing volume flux values that are averaged over different time periods or spatial scales, or measured using different approaches. Thus our aims are to: (1) define effusion rate terminology; and (2) assess the various measurement methods and their results. We first distinguish between instantaneous effusion rate, and time-averaged discharge rate. Eruption rate is next defined as the total volume of lava emplaced since the beginning of the eruption divided by the time since the eruption began. The ultimate extension of this is mean output rate, this being the final volume of erupted lava divided by total eruption duration. Whether these values are total values, i.e. the flux feeding all flow units across the entire flow field, or local, i.e. the flux feeding a single active unit within a flow field across which many units are active, also needs to be specified. No approach is without its problems, and all can have large error (up to ∼50%). However, good agreement between diverse approaches shows that reliable estimates can be made if each approach is applied carefully and takes into account the caveats we detail here. There are three important factors to consider and state when measuring, giving or using an effusion rate. First, the time-period over which the value was averaged; second, whether the measurement applies to the entire active flow field, or a single lava flow within that field; and third, the measurement technique and its accompanying assumptions.