<p><strong>Abstract.</strong> Many applications of geophysical data &#8211; whether from surface observations, satellite retrievals, or model simulations &#8211; rely on aggregates produced at coarser spatial (e.g. degrees) and/or temporal (e.g. daily, monthly) resolution than the highest available from the technique. Almost all these aggregates report the arithmetic mean and standard deviation as summary statistics, which are what data users employ in their analyses. These statistics are most meaningful for Normally-distributed data; however, for some quantities, such as aerosol optical depth (AOD), it is well-known that distributions are on large scales closer to Lognormal, for which geometric mean and standard deviation would be more appropriate. This study presents a method to assess whether a given sample of data are more consistent with an underlying Normal or Lognormal distribution, using the Shapiro-Wilk test, and tests AOD frequency distributions on spatial scales of 1&#176; and daily, monthly, and seasonal temporal scales. A broadly consistent picture is observed using Aerosol Robotic Network (AERONET), Multiangle Imaging Spectroradiometer (MISR), Moderate Resolution Imagining Spectroradiometer (MODIS), and Goddard Earth Observing System Version 5 Nature Run (G5NR) data. These data sets are complementary: AERONET has the highest AOD accuracy but is sparse; MISR and MODIS represent different satellite retrieval techniques and sampling; as a model simulation, G5NR is spatiotemporally complete. As time scales increase from days to months to seasons, data become increasingly more consistent with Lognormal than Normal distributions, and the differences between arithmetic and geometric mean AOD become larger, with geometric mean becoming systematically smaller. Assuming Normality systematically overstates both the typical level of AOD and its variability. There is considerable regional heterogeneity in the results: in low-AOD regions such as the open ocean and mountains, often the AOD difference is sufficiently small (<&#8201;0.01) as to be unimportant for many applications, especially on daily timescales. However, in continental outflow regions and near source regions over land, and on monthly or seasonal time scales, the difference is frequently larger than the Global Climate Observation System (GCOS) goal uncertainty on a climate data record (the larger of 0.03 or 10&#8201;%). This is important because it shows the sensitivity to averaging method can and often does introduce systematic effects larger than the total goal GCOS uncertainty. Using three well-studied AERONET sites, the magnitude of estimated AOD trends is shown to be sensitive to the choice of arithmetic vs. geometric means, although the signs are consistent. The main recommendations from the study are that (1) the distribution of a geophysical quantity should be analysed in order to asses how best to aggregate it; (2) ideally AOD aggregates such as satellite level 3 products (but also ground-based data and model simulations) should report geometric mean or median rather than (or in addition to) arithmetic mean AOD; and (3) as this is unlikely in the short term due to the computational burden involved, users can calculate geometric mean monthly aggregates from widely-available daily mean data as a stopgap, as daily aggregates are less sensitive to the choice of aggregation scheme than those for monthly or seasonal aggregates. Further, distribution shapes can have implications for the validity of statistical metrics often used for comparison and evaluation of data sets. The methodology is not restricted to AOD and can be applied to other quantities.</p>