In the present article, we detail the method used to experimentally determine the power of the CROCUS zero-power reactor, and to subsequently calibrate its ex-core monitor fission chambers. Knowledge of the reactor power is a mandatory quantity for a safe operation. Furthermore, most experimental research programs rely on absolute fission rates in design and interpretation – for instance, tally normalization of reaction rate studies in dosimetry, or normalization of power spectral density in neutron noise measurements. The minimization of associated uncertainties is only achieved by an accurate power determination method. The main experiment consists in the irradiation, and therefore, the activation of several axially distributed Au-197 foils in the central axis of the core, which activities are measured with a High-Purity Germanium (HPGe) gamma spectrometer. The effective cross sections are determined by MCNP and Serpent Monte Carlo simulations. We quantify the reaction rate of each gold foil, and derive the corresponding fission rate in the reactor. The variance weighted average over the distributed foils then provides a calibration factor for the count rates measured in the fission chambers during the irradiation. We detail the calibration process with minimization of respective uncertainties arising from each sub-step, from power control after reactivity insertion, to the calibration of the HPGe gamma spectrometer. Biases arising from different nuclear data choices are also discussed.