The application of a high purity Germanium (HPGe) γ spectrometer in determining the fuel element burnup in a future reactor is studied. The HPGe detector is exposed by a 60Co source with varying irradiation rate from 10×103 s−1 to 150×103 s−1 to simulate the input counting rate in real reactor environment. A 137Cs and a 152Eu source are positioned at given distances to generate a certain event rate in the detector with the former being proposed as a labeling nuclide to measure the burnup of a fuel element. It is shown that both the energy resolution slightly increasing with the irradiation rate and the passthrough rate at high irradiation level match the requirement of the real application. The influence of the background is studied in the different parameter sets used in the specially developed procedure of background subtraction. It is demonstrated that with the typical input irradiation rate and 137Cs intensity relevant to a deep burnup situation, the precision of the 137Cs counting rate in the current experiment is consistently below 2.8%, indicating a promising feasibility of utilizing an HPGe detector in the burnup measurement in future bed-like reactors.