Gamma-ray spectrometry using collimated detectors is a well-established examination method for irradiated nuclear fuel. However, the feasibility of examining a particular nuclide of interest is subject to constraints; the peak must be statistically determinable with the desired precision and the total spectrum count rate in the detector should not cause throughput issues.Methods were assembled for gamma spectrum prediction to optimize instruments for gamma emission tomography and to enable a priori feasibility evaluation of determination of single peaks of irradiated nuclear fuel. The aim was to find reliable results (~10% accuracy) regarding total spectrum and peak count rates with faster computation time than a full-Monte Carlo approach. For this purpose, the method is based on depletion calculations with SERPENT2, a point-source kernel method for the collimator response, a rig response matrix and a detector response matrix, both computed with MCNP6. The computational methodology uses as input the fuel properties (dimensions, materials, power history, and cooling time), and the instrumental setup (collimator and detector dimensions and materials).The prediction method was validated using measured data from a high-burnup, short-cooled test fuel rodlet from the Halden reactor. Absolute count rates and ratios of characteristic peaks were compared between predicted and measured spectra, showing a total count rate overestimation of 7% and discrepancies between 2-20% for the single peaks (same order of magnitude of the uncertainty). This level of agreement is deemed sufficient for measurement campaigns planning, and the optimization of spectroscopic instruments for use in gamma scanning and tomography of nuclear fuel.