The X-ray emission from a simulated massive stellar cluster is investigated. The emission is calculated from a 3D hydrodynamical model which incorporates the mechanical feedback from the stellar winds of 3 O-stars embedded in a giant molecular cloud (GMC) clump containing 3240 M ⊙ of molecular material within a 4 pc radius. A simple prescription for the evolution of the stars is used, with the first supernova explosion at t = 4.4 Myrs. We find that the presence of the GMC clump causes shortlived attenuation effects on the X-ray emission of the cluster. However, once most of the material has been ablated away by the winds the remaining dense clumps do not have a noticable effect on the attenuation compared with the assumed interstellar medium (ISM) column. We determine the evolution of the cluster X-ray luminosity, L X , and spectra, and generate synthetic images. The intrinsic X-ray luminosity drops from nearly 10 34 ergs s −1 while the winds are 'bottled up', to a near constant value of 1.7×10 32 ergs s −1 between t = 1-4 Myrs. L X reduces slightly during each star's red supergiant (RSG) stage due to the depressurization of the hot gas. However, L X increases to ≈ 10 34 ergs s −1 during each star's Wolf-Rayet (WR) stage. The X-ray luminosity is enhanced by 2-3 orders of magnitude to ∼ 10 37 ergs s −1 for at least 4600 yrs after each supernova (SN) explosion, at which time the blast wave leaves the grid and the X-ray luminosity drops. The X-ray luminosity of our simulation is generally considerably fainter than predicted from spherically-symmetric bubble models, due to the leakage of hot gas material through gaps in the outer shell. This process reduces the pressure within our simulation and thus the X-ray emission. However, the X-ray luminosities and temperatures which we obtain are comparable to similarly powerful massive young clusters.