Radiation damage limits the accuracy of macromolecular structures in X-ray crystallography. Cryogenic (cryo-) cooling reduces the global radiation damage rate and, therefore, became the method of choice over the past decades. The recent advent of serial crystallography, which spreads the absorbed energy over many crystals, thereby reducing damage, has rendered room temperature (RT) data collection more practical and also extendable to microcrystals, both enabling and requiring the study of specific and global radiation damage at RT. Here, we performed sequential serial raster-scanning crystallography using a microfocused synchrotron beam that allowed for the collection of two series of 40 and 90 full datasets at 2- and 1.9-Å resolution at a dose rate of 40.3 MGy/s on hen egg white lysozyme (HEWL) crystals at RT and cryotemperature, respectively. The diffraction intensity halved its initial value at average doses (D1/2) of 0.57 and 15.3 MGy at RT and 100 K, respectively. Specific radiation damage at RT was observed at disulfide bonds but not at acidic residues, increasing and then apparently reversing, a peculiar behavior that can be modeled by accounting for differential diffraction intensity decay due to the nonuniform illumination by the X-ray beam. Specific damage to disulfide bonds is evident early on at RT and proceeds at a fivefold higher rate than global damage. The decay modeling suggests it is advisable not to exceed a dose of 0.38 MGy per dataset in static and time-resolved synchrotron crystallography experiments at RT. This rough yardstick might change for proteins other than HEWL and at resolutions other than 2 Å.