Spaceborne microwave radiometry provides an essential contribution to monitoring the Earth with varying spatial resolution both related to the reflector dimension and the frequency of operation. The ESA's Copernicus imaging microwave radiometer (CIMR) mission aims at collecting the geophysical observables at a spatial resolution ranging from 60 km in L band to 4 km in Ka band. This goal can be achieved by equipping CIMR with a large unfurlable mesh reflector antenna. A limitation of the antenna design is that the antenna pattern includes grating lobes that contaminate the scene measurement with contributions originated far from the nominal footprint. This effect introduces inaccuracies in brightness temperature measurements, particularly when facing radiometric discontinuities, e.g., near the coastlines and sea ice edges, which can be greater than the mission required maximum of 0.5 K. The aim of this article is to assess a technique which will be able to correct the effects of antenna pattern and obtain reliable T B measurements. The analyzed simple technique is based on a regularized deconvolution of the antenna pattern to reconstruct the actual brightness temperatures. The technique was tested over a synthetic scenario that mimics both steep and smooth variations in spatial and thermal domains.