Nowadays, smartphone cameras capture bursts of raw photographs whenever the trigger is pressed. These photos are then fused to produce a single picture with higher quality. This paper details the implementation of the method "Handheld Multi-Frame Super-Resolution algorithm" by Wronski et al. (used in the Google Pixel 3 camera), which performs simultaneously multiimage super-resolution demosaicking and denoising from a burst of images. Hand tremors during exposure cause subpixel motions, which combined with the Bayer color filter array of the sensor results in a collection of aliased and shifted raw photographs of the same scene. The algorithm efficiently aligns and fuses these signals into a single high-resolution one by leveraging the aliasing to reconstruct the high-frequencies of the signal up to the Nyquist rate of the sensor. This approach yields digitally zoomed images up to a factor of 2, which is the limit naturally set by the sensor pixel integration. We present an in-depth description of this algorithm, along with numerous implementation details we have found to reproduce the results of the original paper, whose code is not publicly available.