Face is arguably the most common biometric trait that has been extensively utilised and thoroughly studied. Since this unimodal feature visually represents the identity of an individual, preserving the security of face-based authentication models is a prime concern. This work proposes a framework for generating cancellable templates from raw facial images. Our scheme is essentially based upon the notion of locality sensitive hashing (LSH), specifically on its locality sampled code (LSC) realisation. Facial features are initially extracted using the binarized statistical image features (BSIF) descriptor. These binary features are subsequently hashed using the random bit sampling mechanism of LSC. Finally, these local hashes are permanently stored in a noninvertible manner. We have empirically analysed the security requirements of unlinkability, non-invertibility, and revocability in our model. We have also validated our work over the benchmark AR, ORL, Yale, and CASIA-Facev5 databases under multiple scenarios. Among all the resulting cases, the best performance of our model was noted at a minimum EER of 2.69%, 4.45%, 1.2%, and 2.66% for the four data sets, respectively. This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.