Facial expression conveys nonverbal communication information to help humans better perceive physical or psychophysical situations. Accurate 3D imaging provides stable topographic changes for reading facial expression. In particular, light‐field cameras (LFCs) have high potential for constructing depth maps, thanks to a simple configuration of microlens arrays and an objective lens. Herein, machine‐learned NIR‐based LFCs (NIR‐LFCs) for facial expression reading by extracting Euclidean distances of 3D facial landmarks in pairwise fashion are reported. The NIR‐LFC contains microlens arrays with asymmetric Fabry−Perot filter and NIR bandpass filter on CMOS image sensor, fully packaged with two vertical‐cavity surface‐emitting lasers. The NIR‐LFC not only increases the image contrast by 2.1 times compared with conventional LFCs, but also reduces the reconstruction errors by up to 54%, regardless of ambient illumination conditions. A multilayer perceptron (MLP) classifies input vectors, consisting of 78 pairwise distances on the facial depth map of happiness, anger, sadness, and disgust, and also exhibits exceptional average accuracy of 0.85 (p<0.05). This LFC provides a new platform for quantitatively labeling facial expression and emotion in point‐of‐care biomedical, social perception, or human−machine interaction applications.
Conventional pain assessment methods such as patients’ self-reporting restrict the possibility of easy pain monitoring while pain serves as an important role in clinical practice. Here we report a pain assessment method via 3D face reading camera assisted by dot pattern illumination. The face reading camera module (FRCM) consists of a stereo camera and a dot projector, which allow the quantitative measurement of facial expression changes without human subjective judgement. The rotational offset microlens arrays (roMLAs) in the dot projector form a uniform dense dot pattern on a human face. The dot projection facilitates evaluating three-dimensional change of facial expression by improving 3D reconstruction results of non-textured facial surfaces. In addition, the FRCM provides consistent pain rating from 3D data, regardless of head movement. This pain assessment method can provide a new guideline for precise, real-time, and continuous pain monitoring.
Quantitative in vivo measurement helps physicians determine abnormal tissue size or resection margin accurately. Herein, in vivo 3D imaging of abnormal features during endoscopic operation using a light switching microprojector is reported. The microprojector features rotational offset microlens arrays and a customized illumination fiber bundle fully integrated through a single illumination channel of a clinical endoscope. The illumination channel switches white light into structured laser patterns on demand. The 3D profiles are precisely extracted by calculating the distortion of uniform structured patterns on a target surface. The 3D endoscope allows the precise measurement of the size and volume of polyp phantoms within 7.70% and 13.33% errors, respectively. The experimental results show the accurate measurements of abnormal ex vivo human abnormal tissue and in vivo volume changes in the inflated stomach wall of an anesthetized pig. The microprojector can provide a new opportunity for in vivo 3D endoscopic imaging and biometric applications.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.