Background Ocular changes are traditionally associated with only a few hepatobiliary diseases. These changes are non-specific and have a low detection rate, limiting their potential use as clinically independent diagnostic features. Therefore, we aimed to engineer deep learning models to establish associations between ocular features and major hepatobiliary diseases and to advance automated screening and identification of hepatobiliary diseases from ocular images.Methods We did a multicentre, prospective study to develop models using slit-lamp or retinal fundus images from participants in three hepatobiliary departments and two medical examination centres. Included participants were older than 18 years and had complete clinical information; participants diagnosed with acute hepatobiliary diseases were excluded. We trained seven slit-lamp models and seven fundus models (with or without hepatobiliary disease [screening model] or one specific disease type within six categories [identifying model]) using a development dataset, and we tested the models with an external test dataset. Additionally, we did a visual explanation and occlusion test. Model performances were evaluated using the area under the receiver operating characteristic curve (AUROC), sensitivity, specificity, and F1* score.
Low-light image enhancement (LLE) remains challenging due to the unfavorable prevailing low-contrast and weak-visibility problems of single RGB images. In this paper, we respond to the intriguing learning-related question -- if leveraging both accessible unpaired over/underexposed images and high-level semantic guidance, can improve the performance of cutting-edge LLE models? Here, we propose an effective semantically contrastive learning paradigm for LLE (namely SCL-LLE). Beyond the existing LLE wisdom, it casts the image enhancement task as multi-task joint learning, where LLE is converted into three constraints of contrastive learning, semantic brightness consistency, and feature preservation for simultaneously ensuring the exposure, texture, and color consistency. SCL-LLE allows the LLE model to learn from unpaired positives (normal-light)/negatives (over/underexposed), and enables it to interact with the scene semantics to regularize the image enhancement network, yet the interaction of high-level semantic knowledge and the low-level signal prior is seldom investigated in previous methods. Training on readily available open data, extensive experiments demonstrate that our method surpasses the state-of-the-arts LLE models over six independent cross-scenes datasets. Moreover, SCL-LLE's potential to benefit the downstream semantic segmentation under extremely dark conditions is discussed. Source Code: https://github.com/LingLIx/SCL-LLE.
The paper studied the effect of n-Na2B4O7additives of different content on tribological performances of the ion nitrocarburized layer. It is found that 7% n-Na2B4O7additive can improve greatly the friction reduction and wear resistance of the ion nitrocarburized layer under different conditions. This because that synergetic effect of friction reduction and wear resistance is produced between n-Na2B4O7additive and ion nitrocarburized layer under higher temperature, frequence and load, and the chemical reaction films including oxide, nitride, BN, and sulphide and so on formed on the friction surface play the solid lubrication function, and the n-Na2B4O7particles on the friction surface play the "Micron nanobearing" function, translating the sliding friction into the rolling friction, which can make the ion nitrocarburized layer possess the excellent tribological performances.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.