The interplay between Hsf4 and Hsf1 plays an important role in the regulation of lens homeostasis. However, the mechanism of the intermolecular association involved is still unclear. In this paper, we find that reconstitution of Hsf4b into Hsf4-/- lens epithelial (mLEC/Hsf4-/-) cells can simultaneously downregulate Hsp70 expression and upregulate the expression of small heat shock proteins Hsp25 and αB-crystallin at both RNA and protein levels. ChIP assay results indicate Hsf4b, which binds to the promoters of Hsp90α, Hsp70.3, Hsp25 and αB-crystallin but not Hsp70.1, can inhibit Hsf1 binding to Hsp70.3 promoter and the heat shock mediated Hsp70 promoter activity by reducing Hsf1 protein expression. Hsf4b N-terminal hydrophobic region can interact with Hsf1 N-terminal hydrophobic region. Their interaction impairs Hsf1's intramolecular interaction between the N- and C-terminal hydrophobic regions, leading to Hsf1's cytosolic retention and protein degradation. Both lysosome inhibitors (chloroquine, pepstatin A plus E64d) and proteasome inhibitor MG132 can inhibit Hsf4-mediated Hsf1 protein degradation, but MG132 can induce Hsf1 activation as well. Upregulation of Hsf4b can significantly inhibit cisplatin and staurosporine induced lens epithelial cell apoptosis through direct upregulation of Hsp25 and αB-crystallin expression. Taken together, our results imply that upregulation of Hsf4b modulates the expression pattern of heat shock proteins in lens tissue by either directly binding to their promoters or promoting Hsf1 protein degradation. Moreover, upregulation of Hsf4b protects lens cell survival by upregulating anti-apoptotic pathways. These studies reveal a novel regulatory mechanism between Hsf1 and Hsf4b in modulating lens epithelial cell homeostasis.
Non-autoregressive models generate target words in a parallel way, which achieve a faster decoding speed but at the sacrifice of translation accuracy. To remedy a flawed translation by non-autoregressive models, a promising approach is to train a conditional masked translation model (CMTM), and refine the generated results within several iterations. Unfortunately, such approach hardly considers the sequential dependency among target words, which inevitably results in a translation degradation. Hence, instead of solely training a Transformer-based CMTM, we propose a Self-Review Mechanism to infuse sequential information into it. Concretely, we insert a left-to-right mask to the same decoder of CMTM, and then induce it to autoregressively review whether each generated word from CMTM is supposed to be replaced or kept. The experimental results (WMT14 En↔De and WMT16 En↔Ro) demonstrate that our model uses dramatically less training computations than the typical CMTM, as well as outperforms several state-of-the-art non-autoregressive models by over 1 BLEU. Through knowledge distillation, our model even surpasses a typical left-to-right Transformer model, while significantly speeding up decoding.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.