Integration of Ni passivation and surface functionalization with hydrophobic ammonium cations has been demonstrated to enhance the stability of perovskite to moisture. The functionalized perovskite photoanode can perform steady water oxidation for more than 30 min.
Developing all‐perovskite tandem solar cells has been proved to be an effective approach to boost the efficiency beyond the Shockley–Queisser limit. However, the Sn‐based narrow‐bandgap (NBG) perovskite solar cells (PSCs) suffer from the relatively low photostability, which limits their further application in all‐perovskite tandem solar cells. In this work, the instability of NBG PSCs is found to come from the commonly used acidic hole transporting material PEDOT:PSS, which reacts with the indispensable basic additive SnF2 in the perovskite layer. By acidity control of PEDOT:PSS via aqueous ammonia, the NBG PSCs yield an efficiency of 22.0% with much improved photostability, which can maintain 91.3% of the initial value after 800 h illumination under AM 1.5G. As an application, the corresponding all‐perovskite tandem cells exhibit a stabilized efficiency of 25.3% with 92% remaining after 560 h illumination. This work reveals an origin of instability of NBG PSCs and provides an effective approach to enhance the device stability, which can promote the development of all‐perovskite tandem solar cells.
Graph Transformer is gaining increasing attention in the field of machine learning and has demonstrated state-of-the-art performance on benchmarks for graph representation learning. However, as current implementations of Graph Transformer primarily focus on learning representations of small-scale graphs, the quadratic complexity of the global self-attention mechanism presents a challenge for full-batch training when applied to larger graphs. Additionally, conventional sampling-based methods fail to capture necessary high-level contextual information, resulting in a significant loss of performance. In this paper, we introduce the Hierarchical Scalable Graph Transformer (HSGT) as a solution to these challenges. HSGT successfully scales the Transformer architecture to node representation learning tasks on large-scale graphs, while maintaining high performance. By utilizing graph hierarchies constructed through coarsening techniques, HSGT efficiently updates and stores multi-scale information in node embeddings at different levels. Together with sampling-based training methods, HSGT effectively captures and aggregates multi-level information on the hierarchical graph using only Transformer blocks. Empirical evaluations demonstrate that HSGT achieves state-of-the-art performance on large-scale benchmarks with graphs containing millions of nodes with high efficiency.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.