Development of shape memory polymer materials with integrated self-healing ability, shape memory property, and outstanding mechanical properties is a challenge. Herein, isophorone diisocyanate, polytetramethylene ether glycol, dimethylglyoxime, and glycerol have been used to preparation polyurethane by reacting at 80 C for 6 h. Then, graphene oxide (GO) was added and the reaction keep at 80 C for 4 h to obtain polyurethane/GO composite with selfhealing and shape memory properties. Scanning electron microscopy shows that the GO sheets were dispersed uniformly in the polyurethane matrix. The thermal stability was characterized by thermogravimetric analyses. The tensile test shows that the Young's modulus of the composites increases from 38.57 ± 4.35 MPa for pure polyurethane to 95.36 ± 10.35 MPa for the polyurethane composite with a GO content of 0.5 wt%, and the tensile strength increases from 6.28 ± 0.67 to 15.65 ± 1.54 MPa. The oxime carbamate bond and hydrogen bond endow the composite good self-healing property. The healing efficiency can reach 98.84%. In addition, the composite has excellent shape memory property, with a shape recovery ratio of 88.6% and a shape fixation ratio of 55.2%. This work provides a promising way to fabricate stimulusresponsive composite with versatile functions.
As an inevitable phenomenon in most optical remote-sensing images, the effect of shadows is prominent in urban scenes. Shadow detection is critical for exploiting shadows and recovering the distorted information. Unfortunately, in general, automatic shadow detection methods for urban aerial images cannot achieve satisfactory performance due to the limitation of feature patterns and the lack of consideration of non-local contextual information. To address this challenging problem, the global-spatial-context-attention (GSCA) module was developed to self-adaptively aggregate all global contextual information over the spatial dimension for each pixel in this paper. The GSCA module was embedded into a modified U-shaped encoder–decoder network that was derived from the UNet network to output the final shadow predictions. The network was trained on a newly created shadow detection dataset, and the binary cross-entropy (BCE) loss function was modified to enhance the training procedure. The performance of the proposed method was evaluated on several typical urban aerial images. Experiment results suggested that the proposed method achieved a better trade-off between automaticity and accuracy. The F1-score, overall accuracy, balanced-error-rate, and intersection-over-union metrics of the proposed method were higher than those of other state-of-the-art shadow detection methods.
Convolutional Neural Networks (CNNs), such as U-Net, have shown competitive performance in the automatic extraction of buildings from Very High-Resolution (VHR) aerial images. However, due to the unstable multi-scale context aggregation, the insufficient combination of multi-level features and the lack of consideration of the semantic boundary, most existing CNNs produce incomplete segmentation for large-scale buildings and result in predictions with huge uncertainty at building boundaries. This paper presents a novel network with a special boundary-aware loss embedded, called the Boundary-Aware Refined Network (BARNet), to address the gap above. The unique properties of the proposed BARNet are the gated-attention refined fusion unit, the denser atrous spatial pyramid pooling module, and the boundary-aware loss. The performance of the BARNet is tested on two popular data sets that include various urban scenes and diverse patterns of buildings. Experimental results demonstrate that the proposed method outperforms several state-of-the-art approaches in both visual interpretation and quantitative evaluations.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.