The present paper is concerned with a special group of approximants with B2 superstructures. In the first part, recent work on structural features of the B2 superstructure approximants is summarized. Experimental results obtained in AI-Cu-Mn and AI-Cu systems are presented, where a series of B2-based approximants are observed. These phases all have similar valence electron concentrations, in full support of the e/a-constant definition of approximants. Special emphasis is laid on the chemical twinning modes of the B2 basic structure in relation to the AI-Cu approximants. It is revealed that the B2 twinning mode responsible for the formation of local pentagonal atomic arrangements is of 180°/[111] type. This is also the origin of 5-fold twinning of the B2 phase on quasicrystal surfaces. Crystallographic features of phases B2, ~2, z3, y, and other newly discovered phases are also discussed. In all these phases, local pentagonal configurations are revealed. In the second part, dry trihological properties of some AICuFe samples containing the B2-type phases are presented. The results indicated that the B2 phase having their valence ratio near that of the quasicrystal possesses low friction coefficient under various loads, comparable with the annealed quasicrystalllne ingot. Such a result indicates that the B2-type phase with e/a near that of quasicrystal is indeed an approximant, which is in full support of the valence electron criterion for approximants.
Certain ductility may occur during friction tests on quasicrystalline materials that are intrinsically brittle. This is, at least in part, due to a solid-state phase transition from the icosahedral to a BCC phase. The present paper first summarizes phase transition features of quasicrystals and then examines the microstructural mechanism of scratch indentation on an icosahedral Al-Cu-Fe sample. The last part of this paper is devoted to a discussion of the correlation of this phase with respect to quasicrystals.
Inspired by two basic mechanisms in animal visual systems, we introduce a feature transform technique that imposes invariance properties in the training of deep neural networks. The resulting algorithm requires less parameter tuning, trains well with an initial learning rate 1.0, and easily generalizes to different tasks. We enforce scale invariance with local statistics in the data to align similar samples generated in diverse situations. To accelerate convergence, we enforce a GL(n)-invariance property with global statistics extracted from a batch that the gradient descent solution should remain invariant under basis change. Tested on Im-ageNet, MS COCO, and Cityscapes datasets, our proposed technique requires fewer iterations to train, surpasses all baselines by a large margin, seamlessly works on both small and large batch size training, and applies to different computer vision tasks of image classification, object detection, and semantic segmentation.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.