Empirical statistical downscaling methods are becoming increasingly popular in climate change impact assessments that require downscaling multi‐global climate model (GCM) projections. Here, empirical statistical downscaling methods are classified based on calibration strategies [bias correction (BC) and change factor (CF)] and statistical transformations (mean based, variance based, quantile mapping, quantile correcting and transfer function methods). Ten combinations of calibration strategies and transformation methods are used to represent a range of empirical statistical downscaling methods. To test the performance of these methods in downscaling daily precipitation and temperature, an inter‐model cross‐validation is carried out using an ensemble of 16 GCMs from the Coupled Model Intercomparison Project Phase 5 (CMIP5) dataset over the Huai River Basin in China. These downscaling methods are further applied to downscale the climate for the future period to assess the associated uncertainties. The results show that the CF‐based methods outperform the BC‐based methods in projecting the probability distribution of downscaled daily temperature, while both calibration strategies give comparable results in the case of precipitation. With the CF calibration strategy, simply adding (for temperature) or multiplying (for precipitation) the mean CF is sufficient to represent most of the relative changes projected by GCMs. The use of quantile‐based methods appears to be advantageous only at the tails of the distribution. More sophisticated BC methods are needed to remove the biases in the higher‐order statistics of the GCM outputs. The two calibration strategies lead to fundamentally different temporal structures and spatial variability of the downscaled climatic variables. The BC‐based methods produce larger uncertainty bands of inter‐annual variability than the CF‐based methods. For downscaled future precipitation, the uncertainty arising from the downscaling methods is comparable to the uncertainty arising from GCMs, while more uncertainty is introduced by calibration strategies than by statistical transformation methods.
Moulding compounds are used as encapsulation materials for electronic components. Their task is to protect the components from mechanical shocks and environmental effects such as moisture. Moulding compounds are epoxy resins filled with inorganic (silica) particles, carbon black and processing aids. They shows a clear viscoelastic behaviour which is not only temperature but also cure dependent. Due to both thermal and reaction shrinkage moulding compounds introduce residual stresses which may eventually result in product failure.Therefore they can be considered as key materials for the overall thermomechanical reliability. This paper deals with the characterization and modeling of the mechanical behaviour of such moulding compound. The focus will be on the effects of the degree of cure and the filler concentration.
Closely spaced, through-wafer interconnects are of large interest in RF MEMS and MEMS packaging. In this paper, a suitable technique to realize large arrays of small size through-wafer holes is presented. This approach is based on macroporous silicon formation in combination with wafer thinning. Very high aspect ratio (2 100) structures are realized. The wafers containing the large arrays of 2-3pm wide holes are thinned down to 200-150pm by lapping and polishing. Copper electroplating is finally employed to realize arrays ofhigh aspect ratio Cu plugs.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.