The study was conducted to determine the family, social and economic factors associated with deaths of children aged under 5 years. A registry-based nested case-control study was conducted of the deaths of all children aged under 5 years in Kohgilooyeh and Boyer-Ahmad Province in the Islamic Republic of Iran. For each death, two controls were randomly selected among children of the same age, sex and place of residence (186 cases and 372 controls). Congenital abnormality (37.6%) and preterm birth (29.0%) were the two most frequent causes of death among children aged under 5 years. No vaccine-preventable disease was reported as the cause of death. The strongest associations were found with consanguinity of the parents (OR = 3.92; 95% CI = 2.27-6.85 for being first cousins in comparison with no family relation; P < 0.001) and with domestic violence to the mother during pregnancy (OR = 3.13; 95% CI = 1.60-6.17; P < 0.01). The main causes of death of children aged under 5 years in the Province were congenital abnormality and prematurity. . Les malformations congénitales (37,6 %) et les naissances prématurées (29,0 %) constituaient les deux causes de décès les plus fréquentes parmi les enfants de moins de cinq ans. Aucune maladie à prévention vaccinale n'a été rapportée comme cause de décès. Les associations les plus fortes étaient liées à la consanguinité des parents (OR = 3,92 ; IC à 95 % = 2,27-6,85 pour les cousins de premier degré en comparaison avec les sujets n'ayant aucun lien de parenté ; p < 0,001) et à la violence conjugale envers les mères pendant la grossesse (OR = 3,13 ; IC à 95 % = 1,60-6,17 ; p < 0,01). Les principales causes de décès des enfants de moins de cinq ans dans la province étaient les malformations congénitales et la prématurité. ـاالتاملتوسط لرشق الصحية املجلة العرشون و الثاين املجلد السادس العدد 369
Problem statement: Several studies conducted on dominance, marital satisfaction and female aggression (physical and psychological) are the primary concern of the review done in this study. This article contains two parts; the first part touches on the findings which show female dominance has relationship with aggression. The second section is concerned on studies which have shown marital satisfaction has correlation with female aggression. According to the existing literature, the rate of female aggression has relationship with dominance and marital satisfaction. Nevertheless, in spite of this fact, less research has been carried out on dominance, marital satisfaction and female aggression to achieve better family life and a better society in general. Conclusion: Without the studies on women aggression, the conflict behavior in the family related to aggression will not be solved. Researchers must pay more attention towards female aggression
Several studies conducted on female aggression (physical and psychological) are the primary concern of the review done in this paper. This article contains three parts -the first part focuses on the findings which show that the rate of aggression for male and female are equal, while the second part focuses on the research that reveal the rate of physical aggression is higher in women than men. The third part concerns on studies which have shown higher level of psychological aggression in women compared to men. According to the existing literatures, the rate of female aggression is equal to those of men, and in some studies, the rates of physical and psychological aggression among women are found to be higher than among men. Thus, it is concluded that the rate of women aggression is not lower than men, but it is either equal to or higher than men.
GPT is an auto-regressive Transformer-based pre-trained language model which has attracted a lot of attention in the natural language processing (NLP) domain. The success of GPT is mostly attributed to its pre-training on huge amount of data and its large number of parameters. Despite the superior performance of GPT, this overparameterized nature of GPT can be very prohibitive for deploying this model on devices with limited computational power or memory. This problem can be mitigated using model compression techniques; however, compressing GPT models has not been investigated much in the literature. In this work, we use Kronecker decomposition to compress the linear mappings of the GPT-2 model. Our Kronecker GPT-2 model (KnGPT2) is initialized based on the Kronecker decomposed version of the GPT-2 model and then is undergone a very light pretraining on only a small portion of the training data with intermediate layer knowledge distillation (ILKD). Finally, our KnGPT2 is fine-tuned on downstream tasks using ILKD as well. We evaluate our model on both language modeling and General Language Understanding Evaluation benchmark tasks and show that with more efficient pre-training and similar number of parameters, our KnGPT2 outperforms the existing DistilGPT2 model significantly.
GPT is an auto-regressive Transformer-based pre-trained language model which has attracted a lot of attention in the natural language processing (NLP) domain due to its state-of-the-art performance in several downstream tasks. The success of GPT is mostly attributed to its pre-training on huge amount of data and its large number of parameters (from 100M to billions of parameters). Despite the superior performance of GPT (especially in few-shot or zero-shot setup), this overparameterized nature of GPT can be very prohibitive for deploying this model on devices with limited computational power or memory. This problem can be mitigated using model compression techniques; however, compressing GPT models has not been investigated much in the literature. In this work, we use Kronecker decomposition to compress the linear mappings of the GPT-22 model. Our Kronecker GPT-2 model ( KnGPT2) is initialized based on the Kronecker decomposed version of the GPT-2 model and then is undergone a very light pre-training on only a small portion of the training data with intermediate layer knowledge distillation (ILKD). Finally, our KnGPT2 is fine-tuned on down-stream tasks using ILKD as well. We evaluate our model on both language modeling and General Language Understanding Evaluation benchmark tasks and show that with more efficient pretraining and similar number of parameters, our KnGPT2 outperforms the existing DistilGPT2 model significantly.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.