2010 International Conference on Digital Manufacturing &Amp; Automation 2010
DOI: 10.1109/icdma.2010.328
|View full text |Cite
|
Sign up to set email alerts
|

Numerical Simulation and Analysis of Flow and Heat Transfer in the High-Pressure Vortex Tube

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
7
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 14 publications
(7 citation statements)
references
References 4 publications
0
7
0
Order By: Relevance
“…These findings were largely confirmed even on the single-item level (see Supplementary Table 1). Diagnostic agreement between the interviewers and both independent raters was "substantial" to "almost perfect" for most disorders with the exceptions of ADHD hyperactiveimpulsive type and DMDD (Landis and Koch, 1977). With regard to diagnosing ADHD and its subtypes, we found substantial agreement for any ADHD diagnosis, for ADHD combined type, and for ADHD inattentive type.…”
Section: Discussionmentioning
confidence: 68%
See 2 more Smart Citations
“…These findings were largely confirmed even on the single-item level (see Supplementary Table 1). Diagnostic agreement between the interviewers and both independent raters was "substantial" to "almost perfect" for most disorders with the exceptions of ADHD hyperactiveimpulsive type and DMDD (Landis and Koch, 1977). With regard to diagnosing ADHD and its subtypes, we found substantial agreement for any ADHD diagnosis, for ADHD combined type, and for ADHD inattentive type.…”
Section: Discussionmentioning
confidence: 68%
“…Fleiss' kappa was calculated between the interviewers and two raters. To interpret kappa values, Landis and Koch (1977) suggested the following benchmarks: slight ≤ 0.20; fair = 0.21-0.40; moderate = 0.41-0.60; substantial = 0.61-0.80; almost perfect agreement ≥ 0.81.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Once every two weeks, each child was observed in couples while playing with either the DS doll or the control doll. Interrater reliability was Kappa = 0.58, which corresponds to a moderate agreement (Landis and Koch 1977).…”
Section: Methodsmentioning
confidence: 92%
“…Interobserver variability was reflected using weighted Kappa statistics for ordinal variables and intraclass correlation coefficients (ICC) with a two-way random model for continuous variables. Kappa and ICC values were interpreted according to Landis and Koch [17], where < 0.0 reflects 'poor', 0.0-0.20 'slight', 0.41-0.6 'moderate', 0.61-0.8 'substantial' and > 0.81 'almost perfect' agreement.…”
Section: Discussionmentioning
confidence: 99%