2016
DOI: 10.24069/2542-0267-2016-1-4-25-31
|View full text |Cite
|
Sign up to set email alerts
|

A «Basket of Metrics»—the Best Support for Understanding Journal Merit

Abstract: Цель настоящей статьи-проверить утверждение, что эффективная оценка, основанная на количественных показателях, требует использования не одного, а целого набора определенных показателей (своего рода «корзины метрик») для более разнопланового и глубокого анализа качества журнала. Методы. Для изучения мнений проводился опрос (число опрошенных-204; доля ответивших-61 %) международного научного сообщества по поводу применения метрик (показателей) в системах оценки качества журналов и публикаций. Результаты. Ответ «… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 8 publications
(5 citation statements)
references
References 3 publications
0
5
0
Order By: Relevance
“…As a scientist, I fully agree with proposals that have been around for a long time that the impact factor should not be the sole element for the evaluation of individual researchers and/or articles [6]. It may be useful: I know from my own and my colleagues' experience that it is more difficult to publish a paper in high-impact factor journals -if I publish there I know that, on average (!…”
Section: Invited Commentarymentioning
confidence: 83%
“…As a scientist, I fully agree with proposals that have been around for a long time that the impact factor should not be the sole element for the evaluation of individual researchers and/or articles [6]. It may be useful: I know from my own and my colleagues' experience that it is more difficult to publish a paper in high-impact factor journals -if I publish there I know that, on average (!…”
Section: Invited Commentarymentioning
confidence: 83%
“…Similar to Principle 3, but with a global perspective, the diverse nature of research at the institution, as well as in the field, should be highlighted, and appropriate denominators and indicators should be requested. Using a range of appropriate metrics from which at least two indicators are chosen for assessment is a reasonable approach (Colledge and James, 2015).…”
Section: If Openness (As Defined In the Open Definition[15]mentioning
confidence: 99%
“…This recommendation facilitates transparency of assessment practices, enables scrutinization of indicators and allows for inclusion of disciplines and data sources that are not standard (specific databases from humanities; Danish Research Indicator Network (FIN), 2016). Bornmann and Haunschild (2016) point out that altmetrics are a valuable resource adding to the “basket of metrics”(Colledge and James, 2015) and expert assessment. Although they neither replace traditional metrics nor peer review, altmetrics can almost immediately provide a broader picture of engagement with scholarly products, provided they are equipped with persistent identifiers, e.g.…”
Section: Evaluation Of Leiden Manifesto For Library Workmentioning
confidence: 99%
“…In PlumX for example, the metrics available include usage (clicks, views, downloads, library holdings, video plays), captures (bookmarks, favourites, reference manager saves), mentions (blog posts, news mentions, comments, reviews, Wikipedia mentions), social media (tweets, +1s, likes, shares) and citations (citation indexes, patent citations, clinical citations, policy citations) [12]. Many studies have been done on the advantages and disadvantages of altmetrics and potential correlations to citation counts [13,14] and if researchers have an appetite and willingness to use more metrics such as usage data [15]. Whilst this paper does not go into the pros and cons of alternative metrics, we realise a growing willingness to use such metrics in the available "basket of metrics".…”
Section: Introductionmentioning
confidence: 99%