2016
DOI: 10.1093/reseval/rvw018
|View full text |Cite
|
Sign up to set email alerts
|

Integrating metrics to measure research performance in social sciences and humanities: The case of the Spanish CSIC

Abstract: Knowledge dissemination in the social sciences and humanities (SSH) is characterized by an assorted set of publication channels and a more prevalent use of local languages, so international bibliographic databases do not provide a practical study source by themselves. This paper aims at laying the foundations for a comprehensive study of the research performance of SSH CSIC researchers from a micro-level perspective. Both the WoS and an internal CSIC database ('ConCiencia') are used in combination with a set o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 33 publications
0
4
0
Order By: Relevance
“…We believe that these mapping and contextual approaches to altmetrics can be particularly relevant for exploring the impacts (both academic and societal) of the social sciences and the humanities. These are scholarly areas which are very badly covered by scientific databases and societal impact indicators, because non-standard publications (Díaz-Faes et al 2016) and informal interactions regarding socioeconomic and cultural issues are more common than in other fields (Olmos et al 2013). Another issue of special interest is the use of these approaches to explore the engagement of researchers with local or global peers or stakeholders, an issue that is hotly debated in countries in the 'scientific peripheries' with pressure to publish internationally (Piñeiro & Hicks 2015;Chavarro et al 2017).…”
Section: Discussionmentioning
confidence: 99%
“…We believe that these mapping and contextual approaches to altmetrics can be particularly relevant for exploring the impacts (both academic and societal) of the social sciences and the humanities. These are scholarly areas which are very badly covered by scientific databases and societal impact indicators, because non-standard publications (Díaz-Faes et al 2016) and informal interactions regarding socioeconomic and cultural issues are more common than in other fields (Olmos et al 2013). Another issue of special interest is the use of these approaches to explore the engagement of researchers with local or global peers or stakeholders, an issue that is hotly debated in countries in the 'scientific peripheries' with pressure to publish internationally (Piñeiro & Hicks 2015;Chavarro et al 2017).…”
Section: Discussionmentioning
confidence: 99%
“…Same case here for journals in languages other than English. An extensive bibliography acknowledges these limitations [15]. • Inside GesBIB, information is processed and enriched following this iterative process:…”
Section: Methodology and Scopementioning
confidence: 99%
“…Given the perverse incentives enabled by current mechanisms of evaluation in higher education, is it any surprise that scholars act in self-serving ways: they treat graduate students with disdain (K. J. Baker, 2018;Braxton et al, 2011;Noy and Ray, 2012), steal each other's ideas (Bouville, 2008;Douglas, 1992;Grossberg, 2004;Hansson, 2008;Lawrence, 2002;Martin, 1997;Resnik, 2012), engage in citation gaming practices (Baccini et al, 2019; Cronin, 2014; Gruber, 2014; Rouhi, 2017; Sabaratnam and Kirby, 2014) such as "citation cartels" (Franck, 1999;Onbekend et al, 2016) or even outright citation malpractice (Davenport and Snyder, 1995), cite only those with whom they agree (Hojat et al, 2003;Mahoney, 1977), insist that their Ph.D. students cite them in every work (Hüppauf, 2018;Sugimoto, 2014), require undergraduates buy their $200 book, 4 manipulate images to better suit their argument (Clark, 2013;Cromey, 2010;Jordan, 2014), manipulate p-values (Gelman and Loken, 2013;Head et al, 2015;Wicherts et al, 2016), 5 denigrate competitors' research in peer review (Balietti et al, 2016;Lee and Schunn, 2011;Mahoney, 1977;Mallard et al, 2009;Penders, 2018;Rouhi, 2017)-or openly ridicule earnest peer review of what turn out to be hoax papers (Mounk et al, 2018;Schliesser, 2018;White, 2004)-or change their research to suit the metrics, as Aagaard et al (2015) and Díaz-Faes et al (2016) and many others …”
Section: Perverse Incentivesmentioning
confidence: 99%