Publication: Unbiased evaluation of ranking metrics reveals consistent performance in science and technology citation data
Unbiased evaluation of ranking metrics reveals consistent performance in science and technology citation data
Date
Date
Date
Citations
Xu, S., Mariani, M., Lü, L., & Medo, M. (2020). Unbiased evaluation of ranking metrics reveals consistent performance in science and technology citation data. Journal of Informetrics, 14(1), 101005. https://doi.org/10.1016/j.joi.2019.101005
Abstract
Abstract
Abstract
Despite the increasing use of citation-based metrics for research evaluation purposes, we do not know yet which metrics best deliver on their promise to gauge the significance of a scientific paper or a patent. We assess 17 network-based metrics by their ability to identify milestone papers and patents in three large citation datasets. We find that traditional information-retrieval evaluation metrics are strongly affected by the interplay between the age distribution of the milestone items and age biases of the evaluated metrics. Outc
Metrics
Downloads
Views
Additional indexing
Creators (Authors)
Volume
Volume
Volume
Number
Number
Number
Page range/Item number
Page range/Item number
Page range/Item number
Item Type
Item Type
Item Type
Scope
Scope
Scope
Language
Language
Language
Publication date
Publication date
Publication date
Date available
Date available
Date available
ISSN or e-ISSN
ISSN or e-ISSN
ISSN or e-ISSN
OA Status
OA Status
OA Status
Publisher DOI
Other Identification Number
Other Identification Number
Other Identification Number
Metrics
Downloads
Views
Citations
Xu, S., Mariani, M., Lü, L., & Medo, M. (2020). Unbiased evaluation of ranking metrics reveals consistent performance in science and technology citation data. Journal of Informetrics, 14(1), 101005. https://doi.org/10.1016/j.joi.2019.101005