Publication:
Unbiased evaluation of ranking metrics reveals consistent performance in science and technology citation data

Date

Date

Date
2020
Journal Article
Published version
cris.lastimport.scopus2025-06-02T03:42:31Z
cris.lastimport.wos2025-07-22T01:31:18Z
cris.virtual.orcidhttps://orcid.org/0000-0003-1032-5821
cris.virtualsource.orcid82316fd6-d371-4ddd-a4dc-b79fe57b38d6
dc.contributor.institutionUniversity of Zurich
dc.date.accessioned2020-02-20T10:47:51Z
dc.date.available2020-02-20T10:47:51Z
dc.date.issued2020-02-01
dc.description.abstractDespite the increasing use of citation-based metrics for research evaluation purposes, we do not know yet which metrics best deliver on their promise to gauge the significance of a scientific paper or a patent. We assess 17 network-based metrics by their ability to identify milestone papers and patents in three large citation datasets. We find that traditional information-retrieval evaluation metrics are strongly affected by the interplay between the age distribution of the milestone items and age biases of the evaluated metrics. Outcomes of these metrics are therefore not representative of the metrics’ ranking ability. We argue in favor of a modified evaluation procedure that explicitly penalizes biased metrics and allows us to reveal metrics’ performance patterns that are consistent across the datasets. PageRank and LeaderRank turn out to be the best-performing ranking metrics when their age bias is suppressed by a simple transformation of the scores that they produce, whereas other popular metrics, including citation count, HITS and Collective Influence, produce significantly worse ranking results.
dc.identifier.doi10.1016/j.joi.2019.101005
dc.identifier.issn1751-1577
dc.identifier.othermerlin-id:19177
dc.identifier.scopus2-s2.0-85083714909
dc.identifier.urihttps://www.zora.uzh.ch/handle/20.500.14742/168020
dc.identifier.wos000528948000013
dc.language.isoeng
dc.subject.ddc330 Economics
dc.titleUnbiased evaluation of ranking metrics reveals consistent performance in science and technology citation data
dc.typearticle
dcterms.accessRightsinfo:eu-repo/semantics/openAccess
dcterms.bibliographicCitation.journaltitleJournal of Informetrics
dcterms.bibliographicCitation.number1
dcterms.bibliographicCitation.originalpublishernameElsevier
dcterms.bibliographicCitation.pagestart101005
dcterms.bibliographicCitation.volume14
dspace.entity.typePublicationen
uzh.contributor.affiliationUniversity of Electronic Science and Technology of China
uzh.contributor.affiliationUniversity of Electronic Science and Technology of China|University of Zurich
uzh.contributor.affiliationUniversity of Electronic Science and Technology of China|Hangzhou Normal University
uzh.contributor.affiliationUniversity of Electronic Science and Technology of China|UniversitätsSpital Bern|University of Fribourg
uzh.contributor.authorXu, Shuqi
uzh.contributor.authorMariani, Manuel
uzh.contributor.authorLü, Linyuan
uzh.contributor.authorMedo, Matúš
uzh.contributor.correspondenceNo
uzh.contributor.correspondenceNo
uzh.contributor.correspondenceNo
uzh.contributor.correspondenceYes
uzh.document.availabilitypostprint
uzh.eprint.datestamp2020-02-20 10:47:51
uzh.eprint.lastmod2025-07-22 01:36:39
uzh.eprint.statusChange2020-02-20 10:47:51
uzh.harvester.ethYes
uzh.harvester.nbNo
uzh.identifier.doi10.5167/uzh-184445
uzh.jdb.eprintsId42202
uzh.oastatus.unpaywallgreen
uzh.oastatus.zoraGreen
uzh.publication.citationXu, Shuqi; Mariani, Manuel; Lü, Linyuan; Medo, Matúš (2020). Unbiased evaluation of ranking metrics reveals consistent performance in science and technology citation data. Journal of Informetrics, 14(1):101005.
uzh.publication.originalworkoriginal
uzh.publication.publishedStatusfinal
uzh.publication.scopedisciplinebased
uzh.scopus.impact28
uzh.scopus.subjectsComputer Science Applications
uzh.scopus.subjectsLibrary and Information Sciences
uzh.workflow.chairSubjectMarketing and Market Research
uzh.workflow.chairSubjectProfReneAlgesheimer1
uzh.workflow.doajuzh.workflow.doaj.false
uzh.workflow.eprintid184445
uzh.workflow.fulltextStatuspublic
uzh.workflow.revisions51
uzh.workflow.rightsCheckoffen
uzh.workflow.statusarchive
uzh.wos.impact30
Files

Original bundle

Name:
A_comparison_of_complex_network_metrics_in_identification_of_seminal_nodes.pdf
Size:
3.06 MB
Format:
Adobe Portable Document Format
Publication available in collections: