Citation Statistics and Citation Rings

Citations lend credibility to a scientific paper. And sometimes even result in a cash bonus. But while most researchers just hope their paper will be cited by others, some authors have found more ‘creative’ ways of getting their papers noticed. A recent comment on PubPeer offers a glimpse behind the scenes.

Science builds upon science, and no research stands by itself. Researchers build upon each other’s work, by reading older studies, forming new hypotheses, doing experiments, writing papers, and crediting the previously written articles by listing them in the References section of their publications. Similarly, those researchers hope that their own papers will be inspiring new research and that they will be cited by others.

A paper that has never been cited will not look as good as a paper that has been cited by many. A paper that has been cited by dozens or hundreds of other papers will look reliable, trustworthy, and will probably end up very high in search engines such as Google Scholar, which use a complex set of parameters to decide which paper you will see at the top of your search.

Citation statistics

It is easy to see how many times a paper has been cited and by which other papers. For example, PubMed has a “Cited by” option, highlighted here with a red box. This is one of my favorite microbiome papers, and it has been cited 331 times as of today. However, PubMed will only count papers published in journals that are indexed in PubMed.

Another tool that is very helpful is Dimensions. It works best if you paste the unique identifier (the DOI) of a paper into the search box and check the radio button for ‘DOI’. Here, the number of citations for the same paper is 580, presumably because Dimensions has a wider range of indexed journals.

For this particular paper, BioMedCentral, its publisher, also includes the number of citations on the landing page of the paper, which it states to be 474.

Google Scholar is another popular site that makes it easy to see how many times a paper has been cited. Here, it shows that the paper has been cited 707 times.

Source: Google Scholar

AltMetric is a nice tool to see the media attention of a paper. Its score depends on how many times news outlets, blogs, social media posts etc. mentioned a particular paper, but it also lists the number of citations, which it takes from Dimensions (see above).

The Web of Science Science Citation Index (SCI), owned by Clarivate, is the most ‘official’ scientific indexing tool. Like Elsevier’s Scopus, another database, it requires an institutional subscription and I do not have access.

As you can see above, the exact number of citations varies between all these metrics, presumably because they are all calculated differently, and each particular search engine might look at a different set of journals. Google Scholar usually gives the highest number of citations for a given paper, perhaps because it includes books and some magazines that other tools might not count.

Although it might not be straightforward to know exactly how many times a paper has been cited, these tools give some insight about the ‘scientific popularity’ of a paper amongst other researchers.

Incentives to get cited

Citations are important. The influence of researchers is often measured by the numbers of times their papers are cited. The most popular tool to measure an author’s citation score is the h-index, and whether you like or hate this measure, large egos love to boast about their large h-index.

Citations are also important for university rankings. For example, QS World University Rankings measures institutional research quality by calculating a normalized citations-per-faculty metric, which determines 20% of a university’s score. Similarly, Times Higher Education’s World University Rankings gives a weight of 30% – the most of all metrics – to the number of citations.

In a 2013 publication archived here, UNESCO commented about university rankings, by stating that “Rightly or wrongly, they are perceived as a measure of quality and so create intense competition between universities all over the world“.

Thus, institutions find citation scores very important and they might use them to evaluate their staff and faculty members.

Some countries even offer monetary rewards to academic authors, not only for getting a paper published but also for getting it cited. An arXiv preprint from 2016 showed that Chinese universities give cash rewards to authors who published in journals indexed by Web of Science. The reward policies at those universities increasingly counted numbers of citations as well.

Source: Wei Quan et al., arXiv (2017),
https://doi.org/10.48550/arXiv.1707.01162


Gaming the metric: Citation rings

The cash reward system was set up to encourage Chinese researchers to publish and get cited, and to enhance the position of China as a top player in the international research community. And these incentives paid off. A recent study (paywalled) showed that the quality of research from China – as measured by the number of citations a paper gets – exceeded that of the US since 2019.

But, as you can imagine, these monetary incentives also encourage questionable research practices. As reported in Nature, ‘the focus on metrics has also driven a rise in inappropriate practices, such as researchers submitting plagiarized or fraudulent papers, or inappropriately citing their own or a colleague’s work to boost citations‘. In a 2020 order, China’s Ministries of Science and Education told universities to stop these cash rewards.

In the meantime, researchers had found some creative ways of getting their papers cited. One of the ways to get cited is to form a group with other researchers, and to promise to cite each other’s papers. This process is called ‘clubbing’ or a ‘citation ring’ or ‘citation cartel‘. For example, Retraction Watch reported on a set of 60 papers that got retracted after a “peer review and citation ring” was discovered.

A 2015 article found that ‘highly cited Chinese papers are more likely than are similar highly cited U.S. papers to be cited by works from China and from the same institution or author; that is, substantial citation differences between U.S. and Chinese highly cited papers exist at all three levels of internal citation‘. As the authors explain, ‘The norms of interpersonal relationships (guanxi) in China may lead Chinese scholars to cite the work of their colleagues in the same institute, who they meet frequently, or leading scholars in their own country, who have an influence in proposal review and external evaluation for promotion.’

But, as Iztok Fister Jr. explains in this Retraction Watch article, ‘There is often a very thin line between groups of researchers in the same field who cite each other because their research is related and a citation cartel.‘ In a 2016 article, he and his coauthors write ‘Today, citation cartels, where members of these mutually cited papers of authors with which they are known or not known, have become reality in the research domain. In the everyday harder citation race, the cartels imply an easy way to obtain scientific excellence by increasing the number of one’s own citations.’

A recent commenter on PubPeer revealed a more organized way of increasing the number of times a paper gets cited . Under the pseudonym Oriensubulitermes inanis, they talk about agencies that offer citations to authors. The way it works is that after paying a fee, an author first has to cite an article from a list of papers provided by the agent. After that, the author can add their paper to the list, and then gets cited in return. It’s sort of a Ponzi scheme for citations.

O. inanis writes: ‘Not only the authors of this paper but also the authors of the reference papers are in a big and complex network. They do not need to know each other, but they can work together to generate a pile of “precious” “ESI highly cited papers”, guided and managed by the agent.

It does not matter if the paper is relevant, as long as the topic vaguely matches the author’s paper, it is OK. Most peer reviewers do not check all cited papers. Or, as PubPeer top-commenter Hoya camphorifolia found, the authors might add a final sentence to a manuscript after its acceptance that purely serves as a ‘Citation Delivery Vehicle’. In this particular case, that new sentence added 10 references to the manuscript that were not directly related to the topic.

So that is another thing to look out for us science paper sleuths: Ponzi citations.

4 thoughts on “Citation Statistics and Citation Rings”

  1. Retraction Watch reported on a set of 60 papers that got retracted after a “peer review and citation ring” was discovered.

    The emphasis in that case was on “peer review”. The 60 papers that were retracted from Journal of Vibration and Control were all by the same authors. As were additional papers retracted from Human Factors and Ergonomics in Manufacturing & Service Industries: see https://retractionwatch.com/2016/02/22/author-in-2014-peer-review-ring-loses-3-papers-for-peer-review-problems/, https://retractionwatch.com/2016/04/28/authors-in-2014-peer-review-ring-loses-4-more-papers-each-for-compromised-review/.

    It sounds as if those authors had created false identities so they could review their own papers: “evidence indicating that the peer review of this paper was compromised”. There was also an element of “inappropriate manipulation of citations”, but that might have been to boost the authors’ own citations, rather than anyone else’s. No-one else lost any papers.

    Like

Leave a comment