A study on Oncotarget papers

In 2016, with coauthors  Arturo Casadevall and Ferric Fang, I published a study on 20,000 biomedical papers with photographic images, in which we found an average of 4% to contain inappropriately duplicated images.

Not surprisingly, we found that percentage to vary per journal. Some of the 40 journals we investigated had much higher percentages of image duplicates than others.

Percentage of papers with duplicated images in 40 different journals. Based on a subset of 17,816 papers published from 2005-2014. Source: Bik EM et al. , The Prevalence of Inappropriate Image Duplication in Biomedical Research Publications, mBio (2016), doi:10.1128/mBio.00809-16.

Three journals in our study, published by Hindawi and Spandidos, had the highest percentage of duplicated images in our study (top of the graph above), i.e. 10.4 -12.4%.

Scanning Oncotarget papers

One journal that was not included in our dataset but that appeared to have a higher percentage of duplicated images was Oncotarget, an Open Access journal. I therefore decided to do a targeted (pun intended) study on the percentage of problematic papers in Oncotarget. I also wanted to see if this percentage had been changing over the years.

In the past months, I scanned a total of 675 Oncotarget papers from 2015-2019, an average of 135 papers (range: 83-167) per publication year. Only papers with photographic or flow cytometry images were included in this count. Papers with only line graphs or tables were not scanned.

Among these 675 papers, I found a total of 96 papers with image issues (14.2%). Here is the graph of the percentage, plotted per year:

Percentage of papers with duplicated images published from 2015-2019 in Oncotarget. For each year, at least 80 papers were screened. For publication years 2015-2018, that was typically the number of research papers in two issues. For 2019, a total of 155 papers in 40 issues was scanned.

In August, as I was doing this scan, I wrote to Oncotarget to ask them what their thoughts were about this high percentage of duplicated images in their journal. The Editorial Office replied:

We share the same principles of scientific integrity, and we seek to do everything we can to ensure that scientists to not publish problematic images. (…) We are addressing the issue of problematic images in different ways. First, we have started using an image forensic service provided by one of the largest and most reputable companies in the industry. Right now, after making all the necessary technical arrangements, we are extending this service to make sure all papers pass through image forensics. Second, we investigate all reports of irregularity that we receive.

Oncotarget Editorial office response (per email, August 2019)

In a later reply, Oncotarget wrote:

We have our own database containing information about all questionable papers. The authors and some institutions (there are some institutions with higher rate of the questionable papers) from that database are “red flagged” in the submission system. New submissions are checked against the database during quality control.
Our Assistant Editors are often able to catch previously published images during quality control. Here we are, of course, depending on the skill of our Editors, but this measure definitely helps us to catch duplications.
Peer-reviewers are pretty efficient in catching questionable images. To help them, we have contracted one of the most reputable companies in the industry to provide image forensic analyses. Moreover, after completing the necessary technical and financial arrangements, we are extending this service to ensure that all submitted papers pass through image forensics and that the results are available to the reviewers and authors.

Oncotarget Editorial office response (per email; August 2019)

Indeed, from the graph above, it looks as if the percentage of problematic papers in Oncotarget issues is indeed decreasing, starting in 2018. At 9.7%, it is still higher than that in many other journals, though. I encourage the journal therefore to increase their editorial screening of images in submitted papers

Drop in Oncotarget papers after being dropped from science paper indexes

Of note, the Oncotarget issues from late 2018 and 2019 contained significantly fewer research papers than in the previous years. Oncotarget publishes a new issue every 4-7 days. The interval was first 7 days in 2015-August 2017, but was shortened to 4 days from Sept 2017-now, possibly because the journal was gaining in popularity. The first 2 issues of 2019 contained fewer papers than the ones from previous years.

Here is a graph of the number of research papers per issue. Note that I only included the first 2 issues of each year, so the ones published in January, and that the numbers only include papers with photographic images.

Only the first 2 issues of each year (published in the first week(s) of January are plotted here.

There is a sudden drop in the number of research papers in the January 2019 issues compared to the January issues of previous years. This sudden drop in papers might be linked to recent decisions of several scientific index services to no longer include Oncotarget.

In the fall of 2017 Medline stopped including new Oncotarget papers in their database (see coverage by Retraction Watch). In 2018, the journal was dropped from Clarivate Analytics’ Journal Citation Reports and the Science Citation Index Expanded (SCIE) (as per LetPub and Wikipedia). However, new Oncotarget issues remain available on NCBI’s Pubmed, PMC, and SCOPUS.

Oncotarget’s exclusion from scientific paper databases might have been caused by the inclusion of the journal on Beall’s list of predatory journals (which does no longer exist). In a January 2018 editorial Oncotarget’s Editor in Chief Mikhail Blagosklonny wrote:

We remain hopeful that the decision will be reversed. Despite these attacks, Oncotarget has continued to serve as a well-recognized and respected international journal, and has continued to flourish. (…)

For thousands of years libraries served science and scientists. Indexes were created to disseminate information, not to suppress it. For the first time in history, in response to rapid technological progress, librarians are suppressing science and refusing to serve science. Of course, this not true of all librarians. The new generation of librarians resist Beall’ s ideas, including Beall’s supervisor, whose recent article should be read by everyone.

We, the scientists, should change the situation and change the policies of Indexes such as Clarivate’s Web of Science and MEDLINE, and make indexes work for us. Otherwise, why should these organizations exist?

Librarians against scientists: Oncotarget’s Lesson – Mikhail V. Blagosklonny, Oncotarget 2018, doi: 10.18632/oncotarget.24272

The loss of the inclusion of Oncotarget in these scientific indexes might have made the journal less attractive for researchers to publish in, and might have caused the number of submissions to have dropped significantly in 2018. With their high percentage of duplicated images, although dropping, they clearly still have a long way to regain the trust of scientists and science indexing services.

8 thoughts on “A study on Oncotarget papers”

  1. Another possible interpretation is that the decreased reputation of the journal made it higher quality, perhaps because there are less incentives to submit fraud to it and therefore the quality of submissions has improved. This would be in keeping with the observation that “Prestigious Science Journals Struggle to Reach Even Average Reliability” https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5826185/ . Of course, correlation does not imply causation.


  2. “Being left off a paper you worked on ” your tweet. Here is good paper . The abstract

    The Ghost Collaborator
    David Shaw , Ph.D., M.A., M.Sc., M.M.L. & Bernice Elger , M.D., Ph.D.

    Collaboration is increasingly important for researchers in all disciplines. Universities and funding bodies tend to prefer projects that involve interdisciplinarity, collaboration between different institutions, and international consortiums. Such projects can yield great benefits, but they also pose particular challenges for certain aspects of research integrity, and particularly for awarding credit and authorship. In this article, we describe and analyze the phenomenon of the ghost collaborator, who is initially fully involved and makes a full contribution to a project’s design, but then finds him- or herself excluded from meetings and publications.



  3. From Oncotarget : Thank you for a targeted study, but please clarify a few issues.
    (Before posting this commentary here, we have contacted Elisabeth Bik by email )
    Dear Elisabeth Bik,
    We agree with you that a growing number of papers with inappropriate image duplication is a very serious issue. It is common to many journals. Oncotarget is not an exception and we do address this very serious issue in different ways, including an image forensic service (you have the examples of our image forensic results).
    However, this particular study raises a number of questions, including the following three:
    1. Study compares the percentages of papers with duplicated images published between 2005 and 2014 with Oncotarget papers published between 2015 and 2019.
    Is it a scientifically correct to compare the data from two different time periods (2005-2014 and 2015-2019), especially considering the great likelihood that in many journals, the numbers of duplicate images might be higher during the 2015-2019 interval?

    2. Do you have the numbers for other journals for the 2015-2019 time period?
    If so, please provide them now. If you do not have the numbers, then the conclusions might be misleading.

    3. If it is a study, then a conflict of interest statement should be provided. In our summer communication with you, we asked you whether there is any potential conflict of interest, but we have not yet received an answer.
    So, can you state that there is no conflict of Interest for this study?

    A prompt reply from you would be very much appreciated. You can simply answer YES or NO to each question.
    It is ironic that Elisabeth Bik posted her study under the heading “Category: Peer Review Fail” …
    By the way, MEDLINE denies that they consider Beall’s list to be a reliable source of information. (There is a presumption about that in Elisabeth Bik’s study.)


    1. Thanks for your reaction, Oncotarget. I was planning to reply on Monday to your email which you sent on Friday night, but you did not give me much time to reply. However, it is fine with me to have the discussion in public. So here we go.

      1. In our 2016 mBio paper for which we scanned 20,000 papers published in 40 journals, we found the percentage of papers with image duplications to be fairly stable at 4-5 percent between 2005 and 2013, with a hint of a decline in 2014. You can see the graph here: https://mbio.asm.org/content/mbio/7/3/e00809-16/F5.large.jpg
      Perhaps because of our study, and because of postings on social media and PubPeer of image duplications in biomedical papers, more and more journals have implemented stricter figure preparation guidelines and editorial review of images, which have led to a lower percentage of these types of figures. You can see this for example in our study on increased image scrutiny in Molecular and Cellular Biology, which led to a steady decrease in image duplications in recent years. See this figure here: https://mcb.asm.org/content/38/20/e00309-18/figures-only

      2. I have not done such a study, but I have done (see above) a study of 40 journals and 20,000 papers spanning 1995-2014. The average percentage of duplications in those journals in 2013 and 2014, as shown above, was around 4%. In Oncotarget, it was 14.5% in 2015. That is a huge difference – and only 1 year apart. It seems unlikely that the percentage of image duplication in other journals would have suddenly jumped from 4% in 2012-2014 to above 10% in 2015. But, if you like, I could do a systematic scan of Oncotarget papers from 2014.

      3. My conflicts of interest are: I am unemployed since March 1 2019. I do not receive any salary. I am an unpaid peer reviewer for several journals, including ISME Journal and the American Society for Microbiology journals. I have done some small paid consulting jobs in image forensics analysis for a publisher, a lawyer, and a university. This consulting work was all done after I scanned 90% of the papers published in Oncotarget, and the combined earnings with this image forensics consulting work were less than the publication fee for one research paper in Oncotarget. I have also received honoraria for talks at two events. Before March 2019, I worked in two microbiome sequencing biotech companies and my name is on a couple of patents about microbiome sequencing analysis.

      Liked by 1 person

  4. The drop in Oncogene papers after indexing was stopped in Medline clearly shows that NIH/US National Library of Medicine has the key to improve scholarly literature. They should implement quality requirements for journals indexed in PubMed.

    Now it is quanity and no quality that apparently counts. Some quality requirements would limit the number of journal indexed in PubMed and improve the reproducibility.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: