Menu
  • Solutions
  • Resources
  • Pricing
  • Buy Credits

Beyond the Plagiarism Report

Posted by Jessica Gopalakrishnan on Oct 22, 2013 11:25:00 AM
What are the precise steps an editorial team takes in determining whether or not a manuscript contains plagiarism? The complexity of plagiarism challenges editors in many ways, a challenge they take very seriously and handle cautiously and thoroughly. In this webcast, managing editor for American Journal of Preventive Medicine (an Elsevier journal), Charlotte Seidman, starts inside the plagiarism report—describing best practices for identifying duplication and interpreting results—then takes us beyond the plagiarism report, illustrating how it can be used as a multi-faceted decision-making tool.

Watch the video discussion ( 19:14 minutes):
 

Beyond the Plagiarism Report


Gopalakrishnan: Welcome to today’s iThenticate webcast. I am pleased to introduce managing editor for American Journal of Preventive Medicine, Charlotte Seidman. Welcome, Charlotte, great to speak with you today. The American Journal of Preventive Medicine (AJPM) uses iThenticate through the CrossCheck service and have been customers for about 2.5 years. Charlotte was gracious enough to come on today and talk a bit about what goes into the decision of whether a manuscript contains plagiarism or not. I think that this will be very interesting for our listeners. I thought we would start with why AJPM decided to start screening manuscripts for plagiarism.

Seidman: Sure. I have been part of the AJPM editorial team for 19 years, during which time our unsolicited manuscripts have increased tenfold, from 140/year to over 1400.  Of the current 1400 manuscripts we’ll be handling this year, we’ll publish about 17%. Deciding which papers will be published is an arduous task. Editors are under pressure to make the best decisions for their particular journal to select and report the highest-quality science; to avoid papers that may present problems – the “standard” ethical misconduct issues of duplication and plagiarism (or self-plagiarism); to avoid having to run corrections or retractions. (Fabrication and falsification are separate issues, which unfortunately are generally detected after a paper has been published.) So we arm ourselves with a set of tools to assist in these decisions – our years of experience and topic experts in the form of peer reviewers. And the latest addition to our toolbox is the ability to run each manuscript through a plagiarism-detection program.

But like any tool, we have to figure out how to use it. Having the tool is great – using it to help increase the validity and reliability of your journal is even better. As an editor, the first decision you’ll need to make is when to use this tool. Some journals run all submissions through the program; this is helpful for several reasons: first, the editors can decide upfront not to send a paper for peer review, and second, they may decide to share the results with the external peer reviewers. Otherwise, it may be too expensive for every journal to run every paper. 

Gopalakrishnan: So that’s where iThenticate comes in, I assume. How have things changed since incorporating iThenticate?

Seidman: Since AJPM is rejecting over 50% of unsolicited papers after the first read, we chose to use the iThenticate tool for all revised papers. We started using iThenticate in January 2011, and have run over 800 manuscripts through the program to date. Our journal staff has a “composition” meeting each month, to decide which papers will be in a particular issue. In the past, the final decision was based on the input from the peer reviewer comments and the various editors (usually 4) who were participating in that meeting. After January 2011, however, the iThenticate report became part of the decision-making process. As each revised paper is presented for consideration, the score and details of the report are made available.

Gopalakrishnan: Just for clarification. The reports are made available to the editors or authors or both?

Seidman: For the editorial team.

Gopalakrishnan: OK. What has your experience with iThenticate reports been like?

Seidman: Great. One word of advice for anyone using this tool: looking at the score alone does not tell the whole story. To set a certain standard number above or below which you’ll accept or reject a paper is mis-using this tool. For example, a score of 30% may represent plagiarism or duplicate publication, but reading the manuscript may indicate that the author was using material in quotation marks or “chunks” of text from other papers, correctly cited. The message here is that you should not accept or reject based on the score alone; always dig deeper into the report and read the original manuscript.

Gopalakrishnan: That is certainly sound advice. Once you have the report and have read the paper, and you’re still suspicious that the author truly has crossed a line – what do you do?  

Seidman: The first step is to discuss the report with your editorial team – those people who are making a decision about the appropriateness of this manuscript for your journal. Using the report, you can decide if the similar material represents duplicate publication, where the majority of the material is from the same author and has already been published; “global” plagiarism, where major pieces of text are taken from one other source, are not reworded and are not cited; “cut and paste” plagiarism, where large sections of text are taken from multiple sources; self-plagiarism, where chunks of text are taken from the same author’s previous work; or “patchwork plagiarism”.

The most common problem among manuscripts that we’re seeing today is what Steven Shafer at Anesthesia and Analgesia  calls “patchwork plagiarism” – the pulling of small sections of material from a number of other publications, made more of a problem due to increased access to material on the Internet. As opposed to “global plagiarism” where the authors are taking from only one other source, patchwork plagiarism is more difficult to evaluate (and indeed is undetectable without programs like iThenticate). After studying the problem, Shafer recommended that these papers should be rejected without review, that this type of plagiarism is a good predictor of inadequate scientific merit.

Gopalakrishnan: Interesting. Sounds like this patchwork is verbatim plagiarism caused by a bunch of copying and pasting…which definitely crosses ethical boundaries – but fortunately easy for software to detect. What happens if you determine there is an issue in a manuscript?

Seidman: Once you’ve worked through the report and have identified that there is a problem, then the decision will be whether to reject the paper at that point, with a letter to the authors (or his/her department), or to work with the authors to correct the paper. This will be determined by your overall interest in the paper and the level of the misconduct. This is how we would handle papers in each category (I’ll talk about duplicate publication and global plagiarism in a minute).

1. patchwork plagiarism throughout the paper – the paper would be rejected with a standard letter to the author

2. cut and paste plagiarism – the authors would be given the opportunity to revise the paper based on input form the iThenticate report, including rewording sections as needed and citing the proper source.

3. self-plagiarism – this is a controversial topic, with editors divided on how to handle the problem. Some editors allow self-plagiarism in the Methods sections, but nowhere else in the paper. Some take a stricter stance, that ANY plagiarism is plagiarism. The important point here is that the editors be made aware of the potential problem and create a policy for their journal.

4. global plagiarism and duplicate publication are the major breaches of ethical publication that will be revealed by the iThenticate program.

For these more substantive problems, our editorial office uses flowcharts from the Committee on Publication Ethics (called COPE), taking into consideration the individual’s situation (e.g., we had a Chinese author who was guilty of duplicate publication; we did not take the action suggested by the COPE flowcharts, which would have had us reporting his action to his department. We did however write him a personal letter with information about the basics of scientific integrity in reporting his research.)
Gopalakrishnan: Great, let me bring up these COPE flowcharts.

COPE also provides sample letters for each of the situations. The importance of following a set guideline like those provided by COPE is that you’re protecting your editorial office as well. The ethical guidelines as set out by COPE are supported by thousands of journals around the world, which gives you confidence that you’re taking the right approach. For this talk, the most important flowcharts are those that deal with the actions needed for duplicate publication and for plagiarism.

a. what to do if you suspect duplicate publication in a submitted manuscript
b. what to do if you suspect duplicate publication in a published manuscript
c. what to do if you suspect plagiarism in a submitted manuscript
d. what to do if you suspect plagiarism in a published manuscript
    
These flowcharts can be found at http://publicationethics.org/resources/flowcharts

Gopalakrishnan: Great, Charlotte! I'm glad you mentioned COPE. They have some excellent resources should you ever find yourself in a tricky situation. It looks like you (AJPM) have a very efficient editorial process to prevent plagiarism. I’m glad to hear that iThenticate is working out for you. I wish you all the best. Thanks for joining us today, Charlotte. It was a pleasure having you.


Related