I
have always been idealistic about science and scientists. Whenever I go to a
scientific meeting, I get inspired all over again by the enthusiastic and
excellent work done by scientists, whether old professors, young professors,
grad students, or undergrads. (The best contributed-paper presentation I ever
saw was by an undergrad.) Sometimes you find BS in science, but not often. When
I took an ecology course from Joe Connell at Santa Barbara, in 1978, he had us
read papers that he claimed were really terrible but got published anyway. Our
job as students was to rip them to shreds. Now that I look back on it, I see
that the papers were flawed but at least represented a good faith effort by the
authors. I remember one of the flaws in a paper by Martin Cody was that he
relied on a local fisherman to give him some information about shore birds. We
science majors looked down on fishermen. But, I now realize, fishermen may know
a lot about nature. The whole citizen-science movement has exploded since those
old days of scientific snobbery. Cody may have accepted the fisherman’s
testimony a little too readily, but there was nothing unscientific in getting
observations from him.
A
paper that appeared in Ecological Monographs in the 1980s described the problem
of “pseudoreplication,” which is a real pitfall to avoid in research. The
author said that too many papers have been published with this kind of error.
He described scientific papers as “the coin of the realm” and this coinage
needs to have value. He was right and still is. In many universities, the
number of peer-reviewed publications is used as one measure of scholarly
productivity for hiring, tenure, and promotion purposes.
Since
the rise of online publishing—something that was inevitable and can be really
good—a whole new kind of problem has arisen that makes pseudoreplication and
other statistical errors look totally innocent. I refer to the large number of
fake scientific papers. In the old days (I did not say good old days) a fake
publisher could be quickly identified and driven out of business, since paper
and ink and graphic design and postage are costly. But today it costs hardly
anything to set up a “scientific journal” and start issuing online papers.
There must be dozens of such journals now on the web. If the author pays them, they
will publish anything. Joe Connell’s jaw would have dropped at these.
A
contributor to the Ottawa Citizen whipped up a fake paper and sent it off to some of these “peer-reviewed”
journals. He took half of the title, and half of the material, from geology,
and the other half from hematology: the article “Acidity and aridity: Soil
inorganic carbon storage exhibits complex relationship with low-pH soils and
myeloablation followed by autologous PBSC infusion.” Gotta admit it was
creative, especially with the author’s invention of “seismic platelets.” He got
several acceptances. A few noticed his plagiarism, but told him to do a little
rewording and it could be published.
What’s
there to get upset about here? Anyone who hires an applicant or awards a grant
to someone whose publications have titles like this has only him or herself to
blame. These “peer-reviewed” papers apparently have no peer review, depending
on your interpretation of peer. One could interpret “peer” in such a way as to
make all of us peers. Anyone who is hiring a scientist should give credence
only to journals which are known to be reliable, even if not widely read (such
as our own Proceedings of the Oklahoma Academy of Science and Oklahoma Native
Plant Record). You cannot distinguish between real and bogus publishers by
asking whether they charge a fee to the author; nearly all journals have “page
charges.” Occasionally even the most prestigious journals publish articles that
turn out to be hoaxes (such as the work of Woo-suk Hwang) or the work of
scientists who jumped to conclusions a little too fast (such as the work of
Felisa Wolfe-Simon).
Peer
review (in the opinion of any of us who have had papers rejected) is frequently
a capricious affair. But at least it is pretty good at catching fakes. I
reviewed a paper for American Biology Teacher once that was plagiarized. Lazy
plagiarists are often lazy in more than one way. In this case, the author cited
articles that were not listed in the references. Fifteen seconds of work on a
search engine showed me that the paper was plagiarized. I told the editor to
contact the supervisor of the author. Now that I look back on it, I wonder if
the paper was generated just to test the internal quality control of the
journal review process.
So,
go ahead and have a laugh at the articles published by bogus journals; and if
you believe them, joke’s on you.
On
a related note, I’m thinking about starting a journal called Zeitschrift für
Recherches en las Ciencias Naturalistas. I only charge $10,000 per article. If
you are interested in publishing there, let me know, and remember I’m a peer.
Stan Rice, president
No comments:
Post a Comment