Showing posts with label ethics. Show all posts
Showing posts with label ethics. Show all posts

Thursday, July 18, 2013

It's a Jungle Out There

This morning I cleaned out my inbox, deleting without opening, three invitations (to submit manuscripts) from journals I've never heard of. Such invitations have become a weekly occurrence. Even though I've set my spam filter to block emails from repeat offenders, I continue to get new ones.

Coincidentally, I received a link to a blog post about predatory publishers by Ian Woolley writing for Soapbox Science. He goes into detail about the practices of these journals, how to identify them, and about a website that purports to be a watchdog of predatory publishers. You might find it interesting and thought-provoking.

Wednesday, August 29, 2012

Self-Plagiarism

Is self-plagiarism possible?

Let's first define what this is. Basically, it refers to the reuse of significant portions of one's own work, with identical or nearly identical wording or formatting and without citing the original work or otherwise revealing that you are repeating something previously published.

I think the term self-plagiarism is confusing and potentially inaccurate since plagiarism is generally defined as the presentation of someone else's words or ideas as your own.  Based on this definition, it seems impossible to plagiarize oneself.  However, if the definition of plagiarism includes the presentation of words or ideas as new and original, then cutting and pasting from your previously published work without attribution might be considered plagiarism.

Whatever it's called, some ethicists seem to consider reuse of previously published material to be dishonest.  However, there are some gray areas.

Duplicate publications are clearly fraudulent, but what about reusing short descriptions...of methods, for example?  Many of us use the same techniques in multiple studies, and there are only so many ways one can describe these procedures.  How different should each version be?  Earlier in my career, it never occurred to me that reusing a short description of a method was wrong.  I'm sure I've stated some variation on this sentence in many papers: "Data were transformed where necessary to meet assumptions of statistical tests."  Is that self-plagiarism?  Or is it simply more efficient to reuse a standard description?  What about several sentences or a paragraph describing a specific sequence of steps in a protocol?  Again, there are only so many ways one can reword such descriptions without producing a longer and more abstruse narrative.

What I now do is to write methods descriptions from scratch, envisioning each step but without referring to previous written descriptions.  Unfortunately, what happens is that I will often unconsciously repeat what I've previously written...in some cases, almost verbatim.  When that happens, I will deliberately rephrase the new version so that it is different from earlier publications.  Another solution is to describe the method in detail in the original paper and then cite that paper in succeeding articles in which you provide a much briefer summary.  

Another situation is the review paper.  Some authors freely extract summaries and other sections of their published work and combine them into a review article.  Along with text, figures and tables from earlier work may be included in the review.  I know some authors who would not see anything wrong with this since it is their published work they are copying.  However, ethicists would likely consider this self-plagiarism.  If, on the other hand, the review was a true synthesis with a new analysis and novel discussion of the body of work, then brief summaries (rephrased) of previous work would be necessary to lay the groundwork for the new work.

Most people recognize that recycling of published work is unethical; however, it can also be illegal if the author has signed over the copyright to a publisher.  For most scientific articles, this would be the case. This is why you should not reuse figures or tables (in their original formatting) from one of your published articles in a review paper or book, for example, unless you've gotten written permission from the original publisher. It does not matter if you created those original figures using your own data.  When you signed that copyright transfer form, you turned over the rights to those formatted components to the publisher.  The same would be true of any segments of text taken verbatim from a published article and inserted into a new paper.  Most publishers (in my experience), however, will give permission to reuse or modify a figure, photograph, or table, especially by the original author.

Bottom line...it's probably prudent to avoid reusing text from your published articles. It's easy enough to rephrase things sufficiently to avoid charges of self-plagiarism.

Friday, August 17, 2012

Is It Plagiarism?

With the recent news about journalist Farid Zakaria's suspension (and reinstatement) by his employers because of an alleged instance of plagiarism, I thought it worthwhile to again muse a bit about this topic.  In case you've not read about the Zakaria case, you can find more about it online.  In brief, Zakaria is employed by Time Magazine to which he regularly contributes; he also hosts the CNN show, Farid Zakaria GPS.  He has been accused of plagiarizing an article in The New Yorker magazine written by academician, Jill Lepore. 

I don't want to rehash the news stories, so here I will provide just the bare facts: the original text purported to be plagiarized and what Zakaria wrote and published:

Original written by Jill Lepore in The New Yorker magazine: 

"As Adam Winkler, a constitutional-law scholar at U.C.L.A., demonstrates in a remarkably nuanced new book, “Gunfight: The Battle Over the Right to Bear Arms in America,” firearms have been regulated in the United States from the start. Laws banning the carrying of concealed weapons were passed in Kentucky and Louisiana in 1813, and other states soon followed: Indiana (1820), Tennessee and Virginia (1838), Alabama (1839), and Ohio (1859). Similar laws were passed in Texas, Florida, and Oklahoma. As the governor of Texas explained in 1893, the 'mission of the concealed deadly weapon is murder. To check it is the duty of every self-respecting, law-abiding man.'"

Text written by Farid Zakaria and published in his Time Magazine column:

"Adam Winkler, a professor of constitutional law at UCLA, documents the actual history in Gunfight: The Battle over the Right to Bear Arms in America. Guns were regulated in the U.S. from the earliest years of the Republic. Laws that banned the carrying of concealed weapons were passed in Kentucky and Louisiana in 1813. Other states soon followed: Indiana in 1820, Tennessee and Virginia in 1838, Alabama in 1839 and Ohio in 1859. Similar laws were passed in Texas, Florida and Oklahoma. As the governor of Texas (Texas!) explained in 1893, the "mission of the concealed deadly weapon is murder. To check it is the duty of every self-respecting, law-abiding man."

Zakaria's employers, Time Magazine and CNN, apparently thought this was (or might be) plagiarism when they suspended him, and Zakaria has also indicated that he thinks it was plagiarism by apologizing for his actions:  "I made a terrible mistake. It is a serious lapse and one that is entirely my fault. I apologize unreservedly to her [Lepore], to my editors at Time, and to my readers."

Others argue that this is not a case of plagiarism.  Jay Epstein, writing for The Daily Beast, concludes that it's not plagiarism because Zakaria acknowledged Winkler as the source of the information.  Epstein clearly misses the point because it's not Winkler that Zakaria is accused of plagiarizing, but Jill Lepore.  Those are apparently her words, her phrasing, and her interpretation of Winkler's work.  It's clear that the overall structure of the paragraph, the general points being made, and the specific examples were taken without attribution from Lepore's article.  A few words have been changed here and there, but it's essentially the same paragraph. 

Although some would argue that it's not plagiarism because the paragraphs are not exactly the same, this is a common misconception.  Students and other novice writers often think that, by changing a few words and phrases here and there, they can avoid the charge of plagiarism.  However, this belief is not supported by most definitions of plagiarism: "an act or instance of using or closely imitating the language and thoughts of another author without authorization and the representation of that author's work as one's own, as by not crediting the original author"  There are a number of definitions out there, but I like this one because it specifies "imitating the language and thoughts of another author", not just the text.  It leaves no wiggle room; changing a few words does not alter the fact that you've taken someone else's ideas and thoughts and portrayed them as your own.  

However, it is possible that two authors could use similar phraseology to describe something.  Could this have been an accident...that both Zakaria and Lepore, after reading Winkler's work, would write almost the same paragraph? I doubt it, and Zakaria clearly admits that he copied Lepore.  The only question is whether Zakaria thought that by changing a few words it would not be plagiarism or whether he was well aware that he was taking another author's ideas and representing them as his own.  He also might have read Lepore's article and concluded that if he had studied Winkler's work (as Lepore apparently did), he also would have come up with the same interpretation and was equally capable of writing a similarly well-worded summary....hence he felt no compunction about copying Lepore's summary.  I've encountered people who were both lazy and arrogant and would justify taking someone else's wording without attribution because they believe that they would have written something similar, had they the time to read the original work.  I'm not saying that's the case with Zakaria; I'm just pointing out that this is one way some plagiarizers justify their actions (based on my past encounters with such people). 

We will likely never know what Zakaria was thinking when he wrote his article but there is little doubt that this paragraph was plagiarized from another author. It was a stupid thing to do, particularly by someone so much in the public eye.  The question is whether it was part of a pattern of unethical behavior or just an isolated lapse in judgement?  Ultimately, CNN and Time Magazine considered this one instance against all the other work Zakaria has produced and decided that it was an "unintentional error".  Unfortunately, that's not the end of the story.  Others have begun coming out of the woodwork, making additional accusations (which were later retracted).  It will be instructive to see how this ultimately affects Zakaria's career.

Those of us in science will likely encounter a case of plagiarism at some point in our careers......a student's or possibly an accusation made against us.  Even someone who is basically ethical and honest can inadvertently commit plagiarism due to ignorance or sloppy note taking.  And we are all vulnerable to false accusations.  I posted previously about plagiarism but did not explore the subject deeply.  I started out in this post with an example that clearly meets the definition of plagiarism...as set forth by a number of authorities on the subject....but one that not everyone understands or believes to be plagiarism.  In upcoming posts, I thought I would examine plagiarism a bit more and particularly consider some types of plagiarism where things are not so clear cut.  

Wednesday, August 1, 2012

Reciprocity

In previous posts (starting here), I talked about con artists and social manipulators.  I'd like to elaborate on one of the types of con artists that are listed in Gavin DeBecker's book, The Gift of Fear.  The Loan Shark is someone who does you a favor so that when s/he asks you for a return favor, you find it difficult, if not impossible, to refuse. 

Here's a hypothetical situation.  You are invited to participate in an event, and your expenses are paid by your host.  You had never met your host before, but during this event you got to know them a bit better and were not favorably impressed.  At the end of the event, your host approaches you and asks you to write a letter of support for a prestigious fellowship.   

Here's another hypothetical situation:  A colleague nominates you for a prestigious fellowship or award, which you receive.  A short time later, this person asks you to recommend them for the same award.  What do you do if you think they don't deserve it?  If you think they deserve it but resent being manipulated? 

Do you:

1. Agree and write a glowing recommendation because you owe them.

2. Agree and write a lukewarm recommendation because you suspect they will retaliate if you decline.

3. Decline and make up some lame excuse.

4. Decline and tell them exactly why.

If you find yourself in such circumstances, it's important first to recognize that you've been manipulated and that the people you are dealing with are not ethical.  In both situations, you've been deliberately put into their debt so that you cannot easily refuse their request; you know this because they've asked for a favor in return.  Even if you cannot satisfactorily extricate yourself from this situation, you will be forewarned regarding any future interactions (and avoid them like the plague). 

Reading the above options, some people will say that #4 is the only choice.  However, if you've ever been in such a situation and actually face-to-face with someone, you know it's difficult to actually decline such a request in person.  This is what the manipulator is counting on.  You've been backed into a corner and to fail to comply makes you the bad guy.  Some people, however, will feel that they do owe a debt and should reciprocate.  In some cultures, such reciprocity is not only common but expected.  Others may feel they have to do whatever it takes to protect themselves and pick #2.  There is no easy choice, which is why it's a dilemma.

It's also instructive to recognize that you stepped into this trap by accepting their support/nomination in the first place.  How do you distinguish between people who genuinely wish to help you and those who are looking for a way to help themselves? There may be few or no clues, especially if you have not had any extensive interactions with them previously.  One clue is how ambitious the other person is, especially in relation to their qualifications.  The greater the discrepancy, the greater the likelihood they use tactics such as loan-sharking to get their way.  It's wise to be cautious in accepting favors, especially if you do not know the other person well.

In the case of the manipulative host, you are actually not in their debt if you performed whatever the invitation entailed.  You've already reciprocated by showing up and doing whatever you were invited to dom whether it was to serve on an advisory panel, give an invited lecture, or collaborate on a project.  There should be no additional obligation to the person who invited you.  Consequently, you may accept/decline the request based on other criteria, such as whether you know them and their work well enough to make an informed recommendation.  If you decline, though, there is the likelihood that they will react badly.

In the other case, it's likely that the person who has to ask to be nominated is not qualified. Otherwise, you or someone else would think to nominate them without any prompting.  If they are not qualified, then you will have a very difficult time writing an honest letter of support.  If they do happen to be qualified for the award, do you then take into account the fact that they put you in their debt or do you just forget about it and write something based only on their qualifications? Again, there is no easy answer.

Such ethical dilemmas are not uncommon in science, and you may find yourself at some point in your career struggling to deal with similar situations. 

Sunday, July 29, 2012

Solar Redux

Some time ago I read the novel, Solar, by Ian McEwan, which chronicles the shenanigans of the anti-hero, Michael Beard, a fictional Nobel Prize Laureate and all-around despicable character (although I was strangely sympathetic to his predicaments).  I wrote a comment on another blog about this novel.  If you read the post and the related comments, you'll see that there was a lot of arguing by people who had not read the novel. The discussion devolved into a back-and-forth about a description in the novel of an encounter between Beard and a feminist and his ultimate downfall by committing a blunder very reminiscent of the Larry Summers incident.

Anyway, I just came across an interview with McEwan in which he talks to Matt Ridley of The Gardian about the inspiration for the character of Beard.  I've always suspected that this character was patterned after a real scientist and Nobel Prize winner and pondered who it might be.  Apparently, I was right, although it seems that it was not a single person but a group of scientists who inspired this character.  In this interview, McEwan reveals some details about his encounter with the real people who inspired him, but understandably was not as forthcoming with specific names.

At the beginning of the interview, Ridley remarks that the novel's lead character, Beard, is corrupt and, among other indiscretions, steals an idea from someone else and predictably asks McEwan if he thinks science is corrupt.  McEwan's response is, no, that he does not believe this and then proceeds to describe a climate change conference he was invited to, one which had organized a group of 35 Nobel Laureates to speak. 

McEwan characterizes these scientists as "big beasts of the scientific jungle", "all men", "super alpha males", "men of a certain age" (meaning that they were past their prime).  McEwan was invited to give a speech, as a sort of an "after dinner mint", as he wittily describes his experience.  He colorfully describes the scene as being analogous to a watering hole in Botswana...here comes an elephant or rhinoceros or some other majestic animal...the huge egos...men who had control of big institutions and budgets, but who had not done any significant research since their twenties. He was then struck with the idea of writing about a character who was like this...a scientist who was "living in his own shadow".  Great idea and one I think worked well in this novel.

McEwan goes on to say that he could not help but think about these men (the non-fictional ones) and the later climate change summit he attended in Copenhagen. He characterized this as an international gathering of supposedly rational minds in collision with what are clearly enormous egos.  He attributes the failures of this summit, at least in part, to the ego-driven behavior of such alpha male types.  While at this summit, McEwan received the proofs of his novel and decided to add a scene in which Michael Beard is invited to speak at Copenhagen because the character would love to speak in front of such a gathering of international power players.

The author suggests that climate change poses a unique challenge for which humans must bring all their intelligence and creativity to bear, but fail because of other aspects of their nature.  This dichotomy is encapsulated in the novel. I think this is an important point to understand about the novel and the need to create a character like Beard, as opposed to a hero(ine) of science.  The latter would not only have made for a very boring novel, it would not have allowed the exploration of how the egos of these alpha males roaming the jungle of science are impacting society and society's ability to deal with major problems.

Image Credit: NASA

Monday, May 14, 2012

The PI's Nightmare

In the last post, I described a nightmare situation for any PI.  A trusted postdoc has turned out to be a fraud and has skipped town after fabricating data for a major project of yours. The postdoc also turns out to have lied about his Ph.D. degree--he never received it.

At this point, you may be thinking about all the things the PI could have done to avoid landing in this situation, and I could go through these points one by one.  However, at this stage in the situation, none of that is going to help.  Also, one could argue that taking all possible precautions does not guarantee that such a situation will never happen.  Especially if you are dealing with a real con artist. 

The term, "con", comes from confidence, which is what a con artist gains from his victims and allows him to succeed.  Such people are highly skilled at social manipulation and understand how to gain people's confidence.  Fortunately, there are not that many con artists who end up in the field of science.  However, scientists may on occasion encounter such people, either people within the field of science or people peripheral to science. 

For the latter situation, there is a very interesting film called House of Games, written and directed by David Mamet.  You may be more familiar with Mamet's other films, such as The Spanish Prisoner.  Both films are about confidence men (and women).  House of Games, however, features a female psychologist, Dr. Margaret Ford (Lindsay Crouse) who becomes the target of a group of grifters led by Mike (Joe Mantegna). 

Margaret has published a successful book, which has led to professional recognition, modest fame, and a small fortune.  It's this book and consequent fortune that have attracted the attention of this gang of con artists.  Their leader, Mike, designs an elaborate con to ensnare Margaret and eventually relieve her of her money.  They play on her professional interests and desire to find another subject to research and write about...one that may lead to another successful book. 

The set-up is initiated by a young, small-time gambler, who goes to Margaret for treatment.  He tells her that he owes a large sum of money and will be killed unless he pays (which he can't).  Margaret decides to intervene and visits the man the gambler owes. Turns out that it's Mike, the con man.  Mike agrees to forgive the young gambler's debt if Margaret agrees to accompany him to a poker game as his girlfriend and help him spot the "tells" of the other card players.  Margaret immediately sees an entry into an underworld that she can exploit for her own professional goals.

Margaret falls headlong into the trap set by this gang of grifters and eventually loses a large sum of money.  The con is so clever that Margaret does not realize she's been conned...at first.  By chance, she discovers the ruse and proceeds to turn the tables on Mike.  I won't reveal all that transpires, except to say that the film is intriguing on many levels and carefully crafted to keep you guessing until the end. It's not as well done as Mamet's later films, and one could quibble about the acting, especially Lindsay Crouse (see previous posts about female stereotypes in film). 

My point in describing this little film is that it is a cautionary tale about a successful female professional who is taken in by unscrupulous people, even though she knows she's dealing with people who are basically criminals.  Despite that knowledge, she's still fooled by Mike, who cleverly gains her confidence.  And that's all it takes.

How likely is it that a science professional might be victimized by a con artist?  Although the Mamet film is fictional, the postdoc story I described in the previous post is not.  The situation I described happened to my adviser (I was one of the graduate students working on the project).  I also was fooled once by a student worker who was falsifying his time sheets and forging signatures on them.  Fortunately for me, the student worker's actions did not threaten my entire research program or my job.  These experiences tell me that such situations are not that unusual in science, although perhaps not as prevalent as in other professions. 

Credit: Still image from House of Games, Filmhaus

Saturday, May 12, 2012

Catch Me If You Can

Imagine the following situation:

You are a PI in charge of a large project to conduct an environmental assessment of the potential impact of a large Energy Utility on an adjacent natural ecosystem.  With the sizable grant awarded to you, you hire a postdoctoral researcher, technicians, and graduate students to conduct the work.  It is the responsibility of the postdoc, in addition to conducting the field research in his specific area of expertise, to collate all the data from all tasks and to write the final report to the Energy Utility.  

Although this postdoc is fresh out of graduate school, he is an experienced teacher and administrator who returned to graduate school to get a Ph.D. later in life.  You feel very confident in his abilities, especially because of his maturity and past experience, which is a primary reason you hired him for this job.  He is gregarious and well-liked by all on the research team and appears to be a diligent worker.  Things seem to go well during the data acquisition phase; everyone is working hard, spending many hours in the field and laboratory.  As the project is nearing the final months, the postdoc announces that he has accepted an offer to return to his previous position (in another country) but, not to worry, he's almost finished writing the report.  You are not worried because you have high confidence in him and you've been working closely with the technicians and graduate students who are responsible for particular aspects of the report and have also been shown portions of the work supervised by the postdoc.  As often happens in large projects, there are delays.  When it becomes clear that the report cannot be finalized by the date the postdoc must leave, you acquire a no-cost extension for the project and the postdoc agrees to finish up the writing remotely.  

A day or two after the postdoc's departure, you receive a message from him saying that his briefcase containing all the data files and the only copy of the draft report was stolen in the airport.  He's very sorry about this, but he won't be able to meet his obligations.  You are stunned and try to contact him.  All your efforts to communicate fail.  You try not to panic, thinking that at worst, you will just reconstruct everything from the copies of the files he left.  You meet with the technicians and students and explain the situation.  

Your head technician begins delving into the files and discovers some puzzling things.  After several days, the technician comes to your office and tells you that he thinks the postdoc fabricated much of the data he was supposed to have collected personally.  The only data he's confident of is what he and the other technicians and students collected.  However, cross-checking the original datasheets from field notebooks and lab books indicates that the postdoc cooked some of the summary files (altered the data) so that everything will have to be reconstructed from the original datasheets.  And the data the postdoc collected will have to be discarded and either recollected or eliminated from the report. 

You begin to investigate further and discover that the postdoc never really received a Ph.D.--his graduate committee failed him when it became clear he had fabricated some of his dissertation research.  The letters of reference from the adviser and other faculty to you were all forged.  Because the letters were from well-respected professors at a credible university and department, you never bothered to check with the postdoc's graduate school to determine that he actually received his degree from that university....


In other words, you've been the victim of an elaborate con.

If you were the PI in this situation, what would you do?  If you were one of the students or technicians and saw something that made you suspicious, would you report it to the PI?  If you (PI or team member) later encountered this person at a conference or other venue, what would you do?

Monday, January 24, 2011

The Audit Society


This post is about a topic that affects us all, scientists and non-scientists.  Those of you who work in government agencies are already immersed in the "audit society", which has expanded in recent years.  The amount of time devoted to accountability has almost exceeded the amount we actually spend doing our jobs as scientists.  However, the "accountability regime" that is prevalent in government agencies is now becoming common in institutions of higher learning.

In her book, Wannabe University, Gaye Tuchman describes the "audit society" as one that "enables 'coercive accountability' carried out in the guise of transparency, trust, and public service....It entails both forced and voluntary surveillance, as individuals and organizations audit themselves and subject themselves to audit by others." She uses a hypothetical university to illustrate her main thesis about how institutions of higher learning have been transformed by being run like businesses, primarily to achieve the goal of being ranked among the top universities in the country (Wannabe University Syndrome).  She spent years observing a large state university, which is never named. Wannabe universities are run by administrators who model themselves after CEOs and hop from job to job in their quest for an ever more prestigious position/corporation to run. 

Seeking top ranking (and profit), university administrators market a product (a university education) and in the process undermine university faculties by instituting changes from the top down.  At first, the changes are subtle and don't cause too much direct trouble for the faculty (who may be unaware of what's happening).  Eventually, there is increasing emphasis on winning more and more grants and contracts, developing patents, and marketing (athletics, etc.).  Later, there is interference in the classroom (see previous posts: administrative dominance, violation of academic freedom, domesticated foxes and feral dogs).  If you are contributing to these corporate goals, you are rewarded; if not, look out.  You know you are working at a "wannabe university" when, for example, the university president has a stronger background in business than in academic achievement and who describes the university as on the "cusp of greatness" (even if it is currently ranked at #100 or lower), spends millions on new construction and hiring "rock star" faculty, develops a slick advertising campaign (describing itself as on the "cusp of greatness"), and talks about "flagship universities" as engines for supporting state economies.

Of course, university professors are difficult to control.  They have tenure (so far...that will be next to go) and tend to speak their minds.  The new corporate administrators of wannabe universities handle such a vocal workforce by implementing an "accountability regime", which leads to policies of surveillance and control (presented as ways to measure success and to improve the university's ranking). Tuchman argues that the corporatization of higher education negatively impacts students, faculty, and society as a whole.

Those of us working for government agencies are already very familiar with the "accountability regime".  However, the accountability aspect has recently become so intrusive that it is interfering with our ability to do science. I estimate that I now spend two workdays out of five mostly on non-science administrative tasks: filling out forms and justifications for travel (to do fieldwork, attend conferences, expense vouchers); taking required training courses (diversity, leadership, supervisory, records management, security, whistleblowing, EEO, etc.); writing and getting official approval for "study plans" to do research; getting internal reviews and official approval of all science products (including manuscripts to be submitted to journals and abstracts for meetings); doing official performance reviews of staff; and many other miscellaneous tasks, including filling out daily "time and attendance" forms, which documents the hours we work.  Virtually everything I do requires at least one or two signatures of superiors.

I don't know if this is just a US phenomenon or not.  It would be interesting to hear from professors and scientists working in other countries.

Tuesday, October 12, 2010

But I Didn't Cut and Paste Text!

A commenter recently questioned my definition of plagiarism as well as my recommendation to avoid using students or other trainees in conducting manuscript reviews.  These are important points that warrant further discussion, so I'll spend a bit of time in this post expounding on my views.  Note that these are my views, based on my experience, my discussions with colleagues, and my reading about the issues.  Others certainly have the right to their own opinions (especially what is acceptable within their specific fields), and I'm not trying to say that my view is the only acceptable view.

Plagiarism:

The American Association of University Professors defines plagiarism as..."...taking over the ideas, methods, or written words of another, without acknowledgment and with the intention that they be taken as the work of the deceiver." The Office of Research Integrity (ORI) also defines plagiarism as involving the taking of words, ideas, etc. from an author and presenting them as one’s own.  The Office of Science and Technology Policy (1999) defines plagiarism as: "... the appropriation of another person’s ideas, processes, results, or words without giving appropriate credit, including those obtained through confidential review of others’ research proposals and manuscripts."

I would define plagiarism similarly, even without having read these organizations' definitions. However, I recognize that some may wish to limit the use of the term "plagiarism" to the appropriation of text only.  The problem with this (aside from not being an accepted definition) is that students may get the idea that the taking of other things (ideas, methods, hypotheses) without attribution is OK because it's not "technically" plagiarism.  I've encountered students who expressed this belief to me.

Plagiarism of ideas is "Appropriating an idea (e.g., an explanation, a theory, a conclusion, a hypothesis, a metaphor) in whole or in part, or with superficial modifications without giving credit to its originator." The ORI goes on to say "In the sciences, as in most other scholarly endeavors, ethical writing demands that ideas, data, and conclusions that are borrowed from others and used as the foundation of one’s own contributions to the literature, must be properly acknowledged. The specific manner in which we make such acknowledgment varies from discipline to discipline. However, source attribution typically takes the form of either a footnote or a reference citation."

The ORI offers an interesting example of a situation in which an ethical author cited an unusual source of inspiration for his theory on light perception:

"Even in such cases, we still have a moral obligation to credit the source of our ideas. A good illustrative example of the latter point was reported by Alan Gilchrist in a 1979 Scientific American article on color perception. In a section of the article, which describes the perception of rooms uniformly painted in one color, Gilchrist states: 'We now have a promising lead to how the visual system determines the shade of gray in these rooms, although we do not yet have a complete explanation. (John Robinson helped me develop this lead.)' (p.122; Gilchrist, 1979). A reader of the scientific literature might assume that Mr. Robinson is another scientist working in the field of visual perception, or perhaps an academic colleague or an advanced graduate student of Gilchrist’s. The fact is that John Robinson was a local plumber and an acquaintance of Gilchrist in the town where the author spent his summers. During a casual discussion, Robinson’s insights into the problem that Gilchrist had been working on were sufficiently important to the development of his theory of lightness perception that Gilchrist felt ethically obligated to credit Robinson’s contribution."

Some scientists would scoff at Gilchrist's acknowledgment of a plumber and argue that this was unnecessary.  I think his action shows integrity and, moreover, a deep understanding of the concept of plagiarism.  Gilchrist clearly recognizes that his reported insights on light perception would not have occurred (or would have been quite different) had he not had the input of the plumber--and was obligated to acknowledge that source of inspiration.

If students are taught that plagiarism is only the cutting and pasting of text, they may think that appropriation of ideas, hypotheses, methods, etc. is not unethical (or at least not labeled as plagiarism and therefore not subject to sanction).  This would be a serious mistake with potentially severe consequences.

Even if one is aware of this aspect of plagiarism, it is very easy to inadvertently appropriate someone else's idea or concept (sometimes called "unconscious plagiarism").  Our minds can play tricks on us, and an idea that we think is original may in fact be something we read and later remembered as our own (as my memory deteriorates with age, I'm more concerned with this point now than when I was younger).  Most of us, however, are careful to cite the originator of major theories, hypotheses, or concepts in our papers.  But sometimes, the way the text is worded, the impression may be given that another's idea is our own.  Another error is when an author, working from notes, inadvertently uses the exact wording of another author, thinking that the notes were his/her own words summarizing the other work (always place word-for-word notes in quotes so that you do not make this error).

One exception to the plagiarism of ideas is "common knowledge".  It is appropriate to make statements based on widely-recognized phenomena without attribution, e.g., "plants capture CO2 through the process of photosynthesis".  A rule of thumb offered by the ORI is that if the idea or concept is widely-known among high school and college students, then it is common knowledge.  What about ideas that are not common knowledge of students, but are widely recognized by experts in the field?  Here's where things can get tricky, and the decision requires some experience and understanding of what's common knowledge and what requires citation (students often need guidance here).  If the work is to be published in a technical journal, and the target audience is the expert, then statements based on a large body of work might not need a citation.  For example, one might have an opening statement such as "The sensitivity of higher plants to elevated concentrations of CO2 depends on the specific photosynthetic pathway of each species.....we compared the responses of C3 versus C4 species." Not perhaps common knowledge of the average student, but certainly so for people working on photosynthesis.  However, if you made the statement that 82% of C3 species respond to elevated CO2 with increased rates of photosynthesis, then this would require citation(s).

Another possible exception is the semi-technical article or book chapter requiring a less "formal" style of writing.  Editors may ask for text that is unbroken by numerous citations (as one would expect in a technical paper) so that the writing appeals to the non-expert reader.  In these cases, the article or chapter would be accompanied by a list of "additional reading", which was used in the preparation of the piece and contains the cited material.

I plan to write more about plagiarism in future posts--it's a complex topic, many aspects of which authors are not always fully aware (including me).  Even the most experienced can unknowingly commit errors or may be uncertain how to proceed in specific situations.  I'm certainly no expert on plagiarism, but hope to explore the topic by writing about it and, in the process, refine my understanding of its various forms.

The views of readers of this blog definitely help shape such explorations.

Students and Trainees as Manuscript Reviewers

The concern here is about using students or other trainees to perform manuscript or proposal reviews for their mentors (who were asked to do the review).  As an editor, I would question the capability of a trainee (especially someone who has never authored anything) to provide an expert review--which is what the journal expects (or should be seeking in soliciting a review). As an author, I would be concerned that my work was reviewed by an inexperienced trainee, even under the mentorship of a senior person.  I'm expecting a fair evaluation carried out by a peer who is well-versed in the topic of my work and who has published (i.e., is an expert and therefore qualified to assess the quality of my work and if it contributes significantly to the field).

No matter how good or conscientious a trainee, they are not equal to an expert.  If they need "close supervision" by a senior person, one might argue that this confirms they are unqualified to be conducting an official peer review.  How would the journal or funding agency defend such a review, if challenged? They would have no way of determining whether the PI closely supervised the trainee or instead simply forwarded the trainee's review without looking at it.  I know the latter happens because I was often asked by a previous lab director to do his reviews for him (when I was a graduate student).  Back then, I did not know any better and never questioned this practice.  One might argue that I probably did a better and more thorough job than the director would have, but what if I had not?  He did not even read the manuscripts or proposals, so he did not know if my reviews were fair or accurate. 

The point is not whether a trainee can provide a passable review (some certainly can) or that they are supervised by a mentor.  The concern is the author's expectation that their manuscript or proposal has been assessed by an expert and that the scores and ultimate acceptance/rejection are based on the evaluations of those qualified to make that judgment.  The trainee (especially a student) may not meet that expectation.  A post-doc who has published at least one first-authored paper or prepared one proposal may be qualified to conduct reviews of manuscripts/proposals.  However, if the mentor is the one asked to do the review, s/he should inform the journal that the review is to be carried out by a trainee and how much supervision will be involved.

If a trainee (e.g., a post-doc) has the necessary credentials to be considered a "peer" and is capable of performing a review (based on the mentor's judgment), then it would be safe to recommend that trainee as a reviewer.  If the journal or funding agency has a mechanism to allow the use of "assistant reviewers", then at least the review can be assessed with that knowledge.  More importantly, the identity of all contributors to the review are formally documented and known to the journal or funding agency (in the event of a challenge).  Journals in my field, however, have no such mechanism (that I'm aware of).  In that case, it seems most appropriate for the mentor to suggest their post-doc as a substitute and let the journal editor extend the invitation--which provides a means to formally document the person who actually carries out the review.

Personally, I would not want to become embroiled in an investigation in which an author claims that a review was unfair (and it's discovered that my review was mostly written by a trainee, a substitution that was not formally documented by the journal).  So, my advice would be to proceed with caution if you have trainees doing....I mean helping with, your reviews.

I don't think that the need to train students is a valid reason for using trainees to conduct reviews.  Students can be trained to review manuscripts without involving them in the actual review process. A mentor can use published papers ranging from excellent to poor (there are plenty in the literature to choose from) and use them to train students to conduct reviews.  Another possible method is to use unpublished manuscripts that the mentor reviewed in the past, have the trainee conduct a mock review, and then compare the trainee review with the actual review submitted by the mentor (caution would need to be exercised in ensuring that the trainee not know the identity of the author or make use of any information contained in the manuscript).   

Training may be the motivation of some PIs in using assistant reviewers, but such training may be accomplished in other ways.  In my experience, the reason that some (many?) PIs use assistant reviewers is simply to relieve themselves of the task (and justify it as training).  As I said before, if you don't have time to do the review, you can decline the request.

What distinguishes this situation is that there are two competing obligations. A mentor definitely has a moral obligation to help their trainees, but there is also the obligation to ensure an "expert" review.  If the journal welcomes "assistant reviewers" and has a mechanism for documenting their involvement, and the trainee is capable, then the PI may be safe in using them.  A side benefit may be experience for the trainee, but that should not be the primary justification.


Image Credits (created with images from Flickr, iStockphoto, and http://www.rmu.edu/SentryHTML/images/gallery/students/group2/student_professor3.jpg)

Sunday, October 10, 2010

Don't Tell Anyone I Gave This To You

In a previous post, I described an ethical dilemma, one that probably occurs fairly frequently. I've encountered variations of it during my career.  A full description of this dilemma along with an expert opinion can be found here.  The following is my modified version that is similar to one of my experiences:

Cynthia is an ambitious post-doc having a problem with one of her laboratory techniques--extracting an enzyme from plant tissue containing lots of phenolic compounds (which bind proteins).  She's been trying for weeks to resolve this, but has been unsuccessful.  She is at her wit's end and finally goes to her PI to ask for help.  After listening to her tale of woe, he tells Cynthia not to worry--that he'll have a solution for her tomorrow.  The next day, she finds a manuscript on her desk with a note from the PI.  It says, "Check out the methods section...it has the solution to your problem.  However, don't make a copy of this or give it to anyone else.  Also, don't tell anyone that I gave this to you." She tries the method described in the paper and lo and behold, it works! When she excitedly reports this to her PI, she says, "I've scoured the literature and can find no mention of this technique.  Where did you get this paper?" The PI smiles mysteriously and says, "I've got dozens of them in my files."

I asked if the PI's actions were unethical.  The answer is, it depends.  Here is a summary of the expert's opinion (see the link above for the full version):

The moral dilemma hinges on the source of the paper.  Was it one of the PI's old, unpublished papers? One of his student's unpublished papers? Or was it a manuscript he got for review?

1. If the paper was written by the PI (and based on work done in his lab), then the PI is free to give the information and data to the post-doc to use.

2.  What may be less clear is why the former student's paper might also be given by the PI to the post-doc.  In this example, the student never published the paper and has left the university.  The post-doc can be given access to the contents because the PI's university likely owns the student's work as intellectual property.  According to the expert, the only way the student can claim the work is if s/he had previously gotten university permission to copyright the material.  Contrary to what many people think, the work conducted by a researcher (including their ideas) at an organization such as a university becomes the intellectual property of that organization.  In this case, the former student had no such copyright, so the PI, acting as the university's representative, has the authority to give the post-doc access to the contents of the paper.  It would be appropriate to contact the student to notify them that their unpublished information is to be used--and then cite them as the source in the acknowledgments section.  Unless the student makes a substantial contribution to the writing of the paper and a major intellectual contribution to the current work, then it is not appropriate to make them an author (the post-doc and the PI could potentially offer this option to the former student).

3. If the paper is one that the PI received for review, then giving it to the post-doc without the permission of the journal or the author is unethical.  The PI may be conflicted over his desire to help his stressed-out post-doc--and this may take precedence in his decision.  However, doing so is a breach of the confidentiality agreement that he entered into in accepting the role of reviewer.  In other words, the confidentiality agreement takes precedence over the PI's "moral" obligation to help the post-doc.  Furthermore, the PI is not exercising good judgment about how they will eventually use the information in their own publication and how to acknowledge the source of the information.  Giving the paper and the method to the post-doc may solve her immediate problem, but has created an even bigger problem for her in the future.  Unauthorized use of intellectual property obtained by privileged communication (review process) in another paper or proposal is plagiarism.  If they publish based on this method, they would be representing these ideas as their own, but which were taken from someone else's work.  If caught, both the PI and the post-doc could be subject to severe sanctions.

The expert opinion goes on to point out the likelihood that the plagiarism will be uncovered eventually.  If you think about it, any future paper by this PI and post-doc containing the plagiarized material will likely be read by the author of the original paper.  The author and the PI clearly work in the same specialized field, which is why the PI got the paper for review.  Isn't it just as likely that the author will get the PI's future paper for review?  In any case, the author will eventually see it when it is published.  This outcome is especially likely in a highly specialized field in which there are few experts.

My take on this scenario is that it is probably more common than you think.  I'm aware of colleagues who pass around papers they are reviewing or discuss the contents with others.  When confronted, they may admit they shouldn't do it, but then act as if it is nothing of great consequence.  Some pass on papers to students or post-docs (e.g., as exercises).  They may remove the author's name and affiliation, but the content of the manuscript is still confidential and should not be shown to anyone else or copied.  What if your student copies something from that paper without your knowledge, and it eventually ends up in a proposal or paper--where it is later recognized by the original author?

Sometimes you hear about reviewers asking journals for permission to have their post-doc or students review a manuscript (or maybe even do so without asking).  I don't think this is a good idea either.  If you don't have time to do a review, then decline and provide a list of people to substitute for you (you can suggest your post-doc); then, the journal can decide if your recommendation is a suitable reviewer, and the review will occur without your direct involvement.

The ethical situation described above and the hypothetical actions of the PI (#3) illustrate how easy it is for someone to get into deep trouble if they fail to take the time to consider the consequences of their actions.  I can easily imagine a PI who might make such a decision hastily and/or without thinking--but with no real malicious intent to injure the author.  However, such a decision commits at least two ethical transgressions--passing along confidential information without permission and putting another person into a tenuous and potentially liable situation (plagiarism would be the third, if they take the final step and publish).  The PI's moral obligation to "help" his post-doc clouds the larger ethical issues, and in the end, his action could instead seriously harm his post-doc's reputation and career.

Perhaps you have done something similar to this--we all make mistakes at some point, especially when we are inexperienced or under pressure.  Most people, though, would likely have a nagging feeling in their gut that such an action is wrong.  If you have this feeling about something you are facing or that someone else is telling you, pay attention.  Your gut is probably right.

Image Credit (modified from http://www.uwo.ca/nca/education/images/student_3.jpg)

Friday, October 8, 2010

A Difficult Decision

Did you guess which one was the real situation?  It was number 3, which I've reproduced below:

Beth is an assistant professor and has an undergraduate student worker (a senior) who is discovered to have been falsifying his time sheets.  Let's say that Beth's lab technician has reported this to her.  Beth contacts the head of student affairs for guidance.  She is told that the student's actions, if guilty, are considered by the university to be a crime and will be turned over to the campus police; the student will also be expelled.  Let's say Beth is reasonably certain that at least a portion of the time he claimed has been falsified.  Reporting this student will lead to his possible arrest and prosecution and definitely terminate his academic pursuit.  She is hesitant to cause this student to be arrested and expelled.  What if she just fires the student, but does not report him to authorities?  Is that ethical or unethical? What would you do?

I won't reveal what my role was (except to say that I wasn't the student!).  Here is what actually transpired and the reasoning behind the decisions.  Beth went to great effort to document the falsified time claimed by the student.  She compared all work hours against the student's course schedule, and found many instances in which he was in class (confirmed by the course instructors) when he claimed to have been working.  She also found that work hours were claimed for times when the lab was closed (for holidays, etc.) and determined that the student could not have been there (confirmed by graduate students who were working after hours).  All of this information was carefully documented.  Beth also determined that the approval signatures on some of the time sheets were forged.

Beth and the technician decided to confront the student, who confessed to them when shown the evidence (to both the falsified time and the forgeries).  Although Beth felt some reluctance about turning in the student (because of the severe consequences), she ultimately decided that she had no choice.  She turned over her documentation, including signed statements by her and the technician as to what had transpired in their confrontation with the student, to the university business affairs office and to student affairs.  The case was given to the campus police who proceeded to arrest the student.  Beth had advised the student to pay back the funds, which he did; the prosecutor consequently decided not to pursue the case and the criminal charges were dropped.  However, the student was expelled in his senior year.

If Beth had failed to report the student's theft (that's what it was), she could have found herself in trouble with the authorities.  Once she was informed about the theft by the technician, she had no choice but to investigate and then act on her findings.  In fact, the funding that paid the student came from a Federal grant, a situation that might have triggered an even bigger investigation if there had been any attempt at a cover-up.  Regardless of the funding source, Beth was obligated to report the misuse of funds and to try to recover them.  Beth may have felt sympathy for the student, but the student was solely responsible for his actions and, moreover, was clearly aware that what he was doing was wrong (the forgeries).  Furthermore, if Beth had simply fired him, he would have likely gone on to work for someone else at the university or elsewhere, possibly repeating this crime.  Reporting the student to authorities may have been painful for Beth, but failing to do so and stopping him from doing further harm would have been unethical.

Image Credit (Modified from http://www.onlinedegreesaccredited.net/wp-content/uploads/2009/09/medical-lab.jpg and http://www.writeshop.com/blog/wp-content/uploads/2010/04/Teen_boy_writing.jpg)

Monday, October 4, 2010

Swimming with Sharks

When we start out in science, there are many things that will influence our careers.  We are aware of all the obvious technical skills that must be mastered in order to succeed in our particular science field, but less aware of other challenges that may be our biggest obstacles.  In the previous post, I talked about social anxiety and how this might greatly affect one's ability to function professionally.  

Another area that we don't think about much in the beginning and may be neglected in academic programs is ethics (and dealing with unethical people).

In every profession, there are people who try to get ahead by engaging in unethical behavior.  This type of behavior seems to be exacerbated in situations where competition is intense.  When resources are scarce or where there are "territorial" issues, people who cannot prevail based on their skills may resort to under-handed measures.  My sense is that there are only a small number of "true sharks" in science fields--people who have no scruples and will stop at nothing to get what they want.  They exist, but do not predominate.  I don't think scientists are more ethical than the average person, just that the field does not particularly attract people who are unethical.  At the opposite end of the spectrum are people who cannot be tempted under any circumstance to make an unethical decision.  Perhaps more common are those people who under normal circumstances would not do anything unethical, but when put under pressure will turn into sharks, i.e., they are "latent sharks".  The breaking point obviously varies from person to person and with the situation.  

Most of us tend to focus on the "true sharks" and what harm they might do to us and our careers.  However, it's possible that the "latent sharks", who are more abundant, may be more likely to harm us.  Or possibly the ethical choices we make ourselves, if wrong, can do far more damage to us than any deliberate act by someone else.  You may be thinking that you would always know the right thing to do and would not make an ethical mistake, but are you sure?

That shark threatening your well-being may not be a person, but an ethical dilemma.

When we start out in our careers, we often don't realize the difficult ethical choices that we may face as scientists.  I'm not talking here about falsifying data or other obviously fraudulent actions that normal people recognize as being wrong.  We all know these are not only unethical, but absolutely not tolerated in science.  No, I'm talking about situations that scientists (and other professionals) face, but in which some people might find it difficult (under pressure) to do the right thing.  Perhaps even more challenging are those situations in which the correct response is not always clear. 

Here are a few examples to get us thinking about these ideas:

1.  Mary is an assistant professor who receives a research proposal for review that focuses on the exact same questions she is currently pursuing.  The proposal describes a unique approach that is far superior to what she has been using in her own project and addresses a key issue that has been a stumbling block for her.  Imagine further that she has not been as successful as she needs to be and will not get tenure unless she publishes more and gets a decent sized grant--soon.  She's already invested all her time and start-up funds on this research question, and it's too late to start over.  Solving her methodological problem would clear the way for her research to take off.  Let's further say that the proposal author already has a lot of funding (information revealed in the current and pending support).  Mary rationalizes that she would have eventually come up with this technique and decides to use it in her own research.  She also decides that the proposal author already has had more than his share of funding and won't be hurt if he doesn't get this grant. She gives it a "good", rather than "excellent" score, knowing that this will probably sink it and perhaps give her some time to implement the new approach.

Would you find it difficult, if you were in Mary's situation, to do the right thing?  Is taking another scientist's ideas a form of plagiarism? Was there ever any possibility that Mary could have provided an unbiased review of this proposal?  What, if any, are the possible negative repercussions of Mary's actions (for her)?

2.  Here's a variation on first example.  Cynthia is an ambitious post-doc having a problem with one of her laboratory techniques.  She's been trying for weeks to resolve this, but has been unsuccessful.  She is at her wit's end and finally goes to her PI to ask for help.  After listening to her tale of woe, he tells Cynthia not to worry--that he'll have a solution for her tomorrow.  The next day, she finds a manuscript on her desk with a note from the PI.  It says, "Check out the methods section...it has the solution to your problem.  However, don't make a copy of this or give it to anyone else.  Also, don't tell anyone that I gave this to you." She tries the method described in the paper and lo and behold, it works! When she excitedly reports this to her PI, she says, "I've scoured the literature and can find no mention of this technique.  Where did you get this paper?" The PI smiles mysteriously and says, "I've got dozens of them in my files."

Cynthia's PI was trying to help her by giving her this method, but is what he did unethical?  What if he won't tell her the source (author) of the paper?  If she uses this method and describes it in her paper without acknowledging the source, is this unethical?  What might happen to the two of them if others find out?

3.  Here's a very different dilemma. Beth is an assistant professor and has an undergraduate student worker who is discovered to have been falsifying his time sheets.  Let's say that Beth's lab technician has reported this to her.  Beth contacts the head of student affairs for guidance.  She is told that the student's actions, if guilty, are considered by the university to be a crime and will be turned over to the campus police; the student will also be expelled.  Let's say Beth is reasonably certain that at least a portion of the time he claimed has been falsified.  Reporting this student will lead to his possible arrest and prosecution and definitely terminate his academic pursuit.  She is hesitant to cause this student to be arrested and expelled.  What if she just fires the student, but does not report him to authorities?  Is that ethical or unethical? What would you do?

One of the above examples is real, the others are fiction.  In the next post, I'll describe the outcome of the real example and try to explain what might happen in the fictional scenarios.

Image Credit (modified from http://scrapetv.com/News/News%20Pages/usa/images-4/great-white-shark-2.jpg)

Sunday, May 9, 2010

After Hours Dilemma

The Problem:

What should a PI do if a new technician asks to be excused from working after hours? In this case, the technician has expressed a reluctance to work in an empty building. If the PI had made the expectation for after-hours work clear from the beginning, then the obligation to make allowances in this technician’s case is lessened. We’ll assume in this instance, however, that there was no explicit discussion during hiring. It may be that the problem did not occur to the technician until after she spent a few evenings alone in the lab. It’s also possible that she is simply looking for a way to avoid work.

Either way, the request by this new technician requires a careful response.  She cannot excuse the technician without giving the rest of the staff the same option.  Even if most of the staff would continue to come in anyway, the fact that one person does not will ultimately lead to general disgruntlement (someone will have to take over this technician’s after-hours duties).  Also, if the research depends on after-hours work, the PI cannot afford to allow it.  So it would be a mistake to give this new technician or anyone else permission to opt out of their duties. If the research can be accomplished without after-hours work, then another option might be investigated. But we’ll assume that’s not the case here.

The Solution:

The PI should begin by meeting with the technician and explain that work after hours is necessary for the success of the lab’s research program. She should acknowledge the technician's concerns about safety, but explain that she would like to come up with an alternative solution.

The PI should then ask the technician specifically what she is concerned about. The technician has stated only that she does not want to work alone in an empty building. Is she concerned about having an accident and no one to help, being the victim of an attack in an empty building or parking lot, or just what? Once the PI has a better idea of the specific concern, then she can ask the technician what would alleviate that concern (other than not working after hours). She should ask if the presence of other people would alleviate her anxiety or if better building security would help. The PI should ask what the technician might do to contribute to the solution (e.g., make use of a campus transport system or have a spouse or friend drop her off and pick her up on the days she must work late).

Once some other options are identified, then the PI and technician can work out the best solution together. It may be that some simple security changes would be sufficient to allay the technician’s fears.

If all else fails, the PI may have to implement a “buddy system” requirement for work after hours—for all employees and students. I realize this option sounds inconvenient and possibly unfeasible in some situations. But it may be the only solution in this particular case. Furthermore, this might be something to consider for the lab anyway--aside from the situation with this technician. Accidents can occur anywhere and any time, and less experienced staff or students may do some really dumb/dangerous things in the lab, particularly when supervisors are not around.

So, the answer to the question I posed initially is not completely straightforward. The PI is not legally obligated to excuse the technician from working after hours, since this was never promised during the interview. On the other hand, the PI failed to ensure that the technician understood the after-hours requirement for the position. So the PI bears some responsibility for the situation and therefore shouldn’t just turn down the request. The PI is also obligated to ensure the safety of her staff.  The correct response is to address the technician's concerns about safety and to find a solution that does not jeopardize the PI's research program.

In the future, the PI should ensure that new hires understand (and acknowledge) the duties and expectations of the job.

Saturday, May 8, 2010

PI Precautions

This post continues the discussion about what a PI is obligated to do in the event her technician asks to be excused from working after hours.  See previous two posts for background.  In this post, I suggest some basic precautions that the PI should have taken. 

1. Job description. It is always a good idea to spell out in writing (even if your institution doesn’t require it) what the official duties are for each position in the lab and specifically what this work entails—working long hours, travel involving overnight stays, exposure to bad weather or hazardous conditions, etc. These specifics should be part of any job description and explained verbally to job candidates. To be doubly sure, a PI can request the new employee’s signature (or email response) on the job description, acknowledging that they understand the duties and have agreed to them. This precaution will avoid future disagreements over who said what and when.

2. Other staff. Applying policies unequally among staff is a good way to create a disgruntled lab group, or even stimulate an employment discrimination suit. The solution here is simple:  apply all policies equally.   This means that if a PI excuses one technician from working after hours, then the option must be extended to all of them. If research depends on staff being able to work after hours, then the PI would want to avoid having to offer this option. If this expectation is made clear to job applicants and expressly stated in the duties, then there will be less likelihood that employees will ask to be excused.

3. Safety. First, the PI must ensure that her lab and staff are meeting basic safety and OSHA regulations. This is to protect them and the PI (in the event of an incident).

a. At a minimum, the PI should have protocols in place for dealing with likely emergencies (chemical spills, fire, etc.)—and these protocols should be in writing (preferably in a binder readily accessible in the lab).

b. There should be emergency numbers posted prominently.

c. Standard safety equipment (coats, goggles, masks, eyewash station, etc.) should be available and in working order.

d. MSDS (Material Safety Data Sheets) should be available for all chemicals in the lab and compiled in a single place—for easy access.

e. I would also suggest holding regular lab group meetings to go over safety protocols (minimum once per year or whenever there is a large turnover of staff)—and document that you held this meeting.

f. If your staff does fieldwork, there should be a standard protocol such as filing a “float plan”, holding a “tailgate safety meeting” prior to departure, having emergency phone numbers in hand, cell phone or satellite phone, first aid kits, etc.

g. It’s a good idea for all staff to have taken first aid and CPR training. My agency requires it and annual refreshers. Many institutions arrange for such training through the Red Cross.

h. If your group handles especially hazardous materials or operates special machinery or vehicles, ensure that all have received official training and adhere to regulations.

If the PI has done all the above, then in the event of an incident, there will be less chance that the PI will be blamed for negligence. It’s not a guarantee, of course, but failure to have done these things can be used against the PI in the event of litigation. It is unwise to assume that an accident is unlikely or that a subordinate (or their family) is unlikely to sue in the event of injury or death. A PI must assume that accidents will happen no matter how benign the setting or how careful the staff.

In terms of building security, a PI cannot usually afford to install surveillance cameras or alarms, hire security guards, or implement elaborate building access restrictions. However, these are all suggestions that can be made to the institution—and the PI should be sure to document that she made these suggestions and what their response was.

All of the above suggestions are what should be done prior to being faced with the hypothetical issue of the fearful employee. But they don’t really provide a solution to this case. The next post provides a more specific solution.

Thursday, May 6, 2010

Decisions, Decisions

This post is a follow-up to the previous post that described a hypothetical situation in which a PI has hired a new technician who later expresses a fear of working after hours in a nearly deserted building. Others of the staff regularly work outside of normal hours because of the nature of the research. I asked if the PI is obligated to excuse her from those duties.

The majority of readers (so far) think that the PI is not obligated to comply with this request (see poll at bottom of page).

In this post, I will begin to examine the issue, and in succeeding posts suggest some solutions to this specific situation.

There are several aspects to this issue. Let’s take a look at each one.

1. Job description. First, the PI must consider what this employee’s job description says and what was said when she was hired. If the employee expressed a concern about working after hours during the interview, and she was told that she would not be required to do so, then there is an implied contract.  The PI may be obligated to uphold this contract. If the PI (or HR) specified, on the other hand, that she would occasionally or routinely have to work after hours (or whatever the case might be), then she accepted the job under that agreement. 

2. Other staff.  The PI must apply any policy equally to all staff in her lab.  She cannot let one staff member be excused from some aspect of work and not extend the same possibility to everyone (exceptions might be for someone who is handicapped, for example, but this would have likely been discussed upon hiring). Failure to do this could lead to a discrimination lawsuit and land the PI in very hot water.

3. Safety. Even if the PI thinks that safety is not a real issue in the building or whatever the work situation entails, failure to take proper precautions could expose the PI (and institution) to legal problems if an accident occurs.  The PI is responsible for the safety of those staff she supervises (and anyone who spends time in her lab). Anyone working alone in an office, a lab, or in the field can have an accident (chemical spill, falls), suffer some health problem (allergic reaction, heart attack), or be the victim of a criminal act. Not only does the PI not want any harm to come to her staff on ethical/moral grounds, she also does not want to put herself into a position of liability. If an accident occurs, the first thing that the institution will investigate is whether the PI followed safety regulations (OSHA and institutional regulations) and did everything to ensure staff safety. If the PI hasn’t taken reasonable precautions, guess who is going to be blamed?

In the next post, I'll consider some options a PI may consider in dealing with this situation.

Tuesday, May 4, 2010

You Are Excused?

Hypothetical situation:

You are a PI and have hired a technician (female). Your lab's research occasionally requires work after hours and on weekends to monitor experiments or to complete critical analyses. After a few months, the new technician expresses fear at working in a nearly deserted building and asks to be excused from such work. The security at your workplace is not great, but there have never been any "incidents".  You have other staff who routinely perform after-hours research.  Are you obligated to agree to the new technician's personal preferences?

What do you think? Express your overall decision in the poll (for results, see bottom of page).  I'll provide my thoughts on the issue later.

Tuesday, April 13, 2010

Proceed with Caution

Should you have professional liability insurance? Consider the following examples (most based on real situations, a few hypothetical):

Example 1: An employee, who was given a poor performance rating by you, initiates an EEO suit against your agency, claiming discrimination. The agency loses and initiates an adverse action against you.

Example 2: You learn that your graduate advisor has plagiarized a paper you wrote, but never published. Your complaints to the university are brushed off.

Example 3: Following budget cuts, you are assigned to a position for which you are not qualified. When you complain, your employing agency threatens adverse action.

Example 4: One of your subordinates falsifies a financial document, which you sign off on (without knowing that the information was false). Years later, an anonymous “tip” and subsequent audit uncover the inaccuracies, and you are held responsible.

Example 5: You (a technician) report a professor for violating radiation safety regulations and are fired a few days later.

Example 6: You provide campus police with information about a colleague in your department who has been engaging in “erratic, stalking behavior”. This person is eventually dismissed from his position. You are later named as a defendant in a lawsuit brought by this person against the university and individual department members.

Example 7: You (an instructor) charge a student in one of your classes of plagiarism. The student complains to your department, which has to go through a great deal of effort to resolve the situation. Your contract is not renewed the next semester. Your spouse, who works in the same department, is also let go.

Example 8: Someone uses your office computer to download pornography, which is discovered after an anonymous “tip” to campus police. You are dismissed from your position.

Example 9: You (a student) write an anonymous criticism of your university administration in a popular blog. The university uncovers your identity and not only threatens you with university disciplinary action, but sues you.

Example 10: You are a minor co-author on a student’s paper that is later found to contain plagiarized material and must be retracted. The student admits to the plagiarism, and the student’s advisor (a tenured professor and a senior author on the paper) also accepts responsibility. They are in a different department from you.  You go up for tenure shortly after this debacle and are denied, despite having a strong tenure package.  You suspect that the plagiarized paper played a role in this decision.

In some of the above cases, it cost between $10,000 and $70,000 in legal assistance to resolve the situation. Note that it doesn't matter if you are "innocent" of wrongdoing; you still will need legal advice and/or representation. Other situations can lead to greater costs, especially if criminal charges are involved. 

I never thought about needing liability insurance until I saw some colleagues go through some experiences like the ones above. I was especially disturbed to learn that not only might your employing institution or agency not provide legal assistance to you, but that they might even initiate adverse actions against you in some cases.

I know of only a few colleagues who have liability insurance.  Most of those at academic institutions I've spoken with think they don't need it (or don't want to think about it).  More colleagues in the Federal government, however, seem to have liability insurance.  Supervisors are particularly vulnerable to being accused of wrongdoing by subordinates--at some point in their careers. The government may provide a lawyer for you in such cases, but only if it's in the agency's (not your) best interests.  Thus, Federal employee associations recommend professional liability insurance, because members have ended up paying $30,000 or more to defend themselves against accusations.