We're talking about the Blue Ocean Strategy (BOS) (see previous posts for background). The second approach in the BOS is to look across alternative strategies to identify potential blue oceans. An example in the automotive industry would be manufacturers of expensive luxury cars vs. low budget, practical cars. Two different groups of customers exist within a larger industry. The luxury car makers (BMW, Mercedes, Jaguar) compete with each other, but not with the low budget car manufacturers. A blue ocean strategist might combine some of the luxury features with more practical aspects in an intermediate-priced car--and potentially creating a new market.
An example of a company that merged alternative strategies in an industry is Curves, a popular women's fitness company. They looked across strategies in the fitness industry and came up with a combination of health club and home exercise program. They reasoned that women who were struggling to stay fit avoided health clubs where they were intimidated by complicated exercise machines and hated being scrutinized by men. Women who used home exercise videos could work out in private at a fraction of the cost of a health club and with little or no equipment.
The reason women switch from home exercise to health clubs is because of the difficulty of sticking to an exercise routine at home alone. Curves combined the advantages of the two exercise strategies and eliminated all other aspects (complicated machines, pools, spas, locker rooms). They established a club for women where hydraulic exercise machines (simple and nonthreatening) are arranged in a circle to promote socializing. Few mirrors are on the walls to remind customers of their less-than-perfect bodies. Women move around the room to different machines at their own pace. The routine can be completed in 30 minutes. The cost is more than a home exercise video, but a fraction of a normal health club membership. They created a blue ocean--one that has been highly successful.
What might be a situation in science analogous to the automotive and fitness examples described above? We might consider the expensive, complicated research program vs. a research field that uses inexpensive, simpler methods. For example, we might combine molecular biology (with expensive instruments, complicated or meticulous laboratory techniques, specialized knowledge) and ecology (with few or no instruments, simple or less demanding methods, and basic knowledge). We might also combine basic and applied research approaches. An ecologist might be studying how "nurse" plants facilitate establishment and growth of other plant species (by trapping seeds or ameliorating environmental conditions). The initial interest is in understanding how natural ecosystems function and specifically plant-plant interactions. By incorporating molecular techniques into a basic ecological study, one might gain better insight into the underlying mechanism of facilitation. Different genotypes within a nurse species might vary in their "nurse" characteristics due to morphological or other features. Such genotypic differences could be exploited to identify suitable nurse plant material for use in ecosystem restoration projects at sites where environmental conditions are stressful to the target species to be restored. The genotypes identified in the laboratory would then be field tested and evaluated using standard ecological methods. The results could be used to guide restoration of disturbed or degraded ecosystems. Ecosystem restoration sites could be prepared by introducing the selected nurse genotype(s), which might promote natural recruitment of the target species or modify the site for later planting (reducing overall project costs). The work would address basic science questions related to plant-plant interactions, but also address important applied aspects. An ecologist might team up with a molecular biologist and perhaps seek funding from sources that they would not normally consider or be successful with separately. Or, the ecologist could acquire the knowledge and some basic equipment to incorporate a few molecular techniques into their repertoire.
In the past, scientists trained in different disciplines rarely talked, much less interacted to address a research question. Although this segregation is dissolving, many ecologists, geologists, microbiologists, etc. are still working primarily within their disciplines and do not seek partnerships with those outside their specialty. However, that interface between scientific disciplines is one place where blue oceans exist. If you take a look at major breakthroughs in science, often one sees the collaboration of two or more disciplines. Many of our most difficult scientific and societal challenges (environmental pollution, climate change) will require multi-disciplinary teams combining disparate knowledge, methods, and approaches to come up with viable solutions.
So to summarize: the key is to identify alternative strategies (e.g., scientific disciplines) that when combined lead to a blue ocean--one in which there is little or no competition--at least in the beginning. Once your approach or idea is published, others will follow. However, you will have a head-start on the crowd and hopefully will be well-established as a leader in this new field by the time the competition catches up.
Image credit: www.flickr.com (modified from a photo by M. Smith)
Sunday, October 31, 2010
Sunday, October 24, 2010
Uncharted Waters
This post continues the discussion of the Blue Ocean Strategy (BOS)--a business concept that strives to make the competition irrelevant by creating new, uncontested market space. I've been attempting to apply some of the strategic moves of this approach to building a science career and dealing with competition.
There are six assumptions underlying the strategies used by businesses. I've modified them here to apply to the science professional:
1. We define the profession similarly and strive to be the best within it.
2. We view science through the lens of accepted strategic groups (e.g., basic and applied research).
3. We focus on the same user group: other scientists (research), students (teaching), private clients (consulting), or the public (government).
4. We define the scope of science products and services similarly.
5. We accept the scientific profession's functional or emotional orientation.
6. We focus on the same point in time in formulating strategy.
The more you adhere to this conventional wisdom, the greater the overlap with competitors (a red ocean). BOS says that to break out of this red ocean, you have to look outside the conventional boundaries to create a blue ocean.
BOS suggests several paths that one might take to break free of the red ocean.
In this post, I'll consider one path: Look across alternative professions for inspiration. BOS focuses on alternative industries that provide very different services, but provide a similar function. An example would be restaurants vs. cinemas. These industries have very few features in common, but serve the same purpose: pleasure and entertainment. Another example would be in transportation: driving and flying. Southwest Airlines looked to driving as the alternative to flying (rather than competing with other airlines for customers). Their goal was to provide fast travel by air at the price of car travel. In one fell swoop, they eliminated the competition (other airlines) and created a blue ocean.
What are the alternatives for science professionals?
First, let's consider what we do as science professionals. Our ultimate goals are to discover new knowledge and to educate others. We conduct research in a scientific field and publish the findings in a professional journal or government report. Some of us teach science to students who go on to do research and teaching (academia), take government jobs (science policy, resource management, regulation), or do consulting (private industry). A few of us participate in "outreach" activities--taking science directly to the public. Most, however, typically leave the latter to "science writers" and the general media. Scientists typically feel that it is their job to conduct science, not to translate it for the non-professional.
However, the interface between the scientist and the journalist is a potential blue ocean. A number of scientists have succeeded as "science popularizers": Carl Sagan, Jared Diamond, Oliver Sacks, David Suzuki, Stephen Jay Gould, etc. Many have written popular books or articles; some host science shows on TV; a few have written screenplays that were made into movies, some have popular blogs (Pharyngula). I found it interesting that of the 100 or so science popularizers listed in Wikipedia, only three women were included, of whom only one appeared to have worked as a scientist (Kirsten Sanford). There seems to be an opportunity here for female scientists who have a talent for explaining science to the public.
I'm not proposing that all scientists should become science popularizers. I'm simply pointing out a unique niche by looking across alternative professions (science and journalism). This niche is occupied mostly by non-scientists (with a few notable exceptions as listed above), which creates an opportunity for those with a science background. The difference is that the role of science reporter is filled by a scientist who is more knowledgeable than a journalist (who may have only a rudimentary understanding of what they write about). If the scientist is an equally good communicator, their science background clearly gives them a decided advantage over the typical journalist (in-depth understanding of science topics, credibility, contacts in the science community).
That's just one example. Look across other alternative professions: art, architecture, history, horticulture, information technology, law, museums, philanthropy, religion, transportation...to name a few. Lots of potential ideas.
The point is that we limit our opportunities by defining our roles as scientists in a restricted way (researcher, professor) and thinking that the only way to succeed is to be the best within that limited definition. By breaking free of these traditional roles, we can see new ways to succeed and make a contribution--in non-traditional roles that may be better suited to our talents and where we have a competitive edge because of our science background.
As I said in earlier posts, however, striking off on your own into uncharted waters is risky and scary. But for those who succeed, the payoff can be hugely rewarding.
Image Source: Modified from the painting "Lewis and Clark at Three Forks" by Edgar S. Paxson. Image courtesy of Lewis and Clark 2001, the Montana Historical Society, NOAA/OER. Individuals from left are Coulter, guide; York, Clark's servant; Captain Meriwether Lewis; Captain William Clark; Sacagawea; Charbonneau, Sacagawea's husband.
There are six assumptions underlying the strategies used by businesses. I've modified them here to apply to the science professional:
1. We define the profession similarly and strive to be the best within it.
2. We view science through the lens of accepted strategic groups (e.g., basic and applied research).
3. We focus on the same user group: other scientists (research), students (teaching), private clients (consulting), or the public (government).
4. We define the scope of science products and services similarly.
5. We accept the scientific profession's functional or emotional orientation.
6. We focus on the same point in time in formulating strategy.
The more you adhere to this conventional wisdom, the greater the overlap with competitors (a red ocean). BOS says that to break out of this red ocean, you have to look outside the conventional boundaries to create a blue ocean.
BOS suggests several paths that one might take to break free of the red ocean.
In this post, I'll consider one path: Look across alternative professions for inspiration. BOS focuses on alternative industries that provide very different services, but provide a similar function. An example would be restaurants vs. cinemas. These industries have very few features in common, but serve the same purpose: pleasure and entertainment. Another example would be in transportation: driving and flying. Southwest Airlines looked to driving as the alternative to flying (rather than competing with other airlines for customers). Their goal was to provide fast travel by air at the price of car travel. In one fell swoop, they eliminated the competition (other airlines) and created a blue ocean.
What are the alternatives for science professionals?
First, let's consider what we do as science professionals. Our ultimate goals are to discover new knowledge and to educate others. We conduct research in a scientific field and publish the findings in a professional journal or government report. Some of us teach science to students who go on to do research and teaching (academia), take government jobs (science policy, resource management, regulation), or do consulting (private industry). A few of us participate in "outreach" activities--taking science directly to the public. Most, however, typically leave the latter to "science writers" and the general media. Scientists typically feel that it is their job to conduct science, not to translate it for the non-professional.
However, the interface between the scientist and the journalist is a potential blue ocean. A number of scientists have succeeded as "science popularizers": Carl Sagan, Jared Diamond, Oliver Sacks, David Suzuki, Stephen Jay Gould, etc. Many have written popular books or articles; some host science shows on TV; a few have written screenplays that were made into movies, some have popular blogs (Pharyngula). I found it interesting that of the 100 or so science popularizers listed in Wikipedia, only three women were included, of whom only one appeared to have worked as a scientist (Kirsten Sanford). There seems to be an opportunity here for female scientists who have a talent for explaining science to the public.
I'm not proposing that all scientists should become science popularizers. I'm simply pointing out a unique niche by looking across alternative professions (science and journalism). This niche is occupied mostly by non-scientists (with a few notable exceptions as listed above), which creates an opportunity for those with a science background. The difference is that the role of science reporter is filled by a scientist who is more knowledgeable than a journalist (who may have only a rudimentary understanding of what they write about). If the scientist is an equally good communicator, their science background clearly gives them a decided advantage over the typical journalist (in-depth understanding of science topics, credibility, contacts in the science community).
That's just one example. Look across other alternative professions: art, architecture, history, horticulture, information technology, law, museums, philanthropy, religion, transportation...to name a few. Lots of potential ideas.
The point is that we limit our opportunities by defining our roles as scientists in a restricted way (researcher, professor) and thinking that the only way to succeed is to be the best within that limited definition. By breaking free of these traditional roles, we can see new ways to succeed and make a contribution--in non-traditional roles that may be better suited to our talents and where we have a competitive edge because of our science background.
As I said in earlier posts, however, striking off on your own into uncharted waters is risky and scary. But for those who succeed, the payoff can be hugely rewarding.
Image Source: Modified from the painting "Lewis and Clark at Three Forks" by Edgar S. Paxson. Image courtesy of Lewis and Clark 2001, the Montana Historical Society, NOAA/OER. Individuals from left are Coulter, guide; York, Clark's servant; Captain Meriwether Lewis; Captain William Clark; Sacagawea; Charbonneau, Sacagawea's husband.
Thursday, October 14, 2010
The Darkside of Scientific Competition
In the previous posts, I started out talking about self-promotion, then moved on to competition (and ways to avoid it), and finally, to ethics in science. In this post, I'd like to examine how competitive atmospheres can contribute to unethical behavior by scientists.
In earlier posts, I described the Blue Ocean Strategy, a business concept in which the competition is made irrelevant by creating new market space free of competitors. An example I gave was self-funding by scientists--who use a portion of their income (e.g., from consulting) to cover their research expenses, freeing them from having to write proposals and suffering criticism at the hands of harsh reviewers and panelists. Another example was the idea of submitting proposals to smaller or unusual funding sources where the competition is less intense than at NSF or NIH.
Competition in science can be good when it serves to ensure that resources and other rewards are fairly distributed after an evaluation of the qualifications and merits of all eligible parties. Without such a system, cronyism prevails. However, when competition becomes intense, it can become counter-productive and even lead to pathological behavior. Such an atmosphere becomes particularly problematic within research organizations when PIs are pitted against each other. In some instances, scientists might be forced to compete for limited resources within the organization or to share lab space (a recipe for disaster, in my opinion). We've all heard tales of sabotage, in which one laboratory group interferes with the experiments, equipment, supplies, data, etc. of a rival group. Some of us have experienced sabotage first hand.
My experience (at a previous organization) was quite distressing--not only because of the sabotage itself, but because the lab director failed to correct the atmosphere that encouraged such behavior. One of the instances that stands out in my mind was an occasion in which I had submitted a proposal to the sponsored research office (SRO) for final approval and submission to the funding agency. The submission deadline was near (close of business that day), and I was anxious about this particular proposal, into which I had put a lot of effort. I had walked the proposal through the university system to ensure that there were no delays, and had gotten a final approval signature. I thought everything was clear for the SRO to submit the proposal--and went back to my office.
The contract specialist at the SRO, doing a final check of the proposal, found a problem with the proposal budget that needed to be corrected and called the lab to speak with me. However, I was out of my office at lunch when the call came. The secretary took the call, wrote down the urgent message that I needed to call back immediately, and put it into my mailbox (this was before voice mail). In this particular institution, our mailboxes were simply open slots into which mail was placed--and where anyone else could see it. I came back from lunch, checked my mailbox and finding nothing, proceeded on to my office.
About mid-afternoon, I had a nagging feeling and decided to call the SRO contract specialist. I caught her just as she was leaving (early) for the day. When she did not hear back from me, she either forgot about my proposal or didn't care (that's another story). We quickly fixed the budget problem, and the proposal was submitted on time. I later questioned the secretary who took the phone call, and she insisted that she put the message slip in my box. The next day, the phone message mysteriously appeared in my mailbox. Not only had someone tried to sabotage my funding, but they made sure I knew that I had been sabotaged. I suspected who had done it (another PI), although I had no proof. I can't remember if I complained to the director about this instance or not. There were so many of them, and my complaining never resulted in any action. I gave up at some point.
My point with this tale is that in this particular lab, competition among research groups was strongly encouraged. We also had an open lab plan, which was an open invitation for unethical people to sabotage other's work. Fortunately, most of my experiments were conducted in the field or greenhouse (where access was limited), and I never left any of my samples in unlocked drawers or lab analyses unattended. I don't think any of my experiments were ever sabotaged (that I could detect).
The director of this lab sincerely thought that the competitive model was the one that would yield the greatest scientific output. He never considered the cost--lost opportunities for collaboration (by the rival groups), time wasted on security measures or repetition of compromised experiments, and low morale and a pathological workplace. The biggest danger is that someone will eventually do something so unethical that it casts suspicion on the research of the entire lab and/or leads to severe sanctions against the lab. This almost happened as a result of this particular PI's underhanded ways.
He (Dr. X) and another PI with similar lack of ethics (Dr. Y) plagiarized the proposal of a group of scientists in another department. Dr. Y apparently got access to a draft proposal by this second group (I think he may have gotten a copy from a post-doc or graduate student). Drs. X and Y wrote their own proposal and lifted whole pages of text from the other group's proposal, which they used in their proposal--perhaps thinking it would never be discovered. Both proposals were to be submitted to NSF. The contract specialist in the SRO noticed the similarity of the two proposals, and an investigation ensued. I thought, "At last, Dr. X has gotten caught with his hand in the cookie jar."
I should have known better.
Amazingly, Dr. X weaseled his way out of this dilemma. Here is what happened. Instead of attempting to deny or defend his and Dr. Y's actions, he scoured the publications of the authors of the original proposal. He found a recent book chapter in which one of the authors had reproduced a figure from one of Dr. X's early papers (a famous one). The figure in question was used in the chapter to illustrate a well-known phenomenon, but the attribution to the original source somehow was omitted from the figure legend (an oversight caused by a careless post-doc helping with the chapter preparation--not deliberate plagiarism). Dr. X produced this figure as evidence that his work had been plagiarized by the other group. The university administration, faced with this conundrum, decided to sweep it all under the rug. They forced both groups to withdraw their proposals, but no one was sanctioned for plagiarism.
An interesting paper in the journal Science and Engineering Ethics by Melissa Anderson describes the counter-productive outcomes of competition among scientists. The abstract:
Competition among scientists for funding, positions and prestige, among other things, is often seen as a salutary driving force in U.S. science. Its effects on scientists, their work and their relationships are seldom considered. Focus-group discussions with 51 mid- and early-career scientists, on which this study is based, reveal a dark side of competition in science. According to these scientists, competition contributes to strategic game-playing in science, a decline in free and open sharing of information and methods, sabotage of others’ ability to use one’s work, interference with peer-review processes, deformation of relationships, and careless or questionable research conduct. When competition is pervasive, such effects may jeopardize the progress, efficiency and integrity of science.
The authors point out that none of the scientists in the focus groups had anything positive to say about the impact of competition on their work--just the opposite. Some of the participants felt that part of the reason is that science is much more competitive today than in the past (when it was easier to be collegial). They pointed out that patents and other related issues are more often at stake, making secretiveness more prevalent today. Increased competition may not be the whole story, but it can obviously exacerbate unethical behavior.
What's disturbing is that those who fund, manage, and regulate scientific research may not be fully aware of the negative impact of competition on a field that thrives on openness, sharing of ideas, and collegiality. There is renewed emphasis on scientific integrity (as evidenced by the formation of new offices throughout government agencies charged with identification and punishment of scientific fraud). I wonder how many of these science managers recognize the role of competition and the pressures it imposes on scientists? My little example above suggests that some managers assume that competition is good--that it leads to healthy rivalry and greater scientific output. My own experience and apparently that of other scientists suggests otherwise.
Image Credit (modified from photo at flickr.com)
In earlier posts, I described the Blue Ocean Strategy, a business concept in which the competition is made irrelevant by creating new market space free of competitors. An example I gave was self-funding by scientists--who use a portion of their income (e.g., from consulting) to cover their research expenses, freeing them from having to write proposals and suffering criticism at the hands of harsh reviewers and panelists. Another example was the idea of submitting proposals to smaller or unusual funding sources where the competition is less intense than at NSF or NIH.
Competition in science can be good when it serves to ensure that resources and other rewards are fairly distributed after an evaluation of the qualifications and merits of all eligible parties. Without such a system, cronyism prevails. However, when competition becomes intense, it can become counter-productive and even lead to pathological behavior. Such an atmosphere becomes particularly problematic within research organizations when PIs are pitted against each other. In some instances, scientists might be forced to compete for limited resources within the organization or to share lab space (a recipe for disaster, in my opinion). We've all heard tales of sabotage, in which one laboratory group interferes with the experiments, equipment, supplies, data, etc. of a rival group. Some of us have experienced sabotage first hand.
My experience (at a previous organization) was quite distressing--not only because of the sabotage itself, but because the lab director failed to correct the atmosphere that encouraged such behavior. One of the instances that stands out in my mind was an occasion in which I had submitted a proposal to the sponsored research office (SRO) for final approval and submission to the funding agency. The submission deadline was near (close of business that day), and I was anxious about this particular proposal, into which I had put a lot of effort. I had walked the proposal through the university system to ensure that there were no delays, and had gotten a final approval signature. I thought everything was clear for the SRO to submit the proposal--and went back to my office.
The contract specialist at the SRO, doing a final check of the proposal, found a problem with the proposal budget that needed to be corrected and called the lab to speak with me. However, I was out of my office at lunch when the call came. The secretary took the call, wrote down the urgent message that I needed to call back immediately, and put it into my mailbox (this was before voice mail). In this particular institution, our mailboxes were simply open slots into which mail was placed--and where anyone else could see it. I came back from lunch, checked my mailbox and finding nothing, proceeded on to my office.
About mid-afternoon, I had a nagging feeling and decided to call the SRO contract specialist. I caught her just as she was leaving (early) for the day. When she did not hear back from me, she either forgot about my proposal or didn't care (that's another story). We quickly fixed the budget problem, and the proposal was submitted on time. I later questioned the secretary who took the phone call, and she insisted that she put the message slip in my box. The next day, the phone message mysteriously appeared in my mailbox. Not only had someone tried to sabotage my funding, but they made sure I knew that I had been sabotaged. I suspected who had done it (another PI), although I had no proof. I can't remember if I complained to the director about this instance or not. There were so many of them, and my complaining never resulted in any action. I gave up at some point.
My point with this tale is that in this particular lab, competition among research groups was strongly encouraged. We also had an open lab plan, which was an open invitation for unethical people to sabotage other's work. Fortunately, most of my experiments were conducted in the field or greenhouse (where access was limited), and I never left any of my samples in unlocked drawers or lab analyses unattended. I don't think any of my experiments were ever sabotaged (that I could detect).
The director of this lab sincerely thought that the competitive model was the one that would yield the greatest scientific output. He never considered the cost--lost opportunities for collaboration (by the rival groups), time wasted on security measures or repetition of compromised experiments, and low morale and a pathological workplace. The biggest danger is that someone will eventually do something so unethical that it casts suspicion on the research of the entire lab and/or leads to severe sanctions against the lab. This almost happened as a result of this particular PI's underhanded ways.
He (Dr. X) and another PI with similar lack of ethics (Dr. Y) plagiarized the proposal of a group of scientists in another department. Dr. Y apparently got access to a draft proposal by this second group (I think he may have gotten a copy from a post-doc or graduate student). Drs. X and Y wrote their own proposal and lifted whole pages of text from the other group's proposal, which they used in their proposal--perhaps thinking it would never be discovered. Both proposals were to be submitted to NSF. The contract specialist in the SRO noticed the similarity of the two proposals, and an investigation ensued. I thought, "At last, Dr. X has gotten caught with his hand in the cookie jar."
I should have known better.
Amazingly, Dr. X weaseled his way out of this dilemma. Here is what happened. Instead of attempting to deny or defend his and Dr. Y's actions, he scoured the publications of the authors of the original proposal. He found a recent book chapter in which one of the authors had reproduced a figure from one of Dr. X's early papers (a famous one). The figure in question was used in the chapter to illustrate a well-known phenomenon, but the attribution to the original source somehow was omitted from the figure legend (an oversight caused by a careless post-doc helping with the chapter preparation--not deliberate plagiarism). Dr. X produced this figure as evidence that his work had been plagiarized by the other group. The university administration, faced with this conundrum, decided to sweep it all under the rug. They forced both groups to withdraw their proposals, but no one was sanctioned for plagiarism.
An interesting paper in the journal Science and Engineering Ethics by Melissa Anderson describes the counter-productive outcomes of competition among scientists. The abstract:
Competition among scientists for funding, positions and prestige, among other things, is often seen as a salutary driving force in U.S. science. Its effects on scientists, their work and their relationships are seldom considered. Focus-group discussions with 51 mid- and early-career scientists, on which this study is based, reveal a dark side of competition in science. According to these scientists, competition contributes to strategic game-playing in science, a decline in free and open sharing of information and methods, sabotage of others’ ability to use one’s work, interference with peer-review processes, deformation of relationships, and careless or questionable research conduct. When competition is pervasive, such effects may jeopardize the progress, efficiency and integrity of science.
The authors point out that none of the scientists in the focus groups had anything positive to say about the impact of competition on their work--just the opposite. Some of the participants felt that part of the reason is that science is much more competitive today than in the past (when it was easier to be collegial). They pointed out that patents and other related issues are more often at stake, making secretiveness more prevalent today. Increased competition may not be the whole story, but it can obviously exacerbate unethical behavior.
What's disturbing is that those who fund, manage, and regulate scientific research may not be fully aware of the negative impact of competition on a field that thrives on openness, sharing of ideas, and collegiality. There is renewed emphasis on scientific integrity (as evidenced by the formation of new offices throughout government agencies charged with identification and punishment of scientific fraud). I wonder how many of these science managers recognize the role of competition and the pressures it imposes on scientists? My little example above suggests that some managers assume that competition is good--that it leads to healthy rivalry and greater scientific output. My own experience and apparently that of other scientists suggests otherwise.
Image Credit (modified from photo at flickr.com)
Tuesday, October 12, 2010
But I Didn't Cut and Paste Text!
A commenter recently questioned my definition of plagiarism as well as my recommendation to avoid using students or other trainees in conducting manuscript reviews. These are important points that warrant further discussion, so I'll spend a bit of time in this post expounding on my views. Note that these are my views, based on my experience, my discussions with colleagues, and my reading about the issues. Others certainly have the right to their own opinions (especially what is acceptable within their specific fields), and I'm not trying to say that my view is the only acceptable view.
Plagiarism:
The American Association of University Professors defines plagiarism as..."...taking over the ideas, methods, or written words of another, without acknowledgment and with the intention that they be taken as the work of the deceiver." The Office of Research Integrity (ORI) also defines plagiarism as involving the taking of words, ideas, etc. from an author and presenting them as one’s own. The Office of Science and Technology Policy (1999) defines plagiarism as: "... the appropriation of another person’s ideas, processes, results, or words without giving appropriate credit, including those obtained through confidential review of others’ research proposals and manuscripts."
I would define plagiarism similarly, even without having read these organizations' definitions. However, I recognize that some may wish to limit the use of the term "plagiarism" to the appropriation of text only. The problem with this (aside from not being an accepted definition) is that students may get the idea that the taking of other things (ideas, methods, hypotheses) without attribution is OK because it's not "technically" plagiarism. I've encountered students who expressed this belief to me.
Plagiarism of ideas is "Appropriating an idea (e.g., an explanation, a theory, a conclusion, a hypothesis, a metaphor) in whole or in part, or with superficial modifications without giving credit to its originator." The ORI goes on to say "In the sciences, as in most other scholarly endeavors, ethical writing demands that ideas, data, and conclusions that are borrowed from others and used as the foundation of one’s own contributions to the literature, must be properly acknowledged. The specific manner in which we make such acknowledgment varies from discipline to discipline. However, source attribution typically takes the form of either a footnote or a reference citation."
The ORI offers an interesting example of a situation in which an ethical author cited an unusual source of inspiration for his theory on light perception:
"Even in such cases, we still have a moral obligation to credit the source of our ideas. A good illustrative example of the latter point was reported by Alan Gilchrist in a 1979 Scientific American article on color perception. In a section of the article, which describes the perception of rooms uniformly painted in one color, Gilchrist states: 'We now have a promising lead to how the visual system determines the shade of gray in these rooms, although we do not yet have a complete explanation. (John Robinson helped me develop this lead.)' (p.122; Gilchrist, 1979). A reader of the scientific literature might assume that Mr. Robinson is another scientist working in the field of visual perception, or perhaps an academic colleague or an advanced graduate student of Gilchrist’s. The fact is that John Robinson was a local plumber and an acquaintance of Gilchrist in the town where the author spent his summers. During a casual discussion, Robinson’s insights into the problem that Gilchrist had been working on were sufficiently important to the development of his theory of lightness perception that Gilchrist felt ethically obligated to credit Robinson’s contribution."
Some scientists would scoff at Gilchrist's acknowledgment of a plumber and argue that this was unnecessary. I think his action shows integrity and, moreover, a deep understanding of the concept of plagiarism. Gilchrist clearly recognizes that his reported insights on light perception would not have occurred (or would have been quite different) had he not had the input of the plumber--and was obligated to acknowledge that source of inspiration.
If students are taught that plagiarism is only the cutting and pasting of text, they may think that appropriation of ideas, hypotheses, methods, etc. is not unethical (or at least not labeled as plagiarism and therefore not subject to sanction). This would be a serious mistake with potentially severe consequences.
Even if one is aware of this aspect of plagiarism, it is very easy to inadvertently appropriate someone else's idea or concept (sometimes called "unconscious plagiarism"). Our minds can play tricks on us, and an idea that we think is original may in fact be something we read and later remembered as our own (as my memory deteriorates with age, I'm more concerned with this point now than when I was younger). Most of us, however, are careful to cite the originator of major theories, hypotheses, or concepts in our papers. But sometimes, the way the text is worded, the impression may be given that another's idea is our own. Another error is when an author, working from notes, inadvertently uses the exact wording of another author, thinking that the notes were his/her own words summarizing the other work (always place word-for-word notes in quotes so that you do not make this error).
One exception to the plagiarism of ideas is "common knowledge". It is appropriate to make statements based on widely-recognized phenomena without attribution, e.g., "plants capture CO2 through the process of photosynthesis". A rule of thumb offered by the ORI is that if the idea or concept is widely-known among high school and college students, then it is common knowledge. What about ideas that are not common knowledge of students, but are widely recognized by experts in the field? Here's where things can get tricky, and the decision requires some experience and understanding of what's common knowledge and what requires citation (students often need guidance here). If the work is to be published in a technical journal, and the target audience is the expert, then statements based on a large body of work might not need a citation. For example, one might have an opening statement such as "The sensitivity of higher plants to elevated concentrations of CO2 depends on the specific photosynthetic pathway of each species.....we compared the responses of C3 versus C4 species." Not perhaps common knowledge of the average student, but certainly so for people working on photosynthesis. However, if you made the statement that 82% of C3 species respond to elevated CO2 with increased rates of photosynthesis, then this would require citation(s).
Another possible exception is the semi-technical article or book chapter requiring a less "formal" style of writing. Editors may ask for text that is unbroken by numerous citations (as one would expect in a technical paper) so that the writing appeals to the non-expert reader. In these cases, the article or chapter would be accompanied by a list of "additional reading", which was used in the preparation of the piece and contains the cited material.
I plan to write more about plagiarism in future posts--it's a complex topic, many aspects of which authors are not always fully aware (including me). Even the most experienced can unknowingly commit errors or may be uncertain how to proceed in specific situations. I'm certainly no expert on plagiarism, but hope to explore the topic by writing about it and, in the process, refine my understanding of its various forms.
The views of readers of this blog definitely help shape such explorations.
Students and Trainees as Manuscript Reviewers
The concern here is about using students or other trainees to perform manuscript or proposal reviews for their mentors (who were asked to do the review). As an editor, I would question the capability of a trainee (especially someone who has never authored anything) to provide an expert review--which is what the journal expects (or should be seeking in soliciting a review). As an author, I would be concerned that my work was reviewed by an inexperienced trainee, even under the mentorship of a senior person. I'm expecting a fair evaluation carried out by a peer who is well-versed in the topic of my work and who has published (i.e., is an expert and therefore qualified to assess the quality of my work and if it contributes significantly to the field).
No matter how good or conscientious a trainee, they are not equal to an expert. If they need "close supervision" by a senior person, one might argue that this confirms they are unqualified to be conducting an official peer review. How would the journal or funding agency defend such a review, if challenged? They would have no way of determining whether the PI closely supervised the trainee or instead simply forwarded the trainee's review without looking at it. I know the latter happens because I was often asked by a previous lab director to do his reviews for him (when I was a graduate student). Back then, I did not know any better and never questioned this practice. One might argue that I probably did a better and more thorough job than the director would have, but what if I had not? He did not even read the manuscripts or proposals, so he did not know if my reviews were fair or accurate.
The point is not whether a trainee can provide a passable review (some certainly can) or that they are supervised by a mentor. The concern is the author's expectation that their manuscript or proposal has been assessed by an expert and that the scores and ultimate acceptance/rejection are based on the evaluations of those qualified to make that judgment. The trainee (especially a student) may not meet that expectation. A post-doc who has published at least one first-authored paper or prepared one proposal may be qualified to conduct reviews of manuscripts/proposals. However, if the mentor is the one asked to do the review, s/he should inform the journal that the review is to be carried out by a trainee and how much supervision will be involved.
If a trainee (e.g., a post-doc) has the necessary credentials to be considered a "peer" and is capable of performing a review (based on the mentor's judgment), then it would be safe to recommend that trainee as a reviewer. If the journal or funding agency has a mechanism to allow the use of "assistant reviewers", then at least the review can be assessed with that knowledge. More importantly, the identity of all contributors to the review are formally documented and known to the journal or funding agency (in the event of a challenge). Journals in my field, however, have no such mechanism (that I'm aware of). In that case, it seems most appropriate for the mentor to suggest their post-doc as a substitute and let the journal editor extend the invitation--which provides a means to formally document the person who actually carries out the review.
Personally, I would not want to become embroiled in an investigation in which an author claims that a review was unfair (and it's discovered that my review was mostly written by a trainee, a substitution that was not formally documented by the journal). So, my advice would be to proceed with caution if you have trainees doing....I mean helping with, your reviews.
I don't think that the need to train students is a valid reason for using trainees to conduct reviews. Students can be trained to review manuscripts without involving them in the actual review process. A mentor can use published papers ranging from excellent to poor (there are plenty in the literature to choose from) and use them to train students to conduct reviews. Another possible method is to use unpublished manuscripts that the mentor reviewed in the past, have the trainee conduct a mock review, and then compare the trainee review with the actual review submitted by the mentor (caution would need to be exercised in ensuring that the trainee not know the identity of the author or make use of any information contained in the manuscript).
Training may be the motivation of some PIs in using assistant reviewers, but such training may be accomplished in other ways. In my experience, the reason that some (many?) PIs use assistant reviewers is simply to relieve themselves of the task (and justify it as training). As I said before, if you don't have time to do the review, you can decline the request.
What distinguishes this situation is that there are two competing obligations. A mentor definitely has a moral obligation to help their trainees, but there is also the obligation to ensure an "expert" review. If the journal welcomes "assistant reviewers" and has a mechanism for documenting their involvement, and the trainee is capable, then the PI may be safe in using them. A side benefit may be experience for the trainee, but that should not be the primary justification.
Image Credits (created with images from Flickr, iStockphoto, and http://www.rmu.edu/SentryHTML/images/gallery/students/group2/student_professor3.jpg)
Plagiarism:
The American Association of University Professors defines plagiarism as..."...taking over the ideas, methods, or written words of another, without acknowledgment and with the intention that they be taken as the work of the deceiver." The Office of Research Integrity (ORI) also defines plagiarism as involving the taking of words, ideas, etc. from an author and presenting them as one’s own. The Office of Science and Technology Policy (1999) defines plagiarism as: "... the appropriation of another person’s ideas, processes, results, or words without giving appropriate credit, including those obtained through confidential review of others’ research proposals and manuscripts."
I would define plagiarism similarly, even without having read these organizations' definitions. However, I recognize that some may wish to limit the use of the term "plagiarism" to the appropriation of text only. The problem with this (aside from not being an accepted definition) is that students may get the idea that the taking of other things (ideas, methods, hypotheses) without attribution is OK because it's not "technically" plagiarism. I've encountered students who expressed this belief to me.
Plagiarism of ideas is "Appropriating an idea (e.g., an explanation, a theory, a conclusion, a hypothesis, a metaphor) in whole or in part, or with superficial modifications without giving credit to its originator." The ORI goes on to say "In the sciences, as in most other scholarly endeavors, ethical writing demands that ideas, data, and conclusions that are borrowed from others and used as the foundation of one’s own contributions to the literature, must be properly acknowledged. The specific manner in which we make such acknowledgment varies from discipline to discipline. However, source attribution typically takes the form of either a footnote or a reference citation."
The ORI offers an interesting example of a situation in which an ethical author cited an unusual source of inspiration for his theory on light perception:
"Even in such cases, we still have a moral obligation to credit the source of our ideas. A good illustrative example of the latter point was reported by Alan Gilchrist in a 1979 Scientific American article on color perception. In a section of the article, which describes the perception of rooms uniformly painted in one color, Gilchrist states: 'We now have a promising lead to how the visual system determines the shade of gray in these rooms, although we do not yet have a complete explanation. (John Robinson helped me develop this lead.)' (p.122; Gilchrist, 1979). A reader of the scientific literature might assume that Mr. Robinson is another scientist working in the field of visual perception, or perhaps an academic colleague or an advanced graduate student of Gilchrist’s. The fact is that John Robinson was a local plumber and an acquaintance of Gilchrist in the town where the author spent his summers. During a casual discussion, Robinson’s insights into the problem that Gilchrist had been working on were sufficiently important to the development of his theory of lightness perception that Gilchrist felt ethically obligated to credit Robinson’s contribution."
Some scientists would scoff at Gilchrist's acknowledgment of a plumber and argue that this was unnecessary. I think his action shows integrity and, moreover, a deep understanding of the concept of plagiarism. Gilchrist clearly recognizes that his reported insights on light perception would not have occurred (or would have been quite different) had he not had the input of the plumber--and was obligated to acknowledge that source of inspiration.
If students are taught that plagiarism is only the cutting and pasting of text, they may think that appropriation of ideas, hypotheses, methods, etc. is not unethical (or at least not labeled as plagiarism and therefore not subject to sanction). This would be a serious mistake with potentially severe consequences.
Even if one is aware of this aspect of plagiarism, it is very easy to inadvertently appropriate someone else's idea or concept (sometimes called "unconscious plagiarism"). Our minds can play tricks on us, and an idea that we think is original may in fact be something we read and later remembered as our own (as my memory deteriorates with age, I'm more concerned with this point now than when I was younger). Most of us, however, are careful to cite the originator of major theories, hypotheses, or concepts in our papers. But sometimes, the way the text is worded, the impression may be given that another's idea is our own. Another error is when an author, working from notes, inadvertently uses the exact wording of another author, thinking that the notes were his/her own words summarizing the other work (always place word-for-word notes in quotes so that you do not make this error).
One exception to the plagiarism of ideas is "common knowledge". It is appropriate to make statements based on widely-recognized phenomena without attribution, e.g., "plants capture CO2 through the process of photosynthesis". A rule of thumb offered by the ORI is that if the idea or concept is widely-known among high school and college students, then it is common knowledge. What about ideas that are not common knowledge of students, but are widely recognized by experts in the field? Here's where things can get tricky, and the decision requires some experience and understanding of what's common knowledge and what requires citation (students often need guidance here). If the work is to be published in a technical journal, and the target audience is the expert, then statements based on a large body of work might not need a citation. For example, one might have an opening statement such as "The sensitivity of higher plants to elevated concentrations of CO2 depends on the specific photosynthetic pathway of each species.....we compared the responses of C3 versus C4 species." Not perhaps common knowledge of the average student, but certainly so for people working on photosynthesis. However, if you made the statement that 82% of C3 species respond to elevated CO2 with increased rates of photosynthesis, then this would require citation(s).
Another possible exception is the semi-technical article or book chapter requiring a less "formal" style of writing. Editors may ask for text that is unbroken by numerous citations (as one would expect in a technical paper) so that the writing appeals to the non-expert reader. In these cases, the article or chapter would be accompanied by a list of "additional reading", which was used in the preparation of the piece and contains the cited material.
I plan to write more about plagiarism in future posts--it's a complex topic, many aspects of which authors are not always fully aware (including me). Even the most experienced can unknowingly commit errors or may be uncertain how to proceed in specific situations. I'm certainly no expert on plagiarism, but hope to explore the topic by writing about it and, in the process, refine my understanding of its various forms.
The views of readers of this blog definitely help shape such explorations.
Students and Trainees as Manuscript Reviewers
The concern here is about using students or other trainees to perform manuscript or proposal reviews for their mentors (who were asked to do the review). As an editor, I would question the capability of a trainee (especially someone who has never authored anything) to provide an expert review--which is what the journal expects (or should be seeking in soliciting a review). As an author, I would be concerned that my work was reviewed by an inexperienced trainee, even under the mentorship of a senior person. I'm expecting a fair evaluation carried out by a peer who is well-versed in the topic of my work and who has published (i.e., is an expert and therefore qualified to assess the quality of my work and if it contributes significantly to the field).
No matter how good or conscientious a trainee, they are not equal to an expert. If they need "close supervision" by a senior person, one might argue that this confirms they are unqualified to be conducting an official peer review. How would the journal or funding agency defend such a review, if challenged? They would have no way of determining whether the PI closely supervised the trainee or instead simply forwarded the trainee's review without looking at it. I know the latter happens because I was often asked by a previous lab director to do his reviews for him (when I was a graduate student). Back then, I did not know any better and never questioned this practice. One might argue that I probably did a better and more thorough job than the director would have, but what if I had not? He did not even read the manuscripts or proposals, so he did not know if my reviews were fair or accurate.
The point is not whether a trainee can provide a passable review (some certainly can) or that they are supervised by a mentor. The concern is the author's expectation that their manuscript or proposal has been assessed by an expert and that the scores and ultimate acceptance/rejection are based on the evaluations of those qualified to make that judgment. The trainee (especially a student) may not meet that expectation. A post-doc who has published at least one first-authored paper or prepared one proposal may be qualified to conduct reviews of manuscripts/proposals. However, if the mentor is the one asked to do the review, s/he should inform the journal that the review is to be carried out by a trainee and how much supervision will be involved.
If a trainee (e.g., a post-doc) has the necessary credentials to be considered a "peer" and is capable of performing a review (based on the mentor's judgment), then it would be safe to recommend that trainee as a reviewer. If the journal or funding agency has a mechanism to allow the use of "assistant reviewers", then at least the review can be assessed with that knowledge. More importantly, the identity of all contributors to the review are formally documented and known to the journal or funding agency (in the event of a challenge). Journals in my field, however, have no such mechanism (that I'm aware of). In that case, it seems most appropriate for the mentor to suggest their post-doc as a substitute and let the journal editor extend the invitation--which provides a means to formally document the person who actually carries out the review.
Personally, I would not want to become embroiled in an investigation in which an author claims that a review was unfair (and it's discovered that my review was mostly written by a trainee, a substitution that was not formally documented by the journal). So, my advice would be to proceed with caution if you have trainees doing....I mean helping with, your reviews.
I don't think that the need to train students is a valid reason for using trainees to conduct reviews. Students can be trained to review manuscripts without involving them in the actual review process. A mentor can use published papers ranging from excellent to poor (there are plenty in the literature to choose from) and use them to train students to conduct reviews. Another possible method is to use unpublished manuscripts that the mentor reviewed in the past, have the trainee conduct a mock review, and then compare the trainee review with the actual review submitted by the mentor (caution would need to be exercised in ensuring that the trainee not know the identity of the author or make use of any information contained in the manuscript).
Training may be the motivation of some PIs in using assistant reviewers, but such training may be accomplished in other ways. In my experience, the reason that some (many?) PIs use assistant reviewers is simply to relieve themselves of the task (and justify it as training). As I said before, if you don't have time to do the review, you can decline the request.
What distinguishes this situation is that there are two competing obligations. A mentor definitely has a moral obligation to help their trainees, but there is also the obligation to ensure an "expert" review. If the journal welcomes "assistant reviewers" and has a mechanism for documenting their involvement, and the trainee is capable, then the PI may be safe in using them. A side benefit may be experience for the trainee, but that should not be the primary justification.
Image Credits (created with images from Flickr, iStockphoto, and http://www.rmu.edu/SentryHTML/images/gallery/students/group2/student_professor3.jpg)
Sunday, October 10, 2010
Don't Tell Anyone I Gave This To You
In a previous post, I described an ethical dilemma, one that probably occurs fairly frequently. I've encountered variations of it during my career. A full description of this dilemma along with an expert opinion can be found here. The following is my modified version that is similar to one of my experiences:
Cynthia is an ambitious post-doc having a problem with one of her laboratory techniques--extracting an enzyme from plant tissue containing lots of phenolic compounds (which bind proteins). She's been trying for weeks to resolve this, but has been unsuccessful. She is at her wit's end and finally goes to her PI to ask for help. After listening to her tale of woe, he tells Cynthia not to worry--that he'll have a solution for her tomorrow. The next day, she finds a manuscript on her desk with a note from the PI. It says, "Check out the methods section...it has the solution to your problem. However, don't make a copy of this or give it to anyone else. Also, don't tell anyone that I gave this to you." She tries the method described in the paper and lo and behold, it works! When she excitedly reports this to her PI, she says, "I've scoured the literature and can find no mention of this technique. Where did you get this paper?" The PI smiles mysteriously and says, "I've got dozens of them in my files."
I asked if the PI's actions were unethical. The answer is, it depends. Here is a summary of the expert's opinion (see the link above for the full version):
The moral dilemma hinges on the source of the paper. Was it one of the PI's old, unpublished papers? One of his student's unpublished papers? Or was it a manuscript he got for review?
1. If the paper was written by the PI (and based on work done in his lab), then the PI is free to give the information and data to the post-doc to use.
2. What may be less clear is why the former student's paper might also be given by the PI to the post-doc. In this example, the student never published the paper and has left the university. The post-doc can be given access to the contents because the PI's university likely owns the student's work as intellectual property. According to the expert, the only way the student can claim the work is if s/he had previously gotten university permission to copyright the material. Contrary to what many people think, the work conducted by a researcher (including their ideas) at an organization such as a university becomes the intellectual property of that organization. In this case, the former student had no such copyright, so the PI, acting as the university's representative, has the authority to give the post-doc access to the contents of the paper. It would be appropriate to contact the student to notify them that their unpublished information is to be used--and then cite them as the source in the acknowledgments section. Unless the student makes a substantial contribution to the writing of the paper and a major intellectual contribution to the current work, then it is not appropriate to make them an author (the post-doc and the PI could potentially offer this option to the former student).
3. If the paper is one that the PI received for review, then giving it to the post-doc without the permission of the journal or the author is unethical. The PI may be conflicted over his desire to help his stressed-out post-doc--and this may take precedence in his decision. However, doing so is a breach of the confidentiality agreement that he entered into in accepting the role of reviewer. In other words, the confidentiality agreement takes precedence over the PI's "moral" obligation to help the post-doc. Furthermore, the PI is not exercising good judgment about how they will eventually use the information in their own publication and how to acknowledge the source of the information. Giving the paper and the method to the post-doc may solve her immediate problem, but has created an even bigger problem for her in the future. Unauthorized use of intellectual property obtained by privileged communication (review process) in another paper or proposal is plagiarism. If they publish based on this method, they would be representing these ideas as their own, but which were taken from someone else's work. If caught, both the PI and the post-doc could be subject to severe sanctions.
The expert opinion goes on to point out the likelihood that the plagiarism will be uncovered eventually. If you think about it, any future paper by this PI and post-doc containing the plagiarized material will likely be read by the author of the original paper. The author and the PI clearly work in the same specialized field, which is why the PI got the paper for review. Isn't it just as likely that the author will get the PI's future paper for review? In any case, the author will eventually see it when it is published. This outcome is especially likely in a highly specialized field in which there are few experts.
My take on this scenario is that it is probably more common than you think. I'm aware of colleagues who pass around papers they are reviewing or discuss the contents with others. When confronted, they may admit they shouldn't do it, but then act as if it is nothing of great consequence. Some pass on papers to students or post-docs (e.g., as exercises). They may remove the author's name and affiliation, but the content of the manuscript is still confidential and should not be shown to anyone else or copied. What if your student copies something from that paper without your knowledge, and it eventually ends up in a proposal or paper--where it is later recognized by the original author?
Sometimes you hear about reviewers asking journals for permission to have their post-doc or students review a manuscript (or maybe even do so without asking). I don't think this is a good idea either. If you don't have time to do a review, then decline and provide a list of people to substitute for you (you can suggest your post-doc); then, the journal can decide if your recommendation is a suitable reviewer, and the review will occur without your direct involvement.
The ethical situation described above and the hypothetical actions of the PI (#3) illustrate how easy it is for someone to get into deep trouble if they fail to take the time to consider the consequences of their actions. I can easily imagine a PI who might make such a decision hastily and/or without thinking--but with no real malicious intent to injure the author. However, such a decision commits at least two ethical transgressions--passing along confidential information without permission and putting another person into a tenuous and potentially liable situation (plagiarism would be the third, if they take the final step and publish). The PI's moral obligation to "help" his post-doc clouds the larger ethical issues, and in the end, his action could instead seriously harm his post-doc's reputation and career.
Perhaps you have done something similar to this--we all make mistakes at some point, especially when we are inexperienced or under pressure. Most people, though, would likely have a nagging feeling in their gut that such an action is wrong. If you have this feeling about something you are facing or that someone else is telling you, pay attention. Your gut is probably right.
Image Credit (modified from http://www.uwo.ca/nca/education/images/student_3.jpg)
Cynthia is an ambitious post-doc having a problem with one of her laboratory techniques--extracting an enzyme from plant tissue containing lots of phenolic compounds (which bind proteins). She's been trying for weeks to resolve this, but has been unsuccessful. She is at her wit's end and finally goes to her PI to ask for help. After listening to her tale of woe, he tells Cynthia not to worry--that he'll have a solution for her tomorrow. The next day, she finds a manuscript on her desk with a note from the PI. It says, "Check out the methods section...it has the solution to your problem. However, don't make a copy of this or give it to anyone else. Also, don't tell anyone that I gave this to you." She tries the method described in the paper and lo and behold, it works! When she excitedly reports this to her PI, she says, "I've scoured the literature and can find no mention of this technique. Where did you get this paper?" The PI smiles mysteriously and says, "I've got dozens of them in my files."
I asked if the PI's actions were unethical. The answer is, it depends. Here is a summary of the expert's opinion (see the link above for the full version):
The moral dilemma hinges on the source of the paper. Was it one of the PI's old, unpublished papers? One of his student's unpublished papers? Or was it a manuscript he got for review?
1. If the paper was written by the PI (and based on work done in his lab), then the PI is free to give the information and data to the post-doc to use.
2. What may be less clear is why the former student's paper might also be given by the PI to the post-doc. In this example, the student never published the paper and has left the university. The post-doc can be given access to the contents because the PI's university likely owns the student's work as intellectual property. According to the expert, the only way the student can claim the work is if s/he had previously gotten university permission to copyright the material. Contrary to what many people think, the work conducted by a researcher (including their ideas) at an organization such as a university becomes the intellectual property of that organization. In this case, the former student had no such copyright, so the PI, acting as the university's representative, has the authority to give the post-doc access to the contents of the paper. It would be appropriate to contact the student to notify them that their unpublished information is to be used--and then cite them as the source in the acknowledgments section. Unless the student makes a substantial contribution to the writing of the paper and a major intellectual contribution to the current work, then it is not appropriate to make them an author (the post-doc and the PI could potentially offer this option to the former student).
3. If the paper is one that the PI received for review, then giving it to the post-doc without the permission of the journal or the author is unethical. The PI may be conflicted over his desire to help his stressed-out post-doc--and this may take precedence in his decision. However, doing so is a breach of the confidentiality agreement that he entered into in accepting the role of reviewer. In other words, the confidentiality agreement takes precedence over the PI's "moral" obligation to help the post-doc. Furthermore, the PI is not exercising good judgment about how they will eventually use the information in their own publication and how to acknowledge the source of the information. Giving the paper and the method to the post-doc may solve her immediate problem, but has created an even bigger problem for her in the future. Unauthorized use of intellectual property obtained by privileged communication (review process) in another paper or proposal is plagiarism. If they publish based on this method, they would be representing these ideas as their own, but which were taken from someone else's work. If caught, both the PI and the post-doc could be subject to severe sanctions.
The expert opinion goes on to point out the likelihood that the plagiarism will be uncovered eventually. If you think about it, any future paper by this PI and post-doc containing the plagiarized material will likely be read by the author of the original paper. The author and the PI clearly work in the same specialized field, which is why the PI got the paper for review. Isn't it just as likely that the author will get the PI's future paper for review? In any case, the author will eventually see it when it is published. This outcome is especially likely in a highly specialized field in which there are few experts.
My take on this scenario is that it is probably more common than you think. I'm aware of colleagues who pass around papers they are reviewing or discuss the contents with others. When confronted, they may admit they shouldn't do it, but then act as if it is nothing of great consequence. Some pass on papers to students or post-docs (e.g., as exercises). They may remove the author's name and affiliation, but the content of the manuscript is still confidential and should not be shown to anyone else or copied. What if your student copies something from that paper without your knowledge, and it eventually ends up in a proposal or paper--where it is later recognized by the original author?
Sometimes you hear about reviewers asking journals for permission to have their post-doc or students review a manuscript (or maybe even do so without asking). I don't think this is a good idea either. If you don't have time to do a review, then decline and provide a list of people to substitute for you (you can suggest your post-doc); then, the journal can decide if your recommendation is a suitable reviewer, and the review will occur without your direct involvement.
The ethical situation described above and the hypothetical actions of the PI (#3) illustrate how easy it is for someone to get into deep trouble if they fail to take the time to consider the consequences of their actions. I can easily imagine a PI who might make such a decision hastily and/or without thinking--but with no real malicious intent to injure the author. However, such a decision commits at least two ethical transgressions--passing along confidential information without permission and putting another person into a tenuous and potentially liable situation (plagiarism would be the third, if they take the final step and publish). The PI's moral obligation to "help" his post-doc clouds the larger ethical issues, and in the end, his action could instead seriously harm his post-doc's reputation and career.
Perhaps you have done something similar to this--we all make mistakes at some point, especially when we are inexperienced or under pressure. Most people, though, would likely have a nagging feeling in their gut that such an action is wrong. If you have this feeling about something you are facing or that someone else is telling you, pay attention. Your gut is probably right.
Image Credit (modified from http://www.uwo.ca/nca/education/images/student_3.jpg)
Friday, October 8, 2010
A Difficult Decision
Did you guess which one was the real situation? It was number 3, which I've reproduced below:
Beth is an assistant professor and has an undergraduate student worker (a senior) who is discovered to have been falsifying his time sheets. Let's say that Beth's lab technician has reported this to her. Beth contacts the head of student affairs for guidance. She is told that the student's actions, if guilty, are considered by the university to be a crime and will be turned over to the campus police; the student will also be expelled. Let's say Beth is reasonably certain that at least a portion of the time he claimed has been falsified. Reporting this student will lead to his possible arrest and prosecution and definitely terminate his academic pursuit. She is hesitant to cause this student to be arrested and expelled. What if she just fires the student, but does not report him to authorities? Is that ethical or unethical? What would you do?
I won't reveal what my role was (except to say that I wasn't the student!). Here is what actually transpired and the reasoning behind the decisions. Beth went to great effort to document the falsified time claimed by the student. She compared all work hours against the student's course schedule, and found many instances in which he was in class (confirmed by the course instructors) when he claimed to have been working. She also found that work hours were claimed for times when the lab was closed (for holidays, etc.) and determined that the student could not have been there (confirmed by graduate students who were working after hours). All of this information was carefully documented. Beth also determined that the approval signatures on some of the time sheets were forged.
Beth and the technician decided to confront the student, who confessed to them when shown the evidence (to both the falsified time and the forgeries). Although Beth felt some reluctance about turning in the student (because of the severe consequences), she ultimately decided that she had no choice. She turned over her documentation, including signed statements by her and the technician as to what had transpired in their confrontation with the student, to the university business affairs office and to student affairs. The case was given to the campus police who proceeded to arrest the student. Beth had advised the student to pay back the funds, which he did; the prosecutor consequently decided not to pursue the case and the criminal charges were dropped. However, the student was expelled in his senior year.
If Beth had failed to report the student's theft (that's what it was), she could have found herself in trouble with the authorities. Once she was informed about the theft by the technician, she had no choice but to investigate and then act on her findings. In fact, the funding that paid the student came from a Federal grant, a situation that might have triggered an even bigger investigation if there had been any attempt at a cover-up. Regardless of the funding source, Beth was obligated to report the misuse of funds and to try to recover them. Beth may have felt sympathy for the student, but the student was solely responsible for his actions and, moreover, was clearly aware that what he was doing was wrong (the forgeries). Furthermore, if Beth had simply fired him, he would have likely gone on to work for someone else at the university or elsewhere, possibly repeating this crime. Reporting the student to authorities may have been painful for Beth, but failing to do so and stopping him from doing further harm would have been unethical.
Image Credit (Modified from http://www.onlinedegreesaccredited.net/wp-content/uploads/2009/09/medical-lab.jpg and http://www.writeshop.com/blog/wp-content/uploads/2010/04/Teen_boy_writing.jpg)
Beth is an assistant professor and has an undergraduate student worker (a senior) who is discovered to have been falsifying his time sheets. Let's say that Beth's lab technician has reported this to her. Beth contacts the head of student affairs for guidance. She is told that the student's actions, if guilty, are considered by the university to be a crime and will be turned over to the campus police; the student will also be expelled. Let's say Beth is reasonably certain that at least a portion of the time he claimed has been falsified. Reporting this student will lead to his possible arrest and prosecution and definitely terminate his academic pursuit. She is hesitant to cause this student to be arrested and expelled. What if she just fires the student, but does not report him to authorities? Is that ethical or unethical? What would you do?
I won't reveal what my role was (except to say that I wasn't the student!). Here is what actually transpired and the reasoning behind the decisions. Beth went to great effort to document the falsified time claimed by the student. She compared all work hours against the student's course schedule, and found many instances in which he was in class (confirmed by the course instructors) when he claimed to have been working. She also found that work hours were claimed for times when the lab was closed (for holidays, etc.) and determined that the student could not have been there (confirmed by graduate students who were working after hours). All of this information was carefully documented. Beth also determined that the approval signatures on some of the time sheets were forged.
Beth and the technician decided to confront the student, who confessed to them when shown the evidence (to both the falsified time and the forgeries). Although Beth felt some reluctance about turning in the student (because of the severe consequences), she ultimately decided that she had no choice. She turned over her documentation, including signed statements by her and the technician as to what had transpired in their confrontation with the student, to the university business affairs office and to student affairs. The case was given to the campus police who proceeded to arrest the student. Beth had advised the student to pay back the funds, which he did; the prosecutor consequently decided not to pursue the case and the criminal charges were dropped. However, the student was expelled in his senior year.
If Beth had failed to report the student's theft (that's what it was), she could have found herself in trouble with the authorities. Once she was informed about the theft by the technician, she had no choice but to investigate and then act on her findings. In fact, the funding that paid the student came from a Federal grant, a situation that might have triggered an even bigger investigation if there had been any attempt at a cover-up. Regardless of the funding source, Beth was obligated to report the misuse of funds and to try to recover them. Beth may have felt sympathy for the student, but the student was solely responsible for his actions and, moreover, was clearly aware that what he was doing was wrong (the forgeries). Furthermore, if Beth had simply fired him, he would have likely gone on to work for someone else at the university or elsewhere, possibly repeating this crime. Reporting the student to authorities may have been painful for Beth, but failing to do so and stopping him from doing further harm would have been unethical.
Image Credit (Modified from http://www.onlinedegreesaccredited.net/wp-content/uploads/2009/09/medical-lab.jpg and http://www.writeshop.com/blog/wp-content/uploads/2010/04/Teen_boy_writing.jpg)
Monday, October 4, 2010
Swimming with Sharks
When we start out in science, there are many things that will influence our careers. We are aware of all the obvious technical skills that must be mastered in order to succeed in our particular science field, but less aware of other challenges that may be our biggest obstacles. In the previous post, I talked about social anxiety and how this might greatly affect one's ability to function professionally.
Another area that we don't think about much in the beginning and may be neglected in academic programs is ethics (and dealing with unethical people).
In every profession, there are people who try to get ahead by engaging in unethical behavior. This type of behavior seems to be exacerbated in situations where competition is intense. When resources are scarce or where there are "territorial" issues, people who cannot prevail based on their skills may resort to under-handed measures. My sense is that there are only a small number of "true sharks" in science fields--people who have no scruples and will stop at nothing to get what they want. They exist, but do not predominate. I don't think scientists are more ethical than the average person, just that the field does not particularly attract people who are unethical. At the opposite end of the spectrum are people who cannot be tempted under any circumstance to make an unethical decision. Perhaps more common are those people who under normal circumstances would not do anything unethical, but when put under pressure will turn into sharks, i.e., they are "latent sharks". The breaking point obviously varies from person to person and with the situation.
Most of us tend to focus on the "true sharks" and what harm they might do to us and our careers. However, it's possible that the "latent sharks", who are more abundant, may be more likely to harm us. Or possibly the ethical choices we make ourselves, if wrong, can do far more damage to us than any deliberate act by someone else. You may be thinking that you would always know the right thing to do and would not make an ethical mistake, but are you sure?
That shark threatening your well-being may not be a person, but an ethical dilemma.
When we start out in our careers, we often don't realize the difficult ethical choices that we may face as scientists. I'm not talking here about falsifying data or other obviously fraudulent actions that normal people recognize as being wrong. We all know these are not only unethical, but absolutely not tolerated in science. No, I'm talking about situations that scientists (and other professionals) face, but in which some people might find it difficult (under pressure) to do the right thing. Perhaps even more challenging are those situations in which the correct response is not always clear.
Here are a few examples to get us thinking about these ideas:
1. Mary is an assistant professor who receives a research proposal for review that focuses on the exact same questions she is currently pursuing. The proposal describes a unique approach that is far superior to what she has been using in her own project and addresses a key issue that has been a stumbling block for her. Imagine further that she has not been as successful as she needs to be and will not get tenure unless she publishes more and gets a decent sized grant--soon. She's already invested all her time and start-up funds on this research question, and it's too late to start over. Solving her methodological problem would clear the way for her research to take off. Let's further say that the proposal author already has a lot of funding (information revealed in the current and pending support). Mary rationalizes that she would have eventually come up with this technique and decides to use it in her own research. She also decides that the proposal author already has had more than his share of funding and won't be hurt if he doesn't get this grant. She gives it a "good", rather than "excellent" score, knowing that this will probably sink it and perhaps give her some time to implement the new approach.
Would you find it difficult, if you were in Mary's situation, to do the right thing? Is taking another scientist's ideas a form of plagiarism? Was there ever any possibility that Mary could have provided an unbiased review of this proposal? What, if any, are the possible negative repercussions of Mary's actions (for her)?
2. Here's a variation on first example. Cynthia is an ambitious post-doc having a problem with one of her laboratory techniques. She's been trying for weeks to resolve this, but has been unsuccessful. She is at her wit's end and finally goes to her PI to ask for help. After listening to her tale of woe, he tells Cynthia not to worry--that he'll have a solution for her tomorrow. The next day, she finds a manuscript on her desk with a note from the PI. It says, "Check out the methods section...it has the solution to your problem. However, don't make a copy of this or give it to anyone else. Also, don't tell anyone that I gave this to you." She tries the method described in the paper and lo and behold, it works! When she excitedly reports this to her PI, she says, "I've scoured the literature and can find no mention of this technique. Where did you get this paper?" The PI smiles mysteriously and says, "I've got dozens of them in my files."
Cynthia's PI was trying to help her by giving her this method, but is what he did unethical? What if he won't tell her the source (author) of the paper? If she uses this method and describes it in her paper without acknowledging the source, is this unethical? What might happen to the two of them if others find out?
3. Here's a very different dilemma. Beth is an assistant professor and has an undergraduate student worker who is discovered to have been falsifying his time sheets. Let's say that Beth's lab technician has reported this to her. Beth contacts the head of student affairs for guidance. She is told that the student's actions, if guilty, are considered by the university to be a crime and will be turned over to the campus police; the student will also be expelled. Let's say Beth is reasonably certain that at least a portion of the time he claimed has been falsified. Reporting this student will lead to his possible arrest and prosecution and definitely terminate his academic pursuit. She is hesitant to cause this student to be arrested and expelled. What if she just fires the student, but does not report him to authorities? Is that ethical or unethical? What would you do?
One of the above examples is real, the others are fiction. In the next post, I'll describe the outcome of the real example and try to explain what might happen in the fictional scenarios.
Image Credit (modified from http://scrapetv.com/News/News%20Pages/usa/images-4/great-white-shark-2.jpg)
Another area that we don't think about much in the beginning and may be neglected in academic programs is ethics (and dealing with unethical people).
In every profession, there are people who try to get ahead by engaging in unethical behavior. This type of behavior seems to be exacerbated in situations where competition is intense. When resources are scarce or where there are "territorial" issues, people who cannot prevail based on their skills may resort to under-handed measures. My sense is that there are only a small number of "true sharks" in science fields--people who have no scruples and will stop at nothing to get what they want. They exist, but do not predominate. I don't think scientists are more ethical than the average person, just that the field does not particularly attract people who are unethical. At the opposite end of the spectrum are people who cannot be tempted under any circumstance to make an unethical decision. Perhaps more common are those people who under normal circumstances would not do anything unethical, but when put under pressure will turn into sharks, i.e., they are "latent sharks". The breaking point obviously varies from person to person and with the situation.
Most of us tend to focus on the "true sharks" and what harm they might do to us and our careers. However, it's possible that the "latent sharks", who are more abundant, may be more likely to harm us. Or possibly the ethical choices we make ourselves, if wrong, can do far more damage to us than any deliberate act by someone else. You may be thinking that you would always know the right thing to do and would not make an ethical mistake, but are you sure?
That shark threatening your well-being may not be a person, but an ethical dilemma.
When we start out in our careers, we often don't realize the difficult ethical choices that we may face as scientists. I'm not talking here about falsifying data or other obviously fraudulent actions that normal people recognize as being wrong. We all know these are not only unethical, but absolutely not tolerated in science. No, I'm talking about situations that scientists (and other professionals) face, but in which some people might find it difficult (under pressure) to do the right thing. Perhaps even more challenging are those situations in which the correct response is not always clear.
Here are a few examples to get us thinking about these ideas:
1. Mary is an assistant professor who receives a research proposal for review that focuses on the exact same questions she is currently pursuing. The proposal describes a unique approach that is far superior to what she has been using in her own project and addresses a key issue that has been a stumbling block for her. Imagine further that she has not been as successful as she needs to be and will not get tenure unless she publishes more and gets a decent sized grant--soon. She's already invested all her time and start-up funds on this research question, and it's too late to start over. Solving her methodological problem would clear the way for her research to take off. Let's further say that the proposal author already has a lot of funding (information revealed in the current and pending support). Mary rationalizes that she would have eventually come up with this technique and decides to use it in her own research. She also decides that the proposal author already has had more than his share of funding and won't be hurt if he doesn't get this grant. She gives it a "good", rather than "excellent" score, knowing that this will probably sink it and perhaps give her some time to implement the new approach.
Would you find it difficult, if you were in Mary's situation, to do the right thing? Is taking another scientist's ideas a form of plagiarism? Was there ever any possibility that Mary could have provided an unbiased review of this proposal? What, if any, are the possible negative repercussions of Mary's actions (for her)?
2. Here's a variation on first example. Cynthia is an ambitious post-doc having a problem with one of her laboratory techniques. She's been trying for weeks to resolve this, but has been unsuccessful. She is at her wit's end and finally goes to her PI to ask for help. After listening to her tale of woe, he tells Cynthia not to worry--that he'll have a solution for her tomorrow. The next day, she finds a manuscript on her desk with a note from the PI. It says, "Check out the methods section...it has the solution to your problem. However, don't make a copy of this or give it to anyone else. Also, don't tell anyone that I gave this to you." She tries the method described in the paper and lo and behold, it works! When she excitedly reports this to her PI, she says, "I've scoured the literature and can find no mention of this technique. Where did you get this paper?" The PI smiles mysteriously and says, "I've got dozens of them in my files."
Cynthia's PI was trying to help her by giving her this method, but is what he did unethical? What if he won't tell her the source (author) of the paper? If she uses this method and describes it in her paper without acknowledging the source, is this unethical? What might happen to the two of them if others find out?
3. Here's a very different dilemma. Beth is an assistant professor and has an undergraduate student worker who is discovered to have been falsifying his time sheets. Let's say that Beth's lab technician has reported this to her. Beth contacts the head of student affairs for guidance. She is told that the student's actions, if guilty, are considered by the university to be a crime and will be turned over to the campus police; the student will also be expelled. Let's say Beth is reasonably certain that at least a portion of the time he claimed has been falsified. Reporting this student will lead to his possible arrest and prosecution and definitely terminate his academic pursuit. She is hesitant to cause this student to be arrested and expelled. What if she just fires the student, but does not report him to authorities? Is that ethical or unethical? What would you do?
One of the above examples is real, the others are fiction. In the next post, I'll describe the outcome of the real example and try to explain what might happen in the fictional scenarios.
Image Credit (modified from http://scrapetv.com/News/News%20Pages/usa/images-4/great-white-shark-2.jpg)
Labels:
career development,
competition,
ethics,
mentoring
Friday, October 1, 2010
Socially-Inept Scientists
I'll come back to the Blue Ocean Strategy a bit later, but I thought I would say a few more words about social interactions. A few commenters have mentioned that they really appreciate advice about social situations. For example, I talked in a previous post about how to approach a Famous Scientist at a conference mixer. People who are naturally comfortable in social situations perhaps think everyone is like this and do not realize how awkward some of our colleagues feel when they venture outside their laboratories. Those of us who have learned through experience how to navigate socially are aware of this, but soon forget what it feels like to be in an awkward social situation.
As a young woman, I suffered from social anxiety--big time. I could barely bring myself to speak in front of more than one other person. And if there was someone present who was intimidating--a Famous Scientist or someone in authority--I was paralyzed. It took many years and forcing myself to learn how to interact with people and to lose my self-consciousness, but I finally overcame this problem. It not only held me back socially, but professionally. I rationalized that it did not matter--that I could do my research alone or with close collaborators and would succeed. I did not realize how crucial it was to my career development. Only after overcoming (to an extent) this social anxiety have I realized what an impediment it was.
I'm still not a social butterfly and people probably do not view me as someone they would really like to get to know, but I am now comfortable in social situations. Even if others don't feel totally at ease with me, I feel comfortable talking to strangers or just standing or dining alone. It just doesn't bother me any more. At conference mixers, I often look around the room for someone who is alone and looking uncomfortable. I will strike up a conversation with them and try to make them feel more comfortable.
At most of the conferences I typically attend, I know a lot of people and am fairly well-known myself. I am approached frequently by students who have read my papers and want to meet me. However, I just recently attended a conference where I did not know many of the attendees, and the conference focus was somewhat outside my field (so no one had heard of me or my work). This was a small conference--about 150 people. I chatted with the one or two people I knew (distant acquaintances).
Most of the other attendees seemed to know everyone else and naturally congregated in animated groups during breaks. It would have been very difficult to approach one of these groups as a lone stranger. There were very few people standing around alone--as I was. I decided this was an interesting situation--one in which I was a total stranger to most of the people--and decided to do a little experiment.
During the session coffee breaks, I stood by myself to see if anyone would spontaneously start up a conversation with me. When people passed by, I would smile or nod, but not initiate a conversation myself. The first day, no one approached me--despite the obvious fact that I was alone and knew few people there. Then, on the second day, I gave my presentation (in the plenary session), which was something of a departure from the other talks. After this, people began approaching me during breaks. Some had questions about my work. Others just seemed to feel more comfortable about approaching me since I had been "introduced" via my talk. In one case, I was invited to come give a seminar later in the year.
There are a couple of lessons here for the socially-disadvantaged. One lesson is that no one is going to come to your rescue in a semi-social setting like a conference mixer. Part of the reason is that people want to feel comfortable, and talking to strangers is usually not comfortable--especially if you have to make the first move. Another reason is that people are there to make important contacts and to make themselves known to potential advisers or employers. They don't have time to waste on someone they view as being "unimportant" to them.
If you want to meet people, you have to make the first move. The experience I described above showed that people only felt comfortable approaching me (a stranger) after 1) they became aware of me, 2) had been "introduced" to me via my talk, 3) had something specific to discuss with me, and/or 4) saw me as someone important to meet.
The second lesson is that if you give an oral presentation, you become "known" to other people, and they feel more inclined to approach you in a professional or semi-social setting. They may be interested in your work or impressed with how you delivered your talk. If someone comes up to you after your talk and compliments you, try to start up a conversation. Don't just say, "Thanks." and then turn away tongue-tied. You might ask what they enjoyed most about your presentation or if they have any questions. Mention some aspect that you thought might have been unclear and ask for an opinion. Always ask if they do similar work and to tell you about it.
The key to engaging people is to get them talking about themselves.
A final point is to realize that scientists as a group tend to be more socially inept than other groups. So the chances that someone else will rescue you from a socially awkward situation is much lower at a gathering of scientists. The motives behind people's behavior at a professional gathering are also different from those in a social setting. It's important to be aware of these distinctions when planning your strategy. The lesson here is that you have to change your behavior instead of waiting for others to change their behavior toward you.
How do you begin to change if you are really paralyzed in social or professional settings? What ways might you meet people at conferences and other gatherings of scientists?
One very easy and less painful way to meet people is during the poster sessions. There are lots of people standing by their posters expecting (hoping) others will approach them. It's very awkward for poster presenters to stand there waiting for someone to approach. So they will often be relieved when someone comes along and starts up a conversation. You also have lots of opportunities to meet many people--especially people doing work in your field. However, I've found it's sometimes easier to talk to people who work on topics I know little about. By confiding to the poster presenter that you don't know anything about their field puts them at ease. Students and young scientists are especially afraid some expert is going to come along and ask them a question they can't answer or will disparage their work. So, they will be especially open to someone who knows little about their topic. Ask them to explain their work to you (you can say you've always been fascinated with the topic, but that it is outside your field). By doing so, you put them into the role of expert and you in the role of interested listener. Few people can resist an opportunity to be looked upon as the more knowledgeable in a conversation. You must be sincere, of course. If you are not, people will see right through you.
You can also approach speakers after their presentations, but this is sometimes difficult if they are surrounded by other people also wanting to talk to them. However, in every session, there will be the "stars" who are immediately surrounded during the break and the "unknowns" who won't be so tied up. Approach the unknowns and ask them a question about their talk. They'll be grateful to you. After you gain some practice, then you might try approaching Famous Scientist after their talk or at the coffee break.
I also make a point of complimenting students who have given especially good talks. I do this both for students I know, but also for students that I do not know and/or who are in other fields. When I was a student, it would have meant a great deal to me to have an established scientist compliment me. So I know it has an effect on their self-esteem. You can do this also--even as a student. Compliment other students or even established scientists. I guarantee you even the most Famous Scientist will be pleased if someone comes up to them after their talk and tells them how much they enjoyed it.
Finally, don't get discouraged if you get rebuffed initially. You are learning a very difficult skill. It's to be expected that you'll make mistakes at first and that it may take some experience before you become successful.
As a young woman, I suffered from social anxiety--big time. I could barely bring myself to speak in front of more than one other person. And if there was someone present who was intimidating--a Famous Scientist or someone in authority--I was paralyzed. It took many years and forcing myself to learn how to interact with people and to lose my self-consciousness, but I finally overcame this problem. It not only held me back socially, but professionally. I rationalized that it did not matter--that I could do my research alone or with close collaborators and would succeed. I did not realize how crucial it was to my career development. Only after overcoming (to an extent) this social anxiety have I realized what an impediment it was.
I'm still not a social butterfly and people probably do not view me as someone they would really like to get to know, but I am now comfortable in social situations. Even if others don't feel totally at ease with me, I feel comfortable talking to strangers or just standing or dining alone. It just doesn't bother me any more. At conference mixers, I often look around the room for someone who is alone and looking uncomfortable. I will strike up a conversation with them and try to make them feel more comfortable.
At most of the conferences I typically attend, I know a lot of people and am fairly well-known myself. I am approached frequently by students who have read my papers and want to meet me. However, I just recently attended a conference where I did not know many of the attendees, and the conference focus was somewhat outside my field (so no one had heard of me or my work). This was a small conference--about 150 people. I chatted with the one or two people I knew (distant acquaintances).
Most of the other attendees seemed to know everyone else and naturally congregated in animated groups during breaks. It would have been very difficult to approach one of these groups as a lone stranger. There were very few people standing around alone--as I was. I decided this was an interesting situation--one in which I was a total stranger to most of the people--and decided to do a little experiment.
During the session coffee breaks, I stood by myself to see if anyone would spontaneously start up a conversation with me. When people passed by, I would smile or nod, but not initiate a conversation myself. The first day, no one approached me--despite the obvious fact that I was alone and knew few people there. Then, on the second day, I gave my presentation (in the plenary session), which was something of a departure from the other talks. After this, people began approaching me during breaks. Some had questions about my work. Others just seemed to feel more comfortable about approaching me since I had been "introduced" via my talk. In one case, I was invited to come give a seminar later in the year.
There are a couple of lessons here for the socially-disadvantaged. One lesson is that no one is going to come to your rescue in a semi-social setting like a conference mixer. Part of the reason is that people want to feel comfortable, and talking to strangers is usually not comfortable--especially if you have to make the first move. Another reason is that people are there to make important contacts and to make themselves known to potential advisers or employers. They don't have time to waste on someone they view as being "unimportant" to them.
If you want to meet people, you have to make the first move. The experience I described above showed that people only felt comfortable approaching me (a stranger) after 1) they became aware of me, 2) had been "introduced" to me via my talk, 3) had something specific to discuss with me, and/or 4) saw me as someone important to meet.
The second lesson is that if you give an oral presentation, you become "known" to other people, and they feel more inclined to approach you in a professional or semi-social setting. They may be interested in your work or impressed with how you delivered your talk. If someone comes up to you after your talk and compliments you, try to start up a conversation. Don't just say, "Thanks." and then turn away tongue-tied. You might ask what they enjoyed most about your presentation or if they have any questions. Mention some aspect that you thought might have been unclear and ask for an opinion. Always ask if they do similar work and to tell you about it.
The key to engaging people is to get them talking about themselves.
A final point is to realize that scientists as a group tend to be more socially inept than other groups. So the chances that someone else will rescue you from a socially awkward situation is much lower at a gathering of scientists. The motives behind people's behavior at a professional gathering are also different from those in a social setting. It's important to be aware of these distinctions when planning your strategy. The lesson here is that you have to change your behavior instead of waiting for others to change their behavior toward you.
How do you begin to change if you are really paralyzed in social or professional settings? What ways might you meet people at conferences and other gatherings of scientists?
One very easy and less painful way to meet people is during the poster sessions. There are lots of people standing by their posters expecting (hoping) others will approach them. It's very awkward for poster presenters to stand there waiting for someone to approach. So they will often be relieved when someone comes along and starts up a conversation. You also have lots of opportunities to meet many people--especially people doing work in your field. However, I've found it's sometimes easier to talk to people who work on topics I know little about. By confiding to the poster presenter that you don't know anything about their field puts them at ease. Students and young scientists are especially afraid some expert is going to come along and ask them a question they can't answer or will disparage their work. So, they will be especially open to someone who knows little about their topic. Ask them to explain their work to you (you can say you've always been fascinated with the topic, but that it is outside your field). By doing so, you put them into the role of expert and you in the role of interested listener. Few people can resist an opportunity to be looked upon as the more knowledgeable in a conversation. You must be sincere, of course. If you are not, people will see right through you.
You can also approach speakers after their presentations, but this is sometimes difficult if they are surrounded by other people also wanting to talk to them. However, in every session, there will be the "stars" who are immediately surrounded during the break and the "unknowns" who won't be so tied up. Approach the unknowns and ask them a question about their talk. They'll be grateful to you. After you gain some practice, then you might try approaching Famous Scientist after their talk or at the coffee break.
I also make a point of complimenting students who have given especially good talks. I do this both for students I know, but also for students that I do not know and/or who are in other fields. When I was a student, it would have meant a great deal to me to have an established scientist compliment me. So I know it has an effect on their self-esteem. You can do this also--even as a student. Compliment other students or even established scientists. I guarantee you even the most Famous Scientist will be pleased if someone comes up to them after their talk and tells them how much they enjoyed it.
Finally, don't get discouraged if you get rebuffed initially. You are learning a very difficult skill. It's to be expected that you'll make mistakes at first and that it may take some experience before you become successful.
Subscribe to:
Posts (Atom)