Podcast

#AskDifferent – the Podcast of the Einstein Foundation
#AskDifferent, the Einstein Foundation’s podcast series, offers a unique behind-the-scenes opportunity to learn more about the pioneering minds affiliated with and funded by the Foundation, and to find out how their outstanding careers were shaped both by chance and circumstance. What is it that drives them to ask differently, to perpetually ask new questions, and explore the world in all its detail?

How dangerous is fraud in science, Elisabeth Bik?

Portrait Elisabeth Bik
Photo: Michel & Co.

AskDifferent #39 – In this episode of #AskDifferent, Elisabeth Bik takes us behind the scenes of her work as a 'science detective’, where she meticulously scans thousands of published research papers for signs of fraud – having uncovered manipulated and duplicated images in over 7,600 scientific papers. But what drives scientists to fake data, and how does this impact all of us? Tune in to learn about the challenges of detecting fraud and hear Elisabeth Bik’s compelling message on the urgent need for systemic change. For her tireless fight to preserve integrity in research, she was awarded the 2024 Einstein Foundation Award for Promoting Quality in Research. Photo: Michel & Co.

Back to overview

Intro: I only find what we call the dumb fraudsters. The people who leave traces for us to find. So let's say you see a duplication between two panels, like an overlap. And the sample was under a microscope, you take some photos, and then the scientist moved the sample around, took some more photos and it could be that you find two photos that are overlapping, that this is just an honest error or it could be deliberate. Sometimes we see six panels and four of them overlap, and they're supposed to be six experiments, but they're all the same sample. There we assume: Okay, that is, either extremely sloppy or it was done with an intention to mislead. AskDifferent, the podcast by the Einstein Foundation. 

Marie Roeder: Hello, and welcome to today's episode. My name is Marie Roeder, and I'll be your host. Imagine this: A new study presents exciting results about a potential treatment for common disease. The findings are published in a prestigious journal gaining widespread attention. But years later, upon closer inspection, researchers realized that something doesn't add up. Some images appear manipulated, data points don't match, and the study turns out to be based on fraudulent science. This is not just a hypothetical scenario. Fraud in science undermines research integrity and shakes public trust. But how widespread is this problem? Luckily, I don't have to answer this question all alone. With me today is someone who has dedicated her career to uncovering research misconduct, a true science detective, Doctor Elisabeth Bik. Thank you so much for joining me today. 

Elisabeth Bik: Glad to be here. 

Roeder: Doctor Bik, before we dive into the topic, I want to introduce you to our listeners. After earning your PhD in microbiology from Utrecht University in The Netherlands, you spent many years working in academic research before shifting your focus to scientific integrity. Through your meticulous analysis, you have uncovered thousands of problematic scientific papers. You were honored with the Einstein Foundation Individual Award last year in recognition of your relentless efforts in this field. So congratulations. 

Bik: Thank you. It's been a great honor.

Roeder: So let's start with your origin story. You began as a microbiologist, as I already said. You studied microbes, but now you spend your days investigating scientific misconduct. When did you realize that something was wrong in the world of research, and what made you take action?

Bik: Maybe I was a little bit naive about fraud in science. I always pictured that every scientist is very honest and, I'd heard a little bit about plagiarism or so, but, yeah, I had not realized that there is indeed fraud in science. And I guess we're all humans, so maybe that should not come as a surprise. But how I discovered this, is that I heard about plagiarism and I just thought, let's take a sentence I have written and put it in between quotes in Google Scholar. And I found a hit: Somebody had copied a paragraph I had written into some online book chapter. So, it wasn't really a scientific paper. It was a, yeah, a book chapter, you know, free book chapter that somebody could download. But it had two paragraphs stolen from me and it also had paragraphs from other people. And I was very mad about that because it was my sentence. And I decided to work on plagiarism for a while and I did it as a hobby. I was full-time employed at Stanford so did lots of research. But on the weekends, I was scanning for plagiarized texts. And by another coincidence, I found a PhD thesis that had duplicated images. They had photos that had been reused to represent different experiments. There were photos of Western blots, and one of the photos had been mirrored. So, it was the same photo, it was just flipped. And again, I was very mad about that, but I guess I have a talent for spotting these things. So, I decided to switch from plagiarism to image manipulation and image duplication, still fully employed. And a couple of years ago, I decided: Okay, I want to do this. I want to switch from microbiology to doing this type of work. I quit my job. That was a big risk, of course, because, you know, I didn't have an income anymore. And I thought maybe I can do this for a year, see if I can get by, and I can. So I'm now working as a consultant and do this type of work as my new job. 

Roeder: So, what I understand from what we you were just saying is that you look at pictures and you check if they're manipulated. I mean, how does that work? Is this something anybody could do? I assume you have an extraordinary eye for detail, but how do you really go about in checking, for those images?

Bik: When I started doing this work, I found these duplicated images. That's my first find. And it was only later that I found also, manipulated images. Most of the problems I'm finding are duplications – so the same photo being reused. That can be the same photo, it can be rotated, flipped, mirrored, stretched, all those things, or it can be manipulated, so where within a photo you see duplicated elements. So, I do think I have some talents to do that, but I think other people see it as well. Most people will see it once you point it out. I usually draw colored boxes around it. This red box is the same here as the same, you know, like that other figure, and then people usually will see it.

Roeder: I mean, I'm sure there are mistakes in science, and some may be deliberate, and, some might be just honest mistakes. How do you differentiate between those? 

Bik: You cannot always differentiate between that because, I'm just looking at a at a paper, at the published images, and I don't know how the originals really look like. What happens in that lab? What is the photo that ended up in the lab notebook? Is that actually the correct photo or has it been duplicated or manipulated? There are some rules we apply. So, when I find an image that is exactly the same photo being used twice, let's say you have one of those complicated panels where you have, I don't know, five by four photos and two of the photos are identical. We usually assume that's an honest error where just people had tons of photos and just grabbed the wrong one. But when a photo has been manipulated, where within a photo there's duplications, you know, one cell is visible three or four times, we assume that was intentional. But usually, it's not always clear. So, when we post these things online or I talk about it on Twitter or BlueSky, I'll just say it's of concern., these images are of concern. So, we try not to put any stamp on it if this was honest or not. And, we just raise these issues and we invite the authors to reply. And, if they come up with: Oopsie, we made an error, here's the correct photo. –It's all good. It still would probably require a correction, if it's in a paper. But, yeah, if they don't reply, it's sort of maybe, an admission of guilt. We don't know that. But, yeah, we try not to make any assumptions. We just raise concerns, post them on a website called PubPeer, and invite the authors to reply. 

Roeder: Let's put a number on this. I have read that you have scanned more than 20,000 papers. How many manipulations did you find? 

Bik: I scanned 20,000 papers. This was a couple of years ago. So now I'm using software to help me find this. At that time, when I did that scan of 20,000, I used just my eyes. So, I probably missed a lot of duplications. And, again, I'm scanning for duplications, so either within a photo or between photos, but I cannot detect a very well done Photoshop job. If you're really good in Photoshop, I'm not going to catch you. So, we're really only catching the tip of the iceberg. In that set of 20,000, we found 4 percent of papers contained duplications within the paper. So, we also then look between papers. I do that now because I have software to help me. But in that initial scan in 2015 and we published it in 2016, we found 4% of the papers to contain duplications. About half of those we thought were deliberately done. One third of the set was contained what looked like manipulation, so where the same photo contained duplicated elements. Let's say a duplicated cell or if it's a blot or gel, like the same band is visible twice. Those we assume could be misconduct, could be, you know, really an intention to mislead. Although, occasionally, we come across examples where the manipulation has been done by the journal. Usually, to remove a scale bar or the journal didn't like the fonts that the authors use for their labels like a, b, c or so, they wanted a different font. I would say about one third roughly of the examples we're finding, so one third of 4percent, like, one point something percent contains what we think is misconduct. 

Roeder: That is quite a lot, wouldn't you say? 

Bik: Yeah. And it's an underestimate because I only find what we call the dumb fraudsters, the people who leave traces for us to find. So, let's say you see a duplication between two panels, like an overlap. And the sample was under a microscope, you take some photos and then the scientist moved the sample around, took some more photos, and it could be if you find two photos that are overlapping that this is just an honest error or it could be deliberate. Sometimes we see six panels and four of them overlap, and they're supposed to be six experiments, but they're all the same sample. There we assume: Okay, that is, either extremely sloppy or it was done with an intention to mislead. They were very lazy. They didn't do six experiments. But if you would move the sample a little bit farther under the microscope, we would not find that overlap. Or if you're a good photoshopper, we would not catch you. So, it's really an underestimate of what we think is really going on, I think that the percentage of fraud in science papers has to be higher than one point something percent. It's probably in the five to ten percent range, which is shocking if you think about how many papers that would be. I have to say these were all papers with images, with photos. So, we focused on molecular biology papers, because those have a lot of photos usually. But if you also think about bar graphs or line graphs or heat maps or things like that that are not photos, that is much harder to find fraud in because you just see a bar graph with some dots and, you know, you can just make one in excel in a spreadsheet, in five minutes without doing the experiment. So, it's very hard to know if that data is real. With a photo, at least, we tend to think: Okay, at least some experiment happens. And a lot of the duplications are just sloppiness. But we also find these laps where you see so much sloppiness that it also borders misconduct. Basically, it's a very fuzzy line to draw between, you know, what is an error and what is misconduct. So, these estimates are always very hard to make.

Roeder: I see. And what does the publication of manipulated images mean for science as a whole? I was saying, before that it probably does affect the trust in research, doesn't it? 

Bik: It is very devastating for science on many different levels. So, scientists try to build on each other's work. So, if a paper contains misconduct, that could lead a whole bunch of other scientists down a dead end street. If they try to replicate the results, they might not be able to. So that's a lot of wasted time and effort for scientists. It could end up in a meta-analysis where a particular patient's treatment that is based on fake results ends up in a meta-analysis and actually skews all the results into the wrong direction. So it could even sometimes end up in a public health policy based on fraudulent results. So that is wrong. And it's also devastating for science as a field because we talk about science fraud, and so maybe your listeners might think, well, maybe all science is fraud. And that is not the case. Like, even if it's 10%, which is perhaps an overestimate. But even if it's five percent, that is obviously very bad and a lot of waste of good taxpayers' money and patients' trust. But, there's still a lot of good science. It’s still pretty rare. And we should still believe in science, and we should still know that science is the way out of all the problems we're dealing with, like pandemics and pollution and climate change. We need science to solve these problems. But, yeah, there is fraud in science, like in banking or construction, but, we should focus on the good science, obviously, and we should fight the bad science, but it's not that all science is fraudulent. And I want to stress that because it's very easy to talk about science fraud and come to that wrong conclusion. 

Roeder: And from the point of view of those scientists that manipulate the those images: What do you think are the main drivers for this problem? I mean, why do they do that? 

Bik: Roughly, you can summarize it as: As scientists, we need to publish. That is what we're held accountable for. The more you publish, the better your career. And there's different scenarios that could play out to lead people to do misconduct. And I think the most common one I've seen is where there's just pressure from above. Let's say a young early career scientist is in a lab with a very bullying, demanding professor, and maybe the grad student or so is on a visa, and they know that they will lose their visa and their chance to work if they are fired. And so they will try to get out of the lab with a paper, with results, and they will try to please the professor. They will give them the results that the professor wants, and that could lead to misconduct. So we very often, if we find misconduct, that the whole lab seems to be infected, if you will, with that type of, behavior where you see paper after paper after paper with different first authors, different grad students, different post docs, all seem to be doing some type of fighting the results. And we think those are labs where the PI sort of creates this culture of “it doesn't really matter what you do as long as you give me the results I want”. So we see that, yeah, where young people just don't know they cannot push back because they would lose their job. And so because of the strong hierarchy in science, we see that some labs have this culture. And luckily, this is again still rare, but we've seen some of these cases where we scan hundreds of papers and we find dozens of examples of visible photo manipulation. 

Roeder: What really needs to change for this problem to not be a problem anymore? How does really the academic system need to change? 

Bik: Well, it's easy to say, we shouldn't focus on the number of publications that a scientist have, but we would still do that. We look at a resume that has 10 papers published in Science or Nature very differently than a resume that has only a couple of papers published in a low impact factor journal. So, we do make these assumptions about the quality of a scientist. And as long as we focus on metrics, we're going to have people trying to cheat, trying to cheat the metrics, trying to boost the number of their citations or the impact factor. And we see even a whole industry build on that. So, we see these, what we call “paper mills”, where even complete journals are infected with what we believe are all fake papers. We should focus on less on these metrics, but it's easier said than done because it's so easy to focus on them. And on the other hand, I feel there need to be a little bit more consequences for people who are caught cheating. And unfortunately, we've seen these big cases where we found researchers in Europe, in The US, in Asia that have lots of papers with problematic results, and nothing seems to happen. We, science sleuths, we detectives, we find these things, we report them, and journals don't seem to really want to retract those papers. They're very slow, and institutions don't seem to fire these people. These people are still employed. They still produce less feasible fraud. Suddenly after you report them, all their papers are completely clean. There's this disbalance almost like where the rewards for science fraud are high because you have a better career and a better paper, but the consequences are very low. And I think in that climate, people are going to take risks. They see other people do fraud, they see that those people will have a better resume, and then they start to do fraud themselves. And I usually use the analogy of the Tour de France, you know, a bike race where at some point a lot of people were using doping and, people are not winning the race because they were the best athlete, but because they used the best doping, I guess. And if you see other people use doping and win the race, you're going to be tempted to do that yourself too because otherwise, you're going to lose. And at some point, it was decided: Okay, we want to keep the sport a real sport, not about doping. And there were doping checks, and people got caught doing doping. And I think that makes in the end the sport a better sport because now it's focusing on the athletes. But in science, we haven't gotten to that point yet. We're still at the point where everybody uses doping, i. e. fraud, to reach that goal, and we don't seem to care too much about it. I care about it, but I think more people should do, especially publishers and institutions. Research institutions should focus more, not just on preventing fraud, but also having there be consequences for fraud once people get caught.

Roeder: And you post your findings on X, formerly Twitter, and other platforms. How do people react to what you're doing? 

Bik: I switched from X to BlueSky. So little plug there. I'm on BlueSky, Elisabeth Bik. I make it into a game, sort of a puzzle, if you will. I post these under the hashtag ImageForensics, and I post a little challenge. I post two images or some figure from a scientific paper with duplications, usually with overlaps. And I ask people, ,Well, can you spot the overlap?’ And you can win an emoji award. So secretly, I've been working on training everybody to find these things better. So, if they ever peer review a paper or read a paper that they're trained to find these things. That's my secret mission. Usually, I don't disclose which paper it is from. I'll just present the image. If you're smart, of course, you can perhaps use look at the labels and find which paper it was. So, I'm not really trying to hide it, but I don't want to focus on the authors or the paper itself. I want to focus on the problems in the images. I also play that, game when I give a talk – I'll just ask, raise your hand if you see it. And people seem to like it and, appreciate that. Because I don't try to bash the authors, I do make exceptions, because sometimes it's so bad you have to sort of laugh and say: Okay, this is the paper, go cut it into pieces and spit it out because this is just horrible. But I try not to send my followers into harassing authors because there's multiple authors on a paper, and you never know which of the authors was the guilty one. 

Roeder: Have you ever gotten criticism by the authors that you exposed? I mean I’m sure it's hard to admit that you have done a mistake or even manipulated. 

Bik: Yes. You know, not everybody likes to be criticized. Probably nobody likes to be criticized. So, yes, I have made some authors very mad with me. And there were several instances in the past couple of years why I have been harassed by the fans of those authors. One case was a Biotech company where the stockholders were not happy with my criticism of papers associated with the company. And another one was a professor in France who really didn't like my criticism and called me all kinds of names. So very often these people will make it very personal. They will harass me. They will talk about the way I look, the way I talk, the way what I've done in my past. Like I have worked indeed a couple of years for a company that turned out to be fraudulent. Doesn't make me fraudulent, I hope. But, yeah, there was fraud in the company I worked for. Yeah. So, I've made quite some people mad. In all those cases, it turned out that papers have been retracted, papers have been received expressions of concerns. You know, in the end I just know I have every reason to be concerned about the papers. And the fact that they make it about me that they do these personal attacks is horrible: I will have sleepless nights, I will eat a whole bag of cookies because that's how I deal with stress. In the end, it's about the science and I just feel very strongly about that I'm right, that sounds very horrible to say that, but I will only raise the concerns if I'm really concerned. And if the authors don't have any scientific answers but instead try to attack me, then I feel, maybe I did raise a good concern because just show me the original pictures and tell me I'm wrong. I'm happy to be told I'm wrong, but, yeah, they never do that. So, it encourages me to dig even deeper. 

Roeder: So, it's really seeking for the truth that motivates you despite all of this criticism?

Bik: It does. It's knowing that some science is bad and we need to call that out. And, yeah, like you said, science is about finding the truth and cheating is not science. Therefore, it goes against everything that science should be. And we need to call that out because we always say that science is self-correcting, but it's not really because of what journals or peer reviewers do. Very often, it's up to a little group of people like me who scan the literature for all kinds of problems and call them out and and get those papers retracted. But it's hard if you are an active scientist, if you are working in that hierarchy where your boss can fire you, where your boss, your PI, is very important for your career. It's very hard if you're an early career scientist to call out problems. You'll often be seen as a troublemaker but I'm in the more comfortable position that I'm older, I got my PhD. I don't have even a job. I just quit my job, so nobody hopefully can really harm me in terms of writing a bad letter of recommendation or trying to ruin my career, which people definitely have tried but, I don't really have a career I guess. And so I'm in a position where I can call out people without, you know, I I'll take the horrible insults and the harassment. I'll take that. But for younger people it's very hard to do that. So, I try to be their voice. And a lot of people send me concerns about papers or about labs. I'll try to find if I see something visible and I'll try to post that. So I work off of tips and I try to be the voice of those people who have similar concerns but are not in a position to call out bad science. 

Roeder: Well, I would say your career is really being a science detective. Is there any message you want to send to the scientific community about integrity and transparency?

Bik: Well, if you are early career scientist and you are in a lab where your PI tells you: ”We all cheat in science. This is how we all do it.” Don't believe that. You should be honest and careful about your experiment. You should take the time to label everything correctly. If people try to rush you, people try to pressure you to report different results than you actually obtained, don't do that. That is not normal. I hope you can get out of that lab and, start somewhere else where hopefully things are better. And, yeah, don't fall for that the temptation to cheat because eventually it will come out. It's much better to do science a little bit slower and do it rigorously, conscientiously with integrity than to try to rush results or try to massage them and change them. Science should be about finding the truth and that other people might rely on your work. 

Roeder: Thank you so much for taking the time to speak with me today, and thank you to our listeners. If you like this podcast, please feel free to share. My name is Marie Roeder, and I'm looking forward to the next episode of AskDifferent. AskDifferent, the podcast by the Einstein Foundation.