Please visit our new site!
Essay
Unleashing viral apocalypse
Just how important is health and safety?
14 July 2014
www.lablit.com/article/830

Warning: lab security can matter
We all know people who only don a lab coat for the annual safety inspection. Experience is no proof against hubris. It breeds it
I still find it hard to believe. The US Centers for Disease Control and Prevention (CDC) have closed the labs handling anthrax and pathogenic avian flu (H5N1) following the revelation that both agents had been sent accidentally to lower-security labs, potentially or actually exposing unwitting staff to infection. Even worse, vials of smallpox were found while cleaning out an ancient freezer. The virus was found to be viable.
At the same time, researchers conducting so-called gain-of-function (GOF) experiments with influenza (in other words, trying to see what factors can increase the transmissibility of avian flu viruses in mammals, to understand more about it) have found themselves the focus of attention, and not the right kind, after concern has been raised that rather than protecting against the next pandemic, they might be producing it.
All of these issues touch upon the awkward subject of laboratory safety. And I have been reflecting on how I feel about it, changing some of my views in the process. My credentials: I’m an infectious disease epidemiologist who has worked on bacteria and viruses. Labs are categorized as Biological Safety Level 1-4, and I’ve worked in all except 4, the highest level of security. I’ve even been a departmental safety officer. I’m also of a quantitative frame of mind, which means I do ‘dry’ work as well. Twenty years ago, while still an undergraduate, I worked in the same department as Yoshi Kawaoka, now a prominent practitioner of GOF research. And (full disclosure) I am now in the same department as Marc Lipsitch, one of GOF’s foremost critics.
But right now I am not so interested in the rights or wrongs of the GOF program. I’m more focused on lab safety and our attitude to it.
My concern is that scientists’ attitude to safety is bound up together with a bunch of other stuff that has little to do with the objective assessment of risk. When you are starting out in a lab, there is an undeniable thrill when you first come across the stuff that you know is dangerous. The pathogen. The radiation. Hell, even the liquid nitrogen or dry ice. You become habituated to them. While talking to people outside the field there is a tendency to be blasé while at the same time exaggerate the risks you take. I’ve heard more than one student bragging about the ‘deadly pathogens’ they work with. And it has to be said that while Neisseria meningitidis can kill in hours, it’s not exactly Ebola.
The geeky part of us does not respond rationally to these things. When I told a student recently that I might have the opportunity to tour a BSL4 facility, her response was “cool!”, and I grinned in agreement. But on reflection this attitude is bizarre. You just need such a facility to do your job safely. It’s like the ear protectors on a guy running a pneumatic drill – necessary for the safe conduct of a job. Not really cool.
And as for the blasé bit, we all know people who only don a lab coat for the annual safety inspection. Experience is no proof against hubris. It breeds it. Like expert motorcyclists who think helmets are only for riders so bad they expect to fall off, as people get more experienced in the lab they alter their attitude and begin to cut corners.
They always, at least the good ones, do it rationally. Based on their skill and judgment they know the thing they are pipetting won’t damage them if it falls on the skin.
Then there are the bad ones. The ones who contaminate the whole lab other than the hot bench with radioactive phosphorus-32 because they "didn’t want to get the hot bench dirty”
Of course the bad ones aren’t present in the labs handling the most dangerous material, are they?
One of the best books I’ve read in recent years is Thinking, Fast and Slow by Daniel Kahnemann. Kahnemann, an economist, has spent decades showing why what we think is obviously true, is often nothing of the sort, and one of his most bracing findings is the incompetence of experts. Not in all circumstances for sure. If there is some genuine weird shit going down an expert may be in the best position to spot it. But most of the time, it’s not weird shit. It’s ordinary everyday shit. But the expert still tends to see the weirdness, or at least waste time considering it. The knowledge that they are experts means they fail to consider the run-of-the-mill circumstances that apply most of the time.
I once worked in a lab that contained a freezer that contained DNA from Burkholderia pseudomallei, which is for not awfully good reasons designated a potential bioweapon. This was not the live organism, which is a common soil saprophyte in South East Asia, but its DNA. The genome had already been determined and was online. But the freezer still had to be under lock and key, behind a locked door in a corridor with swipe card access which itself was beyond two other doors accessible only with a swipe card. I was annoyed by this. It struck me as dumb in the extreme. How could I take a regulatory body seriously if they thought DNA could cause disease?
I now think differently. Yes, there is a tension between the people who make the rules, and the people who have to implement them. Yes, the decisions handed down by non-experts may make an expert furious, in their sheer ignorance. But for all the evident flaws, the alternative is worse. This is because the alternative is decisions made by an ever-smaller pool of experts, who are increasingly invested in the outcome of their deliberations, and who are likely to have an increasingly poor conception of what will really happen once their brilliant safe plan of research collides with the real world.
That’s what scares me when I hear a scientist declare that their lab is safe, unlike the other ones. Everybody thinks their lab is the safest. Until one day it isn’t.