With all the travelers to the San Francisco Bay area, especially from Asia, I was surprised it wasn't one of the first places in the US to diagnose an infected person. I am encouraged that the US is responding faster than during earlier outbreaks, such as during ^SARS^ (2001-2003), ^MERS^ (2012-2013), and Ebola (2014-2016). I am also somewhat scared by what this quicker response implies about the danger of Coronavirus. While SFO is one of the ^20 airports screening for Coronavirus^, neither San Jose (SJC) or Oakland (OAK) are on the list.
Much of what is currently known about the Coronavirus is likely false, that is, it is partially true but is treated as entirely true. The portions that are wrong may lead to people being unnecessarily exposed to the disease and to people who are infected not getting timely treatment. My intent here is to encourage a healthy skepticism of the official pronouncements by presenting situations from previous outbreaks. This may also help you infer non-malicious motives when officials (inevitably) get caught in a falsehood. Also, I anticipate that there will be inquiries to the City government about what it can do now and what additionally it should be preparing for.
----You can't handle the truth!----
A consistent concern of the officials during an emergency or disaster is of public panic, and this shapes their narrative. However, research starting the 1950s has found the contrary and has been prominently publicized. So why does this belief persist? My speculation is that officials are using different criteria for "panic": That the public isn't following their directions even though that derives from the public having lost faith and confidence in officials. For example, officials, fearing panic, release false information of the category "Who are you going to believe? Me or your own lying eyes." For the public, it is rational to not follow instructions predicated on falsities, whereas officials interpret this as "panic".
----Officials have false certainty: They believe they know things that they should know/realize they don't know----
In disease outbreak after outbreak, I have seen officials repeatedly make statements that start with a disclaimer about how little is currently known about the disease, and then announce measures that fail to accommodate the uncertainties. It's as if they are clueless to the patterns of discovery during similar outbreaks.
Example: During the early days of the AIDS epidemic, the public was repeatedly told that HIV could not be transmitted by X only to be followed up by "Whoops. It is transmitted by X ." Sometimes both occurring at the same conference. Yes, this is the way that science works: hyping claims/confidence in results in order to get published and more funding, followed by "refinements" (corrections). But it is definitely not the way that public policy is supposed to work -- policy decisions involve risk-management and tradeoffs, including the knowns versus the unknowns.
Similarly during and after the ^2014-2016 Ebola epidemic in Guinea and nearby countries^. The list of ways that the virus could be transmitted expanded greatly. Initially, the virus was said to be very short-lived outside a living host, transmitting through "bodily fluids" such as blood, sweat, mucus, saliva ... Later it was discovered to be able to survive for days on many surfaces in a hospital environment, including stainless steel and protective suits (Tyvek).(foot#1)
----Lying with statistics----
Seeking to reassure the public (preventing "panic"? see above), officials produce statistics that distort reality by various means. For example, when the first case of Ebola in the US was diagnosed, a statistic was produced claiming that you had an absurdly minuscule chance of getting Ebola while in public. Turned out, the calculation was based on there being a single infectious person in Dallas ((CORRECTION: originally "New Orleans"), but treating that whole population of the US as having equal chance of getting the disease, that right, a person sitting on a Dallas bus next to an infectious person was portrayed as at the same risk as someone living in Fort Yukon Alaska (^map^).
Another example of "reassurance" from the Ebola epidemic: We were told that you couldn't catch it from someone you were sitting next to in a plane. No explanation of why this might be true. Anyone who has traveled in "cattle-class" knows that skin-to-skin contact is often unavoidable, as is avoiding touching things that someone else has just touched. And you can get stuck next to someone who is sneezing and coughing because there is no empty seat to move to. For the common flu, the "effective range" of coughs and sneezes is about 6 feet.
Add in the estimate that 20% of contagious people were asymptomatic. Screening passengers for symptoms before boarding was a useful measure, but was no guarantee with a disease that could progress from asymptomatic to highly contagious during the time the plane was in the air, such as a flight from West Africa to the US or across the US.
----Optimism gets promoted----
Research on organizational behavior repeatedly finds that optimistic people are much more likely to be promoted than those with more realistic viewpoints. Managers like good news and those that bring it. Those who point out potential or developing problems often get dismissed with "Bring me solutions, not problems!" Cautionary tales have the expert rendering a decidedly negative assessment on a proposal that gets progressively softened as it passes up through the levels of management until it crosses over the threshold into being a positive assessment that gets transformed into an enthusiastic endorsement.
Top management may think the organization is well-prepared for an outbreak when the reverse is true. For example, well into the Ebola epidemic, a nurse who had been working with infected people in Africa flew into Newark airport. Officials wanted to play it safe by temporarily isolating/quarantining her, but there was no facility -- they had to quickly cobble together one. This at one of the US's top-20 international airports over 10 years after SARS. Unbelievable.
When a person with Ebola symptoms showed up at a Dallas hospital, he was misdiagnosed and sent home. While the intake nurse had entered into his online record that he had just arrived from West Africa, the doctor had failed to notice the comment. Reportedly, instructions/advice issued by the CDC hadn't taken this common and well-known error into account. Attaching bracelets with key information to the patients themselves was already an established practice in many -- not all -- medical facilities.
When the patient was subsequently put in treatment, two of the nurses became infected, despite being trained in the use of the bio-isolation suits. Turned out that getting safely out of the suit was very difficult and the training had missed some minor but crucial techniques. With only one active case in the US, wouldn't the CDC have sent someone with extensive experience with the suits to observe and mentor those nurses, both to help them and to see where the instructions might need to be improved? Uhh, no -- the CDC didn't.
----Scaling up has surprises----
The experience with the Ebola patient in Dallas drove home the extensive resources needed. My recollection is that when a detailed calculation was made, the estimate was that the whole of the US could provide that level of care to roughly 20 patients at a time. The US was lucky that we didn't have to test that limit. When one reads accounts of the 1918 Flu Pandemic and sees the pictures of hospital wards chock-full of patients it is very sobering.
----The predictable limitations of bureaucracies----
Bureaucracies were created to address the problems in haphazard and inefficient administration. They are intended to provide consistent treatment of known problems and to develop new procedures when previously unknown problems are discovered. There is a 2x2 matrix where the first of these is called the "known-knowns", and the second is called the "known-unknowns". The third of matrix item is the "unknown-knowns", that is, information that is known but is inaccessible, rendering it effectively unknown. For example, a web page that doesn't show up in web search. Bureaucracies tend to be good at organizing information to minimize such occurrences.
That leaves the "unknown-unknowns", that is, what you don't know (realize) that you don't know. Bureaucracies tend to handle this category very badly, typically by forcing them into known categories.
In a major disease outbreak, we need a bureaucracy to handle the scale of the problem and the scaling up. But being designed for stability and consistency inhibits its ability to rapidly evolve and to be flexible enough to handle the uncertainties from the unknowns. That's right, we need a tall, thin, short, fat man.
In response to the SARS outbreak, the 2007 emergency preparedness exercise for our area had a scenario starting with numerous concert-goers becoming infected with a highly communicable disease and then infecting others on their way home and more the next day. One part of the exercise what to distribute medication, prioritizing those most at risk. The planners from the medical community treated this like a flu shot clinic, with people lining up to receive their dose. They didn't recognize that those lines could increase the spread of the disease, by placing infectious people among the as-yet uninfected. A better scheme was pointed out by then-Chief of Police (Lynne Johnson): Curbside distribution with cars serving as isolation pods. Aside: Palo Alto didn't participate because, despite months of lead time, it was unable to be ready. This small part of the scenario was tested in a later exercise.
----Psychology: A current victim takes precedence over preventing future victims----
In the early days of the AIDS crisis in San Francisco, it was recognized that the bathhouses were a major factor in the spread of the disease. Yet closing them was fiercely resisted and the delay cost untold lives. The psychology, sociology and politics of this battle was complex and controversial, but if you are interested in the problems of quarantines, this could be an interesting variant. It is also off-topic here (to large and complex an issue).
During the Ebola epidemic, most of the medical personnel returning to the US quarantined themselves. However, there were some exceptions, most notably the nurse mentioned above. She rejected the need for quarantine and resisted it. The dominant narrative of the media was that since she had sacrificed to work in the Ebola zone, she should not be subjected to the inconvenience of the quarantine. One reporter went so far as to break the quarantine by going up to her and shaking her hand. The media's "logic" seemed to be that since there wasn't proof that she was infected, she wouldn't become infectious in the future. In effect, the media was rejecting the tradeoff that protecting the lives of the whole country from Ebola was worth the temporary inconvenience of a single individual.
A similar situation occurred after the ^Pulse Nightclub shooting in Orlando FL in 2016^. Because of the unconscionable delays by the police, some of the wounded bleed to death and the killer was given time to go back and kill others. Consequently, the local blood banks were able to provide the needed supply. Not knowing this, many people laudably rushed to donate. When gay men were rejected under the current rules, there were protests that this represented discrimination and slander. However, the CDC's period review of the donation eligibility rules had occurred very recently (months?) and had decided that although the technology for screening the blood itself had advanced considerably, it wasn't yet reliable enough to justify lifting the ban on sexually active gay men. Despite this, numerous advocates and prominent politicians -- including at least one currently seeking the Democratic Presidential nomination -- called for the ban to be immediately lifted (unnamed because I couldn't quickly find the supporting news articles).
Such is the state of our current political environment that powerful decision-makers would prioritize ^virtual signaling^ over protecting the nation's blood supply. This does not bode well.
----Summary and Disclaimer----
I am in no position to make predictions about the Coronavirus nor the current state of preparedness -- national and local -- for such a disease. My contribution here is intended to alert you to some of the relevant lessons from history so that you can be a better consumer of information about this disease. It can also provide some background for residents interested in participating in the City's emergency preparation programs and activities, but I have no contact information -- it may not yet exist.
----My other blogs on coronavirus (COVID-19)----
"Coronavirus (COVID-19): Underappreciated Unknowns & inexplicable failures", 2020-02-28.
"Preparing for COVID-19: An epidemic is not a hurricane. Panic buying harmful", 2020-03-03.
"COVID-19: Critiquing News Releases: What's missing + teachable opportunities", 2020-03-19.
1. ^Ebola Virus Lives for Days on Steel, Plastic^, MD Magazine, 2015-05-04. Reporting results from the US Centers for Disease Control and Prevention (CDC).
An ^abbreviated index by topic and chronologically^ is available.
----Boilerplate on Commenting----
The ^Guidelines^ for comments on this blog are different from those on Town Square Forums. I am attempting to foster more civility and substantive comments by deleting violations of the guidelines.
I am particularly strict about misrepresenting what others have said (me or other commenters). If I judge your comment as likely to provoke a response of "That is not what was said", do not be surprised to have it deleted. My primary goal is to avoid unnecessary and undesirable back-and-forth, but such misrepresentations also indicate that the author is unwilling/unable to participate in a meaningful, respectful conversation on the topic. A slur is not an argument. Neither are other forms of vilification of other participants.
If you behave like a ^Troll^, do not waste your time protesting when you get treated like one.