Category Archives: Medicine and Health

And the beat goes on

“Time, said Heraclitis, is like a river that flows endlessly through the universe; you cannot step into the same river twice.”

Darn good thing too – can’t think of anyone who’d want a repeat of the last few years. Like most people I rather went through the last years in a bit of a daze, but here we are, still standing. Or lurching as the case may be.

Nonetheless I suspect there were those who rather enjoyed themselves (oh, you know, virologists, epidemiologists, biostatisticians, Pfizer…).

OK, maybe on an individual level it was somewhat subconscious but let’s face it, these people are human and it probably was a bit of a treat having their every utterance and half baked hypothesis treated with reverence and quoted on the news. This may explain why there’s plans afoot looking forward to the next great plague. We barely made it past the oh-no-we’re-all-going-to-die stage (not to mention monkeypox looming), masks, social distancing, general gloom and doom, and now here’s Bill Gates with a Cunning Plan for Covid 2.0. He’s written a book, suggesting an international body with the clever acronym GERM (Global Epidemic Response and Mobilization), on stand by, ready to leap in when there’s even a hint of some killer microbe. The cost? A scant $1 billion/year. Here’s a thought, Bill. Why don’t you take a course on basic physiology. I’m sure they’d let you audit and maybe then you’d stop thinking about disease in binary, tech terms. To paraphrase Larry Brilliant, one of the epidemiologists who helped to eradicate smallpox, outbreaks are part of nature and inevitable but the human response, the “pandemic” part, is optional. There are choices to be made in terms of how we react, and, gosh, wouldn’t it have been great if some better decisions had been made..

Perhaps the part that upset me the most was the mental health aspect as all manner of folks who had been skirting on the edge but were able to manage with their normal routines in place were jolted out of that state over the last years and are now slipping into the abyss of behaviourial, cognitive and what we now call neuro, er, diverse problems.

Broken China

Then there’s China, who’s decided on a “zero covid” policy. It would seem head-smackingly obvious that a virus would be, er, less susceptible to a diktat from Mr. Xi than the beleaguered citizens of Shanghai, under house arrest/lockdown for so many months, but I suppose the Chinese Communist Party assumes what they say goes, everywhere in the micro- and macro-universe. The mind boggles.

Unsurprisingly, mental health issues in China, like everywhere, have skyrocketed – even though China doesn’t really acknowledge their existence. Perhaps it’s a holdover from Mao’s day when dissidents were confined for psychiatric “care” (though apparently that still happens). Depression and suicide are on the rise. Well, as they are everywhere. You can only amp up stress and anxiety levels so long before there are consequences.


Stressed? Who you callin’ stressed?

Hypervigilant Hypochondriacs

Meanwhile, our understanding of how bodies work continues to deteriorate, ably aided by corporations pushing the idea that health is really the sum of its (measurable) parts which we can follow, eagle-eyed, with wearable devices. There is, after all, money to be made. So beady little tech eyes are now turned to what has been termed the “quantitative patient”.

And we’re merrily going along with this, loving our Apple watches and FitBits and apps. Improvements in wireless and other technologies are poised to track your every heartbeat and more – so you can obsess not just about the number of steps you took but your sleep, glucose levels, whatever else they can think of to measure. As if Dr. Google weren’t bad enough, now we can be hypervigilant hypochondriacs 24/7.

We used to refer to these measures as “objective” health (versus “subjective”, which is how you actually feel) but this is taking every breath, hiccup and heartbeat to a whole new level. And I would remind you that the algorithms tracking these rely on large banks of energy-devouring servers, adding to climate change as you compare yourself to some mythical standard.

The Minotaur maybe?

Inevitably it will be the young and tech savvy, those least likely to suffer from anything dire, who will be the ones goimg nuts over this medical micromanaging – and they will be the ones most likely to go crazy when their heart skips a beat or their REM sleep doesn’t conform to some average. Given how dynamic physiology is, not to mention self regulating, knowing this minutae will probably only serve to increase anxiety, not health. After all, everything from stress to food to falling in love affects these mechanistic medical benchmarks.

Soooooo … gazing into my crystal ball I see nervous zombies rising through the mist, upset and worried because their watch told them something was abnormal. I observe health care systems overwhelmed by ever younger people panicked by shrill apps. Social skills further eroded as our phones not only remind us there’s an update on Twitter but that our pulse was a touch thready for 4.3 seconds. Then the doctor-du-jour on some telehealth app will try to reassure us we won’t be dead by next Tuesday.

Julie, don’t go

I don’t suppose the Wayne and Shuster skits would do too well these days; they’re awfully clever and literary. But they still make me laugh. One of the best was the Julius Caesar sketch, with Caesar’s wife repeating, mournfully, that she warned him not to go out on the Ides of March. “Julie, I said, Julie don’t go.” But did he listen? Of course not. By the same token I feel like that sometimes, constantly telling people not to check Dr. Google and go down that rabbit hole of medical misinformation. But do they listen? Of course not. There’s always something going on in that Forum.

To illustrate the point, true story. Big-wig American CEO, recently retired, with excellent insurance, feels a twinge in his left arm. Naturally races to the internet and realizes he must be having a heart attack, and must run, not walk, to the ER. Hours or possibly days at the hospital where he’s probed and poked, scanned and tested and then it turns out it wasn’t his heart at all. He had recently taken up gardening (repeat, retirement) and overdid things a bit, straining his left arm and shoulder. Oops.

So you see, Virginia, sometimes there is no sanity clause.


No matter how bad things got the vast majority of us are still here. Statistically, as always, those who did poorly with covid were the poor, the frail, the elderly, the dispossessed; those without access to clean food and water, health care or care in general. The differences between areas that locked down hard (like California) and those that did not (Florida) in terms of covid cases or deaths were insignificant. So much as I hate to say I told you so (OK, I lie, I love it), much of our general freakout and fighting about masks and proximity was misplaced. Basic care and common sense measures would have sufficed. Protecting the most vulnerable. How things might have played out without the panic of the last few years is not knowable; suffice it to say that the current global economic disruption(s) could have been averted, at least somewhat, had saner heads prevailed. Risk, alas, continues to be misunderstood even as the vast majority of us live longer, healthier lives than any other cohort in history.

So, in closing, I leave you with the mental picture of a graphic I recently saw. It read, what’s important to remember is not whether the glass is half full or half empty but that the glass can be refilled. Next to the glass: a bottle of wine.








Chicken Little and Henny Penny were off on a quest: to terrify everyone and tell them the sky was falling. Close to a year later it would seem they succeeded and everyone is indeed scared witless. As the world moves in slow motion Christmas appears to have been cancelled, adding to the general gloom.

Never mind that 1.5 million people die each year of TB or that 300,000 people (mostly children) die of malaria. And some 55+ million more die annually of various and sundry causes which, on a planet with close to eight billion people, is par for the course. Honestly, you’d think that without covid nobody ever died.

For now the burning question is whether that godforsaken mask is the holy grail or a fiendish mind control plot.

A friend tells me that apparently it is illegal to enter a bank wearing a mask. Implication is that you’re a nefarious miscreant. (Come to think of it, this would be the perfect time for a heist; just melt into the masked crowd with your loot.)

Stop the world I want to get off

As always, health discourse has gone reductio ad absurdum, pared down to a basic, binary level – a topic I have whinged on about at length. In one corner the coronavirus, crowned headband and mean little eyes; at the other, the economy, pale, with quavery legs, propped up by government handouts.

Weirdly, nobody seems to notice or care that the state of the economy impacts heavily on health. It’s not a zero sum game. Stress, especially that of not knowing whether you’ll have enough money to feed your family, is especially bad for health, as is isolation and locking ourselves away from anything makes life worth living like social contact. Seeing friends and family. Art. Music. Dance. Theatre. Travel.

Immunologically, the organism, the person matters, not that you’d know it these days. You don’t just get sick because you’re exposed to a pathogen, whether that’s covid or TB. You get sick when your immune system reacts to said pathogen. In the dry, wry words of an old epidemiology text, the virus (or bacterium) is necessary but not sufficient. In other words, the TB bacterium must be present if you have TB, but simply having the tuberculosis bacterium does not mean you have TB.

With any kind of testing that becomes obvious; all sorts of people harbour microbes they never knew they had. My mother never had chicken pox but clearly had been exposed since later in life she developed shingles (when she was taking immune suppressants). Chicken pox’s revenge as one nurse called it.

The dratted virus isn’t some poison perfume chalice, the mere whiff of which can fell anyone walking by. But hey, who cares about science when you can post scolding messages on Facebook?

The big news this week is the vaccine. And how it will be distributed. Not much analysis of the actual mRNA vaccine that frankly spooks the hell out of me. Makes sense in the abstract but real people are adaptive, dynamic physiologic systems: neurologic, immunologic, endocrine etc etc. Altering the genetics of a cell: well, no reset button there. I truly hope there are no negative consequences six months, two years, down the road. What I read, even when it tries to be reassuring is awfully vague.

[And always, along the same binary lines – if you discuss vaccines in anything but glowing terms well, clearly you’re one of them lunatic anti-vaxxers. No grey in this discourse. ]

We are the Borg. Resistance is futile. 

What’s up doc?

I was asked an intriguing question a while back: if we infect others with a cold or flu through coughs and sneezes, how is it possible to transmit a virus when we are asymptomatic (or presymptomatic) and not coughing/sneezing? I had taken it on faith that one could transmit a virus before getting sick. So I tried to find out.

Turns out it there’s a process called viral shedding.

When you have been exposed to a virus and it is attempting to take hold (but your immune system is pushing back), the virus works its way into a cell, often killing it (apoptosis). And  miniscule bits of the virus can shed a bit and get into your system. Possible but highly improbable in these socially distanced times. Reminder: even the symptoms of a cold result from the immune system’s reaction to the virus, not from the virus

So, if you have somehow been exposed to a virus and it has somehow entered your system, this shedding might get passed along if you got seriously close to another person and get these shreds into their eyes or mouth. Bearing in mind that these specks of virus aren’t especially numerous. That’s a lot of ifs. This, incidentally, explains why genital herpes or HIV can be transmitted through, er, bodily fluids, even when the person is asymptomatic. It’s more complicated with a respiratory virus,  particularly since we have this enormous immune organ called the skin protecting us.

But, the numbers are rising, I hear you cry. Surely that’s worrisome.

Maybe. Numbers alone can’t tell a story. For instance, what irks me immensely is the term “case” as I can’t bloody tell whether we’re talking about people who have tested positive or people who are actually sick. (Different groups use the term differently.) Particularly since the majority of people who get sick recover. With more testing it stands to reason that more people will test positive, therefore becoming a “case’. (And do not get me started on test accuracy – as no test is fully accurate.) But with language like “cases surging” we lose all nuance.

It’s year end, furthermore, by which time statistics would, under normal circumstances, have recorded some 56+ million deaths globally from various and sundry causes. As with most things, most at risk are the poor, the dispossessed, the fragile, the elderly. Nothing new there.

So, yes, this appears to be a bad flu but keeping one’s distance and keeping your hands clean should keep the worst of it at bay. Have there been massive numbers of excess deaths? I’m not sure. I suspect not but honestly, life’s too short to try and dig up those statistics that nobody’s keen to publicize.

In any event, when it’s cold more people get colds and flu. We just don’t usually keep track of them with such eagle-eyed negativity.

Alas what is on the rise, along with the freakout factor, is stress, depression, anxiety, gloom. Especially hard hit are those living alone, especially the elderly, cut off from  human contact. Ah, just asking: did anyone actually ask these individuals what they wanted? Guess not.

Lost in the mist of misinformation is the dynamic nature of organisms, viral and human. Forgotten is immunologic adaptation. Things evolve. Long before there was a vaccine for polio or smallpox the incidence of those diseases was well on the wane. I hate using the term “herd immunity” as it sounds more like animal husbandry but it really is a thing. Places where there were outbreaks last spring are now virtually covid free (e.g., Lombardy in northern Italy). Immunity is not static.

Post viral stupid pills

Now I read that – gak! – post covid complications seem to happen in people who’ve recovered, particularly those who were in hospital. Several months later these individuals are not back to “normal”.

Really? A bad viral infection didn’t totally clear in a few months? Shocker. Um, ever hear the term “post viral fatigue” (or post viral syndrome). There are always individual differences but for many people recovery post infection is slow. This isn’t new and scary. We used to call it life.

Bodies aren’t high tech devices you can reboot. Homeostasis takes times.

Yup. Stupid pills. It took over six months for some genius to figure out that in those people with covid who had had an inflammatory over-reaction steroids might help. We’ve known for over 100 years that steroids reduce inflammation but  it didn’t occur to anyone to give dying patients prednisone or something along those lines, to reduce the inflammation that was killing them? (See what I mean when I say the immune system appears to have disappeared in this virus discourse altogether.) Yes, I get that steroids aren’t good for an infection – but see “dying” line above. Still, best not to stress about all this. Stress is really bad for immunity – and wearing that damn mask is stressful enough.

What I will not do is let the buk-buk-buking of those Chicken Littles get me down. There have always been viruses, some worse than others, this too shall pass. In any event, I am bored with this particular fairy tale. So my new hero is Bugs Bunny and like him I plan to munch on carrots, sing my songs and bounce along being as silly as I can.  Just because the world’s gone nuts is no reason to follow suit. B-dda B-dda B-daa.

That’s all folks.


Coronavirus Blues

So here we are, essentially still in thrall to a coronavirus – with horrid terms like “social distancing” and “lockdown” having worked their way into the lexicon even as we are inundated with nauseating corporate platitudes along the lines of “we’re all in this together”. Then there’s “Zoom fatigue” and “brain fog”, giving us all an excuse to whinge about how tired we are. Which I, personally, enjoy a great deal as it gives me an excellent excuse not to do the things I don’t want to do.

Inevitably, as with most things health related, discourse has descended into a kind of bipolar idiocy: either you’re a virtuous human being who defers to “scientists” (who these people are or what their expertise is doesn’t seem to factor in), keeps their distance (leaping ostentatiously away if anyone gets anywhere close), dutifully wears a mask (even in the car, alone) – or you’re a conspiracy theory nutter roaming the streets and howling at the moon.

This, of course, isn’t new but it gained traction with social media some decade or so ago when anybody, anywhere, could find like-minded lunatics instantaneously. So discourse on anything even marginally ambiguous (which anything scientific inevitably is) now descends into the subtle dynamics of a nursery school spat: T’is! T’isn’t. T’is too! There are only two settings, right or wrong. (By the way, if you learned everything you needed to know in kindergarten, you’re an idiot. Get your GED.)

Trouble is, while binary codes seem to work OK in the digital world, they don’t work so well in the real one. And science, especially, doesn’t lend itself to this kind of thinking.

Computer language(s) rule?

Science isn’t some sort of “amorphous blob” (to quote British physicist Brian Cox, quoted in the BMJ on line). Science is dynamic, self-questioning and oftentimes wrong; it’s a method designed to grope for some measure of truth via empiricism.

With time, as the concepts become more refined, so do the answers. Which is how we moved from the notion of “miasmas” of bad air in sick rooms causing infectious disease to an understanding of microbes and germ theory. An understanding that continues to evolve. The development of better microscopes and laboratory techniques played a part as well, not to mention keen thinking by the likes of Pasteur and Koch. Later, a happy lab accident led to antibiotics and this daft idea we’ve formulated as a result that there’s a pill for every ill.

Science is not dogma and a great many hypotheses fall by the wayside. Just check the sheer number of un-reproduceable results for, say, genetics, if you don’t believe me. (One biotech company, Amigen, hoping to develop a miracle cancer drug, tried to replicate 53 “landmark” genetic studies – of which they were only able to replicate six. And they included the original researchers. But those other 47 trials undoubtedly made headlines in their day.)

The real strength of science is that it is self-correcting: it is not a religion or a deity or the final word.

By last March, alas, between the media shrieking blue bloody murder and epidemiologists insisting we were all going to die, it all built up to a crescendo of panic which has yet to subside. The zombie apocalypse would be upon us any minute. (Never mind that there aren’t a lot of good brains around these days for those poor zombies to eat.)

Now, months later, it seems to me, governments and policy wonks, epidemiologists and virologists, having scared the living daylights out of everyone, are a tad unclear as to what to do next. We’re tip-toeing into phase 3, with stores and such re-opening, but nobody’s all that clear about how close to get to that plexiglass or how essential those masks really are. I detect a hint of disquiet as the economy tanks and people are pushed into poverty all over the world, which every gibbon knows is what’s really bad for health.

Trouble is, once you’ve scared everyone half to death un-scaring them isn’t especially easy.


For my part, I have wearily realized that this pandemic business was inevitable. The world was waiting for a pandemic, it was wanting a pandemic – and a pandemic is what it got. Now we’re stuck with the dregs and the aftermath.

There have been rumblings about the Next Great Plague for decades. There was SARS in 2003 (where under a thousand people died), then H1N1 (but then there was a vaccine) and its variants some years later. Each time the so called experts began their siren song, warning us that any second now, the plague was upon us.

It certainly didn’t help that popular culture, films, TV shows, have gone on incessantly about the dangers of microbes, be it bioterrorism, laboratory mismanagement or just bad luck. Every other day some actor strode about in in some show, unrecognizeable in a HazMat suit, looking suitably grave – even as virologists threatened us with the next “war” against some killer pathogen. (I recall a profile of some chap “hunting” viruses, profiled in The New Yorker, a la Indiana Jones, with your man chasing the virus that was going to kill us all.)

In short, we were primed.

The 1918 flu always held up as a template – millions dead and pandemonium in the streets, and boy will you be sorry you didn’t listen. Never mind that in 1918 they barely understood antisepsis, never mind antibiotics. Not to mention that pesky war to end all wars.

This time it took. First off there was that term, “coronavirus”: so easy to pronounce. (Quick reminder: the common cold is a coronavirus.) Plus, it originated in China, a country many of us have reservations about. Its genome was sequenced there making the provenance of this “novel” virus suitably creepy. At first it may even have seemed exciting, in a horror-movie sort of way. People do love to be scared, after all. And the word “pandemic”. So scary. Boo.

After that it took on a life of its own – and nobody took a breath, stopped to consider how focusing solely on a virus, using the most primitive medical model, could have dire, lateral, contextual consequences. Contrary to the idiocies posted on Facebook and its ilk, the reality of health is that it is not an either/or, zero sum game. Either we all wear our masks and keep our distance or everybody dies. Dodge any other human within several metres or the virus “wins”. Military metaphors sunk so deep within public health discourse and public consciousness we don’t even see they’re there. Just ignore the basic immunologic aspect: virus + organism could = disease. It’s not a given. Not just about the virus but the host. And not everyone with a virus will react to said virus.

People live in the real world, with real lives, and this freakout has had real consequences for work, housing, education, and health – which is about far more than an immune response to a virus. It was as though we forgot, watching the numbers rising at the edge of the screen that we all die, with or without a virus. In 2018 according to the World Economic Forum some 56 million people died. No coronavirus then, just, well, life. And flu and TB and a myriad other things. All made worse by deprivation and poverty and hopelessness and stress and all those other things.

To paraphrase Norman Bethune, there are two kinds of tuberculosis. The rich man’s TB and the poor man’s TB. The rich man gets better, the poor one dies. As we saw with this virus. The worse afflicted – other than the few whose immune systems responded with that oft-mentioned cytokine storm – were the poor, the marginalized, the malnourished, the warehoused. Sorry, but camping out at your holiday cottage, complaining of Zoom fatigue as you do your work on line, is not the same thing as losing the one (or three) poorly paid jobs you have which is all that’s keeping your family fed and housed. In the U.S. or Bangladesh.

Nobody likes to talk about the larger context; too complicated, so muddled, damn ambiguous, non-linear. Can’t be neatly graphed or displayed in bright colours. It isn’t something an infectious disease expert or virologist really knows much (or cares) about. It became about the virus; never mind the consequences of shutting it all down or how this will affect communities, cities, families, children.

Keeping it clean

Perhaps the one (semi) positive note I can see in all this is that perhaps some basics around hygiene we’ve appeared to have forgotten will get some attention. You know, things like keeping hospitals and schools and busses clean. Maybe we’ll remember that it’s a good idea to wash our hands more. Or hey, we could consider paying the people who take care of the elderly a living wage so they don’t need to work four jobs to make ends meet. Revolutionary concept.

Maybe we can revisit this notion that buildings need to be hermetically sealed, with no windows that open, giving some of us sick headaches. Or stop already with those horrid digital faucets that spurt water at random. That last one comes from personal experience (but I’m not bitter) from the times I tried to get my hands clean enough to take my contacts out on campus. But even though I brought a nail brush and soap from home, it never worked. I could never get enough water to scrub my hands properly. (Old joke: how do you know someone’s had to use a hand dryer in the washroom? Their trousers are wet.)

There’s a Danger Lurk!

There is simply no way to do away with risk. Risk is all around us. Nothing is 100% safe. Not crossing the street, or going to the dentist; no medical treatment, no drug, no surgery, nothing. It’s a risk to sit too much, stand too much; exercise too much, exercise too little.  You get my drift. (Incidentally, even though everybody’s all worried about a virus doesn’t mean bacteria aren’t still out there too.)

So here we are, singing the coronavirus blues, trying to figure out who decides what this “new normal” is, and watching people pander to their OCD tendencies, get all worked up about masks and otherwise indulge their inner Stasi. Hoping, I guess, that virtue trumps risk.

Trouble is, as the French philosopher Bernard-Henri Levy, who was pilloried for writing about the plight of the Rohingya in the age of coronavirus, has rightly said, our response to this virus has been more pathological than the virus itself. Aptly, he quotes Virchow: “An epidemic is a social phenomenon comprised of some medical aspects.” Not the other way round.



There ain’t no Sanity Clause

So, here we are, in the throes of insanity as Covid-whatsit (I think the virus may have reached the age of majority right about now) accompanied by what I read is panic buying of toilet paper and Lysol wipes and now food. Various grocery store shelves bare. Then there’s the conflation of Corona beer with the corona virus. The mind boggles.

Incidentally, “corona”, in viral terms, simply refers to the shape of the virus, which looks a bit like a crown – it isn’t anything especially dire. The common cold is a coronavirus.

The other day I dashed into Canadian Tire and whatever it was I was trying to find (and didn’t) was right next to a large display of masks. Now these weren’t those little white medical ones you see everybody wearing (which means you can’t understand a damn thing they say, their voices being muffled and all) but, well, closest I’ve ever seen to these are from old movies from around 1920, those weird gas masks they had to protect against chemical warfare. The ones that made wearers look like aliens – and not the nice friendly kind either. Good grief. It’s come to this. Browbeating some poor little Somali kid who works at Canadian Tire because the store’s run out of surgical masks that won’t protect anybody from anything.

Marx Brothers, here we come.


[From the 1935 movie, A Night at the Opera, where Groucho attempts to explain a business contract to Chico, who rightly points out there ain’t no sanity clause.]

Phantom of the Opera?

So, in response to one of the questions I get asked: will a mask help? No it will not. Surgeons wear the damn things to protect the patient they’ve just cut open (the skin, as you might recall, is our largest organ and keeps us safe from marauding germs pretty much all the time) and don’t want to get the bacteria from their mouths and noses into the patient who’s just been cut open. Otherwise, your grubby fingers on that mask will probably do more harm than good. Try not to touch your eyes. If you don’t want to listen to me, listen to this York University prof. 

The sky was falling with SARS (it didn’t – 800 people died). World was ending with H1N1. Didn’t happen. As for the Spanish flu, I’ve already told you times were different in 1918. And that was ten years ago.

What perturbs me somewhat is that there are all these experts out there, expounding, yet nobody really explains what a virus is or how it works. Because without a host organism, in this case you, a virus is just a harmless little chunk of protein.

A bit like a SIM card sitting on your desk. Useless. Until it’s inserted into that phone, it’s just a chunk of useless metal. Viruses are like that.

Immunology 101

Viruses are small. Very very small. Somewhere between 20 and 250 nanometers in diameter. A nanometer is one billionth of a meter. I have no metaphor to help you visualize this. Try to think of it in terms of billionaires or something.

Viruses were discovered, or rather deduced, in 1886 by two different scientists in two different parts of the world, when something was found to infect tobacco plants that didn’t seem to be bacteria, since those had been filtered out. (Bacteria are some 100 times larger than viruses which is why they were observed long before, in the 1600’s.) Nobody actually saw a virus until the 1930’s when viruses finally became visible with the electron microscope.

[Officially the first microscope was developed by the Dutch draper van Leeuwenhoek who observed what he called “small animals” in a droplet of rainwater. Bacteria in other words. Of course van Leeuwenhoek wasn’t always reliable – he maintained he had seen a complete little man, a humuculous, in sperm. Still, the man was brilliant and was the only non-scientist to be invited to join Britain’s prestigious Royal Society.)

Viruses are not cells and aren’t alive, at least not in any conventional sense since they cannot reproduce, grow or metabolize. A virus can only replicate within a living host; it does this by co-opting the host’s cellular genetic “machinery”. So, without a living host, a virus is just a chunk of protein. Analogous to that SIM card sitting on the table.

And it’s not that easy for a virus to gain access to that host since we are all encased in this large immune organ called the skin, which is pretty much impenetrable unless there’s a cut or burn or wound. (That’s why bacteria, which are living cells and can survive on surfaces can infect any part of the skin that is open.)

The skin also contains enzymes like lizozeme (I think I spelled that right) which further repel microbes.

The fact that a virus can only exist within a host cell explains why anti-viral drugs tend to be toxic; anything that kills a virus will also damage the cellular environment, i.e., the host. That’s why antibiotics, that kill self-contained bacteria, work really well and anti-viral drugs do not. Anti-viral drugs like Zithromax tend to be toxic and often don’t work very well. Which is why we haven’t “cured” the common cold.

The odds of your immune system actually responding to a virus are slight if you are a normal healthy human being (since it’s the immune response to the virus that causes symptoms, not the virus itself). True, if you are elderly and/or immune suppressed or ill you are at greater risk. But if you are elderly or sick or undergoing treatment for cancer, well,  you are at risk for all sorts of things.

Damn lies/statistics

I am perplexed as to where these dire statistics that are being hurled about willy nilly are coming from, particularly since hypothesizing about the future is not science, It’s speculation. And that, in turn, is affected by one’s agenda.  After all, if I am trying to convince you of something it is to my advantage to make my point seem as dramatic as possible. Since the people who are putting out these stats have a vested interest in making them sound important  … well, I rest my case.

Has it not crossed anybody’s mind yet that if we were to graph, map or pie-chart the regular flu the numbers would be far, far greater? If we followed normal winter deaths from flu or flu-like illness things each year we’d see the same type of pictures?

Don’t forget keeping us indoors is good for a lot of people, not least those who peddle e-sports and on-line courses and such. Just saying. On the flip side, who will graph and chart the fallout, all those stores and cafes that will have gone out of business, all those people who will have lost their jobs, all the attendant misery this nonsense will end up causing? No, that won’t make the front pages, I guarantee you that.

Knowing how a microbe will behave within a human immune system is .. tricky. To put it mildly. Largely unknowable. Debatable. And it’s not like such predictions were especially accurate with SARS or any other of those pandemics that were going to kill us all.

In any event, this virus doesn’t appear to be especially virulent given the numbers. Yes, the numbers. Think about it. On a planet with 7.7+ billion people (so really close to 8 billion), some 198 thousand are thought to have the virus, of whom close to half have recovered. Some 7900 as of this writing, globally, have died, pretty much half of those in China where the, er, outbreak began and most of the rest in the two hotspots, Italy and Iran. Contrast that to the 1.5 million people who die of TB every year. Or the tens of thousands who die of the regular flu. (The CDC estimates that some 3500 Americans died of the flu in 2017-18.) Never mind poverty and cancer and war and whatnot.

Now I am sorry for those individuals’ families and friends, but for heaven’s sake people. When the weather is cold, people get flu or what’s called flu-like illness. That’s the people who get sick with symptoms they think are the flu but when tested don’t actually have the flu. They’re still sick.

(I suppose someone will respond that the reason these numbers aren’t so bad is because of all the measures taken. Which reminds me of one of those elephant jokes I used to like as a kid. Why do elephants have little red eyes? So they can hide in cherry trees. Have you ever seen an elephant in a cherry tree? No. See how well they hide! You can’t win.)

A foolish consistency, said Emerson, is the hobgoblin of little minds.

In BC,  as of today’s newspaper, four elderly people – in a province with over  3 million people – have died. Again, I’m sorry for their families and friends, but honestly. Does this really warrant panic and hoarding not to mention all the doom and gloom? Serious faces? Empty streets? Good grief. According to Worldometer in  Canada, the total number of cases is well under 600. What’s 600 out of 35 million?  I tried to calculate the percentages in terms of global population but none of my calculators have enough zeroes and then I got confused and gave up. You do the math.

Calm down. And remember that a virus is not a bomb, it’s not anthrax, in and of itself it has no power to do anything. And frankly, this one doesn’t appear to be all the virulent.

Cause of death isn’t as easy to determine as most people think it is. Particularly with those who are older and have several things wrong with them it’s bloody difficult to determine a single cause – which is all the form has room for. The writer Calvin Trillin’s wife died of heart problems some years back; I know this because he wrote a beautifully moving tribute to her. She had had breast cancer some decades earlier; the treatment (radiation notably) badly damaged her heart muscle. So, 20 years later, did she die of heart disease, breast cancer or the treatment for that cancer? See, not so easy.

Um, you needed to be told to wash your hands?!

Yes, wash your hands. Because it’s a good idea generally since the world is full of microbes and if you’re especially tired or sleep deprived or stressed or it’s cold outside and you get gunk on your hands and touch your eyes you could get infected. Or not. Mostly not. Wash your hands not because it will protect you against this apparently killer virus since viruses can’t survive on an inert surface for very long; wash your hands because practicing basic hygiene is something you should just do. Duh.

Here’s a thought. Live a little and splurge on a nail brush. Wet it, run it on the soap, scrub your hands, nails, fingers …  Now you can scratch your eye.

Stop stressing. Turn off the news. The end is not nigh.

And if it is, all the toilet paper in the world won’t save you.



[Thanks to my friend, science blogger Maryse,, for links and for helping me focus my thinking.]


Getting rid of junk (nutrition)

What more can one add?

Junk mail, junk science, junk food, junk DNA: The list of terms we can plunk “junk” in front of seems endless. Usually the point is to negate the meaning of the second word but all too often it refers to things we don’t understand (DNA), are trying to avoid (food) or have no other way of describing (science). But since the word’s out there I think we need a new term, one that describes all the outrageously stupid, yet ostensibly expert, advice out there with respect to food – let’s call it junk nutrition. Which, in the best tradition of junk anything, masquerades as solid scientific advice, uses pseudo scientific terminology and does its level best to terrify us into giving up all the real foods, like butter, that make life worth living.

Worse, in its attempt to confuse us with bafflegab, junk nutrition is far too easy to get wrong, which explains why – as I am desultorily skimming an article in The New Yorker about some person, Stephen A. Smith who apparently “shapes the discourse of the sports world” (be still my beating heart)  I read the following:

“I used to think almond milk was best, but then somebody told me – a trainer told me – there’s too much estrogen up in there. In the almond milk. That’s right. [And] you don’t want to walk around with man boobs if you don’t have to.” (beat, chuckle, rat-a-tat)

P-pardon? Almond “milk” will do what?

Now I absolve Mr. Smith as he appears to be quoting his trainer, but honestly, somebody, somewhere, needs to get their nether regions out of whatever black hole they have become stuck in – as they have clearly confused soy and almond “milk”. (In quotes, incidentally, to remind you that no matter what marketers have told you neither soybeans nor almonds actually contain milk, which comes from female mammals). Soy. er. juice does indeed contain small amounts of phytoestrogens that do mimic estrogen and could, if ingested in sufficiently high quantities, cause said man boobs. Though you’d have to drink an awful lot of it. But how many people will read this and believe Mr. Smith’s trainer? More important, why didn’t the editor catch this?

Who knows, maybe it was one of those product placement things, paid for by the soy drink people.

Junk nutrition, in a nutshell. Eat/drink X so you’ll be healthy, avoid Y (salt, meat, sugar) so you’ll live longer. Trouble is, the stuff they tell you to replace the real food with, like butter, is often a pale imitation (margarine) cooked up by a chemist for a specific purpose. In the case of margarine, a French chemist who was trying to find a fat that could travel with the French army without going rancid. And therein lies the rub. The fake stuff is  (1) manufactured, (2) additive-full  and often contains genetically-modified whatsits (which they do not have to declare because it is not a genetically modified organism or GMO that’s grown/farmed) and   (3) untested so could easily cause some kind of weird side effect down the road. But there it is, large as life, dumb advice pretending it’s health news. This is how you convince otherwise reasonable people that a dark pink fake substance calling itself a hamburger is somehow … superior.

If you want a hamburger with a vegetarian spin, eat a felafel. At least you’ll know human hands were involved there somewhere, not a robotic arm in some factory. Plus it won’t contain  synthetic additives most of which  I cannot pronounce and wouldn’t even try. I can see why the EU has rules about what can call itself sausage or cheese – the terms mean something. But of course nobody would eat the stuff if you called it what it is: fake mushed up chemicals mixed with a bit of beans and plant based something-or-t’others with flavorists (yes, that’s a thing) adding some of that (flavour) at the end of the process. That latter bit, incidentally, is based on a molecular breakdown of what real food, like an orange or a carrot, tastes like.

Human beings cannot make certain necessary nutrients, vitamins, which we get from food. So if you don’t eat meat, milk, eggs, fish, cheese and so on be careful to get enough B12 and niacin and various other trace vitamins that aren’t present in plants. Iron too, as it difficult to absorb iron from plant-based sources. (Pair your plant based protein with something containing vitamin C, like tomatoes or red peppers. It will aid iron absorption.)

In the late 18th century when they also believed they had the nutrition thing down pat an Estonian scientist, Nikolai Lunin, noticed something puzzling. When he fed mice all the known nutrients (fat, carbohydrates, protein, broken down chemically), the mice sickened and died. Then he gave them milk and they were just fine thank you very much. Lunin didn’t know why, it was many decades later before anyone realized vitamins were essential to life. (Think pallegra and scurvy and other vitamin deficiency diseases.)

Salt of the Earth

The junk nutrition advice, of course, doesn’t stop there.  Take salt. (No, really, have some salt. It will make your food taste a lot better Just ask any chef.)

One of the few things Ronald Reagan said that I actually remember had to do with salt. When someone asked the Gipper about salt for some obscure reason he replied that yes, he did try not to oversalt things but honestly, eating something like a hard boiled egg without salt, well, only a raccoon could do it.

I don’t know where this notion that salt is evil came from since I can’t find any real research to back it up. But one day I realized that when I reached for the salt cellar some random stranger was glaring at me. And so it began.

It is true that if you have  congestive heart failure (i.e., if your heart muscle is not especially effective, usually when you’re quite old or have had heart attacks), there can be a buildup of fluid in your system (edema) and eating too much salt can exacerbate that.  Often, people with this disorder are prescribed diuretics, which makes cutting back on salt a good plan. And a small number of people with high blood pressure are sensitive to salt.

But normal healthy people sprinkling a bit of salt on their food, or cooking with salt? Not a problem.

Bodies awfully good at maintaining balance. Homeostatis as  Walter Cannon called it some 150 years ago. So, if you go a little crazy with the salt, as I have been known to do with popcorn, what happens genius? You get .. thirsty. And you drink more water. Whereupon said salt is flushed out of your system.

Alas, the salt-is-bad-for-you talk has taken on a life of its own and like most truisms no longer needs to prove itself; it’s simply become true by dint of repetition.  It is so because everybody says it is so, even though nobody knows why.

A rather elegant Scandinavian study, in fact, demonstrated much of what I’ve said here. Currently my desk has eaten my hard copy and I can’t seem to find it on my hard drive, but the gist was this: In a large study (tens of thousands of people) researchers found that increasing salt intake resulted in people drinking more fluid and passing out the salt in their urine. (The study measured the amount of salt excreted.) Voila. Balance, courtesy of physiology.

In hospital, where I recently spent a fair bit of time with a family member, the food tray arrives with tasteless food accompanied by a teensy packet of pepper. No salt anywhere. And let me tell you, that food needs something to make it palatable. I took to ducking into every McDonald’s I passed, just to take a few salt packets to toss in my purse.

What I found especially galling with the hospital food and its nanny-ish refusal to provide even a teeny packet of salt was that almost everyone there was old, frail and needed to gain weight and strength. Needed food in other words. But everybody was picking at their dinner, probably because it didn’t have a lot of taste. Ah, news flash people. Making food tasteless means the patient is far less likely to eat it, thereby exacerbating said frailty. Recuperating after a long hospital stay without food, now that’s what is dangerous. Not that teeny packet of salt. Seems to me the not-starving-to-death thing trumps the possible (minor) risk to your heart down the road, since, let’s face it, most people who are pushing 90, which a lot of people on this ward were, it’s not 20 years down the road you should be worrying about but next week.

[One a completely different note, I wonder if we’re going to start seeing the incidence of goiter increase as ever younger people decide no salt is the right amount of salt. And iodized salt, with iodine, as you’ll recall, is what made an enlarged thyroid, aka goiter, a thing of the past.]

Aside from anything else, salt is vital to keeping your electrolytes/fluid in balance. In hot weather when you reach for a Gatorade, all you’re doing is replacing the sugar and salt you’ve lost. And, for some of us, whose heart valves do not function as well as the norm, extra salt is necessary to keep enough blood coursing through our system.

Salt was so precious it was used as currency at a certain point in time. But gosh, we know better now. We have Google.

As with much junk nutrition, the voices are loud and crabby and rude. But, to paraphrase a maxim attributed to Louis XIV, “Do not assess the justice of a claim by the vigour with which it is pressed.”

How Sweet It is

Then there’s sugar. Public enemy #1 (or #2, depending on your stance on salt). Meanwhile, inclusion criteria for Type 2 diabetes have been lowered (which means more of us are being subsumed into the “diseased” category) so more of us are being told we should check our glucose regularly (thereby making Bayer, the largest maker of those glucometers rich and helping it take over the world), obsess over our diets and of course start taking the drugs. I have of course rambled on about this before but, oddly, my post does not appear to have changed the world.

Yes, too high glucose is bad. It means that there is sugar circulating in our system that can’t be used as fuel – and our cells need that to function. When this happens, our bodies will store fat – because frankly you’re just as heavy as you need to be in order to survive. If your cells can’t get the glucose they need from the food you’re metabolizing into glucose (which pretty much most food is), they will turn to stored fat for that glucose. At some point that tilts into pathology (diabetes). But our cells, all our cells especially our brain cells, need glucose to function. Lower the levels too much and your brain won’t work. This is especially true for older people.

Metabolism is highly complex: a deft and sophisticated dance between intake and output. Unfortunately, the numbers we revere and try to adhere to (using our personal beepy machines) do not reflect that complexity at all; on the contrary they turn this nuanced, balanced system on its metaphorical ear.

For the elderly this lowering of blood sugar – with the so called ideal being somewhere around 6, which is too damn low – can have dire consequences in terms of cognitive ability. I wonder sometimes how much of this increase in dementia we’re told we’re seeing has to do with the ostensible increase in the diagnosis of type 2 diabetes and lowering of glucose levels lower than anyone over 60 or thereabouts can take.

A 25-year old can fast, detox or do any one of a number of crazy things but usually be OK. (Of course in a very small number of cases this could uncover an underlying heart problem.)  At that age bodies by and large function at peak efficiency. So if there’s no food available glucose will be pulled from elsewhere – fat, muscle. With an  older person, not so much. As we age our ability to keep that balance becomes less efficient, so if we don’t take in nutrients the first thing to go is the brain. We have large brains and they need fuel. Glucose. If they don’t get it, they falter. And I would remind you that once you hit your forties immunologically, physiologically, you are on the downhill slope. In fact, to quote an article in the journal Progress in Cardiovascular Diseases (61 [2018] 10-19) entitled “In Defense of Sugar: A Critique of Diet-Centrism”, the author, a Dr. Edward Archer bluntly states that “without sugar we die”. And no, he is not in the pocket of Big Sugar.

Dr. Archer adds that sugars are so “foundational” to biological life and so central to human health that the simple sugar glucose is “one of the World Health Organization’s Essential Medicines”. Furthermore, given that sugar has been part and parcel of our diet for a long long time, it makes no sense to suddenly blame sugar for everything from obesity to metabolic diseases. No one substance can be responsible for all of society’s ills, such as poverty and packaged foods and fast food and sedentary lifestyles, not to mention time-strapped parents unable to make (more expensive) food from scratch after working two jobs just to pay the rent. Never mind drug companies keen to sell their drugs. This “diet-centrism”, writes Dr. Archer, is neither evidence based nor scientific. And he has the three long pages of references to prove it.

What concerned me most while I was in and out of the hospital “advocating” as they call it (such a bizarre notion, that patients need someone there to prevent harm when, presumably, everyone’s goal is to treat the patient and get them home in one piece) was watching the glee with which the glucose numbers were tested. No matter that in an elderly person a blood sugar level of 5.5 (ideal from the hospital’s perspective) is way too low. In fact, better too high than too low since too low can kill you before next Tuesday; versus long term high glucose which will kill you eventually, years down the road. Don’t know about you, but I’d rather give up some mythical tomorrow for a today when my brain actually works.

Keeping up with the Numbers (game)

But numbers – like the ones the glucometer spits out – are easy. They give us an easy benchmark against which to compare ourselves to some “ideal” and today the high-tech, beepy machines that can measure everything from body temperature and glucose to blood pressure and heart rate, are everywhere especially in acute care settings. These, alas, by reducing the complexities of physiology to a check list make us think we’re safe and on top of things.

Trouble is, we rarely if ever genuinely understand what those numbers mean, much less how they alter with time.

The other day I semi watched as a woman, accompanied by a man who had clearly googled “taking your blood pressure at the pharmacy” before heading out of the house, explain to his female companion why lower was always better. I hovered for an instant then moved on. Nothing to see here folks.

It seemed wiser to keep going and not engage; not stop to explain to this gentleman that the blood pressure measure (even if the pharmacy apparatus was dead on which it probably was not) reflects a dynamic element of life. Blood pressure rises and falls with exertion, with stress, after eating. One measurement means nothing – the only way to have a bit of a sense of the trend is to take your bp several times a day, several times a week, then average it out over time. (Personally, having a man loom over me telling me lower was always better as I took my blood pressure at a pharmacy would make my blood pressure skyrocket. )

And there definitely is such a thing as too-low blood pressure. Take postural hypotension as it’s called: blood pressure plummeting when you suddenly get up. It can cause dizziness, queasiness, even a fall. At worst, a fracture. Pushing those numbers too low may look pretty but they sure as god made little green apples don’t translate into good health. True, persistently high blood pressure is indeed a risk for heart attack and stroke. But simply pushing bp down because lower is always better is idiotic. That heart, after all, resides within an actual person; a person who might want to feel well enough to actually have a life,  which too low blood pressure would not allow.

The trick, as with anything else, is balance. And understanding that individuals differ; what is right for one person may be totally wrong for another. Understanding that age needs to be factored in, not to mention our state of health. But in averaging out numbers and deciding how we compare to some mythical norm it’s all too easy to lose  sight of the actual physiology of what’s going on.

I can understand the impulse to stay on top of things, control what happens. Trouble is we can’t. No amount of obsessing over diet and all these numbers will guarantee future health. Much as we’d like to believe in the crystal ball of the numbers game.

All we can do is take each day as it comes, do our best to eat well, get enough rest/sleep, reduce stress to the best of our ability and carry on. Life is complicated enough without our adding junk nutrition to the mix.

In the immortal words of Garfield the cat, what is diet, after all, but die with a “t”?



The prism of life

In the immortal words of Yogi Berra, if you don’t know where you’re going you’re going to end up some place else. Which is where I seem to have landed, at least these past months.

Again with the AI

The fascination with AI continues to irk, given that every second thing I read seems to be  extolling the magic of AI and medicine and how It Will Change Everything. Which it will not, trust me. The essential issue of illness remains perennial and revolves around an individual for whom no amount of technology will solve anything without human contact. The change-everything reference, by the bye, is not to the  “singularity” here, that point at which some thinkers believe power will shift dramatically and we will all end up serving our robot overlords. That is another story. (One with major movie potential thank you Skynet and Terminator.)

But in this world, or so we are told by AI proponents, radiologists will soon be obsolete. The adaptational learning capacities of AI mean that reading a scan or x-ray will soon be more ably done by machines than humans. The presupposition here is that we, the original programmers of this artificial intelligence, understand the vagaries of real life (and real disease) so wonderfully  that we can deconstruct these much as we do the game of chess (where, let’s face it, Big Blue ate our lunch) and that analyzing a two-dimensional image of a three-dimensional body, already problematic, can be reduced to a series of algorithms.

Attempting  to extrapolate what some  “shadow” on a scan might mean in a flesh and blood human isn’t really quite the same as bishop to knight seven. Never mind the false positive/negatives that are considered an acceptable risk or the very real human misery they create.

Moravec called it

It’s called Moravec’s paradox, the inability of humans to realize just how complex basic physical tasks are – and the corresponding inability of AI to mimic it. As you walk across the room, carrying a glass of water, talking to your spouse/friend/cat/child; place the glass on the counter and open the dishwasher door with your foot as you open a jar of pickles at the same time, take a moment to consider just how many concurrent tasks you are doing and just how enormous the computational power these ostensibly simple moves would require.

Researchers in Singapore taught industrial robots to assemble an Ikea chair. Essentially, screw in the legs. A person could probably do this in a minute. Maybe two. The preprogrammed robots took nearly half an hour. And I suspect programming those robots took considerably longer than that.

Commander Data (on Star Trek Next Gen) spent his life trying to emulate humans and understand the notion of having a “gut feeling” about something. Those, as most of us know, can be wrong but usually are based in experience. Something about the situation at hand reminds us of something. We may not remember the details but the feeling lingers and something in the present situation cues that memory. Personally I have great respect for my intuition, especially when it’s telling me not to buy into the hype.

Ironically, even Elon Musk, who has had major production problems with the Tesla cars rolling out of his high tech factory, has conceded (in a tweet) that “Humans are underrated.”

I wouldn’t necessarily go that far given the political shenanigans of Trump & Co. but in the grand scheme of things I tend to agree. But hey, who knows, perhaps soon we will all be Borg, far too involved in flying around the galaxy telling people resistance is useless to worry about petty nonsense like this.

Lean, mean and gene

On a somewhat similar note – given the extent to which genetics discourse has that same linear, mechanistic  tone – it turns out all this fine talk of using genetics to determine health risk and whatnot is based on nothing more than clever marketing, since a lot of companies are making a lot of money off our belief in DNA. Truth is half the time we don’t even know what a gene is never mind what it actually does;  geneticists still can’t agree on how many genes there are in a human genome, as this article in Nature points out.

Along the same lines, I was most amused to read about something called the Super Seniors Study, research following a group of individuals in their 80’s, 90’s and 100’s who seem to be doing really well. Launched in 2002 and headed by Angela Brooks Wilson, a geneticist at the BC Cancer Agency and SFU Chair of biomedical physiology and kinesiology, this longtitudinal work is examining possible factors involved in healthy ageing.

Turns out genes had nothing to do with it, the title of the Globe and Mail article notwithstanding. (“Could the DNA of these super seniors hold the secret to healthy aging?” The answer, a resounding “no”, well hidden at the very end, the part most people wouldn’t even get to.) All of these individuals who were racing about exercising and working part time and living the kind of life that makes one tired just reading about it all had the same “multiple (genetic) factors linked to a high probability of disease”. You know, the gene markers they tell us are “linked” to cancer, heart disease, etc., etc. But these super seniors had all those markers but none of the diseases, demonstrating (pretty strongly) that the so-called genetic links to disease are a load of bunkum. Which (she said modestly) I have been saying for more years than I care to remember. You’re welcome.

The fundamental error in this type of linear thinking is in allowing our metaphors (genes are the “blueprint” of life) and propensity towards social ideas of determinism to overtake common sense. Biological and physiological systems are not static; they respond to and change to life in its entirety, whether it’s diet and nutrition to toxic or traumatic insults. Immunity alters, endocrinology changes, – even how we think and feel affects the efficiency and effectiveness of physiology. Which explains why as we age we become increasingly dissimilar.

This is important. It means that our personal histories matter more as we age and guidelines and evidence, as useful as they can be in a vague sort of way, need to be used with a large grain of salt, accompanied by a healthy dose of skepticism. Who we are, what we were like throughout our lives and what’s happened to us during that life are part and parcel of our health picture. As I’ve said before, much as we’d like to reduce medical decisions to a question of statistics and probabilities, it’s simply not possible. There are 89-year-olds for whom knee surgery is a perfectly viable alternative; 65-year-olds for whom it is not.

The circle of life seems to have come to a dead halt

Sadly, Super Seniors Study notwithstanding, our template for ageing is rather meager. Pathetic almost. As a sociologist told me many years ago when I was writing a story on geriatrics (when I myself was in my 30’s and didn’t have a clue), our mental picture of a “good”  old age is essentially that of a 20-year-old with a few wrinkles and grey hair. We admire seniors who run marathons and lift weights and do all the things they did decades earlier. We don’t value the  attributes that actually accompany ageing such as the ability to manage time, ideas and people better. Experience. Quicker thinking in deeper, more analytical subjects. Wisdom. Happiness.

Instead, we read that older folk react more slowly in tests where they’re shown some picture or asked to push a button on some video game; this is then used as some kind of bizarre proof that a 20-year-old brain is somehow superior. It may well be true that certain types of reaction time slow with age but my suspicion is that that’s because usually these are irrelevant in the grand scheme of things. Then again, what do I know. I couldn’t remember names or certain kinds of details when I was 25 and I still can’t. And I’m still here. Still can’t remember details but smart enough to look both ways before crossing the street (and putting my phone down before I step on to the curb, something I can’t say for way, way too many people.)

Life isn’t about one’s ability to do tests. Psychologists and educators finally realized several decades ago that IQ tests weren’t all that good at predicting future performance; all they measured was how well one did on IQ tests. And one’s ability to do well on those tests, all too often, was determined by extraneous factors like class and culture. (Something that I recall appeared to annoy researchers mightily, in the days of language/math IQ tests was the students in Sri Lanka did better on the language portion of the test than all other countries, including Britain and America. How could this be, researchers cried. Well, presumably Sri Lankan students spoke and read better English, geniuses.)

Just as we aim to declutter our living rooms and our lives, we want health and medical matters to be neat. Predictable. I wrote about this a while back but – amazingly – my one blog post doesn’t appear to have changed the world.

Ideally we will find a balance, us learning to live with the ambiguities of life – meanwhile, experts might try to realize that in the end people don’t care that you know – not unless they also know that you care.

He was a cool cat

Speaking of caring, Charlie, our noisy, furry friend died at the ripe old age of 17, which in human years is somewhere in the late 80’s. Not a bad age. He died peacefully at home, surrounded by the people he had bustled about managing all his life.  For as we all know, while dogs have owners, cats have staff.

Charlie contemplating his domain.

I wrote about Charlie some years ago when he was ill;. At the time I wasn’t sure he was going to survive the rigours of modern veterinary medicine. But he did and had  what I found out is called a hospice death, according to that fount of all wisdom, the internet. Apparently more of us are opting to let our pets go gently into that good night, contrary to Dylan Thomas’s exhortation.   And so the prism of life continues.

Perhaps strong drink is the only solution

To end on what I consider a more pragmatic note, I just heard a Dean Martin song I had never heard before – the chorus of which sounds as though it should perhaps be our new theme song.


















Artificial Intelligence and Natural Stupidity

We live in an irritating world, made all the more tiresome by the increasing, amount of tech we contend with every day. There are those Facebook algorithms, eerily sending you adverts geared to your “likes” or Linked In knowing your fourth grade classmates better than you do; that tinny  voice offering to help you navigate tech support (god help you if you have an accent); or having to prove you’re not a robot to some web site. Never mind refrigerators and “smart’ TV’s that can watch you – and be hacked  or apps professing to predict your cardiac or Alzheimer’s risk that are about as reliable as a palm reader. Human contact, it would seem, is becoming irrelevant. The few times an actual human answers a call or is there to help, well, I don’t know about you but I want to weep: things go so much more smoothly.

Yes, I know, that’s just crazy talk. The future is now and it’s automated.

Then there’s manufacturing where factories that used to employ hundreds if not thousands of people now have maybe twenty people working alongside the robots who now do the work. What would one call them I wonder? It’s a pride of lions – so perhaps a clank of robots?

Danger, Will Robinson, danger.

Algorithms, chatbots, trolls and nameless, faceless tech of all sorts are so ubiquitous we barely notice them any more, up to and including health care where, you’d think that in dealing with a sick, vulnerable person some human contact would be the basic requirement. Where’s Robot in Lost in Space to warn us, along with Will Robinson, of the danger?

In the health care realm even when there is ostensible human contact it’s technology driven. Hang out in a hospital room for a day or two as I did recently and watch as a nurse or PN (I know they’re not called that any more but I can’t keep up with the changes) wanders in, checks the beeping machines, jots something down and wafts out, without a word. Glance down the corridor and pretty much everyone at the nurse’s station is staring vacantly at a monitor, oblivious to anyone waiting to ask a question.  Honestly, it seems a pity to disturb those machines sometimes.

So why it should have come as a surprise to me I don’t know. In Ontario apparently there’s some conflict going on about robots in Canada’s operating rooms.  A handful of surgeons are seriously miffed that their toys are going to be taken away from them since there’s no real evidence that they actually work any better than the old-fashioned kind of surgery done by actual humans.

(Reminds me of a conference I attended years ago where a urologist spoke for nearly an hour about some cool new massively expensive “microwave” that was going to revolutionize prostate treatment and do away with prostate cancer forevermore. Well, it’s decades later and I haven’t noticed any revolution or reduction in cancer.  What was noteworthy at the time to me at least was that a later speaker, a woman doctor explaining cervical cancer was all about low-tech clinical matters and how to make the patient comfortable. Never noticed any patients being mentioned alongside the toys. Interestingly, a group of male prostate cancer patients made their own film around that time in which they related their experience(s) in the hope that more men would exercise caution before leaping into the high tech and/or surgical options.)

Granted, tech can be a godsend to the injured and disabled. Some of those new digital prosthetic limbs are truly miraculous. The trouble is when the outcome doesn’t match the time, money and effort it takes to create some piece of hard/software. Take robots in the operating theatre. There has to be some serious proof that the tech is superior to current care. And I don’t think I’m being a Luddite when I say if I was the person who was being cut open I’d just as soon have a thinking, adaptable being leaning over me, one whose intelligence was not artificial thank you very much, versus some robot programmed by a 17-year-old who actually believes everyone’s insides look just like that pictures in Gray’s Anatomy (the book, not the TV show). OK, I’m being facetious; presumably live human doctors have input into the development of whatever boffo artificial intelligence (AI) went into the thing. But, as with the cockpit, one would prefer to grump about legroom secure in the knowledge there are biological beings flying the plane; humans who could react to the unknown based on expertise and experience, not rote. (Highly motivated humans who also understand that they go down with plane so it behooves them to figure out how to set the damn thing down safely, maybe on the Hudson River.)

As anyone (read: all of us) who have banged on a mouse in frustration as a drop down menu or some program insists we haven’t done such-and-such even though we have, seventeen times, and there’s simply no recourse, I can see the day coming when a surgical arm needs rebooting and gosh, nobody’s there.

No doubt AI aficionados will object and tell me (in no uncertain terms) that my understanding of artificial intelligence is flawed and as biased as Will Smith’s character on I Robot. Maybe so. But as someone who read the original Asimov books way back when, I tend to think the potential for harm increases exponentially with the complexity of the task at hand. Big Blue may have won at chess and even some game called “Go” was won by some digital gamer, but surgery and flying a plane aren’t a game.

Artificial intelligence, oh my

Alas, the toys ‘r us crowd are happily moving along, research funding at the ready and PR teams salivating at the thought of the coolness of it all. In fact, any day now in won’t just be a robotic arm in the OR but dead silence as STAR (Smart Tissue Autonomous Robot – who thinks of these acronyms anyway?! ) takes over.

At the moment STAR’s a bit slow: a simple gastric bypass operation that takes a surgeon  approximately eight minutes takes it roughly an hour. But just as chatbots reply to the question you tweeted some airline in that anodyne, generalized robotic tone, soon so will your surgery. Trouble is, things can go wrong in surgery and I’m not so sure STAR is up to the challenge. Then again, if something goes wrong you can always sue the manufacturer.

Honestly, I think I’d rather go with lions and tigers and bears. At least they’re real. And their babies are cute.

Here’s where it gets seriously creepy

I suspect most people have an image of AI and tech based on the movies where our hero/heroine leaps in to land the plane or do that tracheotomy when evil terrorists have taken over but those are, by definition, fiction: scripted, directed, edited. No computer ever crashes on any of those shows – people are being chased by clever villains (who love classical music) but every time they manage to download the vital clue in ten seconds flat and get away. No memory stick eve screws up; nobody ever has an issue finding the right port and naturally no software ever needs rebooting: everything from the GPS on up or down works perfectly. Then there’s the rest of us who are simply trying to order a book on line and have spent the last half hour retyping our home address, only to see that  irritating red pop-up telling us our order can’t go through because our information is incomplete.

Inevitably, at times things get seriously creepy. Pornographic androids and now this female android called Sophie who’s been given Saudi citizenship – I kid you not.  I bet you anything this female, now Saudi, android has rights that real women in that Wahhabi  country don’t have.

Then again, what can one expect from the desert kingdom where the crown prince throws anybody and everybody in jail at whim and plans some robot-run economic free zone for the future?  Maybe Sophie can work there.

Fake news?

Moving on – and keeping to the theme of technology taking over – and androids aside, the so called health news that social media zip over for our eyes only are all too often dead wrong. And potentially dangerous. Not only are these almost inevitably disease oriented (which most minor problems are not) but their advice can be downright dangerous. I see catchy ads asking me to click on some supplement that will make smarter, fitter, thinner and no doubt taller. Needless to say I don’t click but somebody does, otherwise they’d stop posting those ads. Like spam, if nobody ever replied they’d stop doing it.

Research in general has increased exponentially over the last decades, to an extent that’s difficult to conceptualize. What we used to call the Index Medicus in the 20th century was in the tens or, eventually, hundreds of thousands; now we are in numbers too large to calculate (and getting larger all the time). And the sad thing is that most of us don’t appreciate just how much the error rate goes up as the numbers increase.

My friend Frogheart, aka Maryse de la Giroday whose gem of a blog contains everything nano you’d ever want to know passed along this piece to me; it’s about a single issue, breast cancer germ cells used over the last four decades that actually aren’t breast cancer cells at all but melanoma.

The mind boggles at just how much research might have been based on this faulty methodology/thinking and just how many patients and clinicians have been led astray. Notably because breast cancer is such an emotive topic.

Over the years I’ve read many, many research papers and articles; some I’ve been asked to review, others I’ve written about, still others have simply been out of interest (or fury). There’s a lot of bad research out there people. Fake news isn’t just about politics; it’s also about life, health and everything else. So please use caution when you read or hear that “scientists” have found X or Y may help with this or that and don’t add supplements or alter your medications based on faith on science. Science, like pretty much everything else human beings engage in, is not only fallible but subject to the same human foibles as anything else: money, position, power, jealousy, idiocy …  As the geneticist Lewontin once said, scientists come from the same world as the rest of us.

A world that these days is probably binary, digital – and not that bright.



Random Thoughts and Staircase spirits

Time, said Auden, will say nothing but I told you so. Time also gives one the opportunity to brood – darkly – on so many of the idiocies out there in the ever-expanding world of health information.  So here, in no particular order, what’s been making me especially cranky:

Monster under the bed roams city streets  

Diabetes, the latest health scourge to hit the news, is now a City of Vancouver problem, at least according to a headline in a throw-away newspaper I threw away,

“Vancouver to track and attack diabetes”.  With what, one idly wonders. Bicycle spokes dropped on those bicycle lanes? Pointed sticks? Stern warnings? Nothing so mundane it turns out. This, apparently is part of some international initiative (a word that sets my teeth on edge) and creme de la creme cities like Houston, Mexico City, Copenhagen, Shanghai and Tianjin (where?) are on board, tracking “people at risk of diabetes” as part of a campaign to promote “healthier cities”. Curiouser and curiouser. Who knew cities were sentient and could get sick.

So the plan is – what? Skulk behind anyone leaving Starbucks with a large, frothy coffee? Tap anyone who seems a bit plump on the shoulder and read them the health riot act? (Honestly officer, it’s this outfit. Makes me look fat.)

Someone with the unlikely title of managing director of social policy at, one assumes, the City of Vancouver  will start “consultations” with Vancouver Coastal Health and – wait for it – Novo Nordisk, the sponsor of this demented plan.

Of course. Silly us, not to have realized a drug company had to be involved.

Must be diabetes lurking back there in them there bushes….


Novo Nordisk, a nominally Danish but probably multinational drug company almost exclusively manufactures diabetes drugs (oral hypoglycemics) as well some types of insulin. (The old insulin by the way, the non-patent-able kind that came from animal pancreases and was easily tolerated isn’t around any more at least on this continent. Banting, bless him, donated his discovery to the people of the world; he didn’t believe anyone should benefit financially from diabetes. Unfortunately he had no way of knowing that by the late 20th century pretty much anything could be “property”: manufactured and sold, up to and including a person’s genome.)

This diabetes sneak attack has already started up in Houston where they “mapped” various areas (for what one wonders) and went door to door to “educate” people about diabetes. Then, if their numbers don’t match some ideal level no doubt they need some of Novo Nordisk’s boffo drugs. (This class of drugs, by the bye, doesn’t tend to have a long shelf life as they usually are fairly toxic to the liver and quite a few of them have come and gone.) These hapless people will be told to get their fasting glucose and A1C* checked and down the rabbit hole they will go. We will all go.

These days after all it has nothing to do with the actual human being who may be in there somewhere but about the numbers. (There’s an American drug ad that doesn’t even pretend it’s about anything but “bringing your numbers down”.)  I suppose racial profiling could play a part as well, given that, statistically, people of South Asian, Hispanic, Asian and First Nations background may be at greater “risk” – whatever that means.

What few people realize is that this ostensible epidemic of type 2 diabetes sweeping the world has much to do with the continual lowering of inclusion criteria. A few decades ago “normal” glucose levels were around ten. Now they’re about half that. For people over 50 the latter number is especially problematic as close to half of us, as we age, tend to have somewhat higher levels of glucose and if you think about it, it simply makes no sense that a physiologic change that affects close to half the population in a particular demographic is a pathology. It’s what’s called, um, normal.

As for me, if anybody tries to corner me and talk to me about my diabetes risk, I plan to run shrieking into oncoming traffic. At least that’s a risk that makes sense.

Fight them on the Beaches

In that previous story what initially struck me was the term “attack”. As though a glucose level that could potentially be problematic was some kind of enemy – not some fluctuating number based on a myriad factors ranging from weight to diet to sleep. A number that moves up and down depending on the time of day and a host of other factors.

Physiology is dynamic, not that you’d ever know it these days given how mesmerized we are with the numbers.

Oliver Sacks, RIP

Someone who understood the complexities of physiology – and stood up for clinical knowledge and patient narratives – was Oliver Sacks., who died last August.

Physician, author, eccentric and host of oddball characteristics, Sacks wrote some amazing books (Migraine, The Man who Mistook his Wife for a Hat, An Anthropologist from Mars, A Leg too Few are some of the ones I enjoyed reading. Apologies if I got the titles slightly wrong as I’m quoting from memory). Most important, his writing reminded us of the diversity and variation(s) there are between us; not simply the similarities that clinical trials, statistical averages and guidelines exploit. Sick or well we’re all different and, to paraphrase Hippocrates and Osler and other famous sorts, medically the person with the disease matters as much as the disease. Or ought to. Alas, the trajectory of modern medicine whether it’s so-called preventive care, apps or genetics has a tendency to iron out those differences and push us towards some mythical average or “normal” that few of us come close to.

Colourful, thoughtful clinicians like Sacks have become vanishingly rare. Perhaps it was Sacks own differences – Jewish, gay, former biker and user of psychoactive drugs, gefilte fish aficionado – that made him realize just how much one’s personal history and narrative played into one’s physiology. Or just how vital it is for clinicians to listen as well as talk.

Dem bones, dem bones

L’esprit de l’escalier is a French phrase referring to all the pithy remarks one ought to have made but which only come to mind some hours later. Usually as one’s interlocutor is long gone.

So, to the pleasant woman who came up to me after my CAIS (Canadian Association of Independent Scholars) talk last year to ask about vitamin supplements, more specifically calcium, what I omitted to mention was that calcium is not a vitamin, it’s a mineral. An element, if one wants to be pedantic, Ca+ (20 on the Periodic Table). Hence, the “elemental calcium’ you can buy in the drug store.

The notion that we all need to take calcium supplements for our bones is based on somewhat simplistic notion, namely that simply ingesting this mineral will somehow magically increase bone density which we are told we are losing at an alarming rate, especially if we are women over 50. Clever advertising ably preys on our fears of “weak” bones, metaphors being what they are.

Bone is an amazing substance. It is dynamic – the collagen demineralizes and then degrades even as other cells (in sync) remineralize the collagen that has just .. diminished for want of a better word. It ebbs and flows (how else could a broken bone heal?) to achieve a balance; a balance that alters with age. When we are young/growing bone builds to its apex, in our twenties. It then plateaus for a time then, as we pass age 35 or thereabouts we gradually lose bone density. This is what we used to realize was normal development. And the bone in your body differs in form, hardness and elasticity depending on where it is and what it does – the vertebrae in your spine and the long bones in your body are of a different consistency and respond to changes in pressure differently than the ribs or the wrist.

The calcium/Vit D directive has become so engrained however that most people believe what they are doing is somehow maintaining or feeding their bones with supplementation.

But our endocrine system monitors the blood level of calcium and maintains it at our personal set point. One that is different for each person. This means that taking in more calcium is generally pointless as it simply cannot be absorbed. To quote Nortin Hadler, an MD, in his book, The Last Well Person, “If the blood calcium level trends down, vitamin D is converted to an active metabolite, which makes the intestinal absorption of calcium more efficient and vice versa”. More is not better; it’s useless. And potentially harmful as calcium can deposit in joints and other bits. As for vitamin D, it too has a set point that differs in each person; too large doses can build up and become toxic. So, those generic amounts you’re advised to take may or may not apply to you. Probably don’t in fact.

We tend to think that the supplements we take as a kind of top-up to diet, like adding oil to a car or salt to soup. Our bones rely on calcium so we basically assume that bone density is improved by taking supplemental calcium. And since our bones contain calcium, and as we get older our bones become less dense, we should “supplement”. It’s a mechanistic form of thinking about the body, one that took off after the Industrial Revolution when an “engineering mentality” took hold about physiology (in anthropologist Margaret Locke’s term). It certainly doesn’t hurt that the nice people at Bayer (who are taking over the world and now sell everything from vitamins to glucose meters) continually tell us we should. Alas, physiology is rarely so cut and dried and our understanding of how bone (or anything else) works remains primitive.

The real advantage of dietary calcium is when we are young and our bones are developing (in our teens). Unfortunately, short of building a time machine and going back in time there’s not much we can do to reverse the bone mass we accrued before our twenties.

So for now the basics of health remain the same as they were in decades past. Relax, eat well, exercise and stop stressing out about supplements. Most important: stop listening to all that bogus advice out there. If all we do is obsess about our health, our diets, our bodies – well, we won’t actually live any longer but it sure will seem that way.


*A1C is a measure of a red blood cell that is said to provide a “snapshot” of your glucose levels over the previous three months. It’s rather elegant but is still a correlation. A good one to be sure but correlation is not, as we all know, causation.





Civil Scientific Discourse RIP

It’s no secret that I am not fond of hot weather in general and summer in particular. Making me especially cranky at the moment is the hyperbole surrounding the science/non-science discourse, e.g., around childhood diseases like chicken pox or measles, mumps and rubella (the three dire conditions the MMR vaccine is supposed to prevent). The crux  appears to be that either you’re either one of those unscientific, Jenny McCarthy-quoting, loons who believes vaccines causes autism – or you’re a normal, nice, sane person who believes in science. Paradoxically, science appears to have gained the status of a deity in this discourse.

No need to get hysterical about skepticism.

No need to get hysterical about skepticism, Hume might way.

Case in point, a headline last year: “Shun anti-vaccine talk, SFU urged”. Some anti vaccine conference was going to take place at some SFU campus and an angry group of critics were whopping mad lest this event “lend credibility” to this “dangerous quackery”.  This, er, quackery was some symposium where the discussion was on how “families are facing increasingly intense pressure from the vaccine lobby and big government to comply with vaccine mandates” and was  organized by something calling itself the “Vaccine Resistance Movement”. Hardly saving the free world from tyranny but hey, the resistance carries on, large as life and flakier than thou.

The 18th century philosopher David Hume, the granddaddy of skepticism would no doubt be turning in his grave at this hysterical, humourless assault.

BC’s Chief Medical Officer replied in his usual vein: “Vaccines, like any medicine, can have side effects, but the benefits … outweigh the risks,” Which is true. But in the abstract one can  wonder whether suppressing all childhood diseases may perhaps have immune consequences. Especially the trend towards vaccinations against diseases “such as chicken pox which cause only inconvenience rather than danger” in the words of British sociologist and science and technology writer Trevor Pinch. (In Dr. Golem: How to Think About Medicine by Harry Collins and Trevor Pinch, University of Chicago Press, 2005). Especially given the sheer number of jabs (approx. 20 I think) that infants now get.

SFU president Andrew Petter apparently refused to cancel anything, merely saying universities stand for freedom of expression and, as far as I know, the conference went ahead. I have no idea what was discussed but I suspect it was a lot of nonsense. That’s not the point. What’s perturbing is the vitriol of the protesting group and the smug suggestion that if one dares to question the “science” or wonder out loud if these might, just might, have adverse immune or other effects, one has no right to speak. Either you toe the party line or you’re a crazy person. One who should be run out of town on a rail to coin a phrase. (I’ve never been sure why being run out on a rail – which to me implies a train – would be such a bad thing. Personally I am mega fond of trains.)

The photo of the conference protestor indicates that the group (“The Centre for Inquiry”) is just as obscure as the one they’re protesting. Maybe the whole thing was a publicity stunt or performance art, who knows.

Any child not vaccinated against the measles should not be allowed in school, someone firmly said to me last month. Measles can cause deafness and blindness, not to mention encephalitis, someone else said. I mildly agreed, merely pointing out that the numbers on these dire effects in the developed world were actually vanishing small, at least based on the (admittedly limited) research I had done. Buried in the contradictory numbers one small group of children was clearly at risk from measles, namely children undergoing cancer treatment.

Years ago, when I wrote a book on the immune system, I did a bit of desultory research on measles; there was some evidence that a natural bout of measles appears to reduce the incidence of allergies and asthma in later life. (Operative word appears – the data was correlational and based on medical records; there is no way to know for sure if this was cause and effect. Bearing in mind that many health recommendations, e.g., lowering cholesterol, are based on correlation.)

Immunologically measles might have a modulating effect; in a way allowing the immune system to become less inappropriately reactive and reducing the incidence of asthma and allergies or other auto-immune conditions. Perhaps this struck a cord with me because in my own case a natural bout of German measles (rubella) cleared the bad eczema (also an auto immune over reaction) I had suffered since I was two or three. Large, itchy welts covering my legs, arms and face, especially knees and elbows. Then poof, I get sick when I am nine or thereabouts; high fever and whatnot, and my eczema essentially clears. I still occasionally get eczema, usually in reaction to an allergen (like aloe). But, by and large, I’m fine. The research I did years later gave me a context for that (better than my grandmother’s “well, the high fever burned it off” which made the eczema sound like a forest fire – though, come to think of it, that’s not the worst description).

But when I wondered out loud some weeks ago if maybe, maybe, over zealous vaccination programs could have anything to do with the increase in peanut allergies some months ago you’d have thought I had suggested a plot for Criminal Minds. It was speculation, people. I’m not the vaccine police.

I’m not sure quite how this binary, myopic perspective evolved and became so engrained, but it seems now that any questioning of standard medical dogma (““sugar is bad”) ends up as some version of t’is/t’isn’t, t’is/t’is NOT: all the subtle dynamics of a nursery school. Either you’re a feeble minded dweeb who fell for the fraudulent, discredited Wakefield Lancet article linking vaccines with autism (actually GI problems not autism but that’s lost in the mist of rhetoric) – or a sensible, right thinking person who believes in science, good government and iPhones. (As it happens I now have a Blackberry Z10 which I think is far, far superior. Were we to pause for a commercial break.)

Science is a method. Science is fluid, moves forward asking questions and trying to find empirical evidence to back them up. It is not dogmatic or static. It’s not perfect but at this point it’s the best we’ve got. But I guess if you’re going to turn science into a religion then it will end up that way.

Pity, since scientific inquiry was, to a large extent, what dragged us out of the Dark Ages.



Lyme Lies – Ticks me Off

Each season has its own medical threats or so they tell us, so by rights I should be warning you about the flu – but I’ve already done that. Or I could warn you about carnivorous Christmas trees (sorry, old joke c/o the late Chuck Davis who mocked a pamphlet referring to “deciduous and carnivorous* trees”) but I promised you Lyme Disease and Lyme Disease it shall be. As it happens,  to my way of thinking, Lyme and flu may well share an immunologic link: as with the flu, where the virus is spoken of as though it’s a rampaging army, similarly, with Lyme Disease it is that original tick bite that has gained iconic status. Differences (biological, physiological, genetic) between people ironed out in the search for easy answers and someone to blame.


Lyme Disease, for anyone raised by wolves who’s missed the thousands of news items over the last 40 years, is a tick-borne disease that tends to cluster in areas such as New England where there are deer, believed to be the vector. Named after the county in Connecticut where it is said to have originated, Lyme has garnered increasing attention as some patients seem to develop vague but debilitating symptoms usually years after the original infection; symptoms that experts tend to dismiss as psychosomatic and unrelated to Lyme (even as conspiracy theorists maintain these medical denials are a plot and There Be Skullduggery afoot). Maybe aliens are involved, who knows.


(I use the term “disease” here , by the bye, with some disquiet; there seems to be much overlap in descriptions and discussions of Lyme between disease and illness – illness usually being defined as the patient’s experience versus the more objective signs and symptoms which are classified as a disease.)


It all begins with a bull’s eye – usually, maybe, sometimes


Ticks, said, Aristotle, clearly not a fan, are “disgusting and parasitic”. Ugly too. These tiny thumbtack creatures survive by boring into a host organism such as mouse, deer or human and – a la Twilight – sucking its blood. They’re vampires in other words. Once the tick has sunk its, er, fangs some patients develop a rash resembling a target or bull’s eye and a bacterial infection that may or may not have symptoms. This, it is said, results from the tick passing on a rare type of bacterium called a spirochete.  Known as Borrelia burgdorferi (after Willy Burgdorfer, who painstakingly identified spirochete in a tick the early 1980’s) a spirochete looks a bit like a coiled telephone cord, hence its name. I will not bore you with the intricacies of the different types of tick, the link to another disease, babesiosis, a malaria like illness also found in New England, though I could. Believe it or not parasitology is actually quite fascinating.


The problem, at least from a purely scientific perspective, is that the spirochete hypothesis came after the realization that, in most cases, Lyme Disease responded quickly and well to antibiotics. This led researchers to work backwards to find the culprit bacterium. In other words, as physician Robert Aronowitz in Making Sense of Illness (Cambridge University Press, 1998) writes, “To say that the discovery of the Lyme spirochete led to rational treatment is to put the cart before the horse [and] owes more the idealization of the relationship between basic science and therapeutics than to the actual chronology of investigation.” It is, Aronowitz suggests, more like a Texas bull’s-eye: you shoot the gun then draw the bull’s eye around the bullet hole.


This is especially problematic since early antibiotic treatment means that any trace of bacteria are usually wiped out and their existence is more in the abstract than anything else.


If you’re a disease, at least be new, modern and famous


Nevertheless, the narrative that’s evolved around Lyme Disease is as follows, this quote from that recent New Yorker article that sparked my curmudgeonly instincts: “Lyme Disease was all but unknown until 1977 when Allern Steere, a Yale rheumatologist produced the first definitive account of the infection.”  Just one problem. It ain’t necessarily so.


If we want to nitpick (and you know I do), a disease called ECM  (erythema chronicum migrans) which is uncannily similar to Lyme Disease appears in European medical texts as far back as the late 19th century. Also characterized by a bull’s-eye rash (called erythema migrans wouldn’t you know) – ECM, in some people, also appeared to result in flu-like symptoms. It was never definitely demonstrated whether it was a tick (which are also common in northern Europe) or a virus, and since the majority of cases were mild and self limiting, nobody paid that much attention.


Plus, ECM was described by a lowly branch of medicine, dermatology (think, Lars, Phyllis’s husband on the Mary Tyler Moore Show if you can remember that far back). Lyme Disease though, was identified through the exalted ranks of a specialty with more nous, rheumatology and then championed by a group of angry, well-off mothers in New England who were furious that their children seemed to be coming down with some kind of disease nobody knew much about; a disease, moreover, that seemed to mimic rheumatoid arthritis.  Since the focus was children, the media immediately jumped on board (and the ringleader-mother, Polly Murray, appears to have been adept at channeling their interest). There may have been joint pain in the European ECM patients but the patients were all adults, in whom joint pain may well have been considered more or less normal.


But in New England, well, there you had a veritable PR maelstrom: children being bitten by these vampiric insects; distraught mothers and heroic scientists swooping in to figure out what this strange, dire new disease could be.


Why does this matter? It matters because new things, new diseases are always more terrifying than old, known ones. Just as we all relax when we find out the potentially lethal symptoms keeping us up at night are actually shared by three quarters of the people in our office and is just what’s “going around”. But a new disease? Affecting children? With bizarre symptoms? That’s scary. And whenever there are descriptions of disease, incidence of that disease increase.


In the case of Lyme Disease, that interest hasn’t waned, with the end point always the same: a plea for more good science (not that bad kind of science people usually like to do).


Guidelines uber alles


The American Infectious Diseases Society guidelines maintain that Lyme Disease is usually easy to treat and cure. A few weeks of antibiotics does the trick in most cases and relapses are rare. Patients, advocates, as well as some rather strange conspiracy sorts, disagree and here’s where we run into one of my pet peeves, that objective/subjective, disease/illness demarcation that shouldn’t be a problem but all too often is.


Patients and their families and friends, at least in the fairly small number of Lyme sufferers who develop lingering and mysterious symptoms (ranging unpleasant but benign ones like headache and insomnia to weird and wonderful: “joints on fire”, “brain wrapped in a dense fog”), feel that the medical community has deserted them and is ignoring their very real pain, the very real fact that their lives have been horribly affected. As with chronic pain and other conditions that simply defy our reductionist explanations, the rhetoric descends into and either/or proposition. Either the disease exists as explained by the guidelines, or it does not. Either the tick bite leads to dreadful long-term symptoms in everyone – or it does not. Nothing in between.


Which is clearly nonsense.


Terms like “idiopathic” (of unknown origin) or “post” (post-viral, post traumatic) have been coined to describe these symptoms, these patients, mostly because we simply don’t know what to do with them. And by “we” I mean everyone.  Society. The culture at large. (I wrote about our issues around chronic pain in an earlier post.)



The biomedical model simply cannot explain the complexities of human experience, human disease, illness. Not only are there vast differences between individuals in their physical and physiological selves, there are social and cultural and dietary and a myriad others. It is simply not feasible to “fix” every underlying “cause” to get rid of a “disease”. Even infectious disease that we know is caused by a virus or bacterium does not affect everyone. Necessary but not sufficient is the phrase. The TB (or any other) bacterium is necessary for TB but not sufficient. Other factors must be present.


So why is it so difficult to believe that in some people that tick bite, with or without the bull’s eye rash, might lead to long term problems; problems amplified by the individual who also believes there is a problem that needs fixing and whose stress levels rise as a result. After all, if they feel so lousy it must be something terrible – cancer, maybe.


We believe in the magic of medicine so when it fails us we are hurt, angry, disappointed. This explains why Lyme (or chronic fatigue etc.) activists so often sound like such loony tunes. Even as they decry the evils of the medical establishment they search for legitimacy from it, absolution, that what they are feeling is “real”. (Which will also translate into other institutions recognizing said condition which then has other consequences like disability.)  True, there is the odd hypochondriac, Munchhausen’s, factitious patient. But there are also people who suffer from pains and disabilities that medicine cannot explain – and abandons, using the term “psychosomatic” like a cudgel. So what if it’s psychosomatic? All psychosomatic means is that the illness or symptoms originate from the mind, not the body (at least insofar as we can tell – our imaging and tests and so on not being exactly infallible). Who cares where the problem originates when people need help?  Isn’t medicine about exactly that, doing no harm, helping people feel better, function better? It seems logical that some people have the type of immune system that reacts, over time, to some kind of toxic insult, tick related or otherwise. These are the folks who develop rheumatic and other symptoms over time, the ones that medicine refuses to countenance.


What I do not understand is why.  Why does not having a diagnosis, a label, mean you have to deny people even have a problem?  (Some Hon. Members: Shame, Shame.)






* they meant coniferous