Category Archives: Cranky Comments

Random Thoughts and Staircase spirits

Time, said Auden, will say nothing but I told you so. Time also gives one the opportunity to brood – darkly – on so many of the idiocies out there in the ever-expanding world of health information.  So here, in no particular order, what’s been making me especially cranky:

Monster under the bed roams city streets  

Diabetes, the latest health scourge to hit the news, is now a City of Vancouver problem, at least according to a headline in a throw-away newspaper I threw away,

“Vancouver to track and attack diabetes”.  With what, one idly wonders. Bicycle spokes dropped on those bicycle lanes? Pointed sticks? Stern warnings? Nothing so mundane it turns out. This, apparently is part of some international initiative (a word that sets my teeth on edge) and creme de la creme cities like Houston, Mexico City, Copenhagen, Shanghai and Tianjin (where?) are on board, tracking “people at risk of diabetes” as part of a campaign to promote “healthier cities”. Curiouser and curiouser. Who knew cities were sentient and could get sick.

So the plan is – what? Skulk behind anyone leaving Starbucks with a large, frothy coffee? Tap anyone who seems a bit plump on the shoulder and read them the health riot act? (Honestly officer, it’s this outfit. Makes me look fat.)

Someone with the unlikely title of managing director of social policy at, one assumes, the City of Vancouver  will start “consultations” with Vancouver Coastal Health and – wait for it – Novo Nordisk, the sponsor of this demented plan.

Of course. Silly us, not to have realized a drug company had to be involved.

Must be diabetes lurking back there in them there bushes….

 

Novo Nordisk, a nominally Danish but probably multinational drug company almost exclusively manufactures diabetes drugs (oral hypoglycemics) as well some types of insulin. (The old insulin by the way, the non-patent-able kind that came from animal pancreases and was easily tolerated isn’t around any more at least on this continent. Banting, bless him, donated his discovery to the people of the world; he didn’t believe anyone should benefit financially from diabetes. Unfortunately he had no way of knowing that by the late 20th century pretty much anything could be “property”: manufactured and sold, up to and including a person’s genome.)

This diabetes sneak attack has already started up in Houston where they “mapped” various areas (for what one wonders) and went door to door to “educate” people about diabetes. Then, if their numbers don’t match some ideal level no doubt they need some of Novo Nordisk’s boffo drugs. (This class of drugs, by the bye, doesn’t tend to have a long shelf life as they usually are fairly toxic to the liver and quite a few of them have come and gone.) These hapless people will be told to get their fasting glucose and A1C* checked and down the rabbit hole they will go. We will all go.

These days after all it has nothing to do with the actual human being who may be in there somewhere but about the numbers. (There’s an American drug ad that doesn’t even pretend it’s about anything but “bringing your numbers down”.)  I suppose racial profiling could play a part as well, given that, statistically, people of South Asian, Hispanic, Asian and First Nations background may be at greater “risk” – whatever that means.

What few people realize is that this ostensible epidemic of type 2 diabetes sweeping the world has much to do with the continual lowering of inclusion criteria. A few decades ago “normal” glucose levels were around ten. Now they’re about half that. For people over 50 the latter number is especially problematic as close to half of us, as we age, tend to have somewhat higher levels of glucose and if you think about it, it simply makes no sense that a physiologic change that affects close to half the population in a particular demographic is a pathology. It’s what’s called, um, normal.

As for me, if anybody tries to corner me and talk to me about my diabetes risk, I plan to run shrieking into oncoming traffic. At least that’s a risk that makes sense.

Fight them on the Beaches

In that previous story what initially struck me was the term “attack”. As though a glucose level that could potentially be problematic was some kind of enemy – not some fluctuating number based on a myriad factors ranging from weight to diet to sleep. A number that moves up and down depending on the time of day and a host of other factors.

Physiology is dynamic, not that you’d ever know it these days given how mesmerized we are with the numbers.

Oliver Sacks, RIP

Someone who understood the complexities of physiology – and stood up for clinical knowledge and patient narratives – was Oliver Sacks., who died last August.

Physician, author, eccentric and host of oddball characteristics, Sacks wrote some amazing books (Migraine, The Man who Mistook his Wife for a Hat, An Anthropologist from Mars, A Leg too Few are some of the ones I enjoyed reading. Apologies if I got the titles slightly wrong as I’m quoting from memory). Most important, his writing reminded us of the diversity and variation(s) there are between us; not simply the similarities that clinical trials, statistical averages and guidelines exploit. Sick or well we’re all different and, to paraphrase Hippocrates and Osler and other famous sorts, medically the person with the disease matters as much as the disease. Or ought to. Alas, the trajectory of modern medicine whether it’s so-called preventive care, apps or genetics has a tendency to iron out those differences and push us towards some mythical average or “normal” that few of us come close to.

Colourful, thoughtful clinicians like Sacks have become vanishingly rare. Perhaps it was Sacks own differences – Jewish, gay, former biker and user of psychoactive drugs, gefilte fish aficionado – that made him realize just how much one’s personal history and narrative played into one’s physiology. Or just how vital it is for clinicians to listen as well as talk.

Dem bones, dem bones

L’esprit de l’escalier is a French phrase referring to all the pithy remarks one ought to have made but which only come to mind some hours later. Usually as one’s interlocutor is long gone.

So, to the pleasant woman who came up to me after my CAIS (Canadian Association of Independent Scholars) talk last year to ask about vitamin supplements, more specifically calcium, what I omitted to mention was that calcium is not a vitamin, it’s a mineral. An element, if one wants to be pedantic, Ca+ (20 on the Periodic Table). Hence, the “elemental calcium’ you can buy in the drug store.

The notion that we all need to take calcium supplements for our bones is based on somewhat simplistic notion, namely that simply ingesting this mineral will somehow magically increase bone density which we are told we are losing at an alarming rate, especially if we are women over 50. Clever advertising ably preys on our fears of “weak” bones, metaphors being what they are.

Bone is an amazing substance. It is dynamic – the collagen demineralizes and then degrades even as other cells (in sync) remineralize the collagen that has just .. diminished for want of a better word. It ebbs and flows (how else could a broken bone heal?) to achieve a balance; a balance that alters with age. When we are young/growing bone builds to its apex, in our twenties. It then plateaus for a time then, as we pass age 35 or thereabouts we gradually lose bone density. This is what we used to realize was normal development. And the bone in your body differs in form, hardness and elasticity depending on where it is and what it does – the vertebrae in your spine and the long bones in your body are of a different consistency and respond to changes in pressure differently than the ribs or the wrist.

The calcium/Vit D directive has become so engrained however that most people believe what they are doing is somehow maintaining or feeding their bones with supplementation.

But our endocrine system monitors the blood level of calcium and maintains it at our personal set point. One that is different for each person. This means that taking in more calcium is generally pointless as it simply cannot be absorbed. To quote Nortin Hadler, an MD, in his book, The Last Well Person, “If the blood calcium level trends down, vitamin D is converted to an active metabolite, which makes the intestinal absorption of calcium more efficient and vice versa”. More is not better; it’s useless. And potentially harmful as calcium can deposit in joints and other bits. As for vitamin D, it too has a set point that differs in each person; too large doses can build up and become toxic. So, those generic amounts you’re advised to take may or may not apply to you. Probably don’t in fact.

We tend to think that the supplements we take as a kind of top-up to diet, like adding oil to a car or salt to soup. Our bones rely on calcium so we basically assume that bone density is improved by taking supplemental calcium. And since our bones contain calcium, and as we get older our bones become less dense, we should “supplement”. It’s a mechanistic form of thinking about the body, one that took off after the Industrial Revolution when an “engineering mentality” took hold about physiology (in anthropologist Margaret Locke’s term). It certainly doesn’t hurt that the nice people at Bayer (who are taking over the world and now sell everything from vitamins to glucose meters) continually tell us we should. Alas, physiology is rarely so cut and dried and our understanding of how bone (or anything else) works remains primitive.

The real advantage of dietary calcium is when we are young and our bones are developing (in our teens). Unfortunately, short of building a time machine and going back in time there’s not much we can do to reverse the bone mass we accrued before our twenties.

So for now the basics of health remain the same as they were in decades past. Relax, eat well, exercise and stop stressing out about supplements. Most important: stop listening to all that bogus advice out there. If all we do is obsess about our health, our diets, our bodies – well, we won’t actually live any longer but it sure will seem that way.

 

*A1C is a measure of a red blood cell that is said to provide a “snapshot” of your glucose levels over the previous three months. It’s rather elegant but is still a correlation. A good one to be sure but correlation is not, as we all know, causation.

 

 

 

 

Civil Scientific Discourse RIP

It’s no secret that I am not fond of hot weather in general and summer in particular. Making me especially cranky at the moment is the hyperbole surrounding the science/non-science discourse, e.g., around childhood diseases like chicken pox or measles, mumps and rubella (the three dire conditions the MMR vaccine is supposed to prevent). The crux  appears to be that either you’re either one of those unscientific, Jenny McCarthy-quoting, loons who believes vaccines causes autism – or you’re a normal, nice, sane person who believes in science. Paradoxically, science appears to have gained the status of a deity in this discourse.

No need to get hysterical about skepticism.

No need to get hysterical about skepticism, Hume might way.

Case in point, a headline last year: “Shun anti-vaccine talk, SFU urged”. Some anti vaccine conference was going to take place at some SFU campus and an angry group of critics were whopping mad lest this event “lend credibility” to this “dangerous quackery”.  This, er, quackery was some symposium where the discussion was on how “families are facing increasingly intense pressure from the vaccine lobby and big government to comply with vaccine mandates” and was  organized by something calling itself the “Vaccine Resistance Movement”. Hardly saving the free world from tyranny but hey, the resistance carries on, large as life and flakier than thou.

The 18th century philosopher David Hume, the granddaddy of skepticism would no doubt be turning in his grave at this hysterical, humourless assault.

BC’s Chief Medical Officer replied in his usual vein: “Vaccines, like any medicine, can have side effects, but the benefits … outweigh the risks,” Which is true. But in the abstract one can  wonder whether suppressing all childhood diseases may perhaps have immune consequences. Especially the trend towards vaccinations against diseases “such as chicken pox which cause only inconvenience rather than danger” in the words of British sociologist and science and technology writer Trevor Pinch. (In Dr. Golem: How to Think About Medicine by Harry Collins and Trevor Pinch, University of Chicago Press, 2005). Especially given the sheer number of jabs (approx. 20 I think) that infants now get.

SFU president Andrew Petter apparently refused to cancel anything, merely saying universities stand for freedom of expression and, as far as I know, the conference went ahead. I have no idea what was discussed but I suspect it was a lot of nonsense. That’s not the point. What’s perturbing is the vitriol of the protesting group and the smug suggestion that if one dares to question the “science” or wonder out loud if these might, just might, have adverse immune or other effects, one has no right to speak. Either you toe the party line or you’re a crazy person. One who should be run out of town on a rail to coin a phrase. (I’ve never been sure why being run out on a rail – which to me implies a train – would be such a bad thing. Personally I am mega fond of trains.)

The photo of the conference protestor indicates that the group (“The Centre for Inquiry”) is just as obscure as the one they’re protesting. Maybe the whole thing was a publicity stunt or performance art, who knows.

Any child not vaccinated against the measles should not be allowed in school, someone firmly said to me last month. Measles can cause deafness and blindness, not to mention encephalitis, someone else said. I mildly agreed, merely pointing out that the numbers on these dire effects in the developed world were actually vanishing small, at least based on the (admittedly limited) research I had done. Buried in the contradictory numbers one small group of children was clearly at risk from measles, namely children undergoing cancer treatment.

Years ago, when I wrote a book on the immune system, I did a bit of desultory research on measles; there was some evidence that a natural bout of measles appears to reduce the incidence of allergies and asthma in later life. (Operative word appears – the data was correlational and based on medical records; there is no way to know for sure if this was cause and effect. Bearing in mind that many health recommendations, e.g., lowering cholesterol, are based on correlation.)

Immunologically measles might have a modulating effect; in a way allowing the immune system to become less inappropriately reactive and reducing the incidence of asthma and allergies or other auto-immune conditions. Perhaps this struck a cord with me because in my own case a natural bout of German measles (rubella) cleared the bad eczema (also an auto immune over reaction) I had suffered since I was two or three. Large, itchy welts covering my legs, arms and face, especially knees and elbows. Then poof, I get sick when I am nine or thereabouts; high fever and whatnot, and my eczema essentially clears. I still occasionally get eczema, usually in reaction to an allergen (like aloe). But, by and large, I’m fine. The research I did years later gave me a context for that (better than my grandmother’s “well, the high fever burned it off” which made the eczema sound like a forest fire – though, come to think of it, that’s not the worst description).

But when I wondered out loud some weeks ago if maybe, maybe, over zealous vaccination programs could have anything to do with the increase in peanut allergies some months ago you’d have thought I had suggested a plot for Criminal Minds. It was speculation, people. I’m not the vaccine police.

I’m not sure quite how this binary, myopic perspective evolved and became so engrained, but it seems now that any questioning of standard medical dogma (““sugar is bad”) ends up as some version of t’is/t’isn’t, t’is/t’is NOT: all the subtle dynamics of a nursery school. Either you’re a feeble minded dweeb who fell for the fraudulent, discredited Wakefield Lancet article linking vaccines with autism (actually GI problems not autism but that’s lost in the mist of rhetoric) – or a sensible, right thinking person who believes in science, good government and iPhones. (As it happens I now have a Blackberry Z10 which I think is far, far superior. Were we to pause for a commercial break.)

Science is a method. Science is fluid, moves forward asking questions and trying to find empirical evidence to back them up. It is not dogmatic or static. It’s not perfect but at this point it’s the best we’ve got. But I guess if you’re going to turn science into a religion then it will end up that way.

Pity, since scientific inquiry was, to a large extent, what dragged us out of the Dark Ages.

 

 

Lyme Lies – Ticks me Off

Each season has its own medical threats or so they tell us, so by rights I should be warning you about the flu – but I’ve already done that. Or I could warn you about carnivorous Christmas trees (sorry, old joke c/o the late Chuck Davis who mocked a pamphlet referring to “deciduous and carnivorous* trees”) but I promised you Lyme Disease and Lyme Disease it shall be. As it happens,  to my way of thinking, Lyme and flu may well share an immunologic link: as with the flu, where the virus is spoken of as though it’s a rampaging army, similarly, with Lyme Disease it is that original tick bite that has gained iconic status. Differences (biological, physiological, genetic) between people ironed out in the search for easy answers and someone to blame.

 

Lyme Disease, for anyone raised by wolves who’s missed the thousands of news items over the last 40 years, is a tick-borne disease that tends to cluster in areas such as New England where there are deer, believed to be the vector. Named after the county in Connecticut where it is said to have originated, Lyme has garnered increasing attention as some patients seem to develop vague but debilitating symptoms usually years after the original infection; symptoms that experts tend to dismiss as psychosomatic and unrelated to Lyme (even as conspiracy theorists maintain these medical denials are a plot and There Be Skullduggery afoot). Maybe aliens are involved, who knows.

 

(I use the term “disease” here , by the bye, with some disquiet; there seems to be much overlap in descriptions and discussions of Lyme between disease and illness – illness usually being defined as the patient’s experience versus the more objective signs and symptoms which are classified as a disease.)

 

It all begins with a bull’s eye – usually, maybe, sometimes

 

Ticks, said, Aristotle, clearly not a fan, are “disgusting and parasitic”. Ugly too. These tiny thumbtack creatures survive by boring into a host organism such as mouse, deer or human and – a la Twilight – sucking its blood. They’re vampires in other words. Once the tick has sunk its, er, fangs some patients develop a rash resembling a target or bull’s eye and a bacterial infection that may or may not have symptoms. This, it is said, results from the tick passing on a rare type of bacterium called a spirochete.  Known as Borrelia burgdorferi (after Willy Burgdorfer, who painstakingly identified spirochete in a tick the early 1980’s) a spirochete looks a bit like a coiled telephone cord, hence its name. I will not bore you with the intricacies of the different types of tick, the link to another disease, babesiosis, a malaria like illness also found in New England, though I could. Believe it or not parasitology is actually quite fascinating.

 

The problem, at least from a purely scientific perspective, is that the spirochete hypothesis came after the realization that, in most cases, Lyme Disease responded quickly and well to antibiotics. This led researchers to work backwards to find the culprit bacterium. In other words, as physician Robert Aronowitz in Making Sense of Illness (Cambridge University Press, 1998) writes, “To say that the discovery of the Lyme spirochete led to rational treatment is to put the cart before the horse [and] owes more the idealization of the relationship between basic science and therapeutics than to the actual chronology of investigation.” It is, Aronowitz suggests, more like a Texas bull’s-eye: you shoot the gun then draw the bull’s eye around the bullet hole.

 

This is especially problematic since early antibiotic treatment means that any trace of bacteria are usually wiped out and their existence is more in the abstract than anything else.

 

If you’re a disease, at least be new, modern and famous

 

Nevertheless, the narrative that’s evolved around Lyme Disease is as follows, this quote from that recent New Yorker article that sparked my curmudgeonly instincts: “Lyme Disease was all but unknown until 1977 when Allern Steere, a Yale rheumatologist produced the first definitive account of the infection.”  Just one problem. It ain’t necessarily so.

 

If we want to nitpick (and you know I do), a disease called ECM  (erythema chronicum migrans) which is uncannily similar to Lyme Disease appears in European medical texts as far back as the late 19th century. Also characterized by a bull’s-eye rash (called erythema migrans wouldn’t you know) – ECM, in some people, also appeared to result in flu-like symptoms. It was never definitely demonstrated whether it was a tick (which are also common in northern Europe) or a virus, and since the majority of cases were mild and self limiting, nobody paid that much attention.

 

Plus, ECM was described by a lowly branch of medicine, dermatology (think, Lars, Phyllis’s husband on the Mary Tyler Moore Show if you can remember that far back). Lyme Disease though, was identified through the exalted ranks of a specialty with more nous, rheumatology and then championed by a group of angry, well-off mothers in New England who were furious that their children seemed to be coming down with some kind of disease nobody knew much about; a disease, moreover, that seemed to mimic rheumatoid arthritis.  Since the focus was children, the media immediately jumped on board (and the ringleader-mother, Polly Murray, appears to have been adept at channeling their interest). There may have been joint pain in the European ECM patients but the patients were all adults, in whom joint pain may well have been considered more or less normal.

 

But in New England, well, there you had a veritable PR maelstrom: children being bitten by these vampiric insects; distraught mothers and heroic scientists swooping in to figure out what this strange, dire new disease could be.

 

Why does this matter? It matters because new things, new diseases are always more terrifying than old, known ones. Just as we all relax when we find out the potentially lethal symptoms keeping us up at night are actually shared by three quarters of the people in our office and is just what’s “going around”. But a new disease? Affecting children? With bizarre symptoms? That’s scary. And whenever there are descriptions of disease, incidence of that disease increase.

 

In the case of Lyme Disease, that interest hasn’t waned, with the end point always the same: a plea for more good science (not that bad kind of science people usually like to do).

 

Guidelines uber alles

 

The American Infectious Diseases Society guidelines maintain that Lyme Disease is usually easy to treat and cure. A few weeks of antibiotics does the trick in most cases and relapses are rare. Patients, advocates, as well as some rather strange conspiracy sorts, disagree and here’s where we run into one of my pet peeves, that objective/subjective, disease/illness demarcation that shouldn’t be a problem but all too often is.

 

Patients and their families and friends, at least in the fairly small number of Lyme sufferers who develop lingering and mysterious symptoms (ranging unpleasant but benign ones like headache and insomnia to weird and wonderful: “joints on fire”, “brain wrapped in a dense fog”), feel that the medical community has deserted them and is ignoring their very real pain, the very real fact that their lives have been horribly affected. As with chronic pain and other conditions that simply defy our reductionist explanations, the rhetoric descends into and either/or proposition. Either the disease exists as explained by the guidelines, or it does not. Either the tick bite leads to dreadful long-term symptoms in everyone – or it does not. Nothing in between.

 

Which is clearly nonsense.

 

Terms like “idiopathic” (of unknown origin) or “post” (post-viral, post traumatic) have been coined to describe these symptoms, these patients, mostly because we simply don’t know what to do with them. And by “we” I mean everyone.  Society. The culture at large. (I wrote about our issues around chronic pain in an earlier post.)

 

 

The biomedical model simply cannot explain the complexities of human experience, human disease, illness. Not only are there vast differences between individuals in their physical and physiological selves, there are social and cultural and dietary and a myriad others. It is simply not feasible to “fix” every underlying “cause” to get rid of a “disease”. Even infectious disease that we know is caused by a virus or bacterium does not affect everyone. Necessary but not sufficient is the phrase. The TB (or any other) bacterium is necessary for TB but not sufficient. Other factors must be present.

 

So why is it so difficult to believe that in some people that tick bite, with or without the bull’s eye rash, might lead to long term problems; problems amplified by the individual who also believes there is a problem that needs fixing and whose stress levels rise as a result. After all, if they feel so lousy it must be something terrible – cancer, maybe.

 

We believe in the magic of medicine so when it fails us we are hurt, angry, disappointed. This explains why Lyme (or chronic fatigue etc.) activists so often sound like such loony tunes. Even as they decry the evils of the medical establishment they search for legitimacy from it, absolution, that what they are feeling is “real”. (Which will also translate into other institutions recognizing said condition which then has other consequences like disability.)  True, there is the odd hypochondriac, Munchhausen’s, factitious patient. But there are also people who suffer from pains and disabilities that medicine cannot explain – and abandons, using the term “psychosomatic” like a cudgel. So what if it’s psychosomatic? All psychosomatic means is that the illness or symptoms originate from the mind, not the body (at least insofar as we can tell – our imaging and tests and so on not being exactly infallible). Who cares where the problem originates when people need help?  Isn’t medicine about exactly that, doing no harm, helping people feel better, function better? It seems logical that some people have the type of immune system that reacts, over time, to some kind of toxic insult, tick related or otherwise. These are the folks who develop rheumatic and other symptoms over time, the ones that medicine refuses to countenance.

 

What I do not understand is why.  Why does not having a diagnosis, a label, mean you have to deny people even have a problem?  (Some Hon. Members: Shame, Shame.)

 

 

 

 

 

* they meant coniferous

Why cats make the worst patients (and the dog ate my homework)

Charlie stopping to smell the flowers in healthier times

Charlie stopping to smell the flowers in healthier times

Charlie, one of the cats, was seriously ill and Lyme Disease (which was the designated subject for this post) went clear out of my head. It shall return. Meanwhile, I’ve been nursing Charlie, aka Houdini cat (who will literally disappear into the towel you think you’ve wrapped around him securely), reminding myself that nursing is a noble, noble profession. (That’s what you call professions that are bloody hard and nobody appreciates.) I’ll say one thing, taking a cat to the animal hospital does give one a quick lesson in the perils of for-profit medicine (my Visa may never recover) – especially in our risk-obsessed age where tests and scans trump individual history, personality and symptoms (human or animal). It also reminded me that one must be vigilant when faced with the ponderousness of Expertise.

In Charlie’s case it began with a neurological condition called Horner’s, an irritation of a nerve running down one side of the face, eye and down the neck and into the chest – not a disease but a symptom. Naturally Expertise immediately rushed to the worst possible diagnosis: lymphoma, or, in a pinch, brain tumour. (Do not pass ‘go’, just head for the hills.) I mildly posited inflammation or infection, probably ear related, particularly since Charlie’s had those before. But noooo.

Critical Care, human or animal, is rife with Expertise: grave, gravel toned and confident. Why? Because they have tech toys, that’s why. Cool devices and imaging technologies that purport to explain the mysteries of life. Even (ha ha) a cat scan. All of which push the patient into ever higher levels of care – because they can. Problem is, the patient often can’t.

I tried to hold my ground but it’s a slippery slope that one; the surer they are the more one caves, especially when they start to say, well, with cats elevated white blood cell count could mean X. I mean, what do I know from cat physiology?

So the cash register tinged and Charlie looked steadily worse. Of course nobody looks good in ICU between the ugly fluorescent light and the tubes but there’s something especially pathetic about a small furry creature sitting in a cage. And Charlie, well that cat could have taught Stanislavsky a thing or two about looking sad.

I kept getting calls to tell me things I already knew (he has a heart murmur). The last time I snapped, “I know. I have one too. Big deal.” That didn’t, naturally stop them from getting a cardiology consult. Bearing in mind that cats don’t hold still for much of these so need to be anaesthetized.

Finally, after every possible dire diagnosis had been ruled out, we came round to my original hypothesis: ear infection.

Don’t get me wrong. I have enormous respect for veterinary physicians. They study long and hard (far longer than human doctors) and by and large they are great. They deal with a diverse patient population who’s uncooperative and uncommunicative. And when I say diverse I’m talking species. And they need to make a living, I get that.

What they, and most of us, do not get however is that they are part of the culture at large and the culture at large is obsessed with the “science “ of medicine, leaving the art further and further behind. Watching Charlie work his way through the system reminded me of just how much medical focus has shifted away from the patient and towards disease, technology; towards what tend to be called “objective” results (versus the messy subjective ones patients bring).

I see this on a human level very time I go to the retinologist with my mother (that, by the bye, is a sub-specialty of ophthalmology). First, they get her to read the letters on the chart and are all impressed at how well she sees. Then they take their pictures and look grave: how could she possibly see that well with those terrible ridges in her retina? (To me they just look like the Alps.) Then they look puzzled. Scan says you can’t but you actually did see. What a gonzo dilemma. So, they go with the scan and give her the medication. Objective trumps subjective.

Question is, should it? Does it make sense for the patient to get lost in this morass of ostensibly objective ‘data’?

Not to my way of thinking. “Normal” – blood pressure, lipid level, whatever – is a best-guess average based on population statistics and what some committee has deemed appropriate. If you’re truly sick it shows up. C-reactive protein in the clouds – well, objective and subjective tend to match. Your joints hurt, you have some kind of inflammatory condition and the test backs you up. It’s that grey zone that’s problematic. Levels fluctuate in every individual, tests can be wrong (some more than others).  Error rates in some tests are as high as 75%. But we forget that.

So, cat or human we are lumped in with the many-too-many – and our individual narrative gets lost. In Charlie’s case nobody believed this pretty little cat who had only been ailing for a week could possibly “just” have a madly inflamed  ear affecting his balance and appetite. An infection is no picnic. But it’s not a brain tumour. And of course Charlie’s Oscar winning ability to look mournful didn’t help. This cat can look sad when he feels ignored; imagine how dreadful he looked when he was dizzy and queasy. It’s a gift. But it’s not diagnostic.

You need a proper history; the back story. The person with the disease is as important as the disease, said Hippocrates. Let’s say you end up in hospital with severe abdominal pain. It matters whether you’ve had this pain before, but less intense or of shorter duration. Sudden abdominal pain could be many dire things; a worsening of an existing problem is probably nothing that will kill you (otherwise you wouldn’t be in the ER in the first place). The clinical picture changes with the history. Someone has to factor it in.

Charlie’s doing better now. As for the rest of us – who knows. We may never survive the tech age.

Summer Reruns

Everyone may stream their entertainment on their teeny tiny phones but that’s just the tech; without a good story summer still means reruns.

So in the fine old tradition of reruns, I give you  a recap of the summertime blues.

(Coming soon, Lyme Disease. Stay tuned.)

Pity Pity Bang Bang

It’s depressing, how often these ‘incidents’ involving high-powered weapons seem to occur in the United States, where there are almost as many guns as people. Everyone cried about Sandy Hook but there have easily been four or five more that I’ve read references to in the paper.

I couldn’t believe it when I read that federal health agencies cannot comment on the public health consequences of guns; haven’t been able to do so for well over 20 years.

I wrote on this in January 2011 and frankly, don’t see there’s much to add … Except that, at least according to this piece in the  New England Journal of Medicine such fatalities are actually in the minority; the majority of gun deaths occur quietly among family members and people one knows.

Hey, I get that. You get mad, you want to hit someone – if you have a gun, it’s damn easy to pick it up and shoot. But if all you have on hand is a stick or even a knife, well, you can do some damage but the death toll usually doesn’t rise to the double digits.

Just last week some fellow went postal in a downtown apartment building in Vancouver. I know the place, I used to live a block from there. Reports are mixed but there appears to have been a knife and a hammer involved. Several people were hurt but only one remains in critical condition in hospital. The rest were kept overnight and let go. They’re fine. Which they would not have been had the perpetrator carried an M16.

Guns don’t kill people, people do? Ah, no. People with guns kill people. People with sticks and stones, not so much.

 

 

I stress, Eustress

It’s become such an ubiquitous concept that it’s difficult to imagine how recent a term “stress” really is. When Hans Selye first proposed that all tension, all sources of anxiety created the same kind of reaction within the body it seemed ridiculous. And this was in the fifties to the best of my recollection.  Of course now we all know that too much stress is bad for us and that stress is a factor in disease.

[So so many things that we take for granted – cardiac risk factors, prevention, stress – are such recent concepts. But we think they’ve been around since the year dot.]

Stress is hard on the immune system, affects us hormonally and causes muscle tension and fatigue. It gets in the way of sleep, which causes its own set of problems ranging from poor concentration to anxiety, and depresses normal pain signals. Which is why soldiers and athletes often don’t feel the pain of a major injury and it is only later that they realize they’ve damaged something.

Then there’s the good side of stress, or eustress. Without some stress we would have zero motivation, zero reason to excel or create. That’s why there’s that old graph showing how some stress is good before a major task, say an exam; with some stress performance gets better. But, if it gets too high then performance suffers.

Which we all know from our own experience.

Wandering around Paris what strikes me as well is the extent to which our actual, physical environment can create or reduce stress. When what is around us is beautiful, when we hear laughter, when the sun is shining – well, it’s hard to feel to unhappy or stressed. No accident that depressed areas inevitably are ugly.

It’s hard to be too stressed when one is a tourist in Paris, well, unless one tries too hard to make the French conform to one’s North American ideas of time, speed and interaction.  Personally, something that I think is rather wonderful here is the very formal aspect of saying ‘bonjour’ whenever one walks into a place, any place. It is a way, I think, of humanizing the service person, the waiter, the person in the store. When one stops to say ‘bonjour madame’ or ‘bonjour monsieur’, one has to pause and look at the person and realize this person is not simply part of the scenery, they are an actual human being. It adds a touch of humanity to what is often a rather soulless encounter.

The French are currently pilloried for their dislike of capitalism, their failing economy, their rising youth unemployment.  Several august bodies are miffed that in spite of all of this money markets still love France, which can borrow money at brilliantly low rates, which suggests they’re not worried about France’s future. There’s a palpable sense of outrage about this on the part of business writers, The Economist, various commentators – usually Anglo Saxon. Why? Why does everyone need to conform to the same ideas?

The French fought a revolution which had at its basis the value of the human being.  Extreme wealth, especially ostentatious wealth, is frowned on in France. I can think of worse things.

In any event, given the stress we all experience when all we focus on is money and making more of it, it seems to me that the French are on to something.

Beware the Bandersnatch my son (aka the “link”)

If I read the word “link” one more time in some ostensibly serious health article I will – well, let’s just say that like Dorothy Parker’s Tonstant Weader I will thwow up.

Looks like a Bandersnatch to me …

Last week “scientists” apparently linked one’s gait as one aged to one’s likelihood of developing Alzheimer’s. Yet another observational study, casting about for some connection to something; naturally they eventually found some tenuous connection somewhere – at least one that they could write a press release about.

(As a researcher once described estrogen – “a drug in search of a disease”.)

No mention of whether this gait thing might have had something to do with other, perhaps undiagnosed, problems such as osteoarthritis or inner ear issues or what-have-you. No, one more thing for us to worry about as we get older – our damn gait.

Earlier headlines with that vile word “link” (plus variations like “linked”, “linking” and so on) always seem to be in the headline, which, of course, is what most people read. So we read that higher levels of Vitamin D3 are linked to all manner of marvelous things, from not getting cancer and heart disease to staying young and sharp and simply mah-velous. Never mind that when you simply test people who are well and compare them to people who are not, measure their “level” of D3 (as though all of us have the same ideal level) and then say, ‘oh, look, high D means better health so why don’t we all take a supplement” you have no way of knowing which came first, the good health or the D3. For all we know, various diseases deplete the body of D3 and the lack of the vitamin is not the cause of the problem but its consequence.

A number of more cautious researchers have been saying exactly this, to no avail. Various and sundry institutions from the Cancer Agency to the WHO have all decided to chime in with their recommendations that people take supplements.

This same kind of nonsense proliferated in the talk around estrogen for pretty much most of the 20th century.  Researchers gushed that women who took estrogen “replacement” therapy (later “hormone replacement therapy” or HRT after it was found that estrogen alone could cause endometrial cancer) kept women young and healthy and prevented heart disease and dementia and probably hives and hangnails.

Replacement is in quotes earlier, incidentally, because it makes no sense to consider the hormone level of a woman of 23 normal for a woman at all other stages of life, particularly midlife, when all women’s hormones naturally decline.

Observational study upon observational study found a correlation (“link”)  between women who took hormones and improved cardiac function, fewer heart attacks and strokes, better health, you-name-it.  Well, except for the smidgeon of extra risk relating to breast cancer which epidemiologists dismissed as irrelevant. Of course this was not irrelevant to women, who didn’t rush to take hormones in droves, much to the researchers’ dismay.

Then the other show dropped. The largest clinical trial in history, the Women’s Health Initiative definitely showed that not only did estrogen not protect women from various and sundry age-related conditions, it actually could cause them.  Cardiac disease was higher in women who took hormones and there was nothing “healthy” about HRT at all.

But hey, they had studies that “linked” estrogen use with health and who were we to argue?

A lot of people ask me about supplements, Calcium and D3, this and that, largely, I think, because of those headlines linking this and that arcane nutrient with health. Which is where my problem with all of this lies.

You can print whatever nonsense you want, provided you don’t make it sound as though you know what you’re talking about. Especially in the headline. People actually change their behavior based on these things. People start taking things, adding things, subtracting things. Forgetting that health is multifactorial, complex and begins in the womb.

You won’t have strong bones as an adult if you were malnourished as a child. Wealth tends to lead to health. People are different. And the nutrients we ingest in food are in a balance and ratio that the body can absorb. Versus our best-guess estimate of what an ideal amount of D3 or B3 or T3* might be.

So beware the dreaded link as though it were the bandersnatch. On average, I think the latter is more benign.

 

*Tylenol 3

Voodoo Medical Science

Where to begin, where to begin.  I get busy with end-of-semester things and head out of town for a few weeks and poof! Bloody chaos.

Women’s reproductive rights suddenly back on the table in the U.S. and the legality of abortion tabled in the House of Commons here as a private bill.  Good grief. Was that plane I took the one in that Twilight Zone episode; the one that goes through the clouds and goes back in time?  More idiocy in the Commons, with this ludicrous Omnibus bill as they’re calling it.  Long guns taken out of the registry which means that automatic weapons can more readily be sold in Canada.  And of course zombie killers. (OK, that last one was ghoulishly interesting, I have to confess.)

And in health care news, as always some bright lights insisting they know what’s best – most recently a report from researchers at McGill (the term researchers usually being code for statisticians) expressing shock, shock I tell you, that drugs are used off-label when this lacks “scientific support”.

Um, OK. So what scientific support would that be? Drug company funded clinical trials – given that all other funding has been cut to the bone? Or do they actually mean data which, I would remind you, does not equal knowledge and can be massaged, manipulated and moulded to fit the theory-du-jour.

One class of drugs these experts took exception to was the use of anti-psychotics in situations where no clinical trials had been done. Years ago a physician friend of mine discovered that one of the anti-psychotics, quitiepine I think, seemed to help a patient with Huntington’s with some of her more onerous symptoms. But of course Pharmacare wouldn’t pay for it because – yup, you guessed it – there was no “evidence” that it worked for Huntington’s.  And naturally we all know that everyone, especially drug companies, are lining up to do an expensive drug trial with a teensy subgroup of patients with a rare, fatal, genetic disease ….

Needless to say, there’s never going to be “scientific support” for this. A point these McGill researchers who’ve clearly never had to deal with an actual patient don’t appear to have twigged to.

Research, clinical trials are expensive, time consuming and difficult to do. Who in their right mind is going to fund one for an old drug that’s no longer on patent that’s been around forever – but that still helps a lot of people? Not going to happen.

The pendulum has so swung so far, moreover, in favour of the stats and the algorithms and the “evidence” that everybody from Obama to your pharmacist to that nice young doctor in the clinic down the road honestly believe that medicine is a science and if we could just figure out the right questions to ask and do the right research (which  angels – taking time out from their dancing-on-a-pin thing – would fund) then All Would Be Revealed and we would all live happily and healthily ever after. As if.

What few people realize, alas, is that the bedrock of “scientific” medicine, the clinical trial, is very recent –though to hear people ramble on about  it you’d swear the dratted thing was on one of those tablets Moses brought down with him.

1948. That’s when the first official clinical trial was conducted: by the first medical statistician on record, Bradford Hill, who gave one group of patients with TB streptomycin (then a very new drug) and another group nothing. The idea took off and before his death in the 1990’s Hill’s book on medical stats (Principles of Medical Statistics) was in its 12th printing.

Hill was no dummy though and realized he’d created a monster. He backtracked. Where once he’d exhorted statisticians to “rise from their humble place” to help medicine become more scientific through the clever application of numbers he suggested we should “relax and reflect”; that such single-mindedness could easily lead to poor patient care: “cookbook medicine”.  It would be better, Hill wrote, if clinical trials were designed to “promote rather than hinder the traditional method in medicine of acute observation … by the clinician at the bedside”.  (All quotes from Richard Horton, the editor of The Lancet writing in 2000 in the journal Statistics in Medicine, “Common Sense and Figures: the rhetoric of validity in medicine” Vol. 19, pp 3149-64)

Probably what Hill had not appreciated in the early fifties as he began his little crusade was the extent to which post-war enthusiasms, technological advances and various social, political and institutional changes – ranging from the ascendancy of the United States to the shifts in finance, corporate influence and law – would transform his notions into a paint-by-numbers fiasco.  Biomarkers and surrogate end points (blood sugar, cholesterol, blood pressure, bone density) would reign supreme and you could feel perfectly fine but be told you had minutes to live.

Then it was EBM guru, David Sackett who picked up where Hill left off. Ably assisted by the new profession of health economics whose sole purpose it was to assist payers (like HMO’s in the U.S. and governments of countries with public health care) cut costs (and realized this statistical scientific rhetoric could aid their cause), the newly minted evidence-based medicine or EBM took off like wildfire, leading to the proliferation of guidelines and Hill’s cookbook medicine.

Sackett also backtracked, emphasizing that “the practice of evidence based medicine means integrating individual clinical expertise with the best available clinical evidence” everybody pretty much ignored him. After all, who cared if patients were different and physiology, difficult; as long as you had your bullet form guidelines and decision trees.

Meanwhile, everybody forgets that evidence has serious limitations, not the least of which is human error, external validity (in other words the people in the trial are not representative of the people in the community who actually take the drug or use the treatment) and conflicts of interest. At best even the best designed of trials tend to encapsulate a narrow slice of life which is not the reality of medical care which tends to be centred around the elderly and those with chronic conditions. (duh)

The old and the sick, precisely the people who use medicine, are excluded from clinical trials; in fact as Bradford Hill pointed out, the clinical trial “at best shows what can be accomplished with a medicine under careful observation and certain restricted conditions”.  I won’t even mention the exclusion of women from trials until the NIH stepped in in the ‘90’s to enforce its own regulations because the top of my head would blow off and that would create such a mess.

Ironically, where scientific medicine and epidemiology do excel is at giving us clues as to what doesn’t work, e.g., in common preventive measures such as mammography and PSA testing. But we don’t like those recommendations so we ignore them.

Turns out the “science” of medicine is like the Sasquatch. Often sighted and excitedly talked about but not actually real.

PowerPointless*

There’s a moment (usually) around week four or thereabouts of teaching that I begin to glimpse a teensy glimmer, perhaps even a glint of comprehension. In cartoon lingo one of those little light bulbs, though often it’s kind of dim and dusty like the light at some tacky hotel you didn’t really want to stay at but you missed the train and it’s all you could find at that hour of night. And at least it seemed like it didn’t rent by the hour.

So interesting I can’t stay awake

I can’t take it for granted yet – not least because I’m never sure if it’s a real glimmer (or just gas) – and I have realized that  critical thinking, even though it’s one of those catchy  phrases always used to describe education and  learning, is not the cornerstone of higher education. Heck, it probably isn’t even the balcony railing.

Of course it may never have been, whatever us oldies like to think of our own brilliant youth. Talking to a philosopher friend who taught at undergrads some 25 years ago, I have the sense that his students weren’t much better – in fact he says he once just gave up; the blank, stolid looks unnerved him and he just up and left. Simply told them he was available in his office if anyone wanted to discuss the material.

It’s a great idea, except I don’t really have an office – as a sessional prof I have an ugly desk in a cubicle; one of many in a large, ugly, locked room that would make Dilbert weep. I just use it to store my coat on the days I teach.  And if I actually expected any students to drop by I’d have to lurk by the door to let them in, since they can’t seem me way off in the back and the door is locked. And that would be creepy. I gather it’s really not about the learning anyway, certainly not that undergraduate thing. It’s what Jane Jacobs called ‘credentialing’.

As nearly as I can make out, reductionist thinking, dull and linear, wanders the hall like some ghost of sleepy hollow – and the reverence for expertise and white coats and science and anything that smacks of authority is put up on a pedestal so high it’s bound to fall off and hurt something. Then again, what does one expect when everything from ridiculous commercials for face cream to mattresses professes to have research (clinical trials no less) backing up their claims that their product improved people’s lives 83%?

How one would know such things always fascinates me. Questionnaires? Surveys? PR thingies? You know the ones I mean, the little sheets of paper someone with a clipboard thrusts into your hands as you’re trying not to dislocate a joint finding some leg room in that airplane seat or you’re racing from one thing to another trying to find your keys. Whereupon a painfully cheerful person asks if you’d mind answering some questions about that soggy sandwich you just ate or what you think of a new strip mall they’re thinking of building where your favorite dry cleaner now resides. Er, if I’d I’d known there was going to be a quiz I’d have studied. As it stands I haven’t the foggiest. (And even if I did, would my opinion make a damn bit of difference? Likely story. It never has before. But I’m not bitter.) Numeric reasoning at present seems to take precedence over all else, including common sense.

I blame Powerpoint.

That’s right. The program we all love even if it’s made by that Darth Vader of software, Microsoft. (Apple has a variant as well I’m sure – it’s just that their ads are hipper and their numbers are smaller so it doesn’t face the brunt of our ire.)

Powerpoint’s given form to our function, our enchantment with linear thinking. And as a speaker or teacher you can even print up your cute little bullet points so nobody has to take notes. Or listen for that matter.

What I teach doesn’t lend itself to bullet points or decision trees. When I leave the class my white board looks like a hyperactive monkey was trying to write MacBeth: a mess of words that makes zero sense to anyone who hasn’t been there to hear me talk about the interconnectedness of everything or realize that those arrows actually mean something.

A/V loves me because I leave them alone. Students, well, that remains to be seen. But, sessional or no, I refuse to reduce the complexities of science and medicine into a series of bullet points. Call me crazy, but I still believe that even these texting, smartphone addled students are capable of  – and even glad to be asked to engage in – thinking. Critically. Creatively. Contextually.

They’re capable of rising to the occasion if we’d just raise our expectations of them a jot. After all, they’re our kids. Surely they’re smarter than we’ve been giving them credit for.

 

* I wish I could take credit for the term but it was a title from the online version of The Economist – so kudos to whomever thought it up.