Category Archives: Health Care

Masquerade

Chicken Little and Henny Penny were off on a quest: to terrify everyone and tell them the sky was falling. Close to a year later it would seem they succeeded and everyone is indeed scared witless. As the world moves in slow motion Christmas appears to have been cancelled, adding to the general gloom.

Never mind that 1.5 million people die each year of TB or that 300,000 people (mostly children) die of malaria. And some 55+ million more die annually of various and sundry causes which, on a planet with close to eight billion people, is par for the course. Honestly, you’d think that without covid nobody ever died.

For now the burning question is whether that godforsaken mask is the holy grail or a fiendish mind control plot.

A friend tells me that apparently it is illegal to enter a bank wearing a mask. Implication is that you’re a nefarious miscreant. (Come to think of it, this would be the perfect time for a heist; just melt into the masked crowd with your loot.)

Stop the world I want to get off

As always, health discourse has gone reductio ad absurdum, pared down to a basic, binary level – a topic I have whinged on about at length. In one corner the coronavirus, crowned headband and mean little eyes; at the other, the economy, pale, with quavery legs, propped up by government handouts.

Weirdly, nobody seems to notice or care that the state of the economy impacts heavily on health. It’s not a zero sum game. Stress, especially that of not knowing whether you’ll have enough money to feed your family, is especially bad for health, as is isolation and locking ourselves away from anything makes life worth living like social contact. Seeing friends and family. Art. Music. Dance. Theatre. Travel.

Immunologically, the organism, the person matters, not that you’d know it these days. You don’t just get sick because you’re exposed to a pathogen, whether that’s covid or TB. You get sick when your immune system reacts to said pathogen. In the dry, wry words of an old epidemiology text, the virus (or bacterium) is necessary but not sufficient. In other words, the TB bacterium must be present if you have TB, but simply having the tuberculosis bacterium does not mean you have TB.

With any kind of testing that becomes obvious; all sorts of people harbour microbes they never knew they had. My mother never had chicken pox but clearly had been exposed since later in life she developed shingles (when she was taking immune suppressants). Chicken pox’s revenge as one nurse called it.

The dratted virus isn’t some poison perfume chalice, the mere whiff of which can fell anyone walking by. But hey, who cares about science when you can post scolding messages on Facebook?

The big news this week is the vaccine. And how it will be distributed. Not much analysis of the actual mRNA vaccine that frankly spooks the hell out of me. Makes sense in the abstract but real people are adaptive, dynamic physiologic systems: neurologic, immunologic, endocrine etc etc. Altering the genetics of a cell: well, no reset button there. I truly hope there are no negative consequences six months, two years, down the road. What I read, even when it tries to be reassuring is awfully vague.

[And always, along the same binary lines – if you discuss vaccines in anything but glowing terms well, clearly you’re one of them lunatic anti-vaxxers. No grey in this discourse. ]

We are the Borg. Resistance is futile. 

What’s up doc?

I was asked an intriguing question a while back: if we infect others with a cold or flu through coughs and sneezes, how is it possible to transmit a virus when we are asymptomatic (or presymptomatic) and not coughing/sneezing? I had taken it on faith that one could transmit a virus before getting sick. So I tried to find out.

Turns out it there’s a process called viral shedding.

When you have been exposed to a virus and it is attempting to take hold (but your immune system is pushing back), the virus works its way into a cell, often killing it (apoptosis). And  miniscule bits of the virus can shed a bit and get into your system. Possible but highly improbable in these socially distanced times. Reminder: even the symptoms of a cold result from the immune system’s reaction to the virus, not from the virus

So, if you have somehow been exposed to a virus and it has somehow entered your system, this shedding might get passed along if you got seriously close to another person and get these shreds into their eyes or mouth. Bearing in mind that these specks of virus aren’t especially numerous. That’s a lot of ifs. This, incidentally, explains why genital herpes or HIV can be transmitted through, er, bodily fluids, even when the person is asymptomatic. It’s more complicated with a respiratory virus,  particularly since we have this enormous immune organ called the skin protecting us.

But, the numbers are rising, I hear you cry. Surely that’s worrisome.

Maybe. Numbers alone can’t tell a story. For instance, what irks me immensely is the term “case” as I can’t bloody tell whether we’re talking about people who have tested positive or people who are actually sick. (Different groups use the term differently.) Particularly since the majority of people who get sick recover. With more testing it stands to reason that more people will test positive, therefore becoming a “case’. (And do not get me started on test accuracy – as no test is fully accurate.) But with language like “cases surging” we lose all nuance.

It’s year end, furthermore, by which time statistics would, under normal circumstances, have recorded some 56+ million deaths globally from various and sundry causes. As with most things, most at risk are the poor, the dispossessed, the fragile, the elderly. Nothing new there.

So, yes, this appears to be a bad flu but keeping one’s distance and keeping your hands clean should keep the worst of it at bay. Have there been massive numbers of excess deaths? I’m not sure. I suspect not but honestly, life’s too short to try and dig up those statistics that nobody’s keen to publicize.

In any event, when it’s cold more people get colds and flu. We just don’t usually keep track of them with such eagle-eyed negativity.

Alas what is on the rise, along with the freakout factor, is stress, depression, anxiety, gloom. Especially hard hit are those living alone, especially the elderly, cut off from  human contact. Ah, just asking: did anyone actually ask these individuals what they wanted? Guess not.

Lost in the mist of misinformation is the dynamic nature of organisms, viral and human. Forgotten is immunologic adaptation. Things evolve. Long before there was a vaccine for polio or smallpox the incidence of those diseases was well on the wane. I hate using the term “herd immunity” as it sounds more like animal husbandry but it really is a thing. Places where there were outbreaks last spring are now virtually covid free (e.g., Lombardy in northern Italy). Immunity is not static.

Post viral stupid pills

Now I read that – gak! – post covid complications seem to happen in people who’ve recovered, particularly those who were in hospital. Several months later these individuals are not back to “normal”.

Really? A bad viral infection didn’t totally clear in a few months? Shocker. Um, ever hear the term “post viral fatigue” (or post viral syndrome). There are always individual differences but for many people recovery post infection is slow. This isn’t new and scary. We used to call it life.

Bodies aren’t high tech devices you can reboot. Homeostasis takes times.

Yup. Stupid pills. It took over six months for some genius to figure out that in those people with covid who had had an inflammatory over-reaction steroids might help. We’ve known for over 100 years that steroids reduce inflammation but  it didn’t occur to anyone to give dying patients prednisone or something along those lines, to reduce the inflammation that was killing them? (See what I mean when I say the immune system appears to have disappeared in this virus discourse altogether.) Yes, I get that steroids aren’t good for an infection – but see “dying” line above. Still, best not to stress about all this. Stress is really bad for immunity – and wearing that damn mask is stressful enough.

What I will not do is let the buk-buk-buking of those Chicken Littles get me down. There have always been viruses, some worse than others, this too shall pass. In any event, I am bored with this particular fairy tale. So my new hero is Bugs Bunny and like him I plan to munch on carrots, sing my songs and bounce along being as silly as I can.  Just because the world’s gone nuts is no reason to follow suit. B-dda B-dda B-daa.

That’s all folks.

 

Coronavirus Blues

So here we are, essentially still in thrall to a coronavirus – with horrid terms like “social distancing” and “lockdown” having worked their way into the lexicon even as we are inundated with nauseating corporate platitudes along the lines of “we’re all in this together”. Then there’s “Zoom fatigue” and “brain fog”, giving us all an excuse to whinge about how tired we are. Which I, personally, enjoy a great deal as it gives me an excellent excuse not to do the things I don’t want to do.

Inevitably, as with most things health related, discourse has descended into a kind of bipolar idiocy: either you’re a virtuous human being who defers to “scientists” (who these people are or what their expertise is doesn’t seem to factor in), keeps their distance (leaping ostentatiously away if anyone gets anywhere close), dutifully wears a mask (even in the car, alone) – or you’re a conspiracy theory nutter roaming the streets and howling at the moon.

This, of course, isn’t new but it gained traction with social media some decade or so ago when anybody, anywhere, could find like-minded lunatics instantaneously. So discourse on anything even marginally ambiguous (which anything scientific inevitably is) now descends into the subtle dynamics of a nursery school spat: T’is! T’isn’t. T’is too! There are only two settings, right or wrong. (By the way, if you learned everything you needed to know in kindergarten, you’re an idiot. Get your GED.)

Trouble is, while binary codes seem to work OK in the digital world, they don’t work so well in the real one. And science, especially, doesn’t lend itself to this kind of thinking.

Computer language(s) rule?

Science isn’t some sort of “amorphous blob” (to quote British physicist Brian Cox, quoted in the BMJ on line). Science is dynamic, self-questioning and oftentimes wrong; it’s a method designed to grope for some measure of truth via empiricism.

With time, as the concepts become more refined, so do the answers. Which is how we moved from the notion of “miasmas” of bad air in sick rooms causing infectious disease to an understanding of microbes and germ theory. An understanding that continues to evolve. The development of better microscopes and laboratory techniques played a part as well, not to mention keen thinking by the likes of Pasteur and Koch. Later, a happy lab accident led to antibiotics and this daft idea we’ve formulated as a result that there’s a pill for every ill.

Science is not dogma and a great many hypotheses fall by the wayside. Just check the sheer number of un-reproduceable results for, say, genetics, if you don’t believe me. (One biotech company, Amigen, hoping to develop a miracle cancer drug, tried to replicate 53 “landmark” genetic studies – of which they were only able to replicate six. And they included the original researchers. But those other 47 trials undoubtedly made headlines in their day.)

The real strength of science is that it is self-correcting: it is not a religion or a deity or the final word.

By last March, alas, between the media shrieking blue bloody murder and epidemiologists insisting we were all going to die, it all built up to a crescendo of panic which has yet to subside. The zombie apocalypse would be upon us any minute. (Never mind that there aren’t a lot of good brains around these days for those poor zombies to eat.)

Now, months later, it seems to me, governments and policy wonks, epidemiologists and virologists, having scared the living daylights out of everyone, are a tad unclear as to what to do next. We’re tip-toeing into phase 3, with stores and such re-opening, but nobody’s all that clear about how close to get to that plexiglass or how essential those masks really are. I detect a hint of disquiet as the economy tanks and people are pushed into poverty all over the world, which every gibbon knows is what’s really bad for health.

Trouble is, once you’ve scared everyone half to death un-scaring them isn’t especially easy.

Freakout

For my part, I have wearily realized that this pandemic business was inevitable. The world was waiting for a pandemic, it was wanting a pandemic – and a pandemic is what it got. Now we’re stuck with the dregs and the aftermath.

There have been rumblings about the Next Great Plague for decades. There was SARS in 2003 (where under a thousand people died), then H1N1 (but then there was a vaccine) and its variants some years later. Each time the so called experts began their siren song, warning us that any second now, the plague was upon us.

It certainly didn’t help that popular culture, films, TV shows, have gone on incessantly about the dangers of microbes, be it bioterrorism, laboratory mismanagement or just bad luck. Every other day some actor strode about in in some show, unrecognizeable in a HazMat suit, looking suitably grave – even as virologists threatened us with the next “war” against some killer pathogen. (I recall a profile of some chap “hunting” viruses, profiled in The New Yorker, a la Indiana Jones, with your man chasing the virus that was going to kill us all.)

In short, we were primed.

The 1918 flu always held up as a template – millions dead and pandemonium in the streets, and boy will you be sorry you didn’t listen. Never mind that in 1918 they barely understood antisepsis, never mind antibiotics. Not to mention that pesky war to end all wars.

This time it took. First off there was that term, “coronavirus”: so easy to pronounce. (Quick reminder: the common cold is a coronavirus.) Plus, it originated in China, a country many of us have reservations about. Its genome was sequenced there making the provenance of this “novel” virus suitably creepy. At first it may even have seemed exciting, in a horror-movie sort of way. People do love to be scared, after all. And the word “pandemic”. So scary. Boo.

After that it took on a life of its own – and nobody took a breath, stopped to consider how focusing solely on a virus, using the most primitive medical model, could have dire, lateral, contextual consequences. Contrary to the idiocies posted on Facebook and its ilk, the reality of health is that it is not an either/or, zero sum game. Either we all wear our masks and keep our distance or everybody dies. Dodge any other human within several metres or the virus “wins”. Military metaphors sunk so deep within public health discourse and public consciousness we don’t even see they’re there. Just ignore the basic immunologic aspect: virus + organism could = disease. It’s not a given. Not just about the virus but the host. And not everyone with a virus will react to said virus.

People live in the real world, with real lives, and this freakout has had real consequences for work, housing, education, and health – which is about far more than an immune response to a virus. It was as though we forgot, watching the numbers rising at the edge of the screen that we all die, with or without a virus. In 2018 according to the World Economic Forum some 56 million people died. No coronavirus then, just, well, life. And flu and TB and a myriad other things. All made worse by deprivation and poverty and hopelessness and stress and all those other things.

To paraphrase Norman Bethune, there are two kinds of tuberculosis. The rich man’s TB and the poor man’s TB. The rich man gets better, the poor one dies. As we saw with this virus. The worse afflicted – other than the few whose immune systems responded with that oft-mentioned cytokine storm – were the poor, the marginalized, the malnourished, the warehoused. Sorry, but camping out at your holiday cottage, complaining of Zoom fatigue as you do your work on line, is not the same thing as losing the one (or three) poorly paid jobs you have which is all that’s keeping your family fed and housed. In the U.S. or Bangladesh.

Nobody likes to talk about the larger context; too complicated, so muddled, damn ambiguous, non-linear. Can’t be neatly graphed or displayed in bright colours. It isn’t something an infectious disease expert or virologist really knows much (or cares) about. It became about the virus; never mind the consequences of shutting it all down or how this will affect communities, cities, families, children.

Keeping it clean

Perhaps the one (semi) positive note I can see in all this is that perhaps some basics around hygiene we’ve appeared to have forgotten will get some attention. You know, things like keeping hospitals and schools and busses clean. Maybe we’ll remember that it’s a good idea to wash our hands more. Or hey, we could consider paying the people who take care of the elderly a living wage so they don’t need to work four jobs to make ends meet. Revolutionary concept.

Maybe we can revisit this notion that buildings need to be hermetically sealed, with no windows that open, giving some of us sick headaches. Or stop already with those horrid digital faucets that spurt water at random. That last one comes from personal experience (but I’m not bitter) from the times I tried to get my hands clean enough to take my contacts out on campus. But even though I brought a nail brush and soap from home, it never worked. I could never get enough water to scrub my hands properly. (Old joke: how do you know someone’s had to use a hand dryer in the washroom? Their trousers are wet.)

There’s a Danger Lurk!

There is simply no way to do away with risk. Risk is all around us. Nothing is 100% safe. Not crossing the street, or going to the dentist; no medical treatment, no drug, no surgery, nothing. It’s a risk to sit too much, stand too much; exercise too much, exercise too little.  You get my drift. (Incidentally, even though everybody’s all worried about a virus doesn’t mean bacteria aren’t still out there too.)

So here we are, singing the coronavirus blues, trying to figure out who decides what this “new normal” is, and watching people pander to their OCD tendencies, get all worked up about masks and otherwise indulge their inner Stasi. Hoping, I guess, that virtue trumps risk.

Trouble is, as the French philosopher Bernard-Henri Levy, who was pilloried for writing about the plight of the Rohingya in the age of coronavirus, has rightly said, our response to this virus has been more pathological than the virus itself. Aptly, he quotes Virchow: “An epidemic is a social phenomenon comprised of some medical aspects.” Not the other way round.

 

 

Getting rid of junk (nutrition)

What more can one add?

Junk mail, junk science, junk food, junk DNA: The list of terms we can plunk “junk” in front of seems endless. Usually the point is to negate the meaning of the second word but all too often it refers to things we don’t understand (DNA), are trying to avoid (food) or have no other way of describing (science). But since the word’s out there I think we need a new term, one that describes all the outrageously stupid, yet ostensibly expert, advice out there with respect to food – let’s call it junk nutrition. Which, in the best tradition of junk anything, masquerades as solid scientific advice, uses pseudo scientific terminology and does its level best to terrify us into giving up all the real foods, like butter, that make life worth living.

Worse, in its attempt to confuse us with bafflegab, junk nutrition is far too easy to get wrong, which explains why – as I am desultorily skimming an article in The New Yorker about some person, Stephen A. Smith who apparently “shapes the discourse of the sports world” (be still my beating heart)  I read the following:

“I used to think almond milk was best, but then somebody told me – a trainer told me – there’s too much estrogen up in there. In the almond milk. That’s right. [And] you don’t want to walk around with man boobs if you don’t have to.” (beat, chuckle, rat-a-tat)

P-pardon? Almond “milk” will do what?

Now I absolve Mr. Smith as he appears to be quoting his trainer, but honestly, somebody, somewhere, needs to get their nether regions out of whatever black hole they have become stuck in – as they have clearly confused soy and almond “milk”. (In quotes, incidentally, to remind you that no matter what marketers have told you neither soybeans nor almonds actually contain milk, which comes from female mammals). Soy. er. juice does indeed contain small amounts of phytoestrogens that do mimic estrogen and could, if ingested in sufficiently high quantities, cause said man boobs. Though you’d have to drink an awful lot of it. But how many people will read this and believe Mr. Smith’s trainer? More important, why didn’t the editor catch this?

Who knows, maybe it was one of those product placement things, paid for by the soy drink people.

Junk nutrition, in a nutshell. Eat/drink X so you’ll be healthy, avoid Y (salt, meat, sugar) so you’ll live longer. Trouble is, the stuff they tell you to replace the real food with, like butter, is often a pale imitation (margarine) cooked up by a chemist for a specific purpose. In the case of margarine, a French chemist who was trying to find a fat that could travel with the French army without going rancid. And therein lies the rub. The fake stuff is  (1) manufactured, (2) additive-full  and often contains genetically-modified whatsits (which they do not have to declare because it is not a genetically modified organism or GMO that’s grown/farmed) and   (3) untested so could easily cause some kind of weird side effect down the road. But there it is, large as life, dumb advice pretending it’s health news. This is how you convince otherwise reasonable people that a dark pink fake substance calling itself a hamburger is somehow … superior.

If you want a hamburger with a vegetarian spin, eat a felafel. At least you’ll know human hands were involved there somewhere, not a robotic arm in some factory. Plus it won’t contain  synthetic additives most of which  I cannot pronounce and wouldn’t even try. I can see why the EU has rules about what can call itself sausage or cheese – the terms mean something. But of course nobody would eat the stuff if you called it what it is: fake mushed up chemicals mixed with a bit of beans and plant based something-or-t’others with flavorists (yes, that’s a thing) adding some of that (flavour) at the end of the process. That latter bit, incidentally, is based on a molecular breakdown of what real food, like an orange or a carrot, tastes like.

Human beings cannot make certain necessary nutrients, vitamins, which we get from food. So if you don’t eat meat, milk, eggs, fish, cheese and so on be careful to get enough B12 and niacin and various other trace vitamins that aren’t present in plants. Iron too, as it difficult to absorb iron from plant-based sources. (Pair your plant based protein with something containing vitamin C, like tomatoes or red peppers. It will aid iron absorption.)

In the late 18th century when they also believed they had the nutrition thing down pat an Estonian scientist, Nikolai Lunin, noticed something puzzling. When he fed mice all the known nutrients (fat, carbohydrates, protein, broken down chemically), the mice sickened and died. Then he gave them milk and they were just fine thank you very much. Lunin didn’t know why, it was many decades later before anyone realized vitamins were essential to life. (Think pallegra and scurvy and other vitamin deficiency diseases.)

Salt of the Earth

The junk nutrition advice, of course, doesn’t stop there.  Take salt. (No, really, have some salt. It will make your food taste a lot better Just ask any chef.)

One of the few things Ronald Reagan said that I actually remember had to do with salt. When someone asked the Gipper about salt for some obscure reason he replied that yes, he did try not to oversalt things but honestly, eating something like a hard boiled egg without salt, well, only a raccoon could do it.

I don’t know where this notion that salt is evil came from since I can’t find any real research to back it up. But one day I realized that when I reached for the salt cellar some random stranger was glaring at me. And so it began.

It is true that if you have  congestive heart failure (i.e., if your heart muscle is not especially effective, usually when you’re quite old or have had heart attacks), there can be a buildup of fluid in your system (edema) and eating too much salt can exacerbate that.  Often, people with this disorder are prescribed diuretics, which makes cutting back on salt a good plan. And a small number of people with high blood pressure are sensitive to salt.

But normal healthy people sprinkling a bit of salt on their food, or cooking with salt? Not a problem.

Bodies awfully good at maintaining balance. Homeostatis as  Walter Cannon called it some 150 years ago. So, if you go a little crazy with the salt, as I have been known to do with popcorn, what happens genius? You get .. thirsty. And you drink more water. Whereupon said salt is flushed out of your system.

Alas, the salt-is-bad-for-you talk has taken on a life of its own and like most truisms no longer needs to prove itself; it’s simply become true by dint of repetition.  It is so because everybody says it is so, even though nobody knows why.

A rather elegant Scandinavian study, in fact, demonstrated much of what I’ve said here. Currently my desk has eaten my hard copy and I can’t seem to find it on my hard drive, but the gist was this: In a large study (tens of thousands of people) researchers found that increasing salt intake resulted in people drinking more fluid and passing out the salt in their urine. (The study measured the amount of salt excreted.) Voila. Balance, courtesy of physiology.

In hospital, where I recently spent a fair bit of time with a family member, the food tray arrives with tasteless food accompanied by a teensy packet of pepper. No salt anywhere. And let me tell you, that food needs something to make it palatable. I took to ducking into every McDonald’s I passed, just to take a few salt packets to toss in my purse.

What I found especially galling with the hospital food and its nanny-ish refusal to provide even a teeny packet of salt was that almost everyone there was old, frail and needed to gain weight and strength. Needed food in other words. But everybody was picking at their dinner, probably because it didn’t have a lot of taste. Ah, news flash people. Making food tasteless means the patient is far less likely to eat it, thereby exacerbating said frailty. Recuperating after a long hospital stay without food, now that’s what is dangerous. Not that teeny packet of salt. Seems to me the not-starving-to-death thing trumps the possible (minor) risk to your heart down the road, since, let’s face it, most people who are pushing 90, which a lot of people on this ward were, it’s not 20 years down the road you should be worrying about but next week.

[One a completely different note, I wonder if we’re going to start seeing the incidence of goiter increase as ever younger people decide no salt is the right amount of salt. And iodized salt, with iodine, as you’ll recall, is what made an enlarged thyroid, aka goiter, a thing of the past.]

Aside from anything else, salt is vital to keeping your electrolytes/fluid in balance. In hot weather when you reach for a Gatorade, all you’re doing is replacing the sugar and salt you’ve lost. And, for some of us, whose heart valves do not function as well as the norm, extra salt is necessary to keep enough blood coursing through our system.

Salt was so precious it was used as currency at a certain point in time. But gosh, we know better now. We have Google.

As with much junk nutrition, the voices are loud and crabby and rude. But, to paraphrase a maxim attributed to Louis XIV, “Do not assess the justice of a claim by the vigour with which it is pressed.”

How Sweet It is

Then there’s sugar. Public enemy #1 (or #2, depending on your stance on salt). Meanwhile, inclusion criteria for Type 2 diabetes have been lowered (which means more of us are being subsumed into the “diseased” category) so more of us are being told we should check our glucose regularly (thereby making Bayer, the largest maker of those glucometers rich and helping it take over the world), obsess over our diets and of course start taking the drugs. I have of course rambled on about this before but, oddly, my post does not appear to have changed the world.

Yes, too high glucose is bad. It means that there is sugar circulating in our system that can’t be used as fuel – and our cells need that to function. When this happens, our bodies will store fat – because frankly you’re just as heavy as you need to be in order to survive. If your cells can’t get the glucose they need from the food you’re metabolizing into glucose (which pretty much most food is), they will turn to stored fat for that glucose. At some point that tilts into pathology (diabetes). But our cells, all our cells especially our brain cells, need glucose to function. Lower the levels too much and your brain won’t work. This is especially true for older people.

Metabolism is highly complex: a deft and sophisticated dance between intake and output. Unfortunately, the numbers we revere and try to adhere to (using our personal beepy machines) do not reflect that complexity at all; on the contrary they turn this nuanced, balanced system on its metaphorical ear.

For the elderly this lowering of blood sugar – with the so called ideal being somewhere around 6, which is too damn low – can have dire consequences in terms of cognitive ability. I wonder sometimes how much of this increase in dementia we’re told we’re seeing has to do with the ostensible increase in the diagnosis of type 2 diabetes and lowering of glucose levels lower than anyone over 60 or thereabouts can take.

A 25-year old can fast, detox or do any one of a number of crazy things but usually be OK. (Of course in a very small number of cases this could uncover an underlying heart problem.)  At that age bodies by and large function at peak efficiency. So if there’s no food available glucose will be pulled from elsewhere – fat, muscle. With an  older person, not so much. As we age our ability to keep that balance becomes less efficient, so if we don’t take in nutrients the first thing to go is the brain. We have large brains and they need fuel. Glucose. If they don’t get it, they falter. And I would remind you that once you hit your forties immunologically, physiologically, you are on the downhill slope. In fact, to quote an article in the journal Progress in Cardiovascular Diseases (61 [2018] 10-19) entitled “In Defense of Sugar: A Critique of Diet-Centrism”, the author, a Dr. Edward Archer bluntly states that “without sugar we die”. And no, he is not in the pocket of Big Sugar.

Dr. Archer adds that sugars are so “foundational” to biological life and so central to human health that the simple sugar glucose is “one of the World Health Organization’s Essential Medicines”. Furthermore, given that sugar has been part and parcel of our diet for a long long time, it makes no sense to suddenly blame sugar for everything from obesity to metabolic diseases. No one substance can be responsible for all of society’s ills, such as poverty and packaged foods and fast food and sedentary lifestyles, not to mention time-strapped parents unable to make (more expensive) food from scratch after working two jobs just to pay the rent. Never mind drug companies keen to sell their drugs. This “diet-centrism”, writes Dr. Archer, is neither evidence based nor scientific. And he has the three long pages of references to prove it.

What concerned me most while I was in and out of the hospital “advocating” as they call it (such a bizarre notion, that patients need someone there to prevent harm when, presumably, everyone’s goal is to treat the patient and get them home in one piece) was watching the glee with which the glucose numbers were tested. No matter that in an elderly person a blood sugar level of 5.5 (ideal from the hospital’s perspective) is way too low. In fact, better too high than too low since too low can kill you before next Tuesday; versus long term high glucose which will kill you eventually, years down the road. Don’t know about you, but I’d rather give up some mythical tomorrow for a today when my brain actually works.

Keeping up with the Numbers (game)

But numbers – like the ones the glucometer spits out – are easy. They give us an easy benchmark against which to compare ourselves to some “ideal” and today the high-tech, beepy machines that can measure everything from body temperature and glucose to blood pressure and heart rate, are everywhere especially in acute care settings. These, alas, by reducing the complexities of physiology to a check list make us think we’re safe and on top of things.

Trouble is, we rarely if ever genuinely understand what those numbers mean, much less how they alter with time.

The other day I semi watched as a woman, accompanied by a man who had clearly googled “taking your blood pressure at the pharmacy” before heading out of the house, explain to his female companion why lower was always better. I hovered for an instant then moved on. Nothing to see here folks.

It seemed wiser to keep going and not engage; not stop to explain to this gentleman that the blood pressure measure (even if the pharmacy apparatus was dead on which it probably was not) reflects a dynamic element of life. Blood pressure rises and falls with exertion, with stress, after eating. One measurement means nothing – the only way to have a bit of a sense of the trend is to take your bp several times a day, several times a week, then average it out over time. (Personally, having a man loom over me telling me lower was always better as I took my blood pressure at a pharmacy would make my blood pressure skyrocket. )

And there definitely is such a thing as too-low blood pressure. Take postural hypotension as it’s called: blood pressure plummeting when you suddenly get up. It can cause dizziness, queasiness, even a fall. At worst, a fracture. Pushing those numbers too low may look pretty but they sure as god made little green apples don’t translate into good health. True, persistently high blood pressure is indeed a risk for heart attack and stroke. But simply pushing bp down because lower is always better is idiotic. That heart, after all, resides within an actual person; a person who might want to feel well enough to actually have a life,  which too low blood pressure would not allow.

The trick, as with anything else, is balance. And understanding that individuals differ; what is right for one person may be totally wrong for another. Understanding that age needs to be factored in, not to mention our state of health. But in averaging out numbers and deciding how we compare to some mythical norm it’s all too easy to lose  sight of the actual physiology of what’s going on.

I can understand the impulse to stay on top of things, control what happens. Trouble is we can’t. No amount of obsessing over diet and all these numbers will guarantee future health. Much as we’d like to believe in the crystal ball of the numbers game.

All we can do is take each day as it comes, do our best to eat well, get enough rest/sleep, reduce stress to the best of our ability and carry on. Life is complicated enough without our adding junk nutrition to the mix.

In the immortal words of Garfield the cat, what is diet, after all, but die with a “t”?

 

 

Random Thoughts and Staircase spirits

Time, said Auden, will say nothing but I told you so. Time also gives one the opportunity to brood – darkly – on so many of the idiocies out there in the ever-expanding world of health information.  So here, in no particular order, what’s been making me especially cranky:

Monster under the bed roams city streets  

Diabetes, the latest health scourge to hit the news, is now a City of Vancouver problem, at least according to a headline in a throw-away newspaper I threw away,

“Vancouver to track and attack diabetes”.  With what, one idly wonders. Bicycle spokes dropped on those bicycle lanes? Pointed sticks? Stern warnings? Nothing so mundane it turns out. This, apparently is part of some international initiative (a word that sets my teeth on edge) and creme de la creme cities like Houston, Mexico City, Copenhagen, Shanghai and Tianjin (where?) are on board, tracking “people at risk of diabetes” as part of a campaign to promote “healthier cities”. Curiouser and curiouser. Who knew cities were sentient and could get sick.

So the plan is – what? Skulk behind anyone leaving Starbucks with a large, frothy coffee? Tap anyone who seems a bit plump on the shoulder and read them the health riot act? (Honestly officer, it’s this outfit. Makes me look fat.)

Someone with the unlikely title of managing director of social policy at, one assumes, the City of Vancouver  will start “consultations” with Vancouver Coastal Health and – wait for it – Novo Nordisk, the sponsor of this demented plan.

Of course. Silly us, not to have realized a drug company had to be involved.

Must be diabetes lurking back there in them there bushes….

 

Novo Nordisk, a nominally Danish but probably multinational drug company almost exclusively manufactures diabetes drugs (oral hypoglycemics) as well some types of insulin. (The old insulin by the way, the non-patent-able kind that came from animal pancreases and was easily tolerated isn’t around any more at least on this continent. Banting, bless him, donated his discovery to the people of the world; he didn’t believe anyone should benefit financially from diabetes. Unfortunately he had no way of knowing that by the late 20th century pretty much anything could be “property”: manufactured and sold, up to and including a person’s genome.)

This diabetes sneak attack has already started up in Houston where they “mapped” various areas (for what one wonders) and went door to door to “educate” people about diabetes. Then, if their numbers don’t match some ideal level no doubt they need some of Novo Nordisk’s boffo drugs. (This class of drugs, by the bye, doesn’t tend to have a long shelf life as they usually are fairly toxic to the liver and quite a few of them have come and gone.) These hapless people will be told to get their fasting glucose and A1C* checked and down the rabbit hole they will go. We will all go.

These days after all it has nothing to do with the actual human being who may be in there somewhere but about the numbers. (There’s an American drug ad that doesn’t even pretend it’s about anything but “bringing your numbers down”.)  I suppose racial profiling could play a part as well, given that, statistically, people of South Asian, Hispanic, Asian and First Nations background may be at greater “risk” – whatever that means.

What few people realize is that this ostensible epidemic of type 2 diabetes sweeping the world has much to do with the continual lowering of inclusion criteria. A few decades ago “normal” glucose levels were around ten. Now they’re about half that. For people over 50 the latter number is especially problematic as close to half of us, as we age, tend to have somewhat higher levels of glucose and if you think about it, it simply makes no sense that a physiologic change that affects close to half the population in a particular demographic is a pathology. It’s what’s called, um, normal.

As for me, if anybody tries to corner me and talk to me about my diabetes risk, I plan to run shrieking into oncoming traffic. At least that’s a risk that makes sense.

Fight them on the Beaches

In that previous story what initially struck me was the term “attack”. As though a glucose level that could potentially be problematic was some kind of enemy – not some fluctuating number based on a myriad factors ranging from weight to diet to sleep. A number that moves up and down depending on the time of day and a host of other factors.

Physiology is dynamic, not that you’d ever know it these days given how mesmerized we are with the numbers.

Oliver Sacks, RIP

Someone who understood the complexities of physiology – and stood up for clinical knowledge and patient narratives – was Oliver Sacks., who died last August.

Physician, author, eccentric and host of oddball characteristics, Sacks wrote some amazing books (Migraine, The Man who Mistook his Wife for a Hat, An Anthropologist from Mars, A Leg too Few are some of the ones I enjoyed reading. Apologies if I got the titles slightly wrong as I’m quoting from memory). Most important, his writing reminded us of the diversity and variation(s) there are between us; not simply the similarities that clinical trials, statistical averages and guidelines exploit. Sick or well we’re all different and, to paraphrase Hippocrates and Osler and other famous sorts, medically the person with the disease matters as much as the disease. Or ought to. Alas, the trajectory of modern medicine whether it’s so-called preventive care, apps or genetics has a tendency to iron out those differences and push us towards some mythical average or “normal” that few of us come close to.

Colourful, thoughtful clinicians like Sacks have become vanishingly rare. Perhaps it was Sacks own differences – Jewish, gay, former biker and user of psychoactive drugs, gefilte fish aficionado – that made him realize just how much one’s personal history and narrative played into one’s physiology. Or just how vital it is for clinicians to listen as well as talk.

Dem bones, dem bones

L’esprit de l’escalier is a French phrase referring to all the pithy remarks one ought to have made but which only come to mind some hours later. Usually as one’s interlocutor is long gone.

So, to the pleasant woman who came up to me after my CAIS (Canadian Association of Independent Scholars) talk last year to ask about vitamin supplements, more specifically calcium, what I omitted to mention was that calcium is not a vitamin, it’s a mineral. An element, if one wants to be pedantic, Ca+ (20 on the Periodic Table). Hence, the “elemental calcium’ you can buy in the drug store.

The notion that we all need to take calcium supplements for our bones is based on somewhat simplistic notion, namely that simply ingesting this mineral will somehow magically increase bone density which we are told we are losing at an alarming rate, especially if we are women over 50. Clever advertising ably preys on our fears of “weak” bones, metaphors being what they are.

Bone is an amazing substance. It is dynamic – the collagen demineralizes and then degrades even as other cells (in sync) remineralize the collagen that has just .. diminished for want of a better word. It ebbs and flows (how else could a broken bone heal?) to achieve a balance; a balance that alters with age. When we are young/growing bone builds to its apex, in our twenties. It then plateaus for a time then, as we pass age 35 or thereabouts we gradually lose bone density. This is what we used to realize was normal development. And the bone in your body differs in form, hardness and elasticity depending on where it is and what it does – the vertebrae in your spine and the long bones in your body are of a different consistency and respond to changes in pressure differently than the ribs or the wrist.

The calcium/Vit D directive has become so engrained however that most people believe what they are doing is somehow maintaining or feeding their bones with supplementation.

But our endocrine system monitors the blood level of calcium and maintains it at our personal set point. One that is different for each person. This means that taking in more calcium is generally pointless as it simply cannot be absorbed. To quote Nortin Hadler, an MD, in his book, The Last Well Person, “If the blood calcium level trends down, vitamin D is converted to an active metabolite, which makes the intestinal absorption of calcium more efficient and vice versa”. More is not better; it’s useless. And potentially harmful as calcium can deposit in joints and other bits. As for vitamin D, it too has a set point that differs in each person; too large doses can build up and become toxic. So, those generic amounts you’re advised to take may or may not apply to you. Probably don’t in fact.

We tend to think that the supplements we take as a kind of top-up to diet, like adding oil to a car or salt to soup. Our bones rely on calcium so we basically assume that bone density is improved by taking supplemental calcium. And since our bones contain calcium, and as we get older our bones become less dense, we should “supplement”. It’s a mechanistic form of thinking about the body, one that took off after the Industrial Revolution when an “engineering mentality” took hold about physiology (in anthropologist Margaret Locke’s term). It certainly doesn’t hurt that the nice people at Bayer (who are taking over the world and now sell everything from vitamins to glucose meters) continually tell us we should. Alas, physiology is rarely so cut and dried and our understanding of how bone (or anything else) works remains primitive.

The real advantage of dietary calcium is when we are young and our bones are developing (in our teens). Unfortunately, short of building a time machine and going back in time there’s not much we can do to reverse the bone mass we accrued before our twenties.

So for now the basics of health remain the same as they were in decades past. Relax, eat well, exercise and stop stressing out about supplements. Most important: stop listening to all that bogus advice out there. If all we do is obsess about our health, our diets, our bodies – well, we won’t actually live any longer but it sure will seem that way.

 

*A1C is a measure of a red blood cell that is said to provide a “snapshot” of your glucose levels over the previous three months. It’s rather elegant but is still a correlation. A good one to be sure but correlation is not, as we all know, causation.

 

 

 

 

Civil Scientific Discourse RIP

It’s no secret that I am not fond of hot weather in general and summer in particular. Making me especially cranky at the moment is the hyperbole surrounding the science/non-science discourse, e.g., around childhood diseases like chicken pox or measles, mumps and rubella (the three dire conditions the MMR vaccine is supposed to prevent). The crux  appears to be that either you’re either one of those unscientific, Jenny McCarthy-quoting, loons who believes vaccines causes autism – or you’re a normal, nice, sane person who believes in science. Paradoxically, science appears to have gained the status of a deity in this discourse.

No need to get hysterical about skepticism.

No need to get hysterical about skepticism, Hume might way.

Case in point, a headline last year: “Shun anti-vaccine talk, SFU urged”. Some anti vaccine conference was going to take place at some SFU campus and an angry group of critics were whopping mad lest this event “lend credibility” to this “dangerous quackery”.  This, er, quackery was some symposium where the discussion was on how “families are facing increasingly intense pressure from the vaccine lobby and big government to comply with vaccine mandates” and was  organized by something calling itself the “Vaccine Resistance Movement”. Hardly saving the free world from tyranny but hey, the resistance carries on, large as life and flakier than thou.

The 18th century philosopher David Hume, the granddaddy of skepticism would no doubt be turning in his grave at this hysterical, humourless assault.

BC’s Chief Medical Officer replied in his usual vein: “Vaccines, like any medicine, can have side effects, but the benefits … outweigh the risks,” Which is true. But in the abstract one can  wonder whether suppressing all childhood diseases may perhaps have immune consequences. Especially the trend towards vaccinations against diseases “such as chicken pox which cause only inconvenience rather than danger” in the words of British sociologist and science and technology writer Trevor Pinch. (In Dr. Golem: How to Think About Medicine by Harry Collins and Trevor Pinch, University of Chicago Press, 2005). Especially given the sheer number of jabs (approx. 20 I think) that infants now get.

SFU president Andrew Petter apparently refused to cancel anything, merely saying universities stand for freedom of expression and, as far as I know, the conference went ahead. I have no idea what was discussed but I suspect it was a lot of nonsense. That’s not the point. What’s perturbing is the vitriol of the protesting group and the smug suggestion that if one dares to question the “science” or wonder out loud if these might, just might, have adverse immune or other effects, one has no right to speak. Either you toe the party line or you’re a crazy person. One who should be run out of town on a rail to coin a phrase. (I’ve never been sure why being run out on a rail – which to me implies a train – would be such a bad thing. Personally I am mega fond of trains.)

The photo of the conference protestor indicates that the group (“The Centre for Inquiry”) is just as obscure as the one they’re protesting. Maybe the whole thing was a publicity stunt or performance art, who knows.

Any child not vaccinated against the measles should not be allowed in school, someone firmly said to me last month. Measles can cause deafness and blindness, not to mention encephalitis, someone else said. I mildly agreed, merely pointing out that the numbers on these dire effects in the developed world were actually vanishing small, at least based on the (admittedly limited) research I had done. Buried in the contradictory numbers one small group of children was clearly at risk from measles, namely children undergoing cancer treatment.

Years ago, when I wrote a book on the immune system, I did a bit of desultory research on measles; there was some evidence that a natural bout of measles appears to reduce the incidence of allergies and asthma in later life. (Operative word appears – the data was correlational and based on medical records; there is no way to know for sure if this was cause and effect. Bearing in mind that many health recommendations, e.g., lowering cholesterol, are based on correlation.)

Immunologically measles might have a modulating effect; in a way allowing the immune system to become less inappropriately reactive and reducing the incidence of asthma and allergies or other auto-immune conditions. Perhaps this struck a cord with me because in my own case a natural bout of German measles (rubella) cleared the bad eczema (also an auto immune over reaction) I had suffered since I was two or three. Large, itchy welts covering my legs, arms and face, especially knees and elbows. Then poof, I get sick when I am nine or thereabouts; high fever and whatnot, and my eczema essentially clears. I still occasionally get eczema, usually in reaction to an allergen (like aloe). But, by and large, I’m fine. The research I did years later gave me a context for that (better than my grandmother’s “well, the high fever burned it off” which made the eczema sound like a forest fire – though, come to think of it, that’s not the worst description).

But when I wondered out loud some weeks ago if maybe, maybe, over zealous vaccination programs could have anything to do with the increase in peanut allergies some months ago you’d have thought I had suggested a plot for Criminal Minds. It was speculation, people. I’m not the vaccine police.

I’m not sure quite how this binary, myopic perspective evolved and became so engrained, but it seems now that any questioning of standard medical dogma (““sugar is bad”) ends up as some version of t’is/t’isn’t, t’is/t’is NOT: all the subtle dynamics of a nursery school. Either you’re a feeble minded dweeb who fell for the fraudulent, discredited Wakefield Lancet article linking vaccines with autism (actually GI problems not autism but that’s lost in the mist of rhetoric) – or a sensible, right thinking person who believes in science, good government and iPhones. (As it happens I now have a Blackberry Z10 which I think is far, far superior. Were we to pause for a commercial break.)

Science is a method. Science is fluid, moves forward asking questions and trying to find empirical evidence to back them up. It is not dogmatic or static. It’s not perfect but at this point it’s the best we’ve got. But I guess if you’re going to turn science into a religion then it will end up that way.

Pity, since scientific inquiry was, to a large extent, what dragged us out of the Dark Ages.

 

 

Why cats make the worst patients (and the dog ate my homework)

Charlie stopping to smell the flowers in healthier times

Charlie stopping to smell the flowers in healthier times

Charlie, one of the cats, was seriously ill and Lyme Disease (which was the designated subject for this post) went clear out of my head. It shall return. Meanwhile, I’ve been nursing Charlie, aka Houdini cat (who will literally disappear into the towel you think you’ve wrapped around him securely), reminding myself that nursing is a noble, noble profession. (That’s what you call professions that are bloody hard and nobody appreciates.) I’ll say one thing, taking a cat to the animal hospital does give one a quick lesson in the perils of for-profit medicine (my Visa may never recover) – especially in our risk-obsessed age where tests and scans trump individual history, personality and symptoms (human or animal). It also reminded me that one must be vigilant when faced with the ponderousness of Expertise.

In Charlie’s case it began with a neurological condition called Horner’s, an irritation of a nerve running down one side of the face, eye and down the neck and into the chest – not a disease but a symptom. Naturally Expertise immediately rushed to the worst possible diagnosis: lymphoma, or, in a pinch, brain tumour. (Do not pass ‘go’, just head for the hills.) I mildly posited inflammation or infection, probably ear related, particularly since Charlie’s had those before. But noooo.

Critical Care, human or animal, is rife with Expertise: grave, gravel toned and confident. Why? Because they have tech toys, that’s why. Cool devices and imaging technologies that purport to explain the mysteries of life. Even (ha ha) a cat scan. All of which push the patient into ever higher levels of care – because they can. Problem is, the patient often can’t.

I tried to hold my ground but it’s a slippery slope that one; the surer they are the more one caves, especially when they start to say, well, with cats elevated white blood cell count could mean X. I mean, what do I know from cat physiology?

So the cash register tinged and Charlie looked steadily worse. Of course nobody looks good in ICU between the ugly fluorescent light and the tubes but there’s something especially pathetic about a small furry creature sitting in a cage. And Charlie, well that cat could have taught Stanislavsky a thing or two about looking sad.

I kept getting calls to tell me things I already knew (he has a heart murmur). The last time I snapped, “I know. I have one too. Big deal.” That didn’t, naturally stop them from getting a cardiology consult. Bearing in mind that cats don’t hold still for much of these so need to be anaesthetized.

Finally, after every possible dire diagnosis had been ruled out, we came round to my original hypothesis: ear infection.

Don’t get me wrong. I have enormous respect for veterinary physicians. They study long and hard (far longer than human doctors) and by and large they are great. They deal with a diverse patient population who’s uncooperative and uncommunicative. And when I say diverse I’m talking species. And they need to make a living, I get that.

What they, and most of us, do not get however is that they are part of the culture at large and the culture at large is obsessed with the “science “ of medicine, leaving the art further and further behind. Watching Charlie work his way through the system reminded me of just how much medical focus has shifted away from the patient and towards disease, technology; towards what tend to be called “objective” results (versus the messy subjective ones patients bring).

I see this on a human level very time I go to the retinologist with my mother (that, by the bye, is a sub-specialty of ophthalmology). First, they get her to read the letters on the chart and are all impressed at how well she sees. Then they take their pictures and look grave: how could she possibly see that well with those terrible ridges in her retina? (To me they just look like the Alps.) Then they look puzzled. Scan says you can’t but you actually did see. What a gonzo dilemma. So, they go with the scan and give her the medication. Objective trumps subjective.

Question is, should it? Does it make sense for the patient to get lost in this morass of ostensibly objective ‘data’?

Not to my way of thinking. “Normal” – blood pressure, lipid level, whatever – is a best-guess average based on population statistics and what some committee has deemed appropriate. If you’re truly sick it shows up. C-reactive protein in the clouds – well, objective and subjective tend to match. Your joints hurt, you have some kind of inflammatory condition and the test backs you up. It’s that grey zone that’s problematic. Levels fluctuate in every individual, tests can be wrong (some more than others).  Error rates in some tests are as high as 75%. But we forget that.

So, cat or human we are lumped in with the many-too-many – and our individual narrative gets lost. In Charlie’s case nobody believed this pretty little cat who had only been ailing for a week could possibly “just” have a madly inflamed  ear affecting his balance and appetite. An infection is no picnic. But it’s not a brain tumour. And of course Charlie’s Oscar winning ability to look mournful didn’t help. This cat can look sad when he feels ignored; imagine how dreadful he looked when he was dizzy and queasy. It’s a gift. But it’s not diagnostic.

You need a proper history; the back story. The person with the disease is as important as the disease, said Hippocrates. Let’s say you end up in hospital with severe abdominal pain. It matters whether you’ve had this pain before, but less intense or of shorter duration. Sudden abdominal pain could be many dire things; a worsening of an existing problem is probably nothing that will kill you (otherwise you wouldn’t be in the ER in the first place). The clinical picture changes with the history. Someone has to factor it in.

Charlie’s doing better now. As for the rest of us – who knows. We may never survive the tech age.

Is there an epidemiologist in the house? *

There’s an appalling advert for one of the adjunct health unions/associations/whatever where someone collapses in a restaurant and the doctor starts to call for various “technologists” (x-ray, CT scan, etc.). I don’t doubt their word that health care today is complex; still, call me crazy but if I collapsed somewhere I’d really rather have an actual health professional, a clinician, by which I mean a nurse or physician, at my side than someone who knows how to operate an ultrasound machine thank you very much. Remember, machines don’t “know” whether something is a concern or not – it’s people, actual humans, who make that determination.

A determination that all too often relies on over-optimistic beliefs regarding the accuracy of “tests”.

(But as neurologist and author Oliver Sachs once sadly remarked: They don’t give Nobel prizes to clinicians, only medical researchers.)

Worse, everything from the images and data we get from those machines not to mention the health information that’s flung at us from all sides is based on statistics. More accurately, a statistical approximation of “normal”. The normal person, whatever or whomever that may be. Another term, whether one is being statistically accurate or not, is “average”.

Now I don’t know about you but I’ve never met an “average” person. Everyone I know is distinctive, sometimes eccentric, often times interesting, funny and, well, different. People are a jumble of ethnicities, backgrounds, socio-economic and otherwise; their education and passions and hobbies and interests differ as does everything from their diets to their bad habits. Er, risky behaviours to the epidemiologists in the house.

So here we all are, contorting ourselves into bizarre shapes trying to fit into the statistical moulds they’d have us fit, from the not-so-benign lipid levels and blood pressure (for which drugs are available should one not conform to aforesaid norm) to clothing sizes and availability in everything from lipstick colours to food. Oh, yeah, tell me you haven’t noticed that your favorite kind of frozen chips appeals to you and six other people so it’s been discontinued.

From supplements to ideal weight, glucose and you-name-it, normal follows us around like some malevolent mosquito, buzzing in our ear and biting us in the you-know-what when we try to ignore it. Whether it’s Dr. Google or the news items on everything from your phone to your TV.

But when we’re feeling off, or sick or have had something bad happen what we want and need is a clinician: someone who knows how to set that broken bone, do that tracheotomy, or CPR and know just how much morphine to prescribe so we’ll keep breathing. Unfortunately, the spate of bad health news out there makes us all so nervous that all to often when we do end up at the clinic or the ER we’ve got nothing more terrifying than bronchitis or a particularly bad bout of cystitis. Not for us former generations stoicism; we race over ‘just in case’ for everything from a sore knee to a cough.

An American chap once disparagingly told me Canadian health care was simply dreadful. How did he know? Well, when he lived in Montreal he had a bad cold. One assumes in winter when people get those in cold climates. Then, late Saturday night he decides to head over to the Emerg because his cough was worrying him. Could he breathe? Yes. Was he running a temp? No. But he went anyway. And couldn’t figure out why the ER staff didn’t rush him to have tests and x-rays. Ah, d’you think you could have picked a less busy time? Of course not.

Where’s an epidemiologist when you need one?

* Not my line, though I wish it were. I read it in Gordon Clark’s column in The Vancouver Province on July 8. Laughed out loud as a matter of fact.

They got stones, I’ll give you that

I was going to call this post “nobody knows the trouble I seen” except that it seems ludicrously self indulgent to whine that one has been living through construction hell when the rest of the world has revolutions, civil wars, hurricanes and so on to contend with. (But, to paraphrase Will Rogers, everything is manageable provided it’s happening to someone else.) This isn’t to say my curmudgeonly instincts have been dormant . One particular item a while back had me seething.

“Sugary drinks are not so sweet”  was the headline in the Health section of the Globe and Mail (24 May 2013).  Apparently, drinking a sugar-sweetened drink a day not only rots your teeth and adds up to empty calories (with the added bonus that it makes New York’s Mayor Bloomburg crazy) but “may increase the risk of kidney stones”.  Gasp. I had to pause to take a sip of my ginger ale* while that sank in.

I puzzed and I puzzed, to reference one of my favorite curmudgeons, the Grinch. Didn’t make sense. How on earth could fructose cause blobs of crystallized minerals to form in the kidney? (To reinforce the point that sugary drinks are Evil the accompanying photo was of a surgeon with a scalpel. Someone had fun with that.)

The research cited was from 2007, published in a journal called Kidney International (2008, 73; 207-212).  The worthiest journal nobody’s ever heard of.  My curiosity got the better of me and I downloaded the article and read through the cringe-inducing prospective study; yet another data-mining expedition hoping to find a “link” between X and Y. (For more on my distaste for the term, see post.) The data? From – wait for it – the appalling Nurses’ Health Study, formerly used to “prove” that taking estrogen was just a boffo idea.  Here, the research cites some 19,000+ women along with some 46,000 men from the Health Professionals Follow-up Study. Impressive numbers. Pity the hypothesis is so feeble.

Not of course to our heroes, researchers Taylor and Curhan, unspecified experts at a renal division/lab at Brigham Young and Harvard who engage in enough statistical jiggery-pokery to make the world go round.  (Pity nobody blinks when data gets tortured.)

Just a few problems here. As I explained, in often far too exhaustive detail in The Estrogen Errors, extrapolating to the general population from the Nurse’s Study is massively problematic. For starters, there’s its basic design, bi-annual self reports, which are notoriously unreliable. We’re all prone to error as any gibbon with half a brain knows: we forget, lie and generally get things wrong. Good grief, most of us stutter when they ask us how much we weigh when we get a new driver’s license. Plus, there’s the healthy user bias – people who respond to any questionnaire tend to be richer, smarter, better off, i.e., healthier than the average bear. Er, person. Often they’re white and frequently younger. All of which means they are not like the real at-risk population who, by and large, tends to be poorer, less educated, older, more diverse, less health and diet conscious, more stressed and sicker. Face it, d’you think you’d have time to sit around reading some blog if you had to work at two or more minimum wage jobs just to put food on the table and pay your rent? Could you even afford an iPad or even high speed internet?

This is on top of the fact that professionals in general can’t stand in for “everyone” and basing one’s conclusions of what these people do (or say) is what’s popularly known as being spectacularly wrong.

What really interested me, though, was what these researchers thought might be going on physiologically. In other words, how did we get from basic sugar metabolism to blobs of crystallized minerals in the kidney? Gremlins? Evil spirits? The authors do obligingly admit that the underlying mechanisms are “unknown” (ah, ya think?!) but postulate various processes, none of which make sense. Hence their masterful use of language:

“Fructose may also increase urinary excretion of oxalate, an important risk factor for calcium oxalate nephrolithiasis. Carbohydrates, along with amino acids, provide the majority of the carbon for glyoxylate and oxalate synthesis, and fructose may be an important dietary sugar influencing the production of oxalate.” (emphasis mine)

 The authors concede backing for their hypotheses are “sparse”; personally I would have said nonexistent. Rats make up the bulk of their research subjects in this section and the one study they cite using humans consists of eleven – yes, 11 – men whose pee was analyzed for calcium loss (versus calcium intake).  Fructose intake made no difference in the calcium these men excreted but the researchers still concluded that the reason fructose laden drinks caused kidney stones “may be related to the effect of fructose intake on urine composition”.  How they concluded this I have no idea. Maybe they were on a sugar high.

The only marginally plausible explanation had to do with uric acid metabolism and for a moment I thought, OK, this might make sense. Then I checked the reference and realized it only applied to people with gout, whose uric acid metabolism is already dysfunctional (that being the definition of gout). 

Kidney stones, by the bye, are hardly that common and rarely if ever life threatening. Even Wikipedia’s overblown, hyperventilating piece on the topic, that sounds as though it was written by a nephrologist who had just passed one, admits that the incidence or number of new cases a year is “0.5%”.  (Of course it doesn’t specify 0.5% of what which is rather an important point, but let’s not nitpick at this late point in the post.)

How did this 5-year-old study even make it into the health news section? Having spent some years as a medical writer and journalist, I can tell you exactly how. A group of people in an editorial meeting, drinking coffee – or pop – were bouncing around story ideas and someone suggested a piece vilifying soft drinks, currently Public Enemy No. 1 (see NYC, Bloomburg).  So, they wrote the headline then they contacted the hapless writer who cast about for some new and nifty problem that could be blamed on aforesaid sugary drinks.  Everyone  knowing full well that the majority of people only read the headline and the first paragraph; it’s only mutants such as myself who check the original research and parse the methods section.

If sugary drinks do give you kidney stones, these people didn’t prove it.

There are a lot of good reasons to consider soft drinks a treat, not a staple. They’re empty calories; they rot your teeth and many of them contain fairly high amounts of caffeine which can make you nervy and insomniac. But kidney stones? Really?! We think not. And it takes stones to say they do.

 

 

 

* oh get over it. It’s summer. There’s construction outside. Yes, I have the occasional ginger ale or Coke. Sometimes, when I’m especially cranky, two days in a row. Sue me.  

So they continue being a pain ..

Painkillers increase risk of car crashes proclaims the headline in today’s Globe and Mail. Apparently, researchers at the “Toronto based Institute for Clinical Evaluative Sciences have found a correlation between even low-dose regular opioid use (two Tylenol 3’s three times a day) and an increased risk of car accidents.

Not a huge risk, the head researcher David Juurlink, hastens to add; certainly nowhere near as high as alcohol, but a risk nonetheless.

Wonderful. Two of my favorite things – correlational studies and experts rambling on about opioids in the same piece with blinkered experts continuing on their merry way, all pleased and sending out press releases (don’t kid yourself, that’s the only way a paper from something called the Institute for Clinical Evaluative Sciences that nobody has ever heard of would get a piece in the Health section of the Globe and Mail).

Um, did it ever occur to these geniuses that the reason people take those drugs, namely pain, might have something to do with those slightly increased numbers of car crashes? I use the term slightly advisedly: the risk increases between 21 and 42% according to the “scientists”. (Scientists in quotes because surely any scientist worth his salt knows that unless you know what you’re comparing something to a percentage – relative risk – is absolutely meaningless.)

Surely pain – which means someone gets more easily fatigued and could become less alert – could have a thing or three to do with it?

Oh no, it’s the opioids.

Of course by the same token, ice creams causes an increased number of drownings. Think about it. In the summer people eat more ice cream – and more people drown. QED.

Last March I wrote a post on Oxycontin and made some disparaging noises (OK, loud, angry noises) about the ado being made about addiction and pain killers. Notably, a Fifth Estate that had me virtually apoplectic with rage. Using largely American stories the CBC newsmagazine insisted that addiction to oxycontin was a  massive problem that we should all get worked up about, especially when it came to First Nations communities in the north.

By contrast, a few weeks ago I happened to come across a BBC mini-documentary about the same topic and the contrast could not have been more marked. I missed the start of the piece but what I did watch was superb. It was a program called “Our World” and the journalist’s name was Linda Sills. (I hope that’s how one spells it.) She had travelled to several communities in northern Ontario, spoken to various tribal elders, artists and addicted individuals and – wonder of wonders – had actually done some research and thought about the subject.

Sills, like the people she spoke to, all agreed that the problem was not opioids (in the ‘80’s it was alcohol and in the ‘90’s glue sniffing) but the situation. The environment. The socio economic conditions. When people are unhappy and hopeless they take solace in drugs, whatever is around, whatever they can get. Solutions are complex, multi-factorial and must emerge from the grassroots of the community itself. An artist who looked to be in his forties, addicted to oxycontin himself, talked of how his art was helping him reduce his drug intake (even though he genuinely looked as though he was still in pain, physical and psychic).

Opioids have been around for thousands of years. Officially they were discovered around the time of the Trojan War (war has always been excellent for medicine) but no doubt people knew of the pain relieving properties of the poppy long before that. They are the single most effective agent in treating pain and although we’ve tried to come up with synthetic variants (Demerol, Fentanyl) and alternatives (non-steroidal anti inflammatories) there simply has never been a drug that works as well, as consistently.

Treating pain with opioids allows people who suffer from chronic pain to function. To have lives. To work, interact with families and friends and feel like the are part of the world.  But in recent years, perhaps with the rise of right wing moralizing in the U.S. and what some people call the rise of the nanny state we have taken a sharp turn away from treating pain to calling individuals who need medication “addicts”.

Our reverence for numeric reasoning and bad statistics naturally hasn’t helped any; after all, what could be more qualitative and unmeasureable than pain, which, by definition, is whatever the person says it is?

 

Boundless enthusiasm for Overtreatment

Last week in Slate, sent along by my friend Maryse whose blog, Frogheart  covering nanotechnology, art, technology and so on is immensely popular (one tries very hard not to be too envious of her close-to-a-million visitors daily), based on an update in the respected Cochrane Review: how treatment of mild hypertension essentially useless.

What neither piece points out is that what we call “mild hypertension” today (systolic 140-159) was considered essentially normal a scant fifteen years ago. Well, 140 anyway. Or that thoughtful (often older) clinicians would not consider this hypertensive in older patients today.

Ah, it’s just a number people. A number, determined by a group of individuals, often cardiologists but also other “experts” (many of whom have ties to the drug companies who make antihypertensive drugs) as to what should be considered “normal”.

I’ve spent much of my research career debunking this notion of “normal”.  Particularly as it pertains to physiology, biology and humans, who, as we all know, tend to come in a variety of shapes and sizes and whose health status is determined by many variables, not the least of which is how much money they have and how happy they are in their lives.

Women, of course, have long been outside this matrix – normal consisting essentially of the male body without its circadian rhythms and cyclic hormonal elements, never mind pregnancy or menopause.  The vast majority of clinical trials, the gold standard of evidence as it has been called, excluded woman altogether and even when they tried to bring them in often women themselves wouldn’t play ball.

The reasons seemed complex, social, domestic, personal, economic and psychological.  Women generally have been socialized to be risk averse, which means if they are told they have condition X then they want the damn treatment. They don’t have time to worry about whether or not they’re taking the placebo. Plus, large multi-centre trials require the time not to mention transportation to get to those bi-weekly weigh-ins or tests or what-have-you and women, particularly women over 40 tend to be overwhelmed with children and grandchildren and ageing parents and work and housework and life. “Who’s got the time to enter a trial?” most will ask. “I’ve barely got time to sit down never mind volunteer my time at a clinical trial.”

No doubt there are other reasons but at this point I haven’t researched it. I just know that women are vastly underrepresented in what we optimistically consider evidence-based medicine.

I see something inherently male and American in this perspective, this enthusiasm for aggressive treatment (as the cultural critic Lynn Payer in her wonderful book Medicine and Culture once remarked, there has to be something culturally satisfying in the notion of ‘aggressive’ given how often the term is used in American medicine; even the recommendations for gentler treatment of newborns was advised to be pursued aggressively).  Or overtreatment.

Cross cultural studies have repeatedly shown that countries like Canada, which can’t afford as many cardiac surgeries and procedures as the U.S., as well as countries like Finland, which simply doesn’t believe in them, have the same outcomes as the U.S. In other words, Americans spend huge amounts of time and money doing things – cardiac bypass, cardiac catheterization, stents, etc. – but cardiac patients are no healthier than in countries where they do half the number per capita. All that activity doesn’t result in better health or lower morbidity or mortality.

Less is often more in medicine. And bodies are fragile. Drugs, surgeries, procedures, tests: these are not benign. They exact a toll on the body. And all for what?

All because somebody somewhere decided they know what was best and what magic number was “normal” blood pressure.  Or what an artery “should” look like in a person with no symptoms.

The worst part is that as patients we are complicit in this, increasingly believing that more is better – and reject the notion of watchful waiting, considering a physician who says, “just take it easy for a while, it’ll get better on its own” a quack. So, fewer and fewer physicians say such things. As a doctor once said at a conference I was at: It’s easier to just write the prescription that to take twenty minutes to explain to a patient (who’s not going to believe you anyway) why she or he doesn’t really need it.

But hey, we wouldn’t want to miss out on something that could be really terrific now, would we?!