Category Archives: confounds

Drunkard’s Fallacy


Two lads are making their way home, after some jars at the bar: “Seamus, would you give over circlin’ round that lamppost? You’re makin’ me head spin!” “Ahh, but Desmond, I’m tryin’ to find me feckin’ keys.” “Oh, now, Seamus, I t’ink I heard a ‘clink’ when we was coming t’rough de alley, back dere.” “Yeah, me an’ all, Desmond; but de light’s better under dis lamp.”

This shocking stereotype of Irish inebriation and false logic was offered to us in graduate school, in a course on research design, to illustrate the “Drunkard’s Fallacy” [the tendency for researchers to “search where the light is better,” and thereby overlook the “keys in the alley”]. Sir Francis Galton, for instance, believed that intelligence was highly correlated with head circumference [which is easily and cheaply measured]; and he tried to encourage fatheads to marry other fatheads, for the improvement of Mankind. Later, Dr. William Sheldon put forth the theory that one’s body type–fat, muscular, or thin [which is evident, even to the casual observer]–was highly correlated with 3 distinct sets of personality traits. All of which would be highly amusing, except that their “scientific evidence” has been used as the rationale for eugenics–most notoriously, but not exclusively, by The Third Reich.

These days in neuropsychological research, there is often a generous sponsor “paying the light bill,” who then–sometimes blatantly, but other times subtly–sets the agenda for “where to search.” Senator Charles Grassley has done Menschlich work, in my opinion, by doggedly insisting that medical researchers disclose the source of their funding, so that consumers can then take their “findings” with a grain of salt. But what if the funding source is Uncle Sam? Could there still be a tendency to “circle the lamppost,” rather than “go down the dark alley,” in search of scientific “truth”?

In 2005 the journal Nature Neuroscience published the results of an NIMH-funded study, which followed over 200 individuals from birth to 26 years, to assess their risk of becoming “depressed” by Stressful Life Events, and its correlation to the presence or absence of “the Serotonin Transporter Gene (5-HTTLPR)” in each individual’s DNA. Don’t you just know, the researchers found the two factors–“depression” in response to bummer events, and the presence of that specific gene–to be highly correlated. Well, the media was all over it like a cheap suit. Cute articles about “Blue Genes” came out like a rash. And the always-only-sleeping-not-dead eugenics lobby began to bang on about genetic screening for 5-HTTLPR, rationalizing that the opposite of bumming out at bad news was Being Resilient; and who wouldn’t want to breed Resilient kids, in these troubled times?

Also–and here’s the Beauty Part, if you’re a government agency, trying to contain costs for Mental Health treatment–if bumming out is “all in your genes,” no need to wander down that dark [time-consuming] alley of “trying to understand what got up your nose, which made you angry, which then made you depressed.” That’s like fiddling while Rome burns. Like trying to understand why your body has become insulin-resistant, or why your arteries are clogged. What you need is a chemical–not an insight. Faster, cheaper, better for Mankind [and the bottom line].

So, here’s the thing. In this week’s issue of the Journal of the American Medical Association: a meta-analysis of all studies which could possibly have replicated the ballyhoo’d NIMH results found no correlation between the two factors. Bupkes, nowt, Nichts. [Incidentally, their n = 14,250, of whom 1,769 were classified as having “depression.”] Do you know what was significantly correlated with “depression” in all the studies these researchers meta-analyzed? Stressful Life Events.

Well, Seamus, I guess it’s back to the dark alley, if we ever want to find those feckin’ keys…

Leave a comment

Filed under confounds, murky research, pro bono publico

"Why Keep a Dog and Bark, Yourself?"


Another Mancunian aphorism, meaning: “Why do the thing you pay (or feed) someone else to do for you?” We now launch into a world of pain & suffering, the “fourth irritant,” which has received short shrift in this blog, so far. Since I [Lieutenant Commander Kangaroo] am at the helm, we shall first make a short detour…to the Orient.

Have you heard about the “Bow-lingual” Dog Bark Translator? It purports to analyze your dog’s utterances, and assign each of them to one of 6 mood states: “happy, sad, frustrated, on-guard, assertive, or needy.” Since, of course, it comes from Japan, it offers the dog owner (“just for fun,” the distributors insist) Japanese “translations”: a phrase to capture each of the 6 canine moods. Since, by coincidence, we have trained Lili to respond to Japanese commands, wouldn’t it be fun to find out what she is saying back to us, in Japanese? [Not $213’s worth of fun, I feel.] Just within the past month, though, she has begun to make this new, yodeling sound on her way to the hearthrug “penalty box,” for over-the-top “barkitude” at intruder dogs on our property. My own translation of it is the adolescent’s plainsong chant of protest, “Mah-am!” It’s not exactly defiance–more of a minority report on the unfairness of the sanction. [It’s a struggle not to laugh when she does it.]

Now, back to pain & suffering. Three cheeky chappies @ the University of Keele, in the UK, wished to study the “point” of swearing, in response to pain. Their research design was so cute, that it merits some attention. Undergraduates were recruited for a (supposed) study of “the degree of stress that various forms of language elicit during tense situations.” [That’s the dullest phrase in the NeuroReport article, I promise.] Each subject was asked to list “5 words you might use after hitting yourself on the thumb with a hammer,” as well as “5 words to describe a table.” Only those who listed at least one obscenity were included in the experiment. They ended up with 38 males and 29 females. [Already, we’re doing sociology & anthropology, no?] Each subject was tested twice, in randomly assigned sequence–once while repeating the first expletive they listed, and once while saying the ordinally corresponding “table” descriptor from their other list. The pain & suffering inflicted was [a maximum of 5 minutes of] their hand submerged in a bucket of ice water. So, guess under which condition each & every subject endured pain significantly longer–expletive or furniture adjective? Not surprisingly, swearing out loud is “hypogesic.” [It lessens pain perception.]

But why? The boffins from Keele are a bit baffled. The best they can do is surmise that it has to do with amygdalar arousal. Therefore, we armchair researchers can feel free to kibitz. What if there had been a third test condition, besides obscenity & Ikea cataloguery? What if the contestants had also been asked to list 5 Emotive (but genteel) outcries [you know, like “Rats!” or “Crumbs! or “Bother!”]? How do we know that such polite but vehement expressions of dismay are not equally hypogesic? [I don’t believe it for a minute, mind you; but it would have made the study’s findings more robust.]

Many dog training books assert that when a canine launches into a prolonged bout of “barkitude,” he is producing endorphins, and therefore rewarding himself with a powerful–some would say addictive–stimulus/response loop, that soon has little to do with the original cue for barking. [The UPS truck is long gone, but the stoner dog is still “self-medicating,” man.]

Maybe that’s what a human yelp of obscenity evokes in the brain: a lovely hit of endorphin, which makes the icy water much easier to handle. If so, and if the yelp is prompted by amygdalar arousal in response to pain & suffering, then (shock, horror!) maybe the amygdala is Not All Bad. Maybe, like most things found in nature, it’s a double-edged sword. And so, too, [until someone does the “Crumbs!” variation on the Keele study] is the occasional, appropriate human bark of (insert expletive).

Leave a comment

Filed under confounds, murky research, pain reduction, stifled wolf

"Who(m) Do You Trust?"


“Me, or your lying eyes?” [goes the old-but-new-again joke]. In the spirit of the Groucho Marx quiz show, You Bet Your Life, in 1957 Johnny Carson [soon joined by Ed McMahon] hosted a daytime game show where the backstories and chemistry between the contestants [3 sets of they-never-met-before “couples”] trumped correct answers. A quiz category was announced, and the man of the mixed-doubles team was asked. “Who do you trust [to answer a question on this topic–yourself, or this dame you just met]?” Cutting-edge battle-of-the-sexes TV! Chivalry vs. machismo! It was like hearing your parents “debate” who knew the faster way to get to the Sawmill River Parkway in weekend traffic. Riveting stuff for the after-school crowd [the show’s target demographic].

[Cultural digression: My father, whose off-the-boat parents spoke Irish at home, acquired his English through grammar books, and was a stickler for correct–even archaic–usage. When, in this blog, I deviate from Fowler’s Usage into demotic speech, I am using Jakobsen’s Poetic speech function, to make a point, innit?] Thus, when my sister and I were recapping that day’s episode for our father, we referred to the show as Whom Do You Trust?

Now, back to pain. And back to one of my favorite hobby horses–“Are you going to trust every so-called research finding, just because it was published in a peer-reviewed journal?” Okay, smokers, this one’s for you. Our old friend Dr. Malinoff has research evidence that “Nicotine stimulates an area of the brain right next to the area that processes pain; smokers’ pain scores routinely exceed the pain scores of non-smokers.”

Now, for you redheads. For decades anesthesiology students were taught to use more pain-killer on their red-haired patients, because their tolerance for pain was experimentally proven to be significantly lower than blondes & brunettes. Turns out that the ever-popular bucket of ice water was used, to achieve these replicated research findings. Recent studies [using the only other ethically-approved method of inflicting pain for research purposes: electric shocks] have found that red-heads are, indeed, more sensitive to cold, but they tolerate a jolt of voltage significantly better than the other groups. I could go on, but you get my skeptical [not to say cynical] point. “It ain’t necessarily so.”

Maybe the researchers are all just [metaphoric] “drunkards, circling the lampost.” Why not let your own experience be your guide to the “truth” about your very own pain? Well, consider this finding, reported in the APA Monitor [January 07]. Using the “Cutaneous Rabbit Illusion” [where the subject’s arm is rapidly tapped, first near the wrist, then near the elbow, and soon the subject “feels” a phantom tapping sensation between the two spots–quaintly known as “the rabbit hop”], the same area of the subject’s brain lit up on the fMRI, whether the tapping sensation was “real” or only “phantom.” A similar thing happened with more painful stimuli [a rabbit-wearing-golf-shoes, sort of thing], only this time it was the dreaded S1 [primary somatosensory cortex…aka pain center] area of the brain that lit up.

Confused? Ah, then I have achieved my goal. Put your previous beliefs about what causes (and reduces) the sensation of pain “on ice” for a bit [unless you have red hair, in which case…just put them under wraps]. Maybe, some of the old-but-new-again ways of coping with pain have something to offer 21st Century sufferers.

So anyway, is that dark smudge, in the lower right quadrant of the door, the tail of a ravening wolf, or just the head of Napster, the black cat? One would be awful, the other just a little inconvenient. What if you could choose which one to experience? I think you can choose. But who ya gonna trust–me, or your [sometimes lyin’] eyes…arm…S1 pain area? Next stop, the enchanted forest. [What could it hurt?]

Leave a comment

Filed under confounds, murky research, pain reduction

Bad Fairy at the Christening


Backstory to Sleeping Beauty: two Good Fairies offer upbeat predictions for baby Aurora; then a Bad Fairy [name of Maleficent] predicts that on the girl’s 16th birthday, she’ll prick her finger with a spindle and die. A 3rd Good Fairy softens the malediction from “die” to “fall asleep.” Then they put the baby into a witness protection program [changing her name to Briar Rose]. You remember the rest.

So here’s the malediction du jour from BMC Medicine 2009, 7;46: based on a decades-long study of 16,496 kids, all born in the UK, in the same week of April, 1970. When they were 10 years old, several tests & measurements were administered. Less subjectively, their Body Mass Index [as well as that of their parents] was obtained by “a qualified nurse.” The Social Class of their parents was calculated, based on Dad’s line of work [if any]. Their teacher filled out a “modified Rutter B” questionnaire [which assessed each kid for how “worried,” “miserable,” “tearful,” and/or “fussy” they were]. Hands up, if you ever were assigned Robert Rosenthal’s 1968 educational classic, Pygmalion in the Classroom. If so, you already know how this study is going to turn out; but don’t spoil the surprise for the others.

Then these UK 10-year-olds were given 3 read-it-yourself-and-fill-in-the-answers surveys. The so-called Self-Report test had just 2 items: “I worry alot,” and “I am nervous,” to which the kid could answer “Not at all,” or “Sometimes,” or “Often/usually.” [Let’s cut to the chase on this one, and say that it predicted nowt, bupkes, nada.] Ah, but there followed the 12-item LAWSEQ [“yes,” “no,” “don’t know”] to assess Self Esteem; and the 16-item CAROLOC [“yes,” or “no/don’t know”] to assess External/Internal Locus of Control. The scoring on each test was like golf [not basketball]: lower was better. Did you ever study the “Yea-sayer Effect”? [As the name suggests, some folks Just Cain’t Say “No” on questionnaires. That’s why well-designed surveys throw in some “Yes, we have no bananas” type of questions, just to catch out the “yea-sayers.” Not these two tests, though.]

Okay, so fast-forward 20 years. Of the original cohort, less than half the 30-year-olds [mostly women] chose to contact the researchers, with their self-reported Body Mass Index. Now for the high-concept title of the article: “Childhood emotional problems and self-perceptions predict weight gain in a longitudinal regression model.” And now, for what the data actually show. “The strongest predictors of weight gain were BMI @ age 10 and parental BMI.” “[For women only] External Locus of Control and Low Self Esteem predicted weight gain on a par with Social Class.” “The Rutter B predicted increased BMI [for women].”

So–before we all start wringing our hands like the guests at Aurora’s Christening party, at the “Statistically Proven Fact” that highly-strung 10-year-old girls [or those who Just Cain’t Say No on questionnaires], whose teachers have already pigeon-holed them as Nervous Nellies, are doomed to become overweight 30-year-olds–let’s consider an unexplored bias in the data. As Rosenthal’s [much more robust] results have shown, a teacher’s subjective assessment of each student has a powerful effect–for good or evil–not only on the teacher’s predictions of that kid’s academic and social success, but on the kid’s actual success.

So, here’s my advice to concerned parents of young girls. Listen carefully at those parent-teacher conferences; and if you’re getting the vibe that the teacher has your kid in “negative halo” mode, either change the teacher’s attitude or change which teacher your kid has. I have no doubt that my father’s move-in-October Navy schedule fortuitously rescued me from some toxic negative halo situations [inasmuch as I was an Exceedingly Highly-Strung, ergo annoying, young pupil]. And twice, my parents insisted that I switch teachers, even when we weren’t blowing in or out of town.

Ya gotta be your kid’s Press Agent, and package ’em, like an Oscar nominee. Ya gotta win the Bad Fairies over, and get them to revise their own predictions of your kid’s prospects. Also, it couldn’t hurt to coach your kid to charm it up a little, no? And for those of you waiting for the Up Your Nose nexus here, say it with me: Childhood humiliation [at not being one of the teacher’s faves] leads to anger [often, directed against oneself] and to dumping cortisol, which leads to weight gain…along with other forms of pain & suffering.

But watch out for that 16th birthday, anyway. It’s a risky time for most girls.

Leave a comment

Filed under attribution theory, body image, confounds, locus of control, murky research, stress and cortisol

"A nod is as good as a wink to a blind horse."


This old Cockney expression, first cited in 1794, means, “Do I have to spell out the obvious to you? You know what I mean.” [Lately, contracted to the Phatic, “Nar’mean?”] Well, here is my corollary: “A diss is as bad as a threat to a young man.”

Waratu Sato [& colleagues] of Kyoto University have made headlines this week with their research on 24 incarcerated juvenile delinquents, compared to 24 “control” subjects, whose average Verbal IQs were 28.4 points higher than their jailed brethren. [The Controls’ mean Verbal IQs were in the High Average range, whereas the JDs’ were in the Low Average range.] As the discussion portion of this breathlessly-hyped-in-the-media article points out, the IQ factor might account for all the difference between the two groups’ performance on the task. Meanwhile, let us consider the task, itself. Each subject was shown a series of photographs of faces “portraying” one of 6 emotions, which they had to identify correctly. [Wait. Remember the dog-bark-translator, also from Japan, which categorized canine utterances into one of 6 emotions? Hmm…] Anyway, the headline was that the 24 JDs kept “misrecognizing” facial expressions of disgust for anger. So, incidentally, did the Control subjects, but 17.2% less often [which the researchers, themselves, acknowledge is “not a large difference”]. And their conclusion? “One of the underpinnings of delinquency might be impaired recognition of emotional facial expressions, with a specific bias toward interpreting disgusted expressions as hostile angry expressions.” On the other hand, as has been empirically demonstrated for centuries, one of the underpinnings of delinquency might be lower verbal IQ. Nar’mean?

But this is not Sato & Co.’s first foray into studies involving distressingly Photo-Shopped facial expressions. As reported in NeuroImage in 2004, 5 females and 5 males [mean age 24.4 years] volunteered for a [non-diagnostic] fMRI study, comparing their amygdalar responses to facial expressions described as angry or neutral, sometimes facing head-on, and sometimes slightly averted. Guess what they found. Head-on angry faces aroused an amygdalar response [in both men & women], whereas averted angry faces did not. Nor did neutral faces, no matter which way they were pointed. And these subjects weren’t even delinquents!

Apparently, even in Japan, researchers enjoy circling the lamppost, in order to discover that which is already known. Are you tellin’ me, the culture which introduced the Western World to the notion of seppuku [aka, hara-kiri] as a rational response to “loss of face” [aka, receiving a look of disgust, or a diss] doesn’t see the nexus between disgust and anger? Well, I do, and I’m Irish. Anyone who has ever read urban anthropology [or the newspapers] is aware that most youthful violence is triggered by one party giving the other party such a look [of disrespect], that honor demands a hostile response [usually towards the dissing party, but sometimes, turned inwards towards the dissed, himself, out of unbearable humiliation]. Nar’mean?

Street savvy youth [and their elders] learn to avoid inadvertently giving such facially expressed offense by taking a leaf out of the Viennese [Dissed] Clever Dog’s book, and averting their gaze. Further, those of us using public transport in the wee hours, learn to “keep our eyes in the boat” and/or to monitor our facial subtext for inadvertent expressions of disgust, and to verbally override them, with such remarks as, “Yuck! I think I may have food poisoning! Oh, well. Worse things happen at sea, right?” The only threat such a remark poses to fellow travelers, is to their clothing, not to their self-worth. No diss, no hostilities. [Usually, not always.]

I was waiting on line at the clinic pharmacy today, where the TV had some inane talk show on, with a guest who may have been the younger brother of a Backstreet Boy; and the interviewer said to him, “What other people think of you is none of your business.” Well, the studio audience applauded. [And so would I have, except that I was at the clinic pharmacy.] What a wonderfully powerful antidote to the infuriating toxin of humiliation! If someone gives you a look of disgust, it’s none of your business. Avert your gaze and tell the wolf in your brain to pipe down, already. Nar’mean?

Leave a comment

Filed under confounds, murky research, power subtext, semiotics

"It would have made a cat laugh…"


“or a dog; I’m bid to crave an audience for a frog!” This first citation of a common British idiom [meaning, “so ridiculous, it would coax a laugh out of an improbable source”], is from The Queen of the Frogs, the last of 176 plays written by James Robinson Planche, in 1879. Besides turning French fairy tales into satirical comedies for the London stage, he was the father of the English costume drama. [Helpful for 19th Century “Kangaroos,” don’t you know.]

Now, back to what makes a rat laugh [according to Jaak Panksepp and his merry pranksters]. Before I tell you what, I’ll tell you how he knows [that a rat is laughing]. He uses the Mini-3 Bat Detector [made by the Ultra Sound Advice company, of London]. Cue the Pied Piper, in historically accurate costume. I’m not making this up. Laughing rats [also cats, dogs, primates, and human children] emit ultrasonic vocalization patterns (USVs) at the frequency of 50 KHz, which Jaak calls “chirping.” [This is in contrast to “long-distress” USVs @ 22 KHz, which express negative emotions, such as fear, “social defeat,” or anticipation of pain & suffering.] So, how do you make a rat laugh? Tickle him [or let him self-administer cocaine]. Seriously. And how do you bum a rat out? Mix cat fur into his cage bedding [or take away his blow]. Whom shall we call first: the Nobel prize committee, or PETA?

While you’re pondering that, you should know that these rats have no personal experience of cats as predators; but even one cat hair in their cage freaks ’em out. Panksepp opines that lab researchers who own cats skew rat-study data all the time, due to this overlooked fear factor on their clothing or person.

But we humans have more degrees of freedom than lab rats, many of us. What other stimuli (besides tickling & coke) might make us laugh? The ancient Greek philosophers, such as Plato, thought they had the definitive answer: a feeling of superiority. According to this cynical lot, all human hilarity arises from Schadenfreude: delight at another’s humiliation. Hmm. Maybe for grown-ups. Not so much for human babies and other young mammals [who are suckers for the tickling]. Heroditus [484 – 425 BC], used historical vignettes to explain how tears of joy can so quickly turn into tears of sorrow. [How the USVs can drop from 50KHz to 22 KHz, in the blink of an eye.] He tells, for instance, of Xerxes, who is kvelling over his fleet at a regatta at Abydos, then suddenly becomes all verklempt; and when his uncle asks him,“Boychick! Was ist los?” Xerxes says, “In 100 years, all these people will be dead, and no one will know how powerful I am!” Solipsistic, much?

In 1979 psychologists Efram & Spangler posited that all tears [whether of sorrow or joy] occur during the recovery phase of limbic arousal. “All tears are tears of relief.” Miss America cries because she was so afraid she would lose. Mourners cry [according to these guys] because they are so glad that the bells are not (pace John Donne) tolling for them.

Back to our putative laureate, Panksepp. He would assume that all tears [whatever the frequency of our USVs] contain cortisol: that the relief we are experiencing [whether we label ourselves “over-the-moon” or “down-in-the-dumps”] is, whatever else, neuro-chemical.

Personally, I’m saving up for a Mini-3 Bat Detector, to find out what makes a dog [like Lili] laugh. And meanwhile, I suggest we all take careful note of what makes us laugh and/or cry. I just know there are more triggers for mirth than tickling, blow & Schadenfreude. Tell you about some of them next time, yah?

Leave a comment

Filed under catharsis, comic relief, confounds, murky research, stress and cortisol

Janus, the Gatekeeper


As the month named for the Greek god of “Shut it!” draws to a close, here’s a meditation on knowing when to say “Enough, already!” An article in the LA Times this week reports the results of an Australian study [through whose methodology one could drive a “ute,” but, oh, well] published in Circulation [as in cardiovascular, not newspaper] asserting, on the basis of subjects’ self-report of their hours spent watching telly in one week [What if it had been this week, and the Australian Open Tennis Tournament was on? S’truth!], that those who watched more than 4 hours per day were “18% more likely to die” than those who watched under 2 hours a day. So, what, if you have no access to telly, you’re going to live forever? Outta sight!

Their point was meant to be that prolonged sitting leads to poor circulatory health. “Switch the bloody thing off and go walkabout!” Sound advice, even if not convincingly proven by their data. I have another theory, having to do with the content of the programmes [it was Oz, after all] watched. In the photo accompanying the news release in the LA Times, a guy was doing a vigorous workout at the gym, while viewing a widescreen telly tuned to a 24-hour news channel. Was this a wry editorial decision, on the part of the newspaper of record for the TV & movie capital of the world, to undercut the message that telly viewing precludes exercise? Pretty cute, if so. Also, it’s grist for the mill for my alternative theory of what’s hazardous to one’s health: 24-hour news channels. All that vicariously traumatizing news, infinitely looped, ineptly analyzed, spun, repeated [you should excuse the expression] ad nauseam: it’s a major producer of cortisol [which the researchers did measure in their 4-hour-plus subjects, and lo, it was sky high].

I’ll wager that 4 hours spent watching comedies, well-made dramas, or sporting events [including horse racing, which produces adrenaline, not cortisol] would be much less toxic than 4 hours of looped news. Wonder if the researchers asked their subjects to list shows by name, or even by genre. Some great data-mining to be had, in them thar hills…

Whenever my clients complain of insomnia, I advise them to reduce their intake, not of caffeine, but of TV news. It is designed to hook you, to instill Fear Of Missing Out in you, to compel you to keep watching. I suggest substituting a cooler medium [in the Marshall McLuhen sense], such as newspapers [or online news sites]. They are less “in your face.” They give you the option to skim, or even [gasp!] skip, cortisol-agenic news items. To be the gatekeeper of your vicarious trauma. To say, “Enough, already!” and get back to your own, possibly less distressing and certainly more relevant, life challenges. I’m not saying you should care less about the calamities of your fellow earthlings. I’m saying you should watch less.

It’s not too late for a New Year’s resolution…

Leave a comment

Filed under confounds, murky research, stress and cortisol, vicarious trauma

Rx: "Waldspaziergang" (A Walk in the Woods)


Another case of Pseudo-scientific Over-reach, brought to you by the BBC this week: “‘Green’ exercise quickly ‘boosts mental health.'” This, (loosely) based on a paper by Jo Barton & Jules Pretty of the University of Essex [published in Environmental Science & Technology, under the catchy title, “What is the Best Dose of Nature and Green Exercise for Improving Mental Health? A Multi-Study Analysis”]. The authors did a statistical meta-analysis of 10 completely unrelated studies involving people of various ages engaging in various outdoor activities, and answering questionnaires purporting to measure changes in their self-esteem and mood, at the intervals of 5 minutes into the exercise, 10 to 60 minutes, “half a day,” and/or “a whole day.”

The groups studied ranged in age from “youths” to ” the elderly.” The activities they engaged in ranged from walking [apparently, not part of all 10 studies] to cycling, horse-riding, fishing, sailing, gardening, and “farming activities.” All the studies took place near Essex in England, at some time over the past 6 years; and the 1252 participants were “self-selecting using an opportunistic sampling method.” [I think that means, these were the ones who completed their questionnaires.]

Before we get to the “data,” let’s ponder how on earth one “completes” 2 questionnaires after 5 minutes of horse-riding. Is it like the Kentucky Derby, where a lady with a wireless microphone rides up beside you and interviews you? Is there a staggered start to the pony trek, so she can interview each participant exactly at their 5-minute mark? Wouldn’t it take longer than 5 minutes per participant, to ask & answer the 20 questions? How about the cyclists? Is it like the Tour de France, with an interviewer in a chase car? These intriguing logistical problems were not addressed in the “Materials and Methods” section of the paper.

Anyway, now for their “Results.” For both self-esteem and mood, the “greatest changes come from 5 minutes of activity, and thus suggest that these psychological measures are immediately increased by green exercise.” They go on to report that “the changes are lower for 10-60 min and half-day, but rise again after a whole day duration.” Looking at the many data charts in the article, unless the same chipper 5-min subjects bum out @ the 10-60 min and half-day point, and then perk up a bit after the whole day, it appears that each participant was assessed at only one point. There’s a clue in the “Discussion” section: “Whole-day activities are likely to be qualitatively different activities, involving in some cases camping overnight and in others significant conservation achievements.”

Hmm, wouldn’t it be useful to know just which Green Activities yielded “The 5-minute Fix”? I’m thinking, unless you’re a professional jockey, not horse-riding. Not fishing, either. Nor, indeed, sailing. I’m thinking, probably walking. So, why not try that first? Take yourself [and any handy companion, 2- or 4-footed] on a little walk among the trees, and just see if it doesn’t “boost [your] mental health.” That’s what the Austrians were doing to lift their spirits, decades before Freud had them lying on his couch: Waldspaziergang in the Vienna Woods. [I hear a waltz…]

Leave a comment

Filed under confounds, murky research

What’s your point?


Lately I’ve been asking the “What’s up my nose?” question about an insidiously lovely song by Ed Sheeran [currently #3 on the BBC 1 chart] called, innocuously enough, “The A Team.” As the [you should excuse the expression under the circumstances] “addictively” catchy lyrics clarify repeatedly, it is the “Class A team” to which the heroine/victim in the song belongs [meaning that she is fatally attracted to drugs classified in the UK as Class A, such as crack cocaine]. I badgered my visiting 20-something daughter about 2 aspects of this song. Why, when it seems to glamorize, without irony, lethal drug abuse, is it so popular? [Because it’s beautifully written, played & sung.Very few listeners downloading the song are thinking critically about its message.] And why, when such glamorization is as old as the opera La Boheme [and its current iteration Rent], does it make me so angry? As it happens, I was doing all this heavy “wolf-work” a week before Amy Winehouse’s untimely death.

Before I deconstruct my “issues” with Ed Sheeran, let me draw your attention to an editorial in yesterday’s NYTimes, entitled “Addictive Personality? You Might be a Leader,” by David J. Linden, “Professor of neuroscience @ Johns Hopkins University School of Medicine and the author of The Compass of Pleasure: How Our Brains Make Fatty Foods,Orgasm, Exercise, Marijuana, Generosity, Vodka, Learning, and Gambling Feel So Good.” [2 fun facts about the author & then my critique of his research: before joining the Johns Hopkins faculty, he worked for Big Pharma; and his father is a high-profile “shrink to the stars” in Santa Monica, CA.] The burden of his argument, taken from the animal & human research of others [some of it, decades old], is that “addicts want their pleasures more but like them less.” This he attributes to “blunted dopamine receptor variants” in these individuals.

Point of order. As its title suggests, this is a very informally written Pop Psych book [not a peer-reviewed journal article]. How large was his human sample size? In the NYTimes, he cites mostly anecdotal evidence concerning famous dead guys [such as Baudelaire, Aldous Huxley, Winston Churchill, and Otto von Bismarck]. How do we know that these “I can’t get no-o satisfaction” folks are actually getting less satisfaction from their “cocaine, heroin, nicotine or alcohol” than their peers are? Just guessing, here: he asked them? [Or the researchers who actually carried out the studies did.] And the addicts said [in a variant of the old Irish joke], “This blow is terrible, and there’s not enough of it!”

And don’t even get me started on Theory of the Mind, which posits that we can never truly know another individual’s experience, so how can we possibly know that we liked the drug less than the Man on the Surbiton Omnibus [British legal term of art for “the average guy”] did?

Is the circularity of Linden’s argument making you dizzy yet? If you are an addict, there’s something wrong with your dopamine receptors. [Not your fault, you poor victim.] To quote one of my favorite famous dead guys, the comic novelist Evelyn Waugh [who wrote brilliantly about alcoholism in Brideshead Revisited], “your brains is all anyhow.”


Is this supposed to mean that everyone with this genetic variant is doomed to substance addiction? Back in the 70s there was a controversial theory that sought to “explain” [excuse?] alcoholism as the result of a genetic variant that metabolizes ethanol in the [poor victim’s] brain more slowly than in your man on the Surbiton omnibus’ brain, storing it as a morphine-like substance. [Thus, alcohol addiction was actually morphine addiction; and we all know how to “cure” that, right?] Studies suggested the prevalence of this gene variant in certain ethnic populations [such as my own, the Irish]. It’s not our fault! We’ve got a disease, innit? What? Like an allergy? Like a peanut allergy? Jeez! Well then, let’s just avoid peanuts. Or, mutatis mutandis, alcohol.

What’s my point? What’s up my nose, about Messrs. Sheeran & Linden? The fear, that by ceding locus of control over what we choose to ingest [by mouth, nose, or vein] to an “accident” of our brain physiology, we are condemned to fulfill the dark prophecy that “anatomy is destiny.” The humiliation, that we have no option but to follow our noses to the irresistible substances that we crave, even though they will [glamorously or sordidly] kill us.

As the Brits would say, “Blow that for a game of soldiers!”

Leave a comment

Filed under confounds, gets right up my nose, locus of control, murky research