Category Archives: murky research

"Chotto matte!"


In 2007, the pre-frontal cortex emerged, from the cinder-strewn hearth of neuro-anatomical and behavioral research, to the dazzling ballroom of the chattering classes–conveyed thither by a New Yorker cartoon. Parents, to their fashionably disheveled adolescent son: “Young man, go to your room and stay there until your cerebral cortex matures.”

Who knew? Turns out Disney was right. There’s a Jiminy Cricket area of the cerebral cortex, that acts like a Hollywood producer–red-lighting or green-lighting various hare-brained scenarios the amygdala pitches to it: “Too risky! Have to run that by legal. Too controversial! We’ll alienate our demographic. Will it be banned in Boston? Condemned in Cleveland? Banished in Baltimore? Better lose some of the obscenity, but keep the gratuitous violence.”

One slight glitch. As the New Yorker cartoon implies, you can’t hurry “Good Taste, Good Judgement, and Self-Control.” [Anyone who recognizes that phrase went to Duke before 1968.] The pre-frontal cortex doesn’t reliably begin to exercise Exective Function until one’s mid-2os. [Results may vary, depending on Nature, Nurture, and Proximal Events.] For the purpose of mounting a legal defense, “Proximal Events” have been known to include temporary insanity, due to jealousy [la crime passionel], circadian rhythm disruption, and/or Twinky toxicity.

Incidentally, in the 1980s neuro-anatomical researchers were publishing articles tracing the pathways of the dog’s pre-frontal cortex to various other brain areas, thereby scandalizing the ethological community, who had insisted a priori, that one had to be at least a primate, dear, to have impulse control. How old were the dogs in the research study, I wonder? Had they matured past that harum-scarum phase, where all rabbits are fair game…well, let’s face it, where all game is fair game? Here’s a fun fact: predator [carnivore] animals have a relatively larger cerebral cortex [and a relatively shorter gut], than prey [herbivore] animals. Not to get too anthropomorphic about it, it takes alot more Executive Function [not to mention Therbligs], to find, select, and pounce on a sentient, mobile animal, than on a stationary plant [pace Prince Charles…who talks to plants…but hunts foxes…oh, never mind].

My favorite nemesis, Big Pharma, is even now working on a drug to hasten the growth of the pre-frontal cortex. There should be a contest, to predict what the adverse side-effects [or unintended consequences] of such a drug will be. Those of us who studied psychology in the 60s recall that in the 1950s the panacea for all manner of “insanity”[ranging from schizophrenia, to mood disorders, to antisocial behavior] was the Pre-frontal Lobotomy, wherein all connections between the pre-frontal cortex and the rest of the brain were surgically severed. Those who studied Filmography, instead, will tell you that Frankenstein was the research scientist, not the monster.

Before that Brave New World comes to pass, the best course of action, for enhancing the self-control of dogs, young people, and those of us with Overactive Amygdala Syndrome, is the daily exercise of what pre-frontal cortex we do have. First of all, make sure it gets enough blood flow [by identifying what is getting Up the Nose of the amygdala].

For what is second of all, readers will have to “Chotto Matte” [Japanese for wait a little while], for the next post.

Leave a comment

Filed under ethology, gets right up my nose, murky research

"Yoshi!"


In high school we put on a Sigmund Romberg operetta which included a cynical little ditty about Being Good: “Always do what people say you should. You never can be happy, child, unless you’re good. I did what I was told. I was as good as gold. And I know I shall be happy, cuz I am so good.” If sung sarcastically enough, it always brought the house down.

At the risk of sounding like a curmudgeon, I recommend Romberg to modern fMRI researchers on altruism. The most frequently cited study involves 19 graduate students [in good health], who voluntarily participated in a study in which a radio-active isotope was “introduced” into their bloodstream. They were given a starting “float” of $128 each, which they could opt to keep or to donate [some of] “anonymously” to various charitable causes. The headline finding of the study was that [gasp!] the same part of the brain lit up when a participant gave away money, as when they received it. I call “Sampling Error!” I would like to see this result replicated, using 19 [or 190] randomly selected people, from all walks of life, including, oh, for instance, the have-nots. I put it to you, that these volunteers were no more “a cross-section of humanity,” than the folks who answer the phones for a PBS pledge drive. It takes a certain level of altruism to agree to go radio-active, not to diagnose or treat a serious health problem of one’s own, but simply to “further theoretical knowledge about in-born altruism.” Don’t you think?

More to the point, these volunteers were all 20-Somethings, already “in the zone” for having an up-and-running Pre-frontal cortex, as well as millions of stored memories associating Being Good with Getting a Reward. Why does Lili shut open doors? Out of save-the-earth’s-resources Green-ness? To make herself feel like a [very specialized] service dog? I believe she does it because I took the time [10 minutes] to lay down neural pathways in her brain between the command “Shimaru,” her shutting the door, the praise word “Yoshi!” (Good job!), and a small ort of dried lamb lung. Now, she need only be intermittently reinforced with the morsel of food [or even just with praise] to keep the behavior in her repertoire. Mostly, we mark and reinforce all kinds of pro-social behavior, simply by telling her “Yoshi!” And, folks, she’s just a dog, not a graduate student.

I’m saying, I think the 19 so-frequently-cited subjects [just listen for it, next pledge week] were all sub-vocalizing their personalized version of Romberg’s “So Good” song, right up there in their cerebral cortex; and the reason their Reward Center lit up when they were Being Good was because of a conditioned response. Yikes! That makes me sound like a Behaviorist! [Which I’m not. Well, only on the weekends.]

This stuff matters, because the ugly step-sister of fMRI research on Being Good is research on what used to be called psychopaths, suggesting that there are neuro-anatomical [perhaps even genetic] differences in their brains, that predispose them to anti-social behavior. This is chillingly reminiscent of eugenics, if you ask me; and much of it is based on the same methodologically flawed research design as the 19 Altruists study.

I believe that experience is at least as important a determinant of behavior as DNA. If it isn’t, why even bother to lay down neural pathways rewarding Good Deeds? Why ever say “Yoshi”?

Leave a comment

Filed under murky research, pro bono publico

Drunkard’s Fallacy


Two lads are making their way home, after some jars at the bar: “Seamus, would you give over circlin’ round that lamppost? You’re makin’ me head spin!” “Ahh, but Desmond, I’m tryin’ to find me feckin’ keys.” “Oh, now, Seamus, I t’ink I heard a ‘clink’ when we was coming t’rough de alley, back dere.” “Yeah, me an’ all, Desmond; but de light’s better under dis lamp.”

This shocking stereotype of Irish inebriation and false logic was offered to us in graduate school, in a course on research design, to illustrate the “Drunkard’s Fallacy” [the tendency for researchers to “search where the light is better,” and thereby overlook the “keys in the alley”]. Sir Francis Galton, for instance, believed that intelligence was highly correlated with head circumference [which is easily and cheaply measured]; and he tried to encourage fatheads to marry other fatheads, for the improvement of Mankind. Later, Dr. William Sheldon put forth the theory that one’s body type–fat, muscular, or thin [which is evident, even to the casual observer]–was highly correlated with 3 distinct sets of personality traits. All of which would be highly amusing, except that their “scientific evidence” has been used as the rationale for eugenics–most notoriously, but not exclusively, by The Third Reich.

These days in neuropsychological research, there is often a generous sponsor “paying the light bill,” who then–sometimes blatantly, but other times subtly–sets the agenda for “where to search.” Senator Charles Grassley has done Menschlich work, in my opinion, by doggedly insisting that medical researchers disclose the source of their funding, so that consumers can then take their “findings” with a grain of salt. But what if the funding source is Uncle Sam? Could there still be a tendency to “circle the lamppost,” rather than “go down the dark alley,” in search of scientific “truth”?

In 2005 the journal Nature Neuroscience published the results of an NIMH-funded study, which followed over 200 individuals from birth to 26 years, to assess their risk of becoming “depressed” by Stressful Life Events, and its correlation to the presence or absence of “the Serotonin Transporter Gene (5-HTTLPR)” in each individual’s DNA. Don’t you just know, the researchers found the two factors–“depression” in response to bummer events, and the presence of that specific gene–to be highly correlated. Well, the media was all over it like a cheap suit. Cute articles about “Blue Genes” came out like a rash. And the always-only-sleeping-not-dead eugenics lobby began to bang on about genetic screening for 5-HTTLPR, rationalizing that the opposite of bumming out at bad news was Being Resilient; and who wouldn’t want to breed Resilient kids, in these troubled times?

Also–and here’s the Beauty Part, if you’re a government agency, trying to contain costs for Mental Health treatment–if bumming out is “all in your genes,” no need to wander down that dark [time-consuming] alley of “trying to understand what got up your nose, which made you angry, which then made you depressed.” That’s like fiddling while Rome burns. Like trying to understand why your body has become insulin-resistant, or why your arteries are clogged. What you need is a chemical–not an insight. Faster, cheaper, better for Mankind [and the bottom line].

So, here’s the thing. In this week’s issue of the Journal of the American Medical Association: a meta-analysis of all studies which could possibly have replicated the ballyhoo’d NIMH results found no correlation between the two factors. Bupkes, nowt, Nichts. [Incidentally, their n = 14,250, of whom 1,769 were classified as having “depression.”] Do you know what was significantly correlated with “depression” in all the studies these researchers meta-analyzed? Stressful Life Events.

Well, Seamus, I guess it’s back to the dark alley, if we ever want to find those feckin’ keys…

Leave a comment

Filed under confounds, murky research, pro bono publico

"Why Keep a Dog and Bark, Yourself?"


Another Mancunian aphorism, meaning: “Why do the thing you pay (or feed) someone else to do for you?” We now launch into a world of pain & suffering, the “fourth irritant,” which has received short shrift in this blog, so far. Since I [Lieutenant Commander Kangaroo] am at the helm, we shall first make a short detour…to the Orient.

Have you heard about the “Bow-lingual” Dog Bark Translator? It purports to analyze your dog’s utterances, and assign each of them to one of 6 mood states: “happy, sad, frustrated, on-guard, assertive, or needy.” Since, of course, it comes from Japan, it offers the dog owner (“just for fun,” the distributors insist) Japanese “translations”: a phrase to capture each of the 6 canine moods. Since, by coincidence, we have trained Lili to respond to Japanese commands, wouldn’t it be fun to find out what she is saying back to us, in Japanese? [Not $213’s worth of fun, I feel.] Just within the past month, though, she has begun to make this new, yodeling sound on her way to the hearthrug “penalty box,” for over-the-top “barkitude” at intruder dogs on our property. My own translation of it is the adolescent’s plainsong chant of protest, “Mah-am!” It’s not exactly defiance–more of a minority report on the unfairness of the sanction. [It’s a struggle not to laugh when she does it.]

Now, back to pain & suffering. Three cheeky chappies @ the University of Keele, in the UK, wished to study the “point” of swearing, in response to pain. Their research design was so cute, that it merits some attention. Undergraduates were recruited for a (supposed) study of “the degree of stress that various forms of language elicit during tense situations.” [That’s the dullest phrase in the NeuroReport article, I promise.] Each subject was asked to list “5 words you might use after hitting yourself on the thumb with a hammer,” as well as “5 words to describe a table.” Only those who listed at least one obscenity were included in the experiment. They ended up with 38 males and 29 females. [Already, we’re doing sociology & anthropology, no?] Each subject was tested twice, in randomly assigned sequence–once while repeating the first expletive they listed, and once while saying the ordinally corresponding “table” descriptor from their other list. The pain & suffering inflicted was [a maximum of 5 minutes of] their hand submerged in a bucket of ice water. So, guess under which condition each & every subject endured pain significantly longer–expletive or furniture adjective? Not surprisingly, swearing out loud is “hypogesic.” [It lessens pain perception.]

But why? The boffins from Keele are a bit baffled. The best they can do is surmise that it has to do with amygdalar arousal. Therefore, we armchair researchers can feel free to kibitz. What if there had been a third test condition, besides obscenity & Ikea cataloguery? What if the contestants had also been asked to list 5 Emotive (but genteel) outcries [you know, like “Rats!” or “Crumbs! or “Bother!”]? How do we know that such polite but vehement expressions of dismay are not equally hypogesic? [I don’t believe it for a minute, mind you; but it would have made the study’s findings more robust.]

Many dog training books assert that when a canine launches into a prolonged bout of “barkitude,” he is producing endorphins, and therefore rewarding himself with a powerful–some would say addictive–stimulus/response loop, that soon has little to do with the original cue for barking. [The UPS truck is long gone, but the stoner dog is still “self-medicating,” man.]

Maybe that’s what a human yelp of obscenity evokes in the brain: a lovely hit of endorphin, which makes the icy water much easier to handle. If so, and if the yelp is prompted by amygdalar arousal in response to pain & suffering, then (shock, horror!) maybe the amygdala is Not All Bad. Maybe, like most things found in nature, it’s a double-edged sword. And so, too, [until someone does the “Crumbs!” variation on the Keele study] is the occasional, appropriate human bark of (insert expletive).

Leave a comment

Filed under confounds, murky research, pain reduction, stifled wolf

"Who(m) Do You Trust?"


“Me, or your lying eyes?” [goes the old-but-new-again joke]. In the spirit of the Groucho Marx quiz show, You Bet Your Life, in 1957 Johnny Carson [soon joined by Ed McMahon] hosted a daytime game show where the backstories and chemistry between the contestants [3 sets of they-never-met-before “couples”] trumped correct answers. A quiz category was announced, and the man of the mixed-doubles team was asked. “Who do you trust [to answer a question on this topic–yourself, or this dame you just met]?” Cutting-edge battle-of-the-sexes TV! Chivalry vs. machismo! It was like hearing your parents “debate” who knew the faster way to get to the Sawmill River Parkway in weekend traffic. Riveting stuff for the after-school crowd [the show’s target demographic].

[Cultural digression: My father, whose off-the-boat parents spoke Irish at home, acquired his English through grammar books, and was a stickler for correct–even archaic–usage. When, in this blog, I deviate from Fowler’s Usage into demotic speech, I am using Jakobsen’s Poetic speech function, to make a point, innit?] Thus, when my sister and I were recapping that day’s episode for our father, we referred to the show as Whom Do You Trust?

Now, back to pain. And back to one of my favorite hobby horses–“Are you going to trust every so-called research finding, just because it was published in a peer-reviewed journal?” Okay, smokers, this one’s for you. Our old friend Dr. Malinoff has research evidence that “Nicotine stimulates an area of the brain right next to the area that processes pain; smokers’ pain scores routinely exceed the pain scores of non-smokers.”

Now, for you redheads. For decades anesthesiology students were taught to use more pain-killer on their red-haired patients, because their tolerance for pain was experimentally proven to be significantly lower than blondes & brunettes. Turns out that the ever-popular bucket of ice water was used, to achieve these replicated research findings. Recent studies [using the only other ethically-approved method of inflicting pain for research purposes: electric shocks] have found that red-heads are, indeed, more sensitive to cold, but they tolerate a jolt of voltage significantly better than the other groups. I could go on, but you get my skeptical [not to say cynical] point. “It ain’t necessarily so.”

Maybe the researchers are all just [metaphoric] “drunkards, circling the lampost.” Why not let your own experience be your guide to the “truth” about your very own pain? Well, consider this finding, reported in the APA Monitor [January 07]. Using the “Cutaneous Rabbit Illusion” [where the subject’s arm is rapidly tapped, first near the wrist, then near the elbow, and soon the subject “feels” a phantom tapping sensation between the two spots–quaintly known as “the rabbit hop”], the same area of the subject’s brain lit up on the fMRI, whether the tapping sensation was “real” or only “phantom.” A similar thing happened with more painful stimuli [a rabbit-wearing-golf-shoes, sort of thing], only this time it was the dreaded S1 [primary somatosensory cortex…aka pain center] area of the brain that lit up.

Confused? Ah, then I have achieved my goal. Put your previous beliefs about what causes (and reduces) the sensation of pain “on ice” for a bit [unless you have red hair, in which case…just put them under wraps]. Maybe, some of the old-but-new-again ways of coping with pain have something to offer 21st Century sufferers.

So anyway, is that dark smudge, in the lower right quadrant of the door, the tail of a ravening wolf, or just the head of Napster, the black cat? One would be awful, the other just a little inconvenient. What if you could choose which one to experience? I think you can choose. But who ya gonna trust–me, or your [sometimes lyin’] eyes…arm…S1 pain area? Next stop, the enchanted forest. [What could it hurt?]

Leave a comment

Filed under confounds, murky research, pain reduction

Who You Callin’ Field Dependent?


In the 1970s H. Witkin & colleagues took an interesting difference in human cognition (between those who tend to See the Big Picture & those who tend to Notice Details), and ran with it, turning it into an all-out, Kangaroos-vs.-Clydesdales, smackdown. By 2002 here’s how The Dictionary of Psychology [ed. Ray Corsini] was talkin’ ’bout Field Dependence: “A tendency to uncritically rely on environmental cues, particularly deceptive ones, in tasks requiring the performance of simple actions or the identification of familiar elements in unfamiliar contexts. Passivity…is associated with field dependence.” And Field Independence? “The general capacity to orient the self correctly despite deceptive environmental cues (e.g. not being distracted by incidental elements in making a decision). Field independence is highly correlated with analytic ability, high achievement motivation, and an active coping style.”

Now let me tell you how physiologically field dependent [or do I mean feeble-minded] I am. You may recall my mentioning how frequently [and inconveniently] car-sick I was as a child. Know what cured me? A 1960 Mercedes Benz 190, which my father bought in the UK and–mercifully–shipped back with us upon our return to the USA, where it served as our one-and-only family car, until its debacle [rear-axle disintegration] in 1978. Aside from looking way cooler than our ’54 Buick or my grandparents’ endless succession of Caddies, it had a Very Stiff Suspension, so that a bump in the road was experienced as one short, sharp jolt [rather than a series of wallowing undulations]. What you saw was what you got. That’s what we F-D folk need, to avoid that nauseous feeling. The classic informal test for F-D involves something not everybody does anymore: sitting in a Northbound train at the station. When the Southbound train on the opposite track pulls out, does it feel as if your stationary train is moving forward? Welcome to my world.

But–talk about leaps of logic–how do we get from that kinesthetic phenomenon to Corsini’s & Witkin’s broad-brush character attributions, such as “requires externally defined goals and reinforcements”…”needs organization provided”…”avoid telling [an F-D] too many facts.” Can you hear my howling wolf cry “humiliation“? Compare that to their descriptions of F-InD folks: “Has self-defined goals & reinforcements”…”can self-structure situations”…”interested in new concepts for their own sake.” I’m going to go out on a limb, here, and deduce [which is what we F-D types do] that Witkins & Co. are/were [I can’t be bothered to check their bios, to find out who’s still with us] cognitive Clydesdales.

Lemme tell you some of the other descriptors they use for those oh-so-kinesthetically-savvy F-InD types, though: “impersonal orientation”…”learns social material only as an intentional task”…”motivated by grades, competition, by [being shown] how the task is valuable to them [not to other people].” Sounds a little…um…solipsistic. No? [Also sounds like the profile of the person Mostly Likely to Get Hired, in the current economic climate. Hence the Crazy Like a Fox remark, at the end of the previous post.]

So here’s my point. [Same old point, as ever.] There are not just two cognitive types of people; there is a continuum. Not every Analytical thinker [F-InD] is a brilliant scientist with no social skills; and not every Global thinker [F-D] is an intellectually lazy People Person…although I can think of a Prominent Politician who fit that description. All y’all Clydesdales need to climb off your high horse [as it were], and realize that you need us Big Picture Kangaroos, with our non-linear cognitive style, if only for comic relief. We all ought to see the value of both Flakes & Geeks, and to realize that every one of us is a hybrid of both.

Say, what’s that, hanging from a branch in that big old tree in this picture? Or didn’t you notice it?

Leave a comment

Filed under crazy like a fox, murky research, sharks and jets

Bad Fairy at the Christening


Backstory to Sleeping Beauty: two Good Fairies offer upbeat predictions for baby Aurora; then a Bad Fairy [name of Maleficent] predicts that on the girl’s 16th birthday, she’ll prick her finger with a spindle and die. A 3rd Good Fairy softens the malediction from “die” to “fall asleep.” Then they put the baby into a witness protection program [changing her name to Briar Rose]. You remember the rest.

So here’s the malediction du jour from BMC Medicine 2009, 7;46: based on a decades-long study of 16,496 kids, all born in the UK, in the same week of April, 1970. When they were 10 years old, several tests & measurements were administered. Less subjectively, their Body Mass Index [as well as that of their parents] was obtained by “a qualified nurse.” The Social Class of their parents was calculated, based on Dad’s line of work [if any]. Their teacher filled out a “modified Rutter B” questionnaire [which assessed each kid for how “worried,” “miserable,” “tearful,” and/or “fussy” they were]. Hands up, if you ever were assigned Robert Rosenthal’s 1968 educational classic, Pygmalion in the Classroom. If so, you already know how this study is going to turn out; but don’t spoil the surprise for the others.

Then these UK 10-year-olds were given 3 read-it-yourself-and-fill-in-the-answers surveys. The so-called Self-Report test had just 2 items: “I worry alot,” and “I am nervous,” to which the kid could answer “Not at all,” or “Sometimes,” or “Often/usually.” [Let’s cut to the chase on this one, and say that it predicted nowt, bupkes, nada.] Ah, but there followed the 12-item LAWSEQ [“yes,” “no,” “don’t know”] to assess Self Esteem; and the 16-item CAROLOC [“yes,” or “no/don’t know”] to assess External/Internal Locus of Control. The scoring on each test was like golf [not basketball]: lower was better. Did you ever study the “Yea-sayer Effect”? [As the name suggests, some folks Just Cain’t Say “No” on questionnaires. That’s why well-designed surveys throw in some “Yes, we have no bananas” type of questions, just to catch out the “yea-sayers.” Not these two tests, though.]

Okay, so fast-forward 20 years. Of the original cohort, less than half the 30-year-olds [mostly women] chose to contact the researchers, with their self-reported Body Mass Index. Now for the high-concept title of the article: “Childhood emotional problems and self-perceptions predict weight gain in a longitudinal regression model.” And now, for what the data actually show. “The strongest predictors of weight gain were BMI @ age 10 and parental BMI.” “[For women only] External Locus of Control and Low Self Esteem predicted weight gain on a par with Social Class.” “The Rutter B predicted increased BMI [for women].”

So–before we all start wringing our hands like the guests at Aurora’s Christening party, at the “Statistically Proven Fact” that highly-strung 10-year-old girls [or those who Just Cain’t Say No on questionnaires], whose teachers have already pigeon-holed them as Nervous Nellies, are doomed to become overweight 30-year-olds–let’s consider an unexplored bias in the data. As Rosenthal’s [much more robust] results have shown, a teacher’s subjective assessment of each student has a powerful effect–for good or evil–not only on the teacher’s predictions of that kid’s academic and social success, but on the kid’s actual success.

So, here’s my advice to concerned parents of young girls. Listen carefully at those parent-teacher conferences; and if you’re getting the vibe that the teacher has your kid in “negative halo” mode, either change the teacher’s attitude or change which teacher your kid has. I have no doubt that my father’s move-in-October Navy schedule fortuitously rescued me from some toxic negative halo situations [inasmuch as I was an Exceedingly Highly-Strung, ergo annoying, young pupil]. And twice, my parents insisted that I switch teachers, even when we weren’t blowing in or out of town.

Ya gotta be your kid’s Press Agent, and package ’em, like an Oscar nominee. Ya gotta win the Bad Fairies over, and get them to revise their own predictions of your kid’s prospects. Also, it couldn’t hurt to coach your kid to charm it up a little, no? And for those of you waiting for the Up Your Nose nexus here, say it with me: Childhood humiliation [at not being one of the teacher’s faves] leads to anger [often, directed against oneself] and to dumping cortisol, which leads to weight gain…along with other forms of pain & suffering.

But watch out for that 16th birthday, anyway. It’s a risky time for most girls.

Leave a comment

Filed under attribution theory, body image, confounds, locus of control, murky research, stress and cortisol

"A nod is as good as a wink to a blind horse."


This old Cockney expression, first cited in 1794, means, “Do I have to spell out the obvious to you? You know what I mean.” [Lately, contracted to the Phatic, “Nar’mean?”] Well, here is my corollary: “A diss is as bad as a threat to a young man.”

Waratu Sato [& colleagues] of Kyoto University have made headlines this week with their research on 24 incarcerated juvenile delinquents, compared to 24 “control” subjects, whose average Verbal IQs were 28.4 points higher than their jailed brethren. [The Controls’ mean Verbal IQs were in the High Average range, whereas the JDs’ were in the Low Average range.] As the discussion portion of this breathlessly-hyped-in-the-media article points out, the IQ factor might account for all the difference between the two groups’ performance on the task. Meanwhile, let us consider the task, itself. Each subject was shown a series of photographs of faces “portraying” one of 6 emotions, which they had to identify correctly. [Wait. Remember the dog-bark-translator, also from Japan, which categorized canine utterances into one of 6 emotions? Hmm…] Anyway, the headline was that the 24 JDs kept “misrecognizing” facial expressions of disgust for anger. So, incidentally, did the Control subjects, but 17.2% less often [which the researchers, themselves, acknowledge is “not a large difference”]. And their conclusion? “One of the underpinnings of delinquency might be impaired recognition of emotional facial expressions, with a specific bias toward interpreting disgusted expressions as hostile angry expressions.” On the other hand, as has been empirically demonstrated for centuries, one of the underpinnings of delinquency might be lower verbal IQ. Nar’mean?

But this is not Sato & Co.’s first foray into studies involving distressingly Photo-Shopped facial expressions. As reported in NeuroImage in 2004, 5 females and 5 males [mean age 24.4 years] volunteered for a [non-diagnostic] fMRI study, comparing their amygdalar responses to facial expressions described as angry or neutral, sometimes facing head-on, and sometimes slightly averted. Guess what they found. Head-on angry faces aroused an amygdalar response [in both men & women], whereas averted angry faces did not. Nor did neutral faces, no matter which way they were pointed. And these subjects weren’t even delinquents!

Apparently, even in Japan, researchers enjoy circling the lamppost, in order to discover that which is already known. Are you tellin’ me, the culture which introduced the Western World to the notion of seppuku [aka, hara-kiri] as a rational response to “loss of face” [aka, receiving a look of disgust, or a diss] doesn’t see the nexus between disgust and anger? Well, I do, and I’m Irish. Anyone who has ever read urban anthropology [or the newspapers] is aware that most youthful violence is triggered by one party giving the other party such a look [of disrespect], that honor demands a hostile response [usually towards the dissing party, but sometimes, turned inwards towards the dissed, himself, out of unbearable humiliation]. Nar’mean?

Street savvy youth [and their elders] learn to avoid inadvertently giving such facially expressed offense by taking a leaf out of the Viennese [Dissed] Clever Dog’s book, and averting their gaze. Further, those of us using public transport in the wee hours, learn to “keep our eyes in the boat” and/or to monitor our facial subtext for inadvertent expressions of disgust, and to verbally override them, with such remarks as, “Yuck! I think I may have food poisoning! Oh, well. Worse things happen at sea, right?” The only threat such a remark poses to fellow travelers, is to their clothing, not to their self-worth. No diss, no hostilities. [Usually, not always.]

I was waiting on line at the clinic pharmacy today, where the TV had some inane talk show on, with a guest who may have been the younger brother of a Backstreet Boy; and the interviewer said to him, “What other people think of you is none of your business.” Well, the studio audience applauded. [And so would I have, except that I was at the clinic pharmacy.] What a wonderfully powerful antidote to the infuriating toxin of humiliation! If someone gives you a look of disgust, it’s none of your business. Avert your gaze and tell the wolf in your brain to pipe down, already. Nar’mean?

Leave a comment

Filed under confounds, murky research, power subtext, semiotics

"Nana Window"


I just finished reading the cover story in this week’s NYTimes magazine, which I knew would get my amygdala aroused [and it did]. It’s about people whose amygdala gets aroused “too easily.” Oh, yeah? Says who? Jerome Kagan has been doing a longitudinal [Bad Fairy at the Christening] study at Harvard, starting with babies in 1989, whom he identified as either highly reactive [to novel stimuli], somewhere in the middle, or “low-reactive.” I’m going to let anyone interested look up the article; and instead I shall cut right to the chase. “Mary” was one of his “high-reactive” subjects, and he predicted that she would grow up to be a worrier. And, lo, she did. She’s worrying her way through Harvard as I write this. To which I respond, “Oh, come on! If that’s ”bad outcome,’ whaddaya call ‘good outcome,’ Jerry?”

Many pages into this up-till-then uncritical review of Kagan’s findings, the NYTimes author cites a researcher with a quibble: Dr. Robert Plomin of King’s College, London, wonders if, perhaps, subjecting these kids to the daunting fMRI, itself, might not account for much of their amygdalar arousal. Nar’mean?

Towards the end of the article, other dissenting voices are quoted, wondering why all of the “high-reactives” haven’t developed clinically significant anxiety [as predicted by Dr. Kagan]. Turns out some of the subjects are schmizing themselves into interpreting their racing pulses and dilated pupils as “being jazzed,” which they describe as “vaguely exhilarating or exciting.” Others [T.S. Eliot is mentioned] somehow manage to channel their amygdalar arousal into creating works of art [for the amusement & edification of the more laid-back among you, apparently]. Yet, the Bad Fairy gets the final word: “In the longitudinal studies of anxiety, all you can say with confidence is that the high-reactive infants will not grow up to be exuberant, outgoing, bubbly or bold.”

If that weren’t such an obvious load of old cobblers, I [the Exemplar of “High-Reactive” infants] would find it humiliating. Anyway, for those of you who would like a low-tech coping strategy to deal with anxiety, go to YouTube and look up “Nana Window.” On 23 April 2009 [St. George’s Day in England], the usual gang on the Chris Moyles [BBC Radio 1] show were joking around with Carrie, who had said, “My Nan always puts one in her window on St. George’s day.” [Her grandmother displays the Cross of St. George flag, which is England’s (red-cross-on-a-white-field) part of the United Kingdom’s “Union Jack.”] Chris & Comedy Dave chose to find a double-entendre in her innocent remark, and immediately improvised a Reggae song with the following lyric: “Nana Nana window. Nana window.” If you can’t find it on YouTube and still want to sing it, it’s all on one note, except for the “dow,” which is a 5th higher. Commence singing at the first sign of anxiety and repeat until you feel better.

In scientific point of fact, singing almost any song will reduce most anxiety symptoms, for the following reasons. Singing regulates breathing [thereby countering hyperventilation]. The sillier the lyric, the more likely you are to laugh [thereby relieving muscle tension]. The louder you sing, the more adrenaline you expend [thereby restoring homeostasis to your body]. Cognitively, you are likely to distract yourself from the alarming stimulus for long enough to get some perspective on it. [Is the irritant really awful or just…you know the mantra by now.]

The lyric “Nana Window” is the latest in the long and worthy tradition of non-lexical vocables [such as “Hey nonny nonny” from Shakespeare’s Much Ado About Nothing, and more recently, “Ob-la-di-ob-la-da” from the Beatles’ White Album], which multitask, by fulfilling [at least] two Speech Functions. They are Phatic [imparting no factual information, just keeping the listener listening] and/or Poetic [since they may, indeed, be a secret code for something else[such as “Carrie’s Nan is displaying something in her window.”]; and they often are also Emotive [expressing a particular feeling]. [“Hey nonny nonny,” according to Shakespeare scholars, expresses dismay.]

Here is Lili, displaying herself in the window, while keeping [hypervigilant?] watch for intrusions. The other day, I was upstairs brushing my teeth, when I heard [evidence of] her aroused amygdala: barking. I planned to go down and assert my Pack Leader status over her, by telling her to “Yaka mashie. Asoko.” [“Be quiet. Go down to your room in the basement until you can compose yourself.”] But before I could even rinse my mouth out, there was silence. I discovered that Lili had piped down and taken herself downstairs, all on her own. Now, that’s what I’m talkin’ about! So, okay, our amygdala gets aroused easily; but we humans, too, can learn to tell it to “Yaka mashie. Asoko,” [perhaps by singing the “Nana Window” song], and thus stand ourselves down from our many alarums.

Leave a comment

Filed under comic relief, limbic system, murky research, stress and cortisol

Turn On the Waterworks


What’s the good of crying? [That’s not a rhetorical question.] Sir Henry Maudsley (1859-1944), a neurologist and psychiatrist who took care of shell-shocked Australian soldiers during World War I, knew the answer: “Sorrows which find no vent in tears may soon make other organs weep.”

Ancient Greek dramatists knew it, too, staging tragedies so shockingly blood-thirsty [remind you of a current genre?], that audiences were guaranteed to have a good, cathartic cry. Today on the Beeb [BBC radio 1, that is], as Trueblood makes its UK debut, a group of media mavens were asking each other, “What is this current fascination with vampires and such?” One pundit opined that “in times of economic distress, people need an outlet for their own misery and fear, so they give themselves a socially acceptable reason to weep and wail.”

Cue the Possibly-Mad-Scientists. My personal fave is Jaak (not-a-typo) Panksepp [originally from Estonia], who coined the term “affective [pertaining to the emotions] neuroscience.” He studies the vocalizations of animals, such as rats, and has found that they wail with distress and laugh with delight. [Today’s post is no laughing matter. Later for that.] So, guess what familiar substance is found, in significantly elevated levels, in the saliva of wailing rats (inasmuch as they do not shed tears of sorrow)? CORTISOL. It’s also found in the tears and saliva of crying humans, folks. Talk about catharsis!

So, when Lili makes that keening noise as she is sent [or, these days, sends herself] to the basement, for the misdemeanor of barking at the UPS guy, an analysis of her saliva would likely show a whole lot o’ cortisol, which she cleverly lets “Duck” [her comfort stuffed animal] absorb, as she holds him in her mouth. In a few moments, she regains her composure and is back upstairs, happy as Larry [an Australian idiom, meaning “very happy”]. Very few of Maudsley’s wartime patients were Happy as Larry, one gathers.

How lucky for Lili (and Jaak’s rats, and human children), that society permits them this low-tech method of ridding the body of toxic cortisol. How unfortunate, that when grown-ups (especially men, or women in non-traditional jobs, such as the military) weep, they are humiliated with labels such as “weak,” “manipulative,” or “suffering from a Mood Disorder.” Recent research purporting to demonstrate that weeping only makes men more distressed [especially studies using my least favorite research tool, the fMRI], have been critiqued as culturally-biased. The subject’s (radio-active) brain is registering the anticipated, negative social consequences of crying, not a “hard-wired” neuro-chemical consequence. The brain of a male actor anticipating an Oscar nomination for his convincing on-screen crying [I hypothesize] would look very different in such a study, from his brother, the Marine Corps Drill Sergeant.

Which reminds me of a harrowing but invaluable class in our acting school, in which male & female students alike had to produce real tears on cue, for a grade. In keeping with the school’s Method Acting approach, no artificial means of lacrimation [such as onion juice on one’s fingertips, or a tack in one’s shoe] were permitted. The actor must Prepare: conjure up a powerful, tear-jerking memory, and use it as the spigot, to Turn On the Waterworks. Just imagine the endorphin hit which follows the [male or female] acting student’s right-on-cue crying jag. Talk about tears of joy!

Which we will, in the next post.

Leave a comment

Filed under catharsis, murky research, semiotics, stress and cortisol