Tuesday, January 5, 2016

Sweet Sweat: Part I - Liquid Gold

Wife:  My eyebrows are sweating.
Me:  What??
Wife:  It's so hot my eyebrows are sweating. They always do.
Me:  What?

Male readers who have been married for a while will recognize this is a time to shut up.  The correct response, if any, is to acknowledge the discomfort and then quickly move on.  However, I made not only one mistake but two.  First, I denied that it was particularly hot.  Then I questioned whether eyebrows were a normal place to sweat.

My penance was to do some research into the topic of perspiration and possible differences among people in where and when they sweat.  I found that there is a wealth of information about this topic, and to someone as warped as me it turned out to be very interesting. There was so much material that rather than share it in one long blog I've decided to torture you with two shorter installments.  Here goes.  Oh, and be patient -- we'll eventually get to that eyebrow thing....

"Horses Sweat, Men Perspire, 
                  But Ladies Merely Glow......"

This Victorian-Era euphemism captured the view in the early 1900's toward perspiration -- a gross aspect of animal nature, found in restrained and diminished form in male humans and quite incompatible with the ethereal sensibilities of Victorian gentlewomen. This idea now seems rather quaint, given the more accurate perceptions available to us in coed fitness centers and in athletic venues.  Men and women sweat, not just horses.

Although we may have a somewhat more realistic view of perspiration these days, we are still quite ambivalent about it, as indicated by the lucrative $3 billion a year deodorant and anti-antiperspirant industry in the U.S. (Euromonitor Marketing Research Report, 2014).  We know we sweat but we pay a lot of money not to do so, and we pay to make sure we don't stink even if we do.  The fact that perspiration and odor are big business should make us suspicious that at to least some degree our attitudes might be the result of Madison Avenue manipulation.  A recent article by Sarah Everets published in the venerable Smithsonian Magazine offers considerable evidence that this is correct:
"In the 1910s deodorants and antiperspirants were relatively new inventions. The first deodorant, which kills odor-producing bacteria, was called Mum and had been trademarked in 1888, while the first antiperspirant, which thwarts both sweat-production and bacterial growth, was called Everdry and launched in 1903.  But many people—if they had even heard of the anti-sweat toiletries—thought they were unnecessary, unhealthy or both." (Everets, 2012).
What do you do if you have a product that people don't perceive they have a need for, and even regard it negatively?  The answer is that you create a market for the product by convincing people they really do need it, and that it is perfectly safe.  The first advertising campaign for anti-antiperspirants began in 1912, designed by James Young, a copy writer for a New York advertising agency and former traveling Bible salesman.  Directed at women, the campaign promoted a product named Odorono, stressing its healthfulness and also suggesting that perspiration was a problem needing to be solved:
"Young’s early Odorono advertisements focused on trying to combat a commonly held belief that blocking perspiration was unhealthy. The copy pointed out that Odorono (occasionally written Odo-ro-no) had been developed by a doctor and it presented “excessive perspiration” as an embarrassing medical ailment in need of a remedy." (Everets, 2012)
The campaign worked -- sort of.  The sales of Odorono jumped initially but flattened out after a few years.  It seems that while the campaign led many women to be familiar with the product, 2/3 still didn't think there was a need for it.  Young switched to what has become a time-honored way for advertisers to manipulate perceived need --- focus on fear of social embarrassment that the product can take away.  Here's a sample of Young's 1919 sales pitch for Odorono in Ladies Home Journal: "A woman’s arm! Poets have sung of it, great artists have painted its beauty. It should be the daintiest, sweetest thing in the world. And yet, unfortunately, it’s isn’t always." The advertisement went on to explain that women may be stinky and offensive, and they might not even know it.  "The take-home message was clear: If you want to keep a man, you’d better not smell" (Everets, 2012).   Although the ad was considered offensive by many readers because it dealt with a socially taboo topic, Odorono sales jumped 112 percent by the next year.

Other companies copied the Odorono marketing approach and over the years the ads became much bolder.  A particularly blunt example is a 1937 advertisement for Mum (now Ban):
"You’re a pretty girl, Mary, and you’re smart about most things but you’re just a bit stupid about yourself. You love a good time—but you seldom have one. Evening after evening you sit at home alone. You’ve met several grand men who seemed interested at first. They took you out once—and that was that. There are so many pretty Marys in the world who never seem to sense the real reason for their aloneness. In this smart modern age, it’s against the code for a girl (or a man either) to carry the repellent odor of underarm perspiration on clothing and person. It’s a fault which never fails to carry its own punishment—unpopularity."  (Everets, 2012)
Campaigns to convince men that they needed these products began in 1935, with the introduction of the first deodorant for men called Top-Flite.  These ads, too, focused on insecurities -- in this case of men trying to obtain and keep depression-era jobs.  But why was there a 20-year delay in developing and pitching these products to men? Could it be that men don't sweat as much as women or that they stink less?  Doubtful. The more likely reason is that advertisers viewed women as more likely to adopt these products because our society had primed them to respond to a fear-based pitch that emphasized the possibility of social rejection.  The insecurities of the Great Depression changed men's attitudes and made them more susceptible to a fear-based appeal for a product that promised to make them more successful in white-collar jobs -- thus opening a huge new market for deodorant products (Everets, 2012). Ads stressed how lack of personal grooming could ruin a career and threaten a man's role as successful family provider, as well as his general "macho" attractiveness to women, by being unknowingly stinky at the office.

Of course, the advertisers first had to go to great lengths to disassociate the male version of the product from the female version, even though the active ingredients and their strengths were exactly the same. Thus the name "Top-Flite," a clear reference to the game of golf, which at that time was seen as a "man's" game. Other strategies included using containers in the shape of whiskey jugs and blocky black bottles and incorporating scents like "leather," "pine," and "old spice." 

So, are the advertisers right, are we humans naturally drippy, stinky creatures?  Is perspiration the nemesis of advanced civilization?  Do men sweat more than women? Do eyebrows really sweat?  Answers to these and other questions will be in Part 2:  "Don't Ever Let Them See You Sweat"

Saturday, December 5, 2015

My New Chip & Pin Card Works! (Well, Sort Of...)

I've blogged previously about the difficulty my wife and I had last year while traveling in Europe with our "Swipe & Sign" credit card (see American Travelers Abroad: The Chips Are Down). Briefly, the problem is that U.S. credit card technology is way behind most of the rest of the world, where the standard is the much more secure "Chip & PIN."  Transaction information is encrypted via the chip embedded in the card, and then rather than a signature that anyone can fake, a personal PIN number is required to complete the transaction.  An American traveling abroad can still charge things because most card readers there do have a swipe slot and will generate a paper slip to be signed.  However, this assumes the transaction involves face-to-face interaction.  Many point-of-sale transactions in Europe are at unattended machines that (a) only accept chip cards and (b) require a PIN. These include toll booths, gas stations, parking garages, and ticket machines for public transportation --- in other words, many of the venues tourists are likely to encounter.

After our difficult experiences last year my wife and I decided to see if we could get a chip and pin card for future travel.  I quickly found that several companies offered Chip cards, but they were not true Chip & PIN cards because they still required a signature.  Indeed, the first type is what American credit card companies are now distributing in the wake of several high profile data hacks, such as the Home Depot and Target debacles.  These cards, if used with a chip reader, are definitely more secure than the swipe cards they replace because they are harder to counterfeit and the transaction is more securely encrypted.  However, they may do you no good at all in the unattended purchase situations you are likely to encounter while traveling abroad.  Here is the description included with my new Chase chip card that I recently received:
You may be asked for a PIN, rather than a signature, when using chip card readers abroad.  If this happens, you may be able to cancel the PIN prompt and complete the transaction.  Just in case, it's always a good idea to carry local currency for payments at unattended kiosks that may require a PIN.
News Flash, Chase:  Many of those unattended kiosks won't allow payment with cash!

The card my wife and I settled on was the Barclay Arrival + which was touted as having true PIN capabilities and no foreign transaction fees. We received the card and set up a PIN, but of course we had no opportunity to test it here in the U.S.  Our first complete test abroad came a short while ago on a trip to Scandinavia, the Baltic States, and Saint Petersburg, Russia.  Here's my report.

When the chip was inserted into card readers it worked flawlessly everywhere.  Not once did a merchant have to swipe the card.  So far so good.  However, I was disappointed to find that in all face-to-face transactions I was required to sign the charge slip, rather than enter my PIN.

The real test came when we encountered unattended machines.  This occurred  when we landed in Stockholm and needed to buy tickets for the transfer bus from the airport to the downtown area.  In the arrivals area we found a collection of unattended machines selling train and bus tickets.  We stuck in our Barclay card and it asked for our PIN. I entered the number and .... voila! IT WORKED! Wow, we Americans had finally entered the 21st century in terms of banking technology!

Later we needed to buy metro tickets, which in Stockholm are available for sale in certain stores and from unattended machines located near the metro turnstiles.  I held my breath the first time we stuck in the card.  Bingo! -- it asked for our PIN and accepted it!  This happened several more times during the course of our stay.  In short, every time we encountered an unattended machine on this trip the PIN functioned perfectly.  What would have happened if we tried to use a Swipe & Sign or a Chip & Sign card?  Don't know, don't care.  I do know that last year in Europe we were unable to complete these transactions with our old credit card and it was a major pain in the butt.

Apparently the Barclay card's default is to require a signature, but if a PIN is absolutely required it will accept it.  This isn't as good as I had hoped, but it is certainly an improvement and probably the best we can do at the moment.  To the best of my knowledge, there is no true Chip&Pin card where the default is PIN available to Americans at this time (see Note 1 below).  If you know of one, please forward the information to me.

In preparing this blog I did some research on chip cards and immediately found that there is still a lot of confusion about them, especially the difference between Chip&Sign cards and Chip&PIN (with signature also, like my Barclay card).  I even found one forum in which a person with a Barclay Arrival + card claimed the PIN wasn't accepted abroad -- clearly false given my experience of a few weeks ago, as well as reports of other travelers.  At any rate, here is the best and latest assessment of true Chip&PIN cards available to Americans I could find, dated August 1, 2015:  MileCards.Com, "11 Chip & PIN credit cards with no foreign fees."

Happy travels!!
_______________________________
Related Blogs & Notes
American Travelers Abroad: The Chips Are Down
One Way That Chip Credit Cards Aren't More Secure
Note 1 -- I've come across a few unofficial reports of Chip & PIN cards from some credit unions that will ask for a PIN when read by the new readers now being distributed in the U.S.
Note 2 -- I've also come across an unofficial report that foreign chip readers are being modified to accept Chip & Sign cards from the U.S.  This seems doubtful to me -- or a least a bad idea if true -- because it lowers the security of unattended transactions.

Friday, December 4, 2015

Bah Humbug! (Redux)

A few years ago I wrote a blog about my mixed feelings concerning the Christmas season (Bah, Humbug! (Sort Of),12/12).  My attitudes haven't changed much, but in honor of the reflective spirit of the Holidays I want to expand a bit on this topic and offer some additional observations

It's become clear to me that a major trigger for the beginning of my Christmas malaise is the spectacle of Black Friday.  This occurs the day after Thanksgiving, a holiday which seems to bring out the best in people, including many sincere displays of generosity and charity.  The very next day, however, there is a tidal wave of selfish acquisitiveness in which the motto seems to be "Push, Shove, Grab, Buy" as people fight for everything from t.v.'s to toys.  These are most certainly not all intended as gifts, but rather are often desirable material possessions that are priced so low that the result is the retail equivalent of a feeding frenzy.  If there really is a "war on Christmas" as some have argued, I suggest that it isn't liberal philosophy but rather over-amped commercialism that is the major source.  At any rate, this day marks the beginning of my Christmas season emotional doldrums.

I have been ambivalent about the holiday season for quite some time, and I think the seeds were planted in childhood.

As a kid I can remember being so excited that I was unable to sleep on Christmas Eve.  Everything was so special -- the decorations at our house and around the city, the presents under the tree, the Christmas music on the radio and in the shopping malls, the heartwarming holiday specials on television, the dozens of Christmas cards we sent and received.  Although my family wasn't devoutly religious, we usually attended midnight mass on Christmas Eve at our local Episcopalian church.  Christmas day was a hectic family affair that started with opening presents, followed by dinner in mid-afternoon with in-laws and relatives, more exchanging and opening of gifts, then socializing until 8 or 9 o'clock.  All in all this was a very intense and long day.

The next day was a big let down.  I can remember getting together with neighborhood friends to compare gifts and to play with each other's stuff.  But the big thrill was over and it seemed anticlimactic.  Amazing what a difference 24 hours can make -- from heartfelt joy, eager anticipation, and warm fuzzy emotions to a kind of emptiness, deflation and a feeling of  despondency.  And those presents I had wanted so badly almost never lived up to my expectations.

As an adult I have to fight a tendency to become a bit depressed during the holiday season.  It's not that I'm a Scrooge at heart -- I really would like to feel the holiday spirit and experience those warm fuzzies again.  But it is hard to do when retailers start their holiday push even before Halloween, Christmas carols are used to sell merchandise rather than express holiday sentiments, and buying gifts is evaluated in terms of contribution to GNP rather than as a gesture of caring.  It seems commercialized, shallow and insincere.

And of course it is hard to reconcile the messages of goodwill and peace with pervasive international conflict, with the exploitation, denigration and ruthless subjugation of large segments of the global population, and with politicians and even some religious leaders calling for policies that are at odds with compassion and loving kindness. If we could act like it was Christmas Eve throughout the year these problems might disappear. But I fear we are more likely to act like it was the day after Christmas -- or even worse, Black Friday

To end on a more positive note, and to illustrate my ambivalence, not just negativity toward the holidays, I'll offer this thought:  maybe capturing the spirit of the season shouldn't be easy.  Maybe the challenge of overcoming the obstacles, of seeing past the commercialism, conflict, and shallowness can lead to a more significant personal and social experience.  I think it's worth a try.  Maybe now more than ever.

Monday, November 9, 2015

How Breathing Fresh Air Can Be Electrifying

Sometimes very simple inventions can have life-changing positive impacts.  I've written about one of these before as part of my "Ray of Sunshine" series:  Using old plastic containers to illuminate the homes of the estimated 1.3 billion people in the world who cannot afford electric lights (Some Christmas Cheer: Liters of Light).  Despite the difficulty in finding other Rays of Sunshine amidst the 99.99% negative news these days, I recently came across one that is noteworthy because it is an example of the convergence of simple technology with a business model whose mission is to be financially successful while simultaneously improving the lives of millions of poor people around the world.

The next time you choose to fire up your barbecue to cook those juicy steaks, consider that the World Health Organization estimates that around 3 billion people worldwide are forced by poverty to cook and heat their homes using open fires and simple stoves burning wood, animal dung, crop waste and coal. Besides the environmental degradation that results from this, the health consequences are staggering:
  • Over 4 million people die prematurely from illness attributable to the household air pollution from cooking with solid fuels.
  • More than 50% of premature deaths among children under 5 are due to pneumonia caused by particulate matter (soot) inhaled from household air pollution.
  • 3.8 million premature deaths annually from noncommunicable diseases including stroke, ischaemic heart disease, chronic obstructive pulmonary disease (COPD) and lung cancer are attributed to exposure to household air pollution.  (WHO Factsheet)
Entrepreneurs Alec Drummond and Jonathan Cedar didn't start out to tackle this health problem. They were simply avid hikers who didn't want to carry fuel with them for their campstove and wound up inventing something called the Biolite Campstove.  This little beauty is a techie's dream and just the thing for those who like to hike and camp but don't want to give up all the comforts of home. Ignite some twigs, pine cones, or small branches in the stove's fire chamber and the heat activates a thermoelectric generator that powers a small interior fan, making the fuel burn more cleanly and efficiently than a traditional open stove.  As a result, the Biolite stove produces 90 percent less carbon monoxide and 94 percent less smoke than an open fire, and uses less fuel to produce the same amount of heat.

Biolite Campstove
But that's not all. The excess electricity produced by the thermoelectric generator is sent to a USB charging port that can recharge cellphones, cameras, LED lights, or any other device that has USB recharging capabilities.  In other words, you can cook, stay warm, light your campsite, and charge your cellphone all at the same time with a just few twigs of firewood.  The appeal of being environmentally green and also comfy has turned out to be very strong. When Drummond and Cedar first began marketing their stove in 2012 it was an immediate success with the recreational camping market in the U.S.  It retails for about $130 and the rechargeable lights the company offers are about $100 more, well within reason given the cost of other camping gear and the willingness of Americans to spend big bucks on this kind of equipment (a total of $1.5 billion per year, according to Statista.Com).

Early in the development of their product, Drummond and Cedar became aware of the world-wide health problem posed by open-fires in developing countries and saw the potential of their stove for helping solve it.  And they also saw that bringing free electricity to those who need it most could potentially improve the quality of people's lives beyond the health benefits. The problem, of course is that $250 is far beyond the reach of the people who could benefit the most from Biolite's stove and light system -- if they had that kind of money to spare they wouldn't be cooking over open fires in the dark. It also is beyond the financial ability of most charities to distribute large numbers of units that cost that much.

The solution that Drummond and Cedar came up with was to design a simpler and sturdier version and to
Biolite Homestove
finance its distribution in rural developing countries like India, Ghana, and Uganda by lowering the cost in a unique way:

"They quickly dismissed relying on a charity, because there was not one large enough to fund stoves for 3 billion people. Instead, Cedar and Drummond decided to pair the two markets they were interested in: the recreational market in the developed world and the rural, third-world market.

The camping products subsidize the cost of operation––and lower prices in the developing world. BioLite's stove for campers retails for $130 in the United States. A sturdier, more durable and larger version for cooking daily sells for the equivalent of $50 in India and Africa. Cedar calls the business model "parallel innovation."

Most of their revenue comes from selling the camp stoves and other products for recreational use in the U.S. and other Western countries. A smaller share of revenues is from selling camp stoves in the developing world; an even smaller slice of the company's revenue pie comes from charitable grants." (8/26/15, Naveena Sadasivam, insideclimatenews.org )
Although $50 may still seem like a lot for many people in developing countries, the cost can be spread out through charitable loan programs and lowered through reductions in duty and other taxes on imports of clean energy products. Other creative approaches are being championed by organizations such as the Global Alliance for Clean Cookstoves.

It is still too early to document the large-scale health impacts of the Biolite but studies are in fact being conducted.  In the meantime there is inspiring anecdotal evidence of what the HomeStove has meant to the financial status and self-development opportunities of individual people. One example is the case of Erinah, an enterprising woman living in a small town in Africa. Erinah makes her living by working at the local hospital and in her spare time running a small canteen in her village. Through a microloan program she was able to buy four HomeStoves -- one for her business, one for her mother, one for her aunt, and one for her grandfather. Because the HomeStoves use less fuel than open fires she was able to save money that she would have otherwise spent on charcoal or firewood and she paid the loans back in a year.  Her business is more prosperous and the lives of her relatives are easier and healthier (without relying on charity) thanks to an innovative, simple product.

Make no mistake:  Drummond and Cedar are no doubt enjoying the financial rewards of their invention and their marketing strategy, and they are working to make their business even more profitable.  But this is not a case of profiting by exploiting others. The Campstove makes a healthy recreational activity more enjoyable and more environmentally friendly, and the HomeStove greatly improves the healthfulness of people's home environments, reduces environmental degradation, and provides people greater financial security and opportunities for self-development in some of the poorest regions of the world.  It seems to me this is a case of entrepreneurism quite worthy of being a "Ray of Sunshine."

_________________
Sources and Resources:
World Health Organization FAQ on Household Air Pollution
Global Alliance for Clean Cookstoves
Biolite Campstove and Biolite Homestove Descriptions
Biolite Mission Statement
Brooklyn Startup Tackles Global Health with a Cleaner Stove | InsideClimate News
How Electricity-Generating Cook Stoves Increase Profit and Decrease Suffering | | Observer.Com
How BioLite Is Making The World A Better Place With Thermoelectricity - Earth911.Com 

Monday, October 12, 2015

Whose Mind Is This, Anyway?

"....we are not unitary individuals but superorganisms, built out of both human and nonhuman elements; it is their interaction that determines who we are."  (Kramer & Bressan, 2015)

One of the illusions most of us hold with great conviction is that we are separate and distinct from the rest of the world.  "I am here. You (and everything else) are there."  "I am this.  I am not that." This belief in separateness and in the essence of our self identity seems so clearly true that we tend not to question it.  However, once we begin to examine closely what is meant by "here" and "there," "I" and "you," "this" and "that," things start to get a bit fuzzy.

My somewhat deranged fascination with (a) microbes (aka "germs") and (b) excrement (aka "poop") has led me to discover that there is a lot of scientific evidence supporting the idea that our belief in separateness is simply not correct. I've written about some of this before (see Fabulous Synthetic Poop!Microbes for Breakfast!, and How About A Fecal Transplant?).  Research has shown that each of us is host to more than 100 trillion microbes that live in, on, and around us. Some microbes have been with us since before birth, influencing our development in the womb, and others joined us as we traveled through the birth canal and when our mothers nursed us. The interdependency between their lives and ours is so complete that for the first year of life our immune systems are switched to low so that more microbes can colonize our bodies -- it seems that the evolutionary advantage of having beneficial strains of these critters become part of us is so strong that it outweighs the risk of early childhood infection from "bad" microbes or other pathogens. In fact, we are dependent on them to the point that we could not survive for long without them. They are essential in digesting food, mounting successful immune defenses against diseases, and synthesizing certain vitamins.

In adulthood there are 10 times more microbes in us than there are human cells. Together they are our "microbiome," a community of creatures that interact with each other and with us in complex ways throughout our lives.  The relative number of different strains of bacteria in our microbiomes is unique to each person -- a kind of microbial fingerprint. In fact, some recent preliminary research has shown that we leave microbial traces in our environments that are as identifiable as fingerprints even without touching anything (Meadow et al., 2015).  It seems each of us has a "cloud" of bacteria that surrounds us and which leaves our unique signature wherever we go. If we could make the cloud visible (a "microbiomic aura") it would be very difficult to discern where our microbes end and "we" begin.
 
"I" and "Me" most definitely do not refer to a single, unitary organism that exists separately from all other organisms.  As I suggested with the opening quote above, it is more appropriate to regard ourselves as "superorganisms" -- beings composed of many organisms whose lives are intimately and inextricably intertwined. This integration goes beyond just physical interactions, however.  There is now considerable evidence that even our thoughts, feelings, and behaviors are influenced by nonhuman elements within us -- our minds, in other words, may also be those of superorganisms.

The current state of our knowledge about humans as superorganisms was recently presented by Peter Kramer and Paola Bressan of the University of Padua in an excellent article published in the journal Perspectives on Psychological Science (Kramer & Bressan, 2015). Kramer and Bressan review the data on the microbiome and also research that has investigated other foreign components of our makeup, including viral DNA, cells from other human beings, and microbes that reside in the brain. Their conclusion is quite different from our usual self-view:
...our emotions, cognition, behavior, and mental health are influenced by a large number of entities that reside in our bodies while pursuing their own interests, which need not coincide with ours. Such selfish entities include microbes, viruses, foreign human cells, and imprinted genes regulated by viruslike  elements.......we are not unitary individuals in control of ourselves but rather ... collections of human and nonhuman elements that are to varying degrees integrated and, in an incessant struggle, jointly define who we are.
I'll focus on just two examples to illustrate the psychological influences of our nonhuman residents: gut microbes and brain microbes.  If you want to explore other sources of influence, see Kramer & Bressan's paper, or a less technical partial treatment from BBC.Com, "Is There Another Human Being Living Inside You?"

Effects of Gut Microbes on Behavior, Personality & Mood

Certain strains of microbes in our microbiome have been shown to alter a number of neurotransmitter chemicals, for example by manufacturing and releasing GABA and other neuroactive substances, including noradrenaline, acetylcholine,serotonin, and dopamine. These chemicals are involved in mood regulation (eg., euphoria, anxiety and depression), risk-taking behavior, memory formation, sociability, responsiveness to stress, and they likely play a role in certain mental disorders, such as schizophrenia and autism.

The link between specific patterns of gut microbes and behavior has been clearly shown in animal studies where normally timid strains of mice become adventurous and adventurous mice become timid when colonized with the microbiome of the other strain through fecal transplants. In other studies, mice raised with minimal gut microbes showed lower levels of anxiety under calm conditions, but stronger than normal reactions when stressed.  These effects could be eliminated if the germ free mice were given fecal implants from normal mice, but only up to a certain age: "Thus, early exposure to (healthy) gut flora is required for normal development of the stress response ...[and] neonatal infection with pathological bacteria may permanently alter such response, predisposing the individual to stress-related disorders later in life" (Kramer & Bressan, 2015).  Increasing certain microbe strains commonly found in yogurt reduced despair-like behaviors (eg., passivity, not attempting to escape stressful stimuli) in rats, and feeding mice a microbe-laced broth improved their memory and reduced anxiety and depression-like behaviors.

Studying the psychological influences of microbes in humans is more challenging because our experiences can be unwittingly influenced by expectations and prior beliefs  -- the so-called placebo effect, or just plain "wishful thinking."  Relying on self-reports of mood, for example, is not scientifically convincing, but several studies of the effects of altering microbial concentrations of certain gut microbes have also included more objective measures.  For instance, in one study healthy individuals ingested daily doses of lactobacilli, the same microbe found in yogurt and other "probiotic" products.  After one month there was a significant reduction of self-reported anxiety and depression, but more importantly there was also a measurable reduction in stress-related cortisol levels, showing the same effectiveness as benzodiazepines (eg. Valium). In another study these same bacteria modified healthy women’s brain activity in regions that control processing of emotion and sensation, dampening reactions to facial expressions of anger and fear ... these same regions are involved in anxiety disorders (Kramer & Bressan, 2015).

Microbes on My Brain

Most of us are hosts not only to gut microbes but also to strains of microbes that colonize our brains. Residing in the brain gives them the opportunity to directly manipulate neurotransmitters and to influence behavior. Particularly interesting, however, is evidence that they do not simply influence mental processes, they also manipulate the brain in ways that increase their own survival and genetic viability.

Unlike gut microbes, brain microbes are almost always parasites, in that they exploit us while simultaneously doing us harm. They are surprisingly (and disturbingly) common. We usually think of parasites as prevalent only in poorer, less developed countries, but in the case of certain brain microbes the rate of infection is uncorrelated with poverty or level of development. For instance, toxoplasma gondii infects about 22% of the U.S. population (CDC data), 50% of those in the U.K and continental Europe, and as high as 70-80% is some South American and African countries, but as low as 10% in parts of Asia (Hill & Dubey, 2002Kramer & Bressan, 2015). It is generally believed that toxoplasma evolved as an animal parasite and humans are an incidental host that has occurred in modern times because of close contact with certain animals.

Toxoplasma is a particularly good example of a microbe that manipulates the brain activity and behavior of its host. Toxoplasma eggs are usually found in the poop of animals that have eaten intermediary hosts of the microbe. Animals that ingest the poop become infected and the microbe is spread more widely.  A common example is when domestic cats eat infected mice or rats, then excrete poop with toxoplasma eggs. Rodents, not known to be picky about their food, eat the cat poop and complete the cycle. Humans who come into contact with infected cat feces (say through gardening or cleaning a litter box) can also become hosts by unwittingly ingesting eggs they have accidentally transferred to their food. Another source of infection for humans has been found to be commercially available food, particularly under-cooked meat or fish that has somehow been tainted with Toxoplasma eggs.

Now for the really interesting part. Rats and mice that are infected with Toxoplasma lose their fear of cats and even become sexually attracted to cat urine (see Berdoy, Webster, & Macdonald, 2000).  This, of course, is very bad for the rodents but very good for Toxoplasma because only in the intestines of cats or other intermediary hosts can the microbes produce eggs.  "Toxoplasma manipulates the brain of the rat so as to increase the probability that its otherwise uncertain transfer to the cat’s intestines actually takes place" (Kramer & Bressan, 2015).  The mechanism for this seems to be an increase of the neuroactive chemical dopamine, which in humans is known to be associated with recklessness and sensation-seeking behavior and greater susceptibility to schizophrenia.  This may also be a cause of higher workplace and traffic accidents among infected humans.  Aside from the neurological effects on behavior, Toxoplasma doesn't usually pose serious physical problems for its host unless the victim's immune system is weakened.  Humans with other health problems, young children, and elderly are at risk and in infants neurological damage can be quite severe.  For this reason pregnant women (who can be infected but asymptomatic) are often advised to avoid contact with cats in order to prevent passing Toxoplasma eggs to their offspring.

Conclusion

Besides Toxoplasma there are a number of other microbes that often colonize the brain, and there are several non-microbial life forms that also influence behavior and cognition. The bottom line for me is that "knowing thyself" requires assessing the contribution 100's of trillions of other organisms to our sense of who we are.  The old adage, "we are not alone," applies not only to things that are external to us but also to things that are deeply embedded within our bodies. Perhaps, as Kramer & Bressan put it: "It is time to change the very concept we have of ourselves and to realize that one human individual is neither just human nor just one individual. "
 ______________________________
Sources & Resources
Kramer, P., & Bressan, P. (2015). Perspectives on Psychological Science, Vol. 10(4) 464–481
You're Surrounded by a Cloud of Bacteria as Unique as a Fingerprint: Washington Post, 9/22/15
CDC - Toxoplasmosis - Epidemiology & Risk Factors 
CDC -Toxoplasma gondii Infection in the United States
Berdoy M., Webster, J. P. and Macdonald, D. W. (2000) Fatal attraction in rats infected with Toxoplasma gondi. Proc. R. Soc. Lond. B, 267, 1591-1594
Is There Another Human Being Living Inside You?:  BBC.com, 9/18/15 


Friday, August 14, 2015

The Allure of Undoing Reality -- "If Only," "Coulda," "Woulda," "Shoulda"

Humans have a number of "interesting" qualities, some of which seem to be unique in the animal kingdom.  One of these is quite odd and puzzling when you first think about it:  We love to undo reality.  Given almost any event or state of affairs we are very likely to imagine alternatives to it --  a cognitive process called "counterfactual thinking."

As an example, consider the all-too-frequent news story of a gunman who mows down innocent people.  Take your pick of several recent actual cases of this, say the June shootings of 9 people in a Charleston, S.C. church (NPR, 7/10/15).  The facts of this tragedy are clear: a gunman with self-admitted racist motives opened fire after sitting through the church service and 9 people are dead.  At first news stories focused on the scope and details of what happened, then turned to analyses of the implications and possible causes, and finally to counterfactual assessments of how this terrible event could have turned out differently.  "If only" racism wasn't so prevalent.  Or, "if only" the background check of the alleged gunman wasn't flawed, it "shoulda" prevented him from buying a weapon.  Or, "if only" tighter security measures at the church had been in place (e.g., metal detectors, arming the pastor with his own gun), they "coulda" barred him from entering or at least reduced the number of people he killed.  Any or all of these imagined factors might have undone the reality of 9 dead people.

Events in our own lives are also often the focus of counterfactual thinking.  Negative events seem particularly likely to engage our cognitive efforts to imagine alternatives to reality.  An accident, a mistake, an illness, or other bad incident inevitably leads to assessments of counterfactual factors that might have led to a more positive state of reality.  What could we have done differently?...what should we have done?...if only we would have done X then the bad thing could have been prevented or perhaps something positive would have happened instead.  Of course, counterfactuals may include factors over which we have no control -- genetic predisposition, undetectable environmental hazards, unexpected behavior of other people, etc. and their plausibility has the benefit of absolving us of responsibility for the event.  However, these alternatives are often discounted because they imply that we may not be able to control what happens to us -- a very uncomfortable idea for most of us to entertain (even if true).

Counterfactual thinking has been the focus of a good deal of research and theory in social psychology over the past thirty years, beginning with the insightful work of Kahneman & Tversky (1982).  The result is a fairly complete understanding of the nature of the phenomenon -- why we tend to undo reality, the circumstances that govern the likelihood we will do so, the determinants of the kinds of factors we select as the most plausible counterfactual alternatives, the cognitive and emotional impact of counterfactual thinking, the nature of individual differences in extent and style of counterfactual thought, etc.  In order to avoid having you engage in counterfactual thinking along the lines of "if only I hadn't clicked on the link to this blog, I could be doing something way more fun, like sorting my socks," let me just cut to the chase and give you a few highlights of what these efforts have produced.  For more thorough reviews, see the references at the end.

  • Undoing reality has emotional consequences.  As you probably noticed from the examples above, counterfactual thinking nearly always is associated with emotions in two ways.  First, the actual event or state of reality likely provokes positive or negative feelings.  Mass shootings of unarmed people is abhorrent to us. Personal accidents, failures to achieve goals, mistakes we make, losses of loved ones, illness,  and economic misfortunes engender fear, sadness and despondency. Second, the counterfactuals and the alternative reality they generate also evoke emotions, for instance when we imagine that there was something we could have done to prevent something bad from happening we feel regret, shame, or anger.  One way of alleviating negative emotions is to engage in what is called "downward" counterfactual thinking -- considering ways things could have been even worse -- "at least"  the plumbing leak didn't damage our new sofa, or "at least" the car still runs after I smashed it into that wall...."
  • Emotional consequences aren't always rational.   Our tendency to take mental short-cuts when we think about events sometimes leads to emotional reactions that rely less on logic or facts than on things like the ease with which certain alternatives to reality come to mind rather than others, based on their recent salience, mutability, or personal relevance.  For instance, losing the lottery with a ticket that is just one number off provokes a much stronger emotion than losing with a ticket that has no matching numbers, even though the probabilities of the two losing numbers are exactly the same.  It is much easier to imagine that "if only" just one number had been different we would have won than to imagine all the numbers having been different.  Likewise, having a costly car accident the day after we forgot to renew the insurance is likely to make us feel worse than if the accident happened a month later, even though in both cases the payment error results in the same financial outcome.
  • Undoing reality is a good thing (usually).  Given the fantastical nature of counterfactual thinking and the angst it can produce, it may seem  puzzling why we spend so much time doing it.  The answer comes from decades of research that points to the indisputably functional nature of  undoing reality and the emotional response that results (see Epstude & Roese, 2008).  By considering alternative ways an event (particularly a negative one) might have occurred, along with factors that might have prevented it or produced a more positive outcome, we can adapt future behavior to be more effective or to avoid repeating past mistakes. The regret and remorse associated with considering counterfactuals, though uncomfortable, may motivate changes in behavior that are more adaptive in the long run.  Indeed, it could be argued that the human tendency for counterfactual cognitive activity is an evolutionary consequence of having nervous systems that aren't programmed with predominately instinctual behaviors -- it is the necessary mechanism by which we adapt and change to new environmental demands.
  • But not always.  Like any powerful adaptive tool, counterfactual thinking can be misused or can be applied in ways that lead to personal or social difficulties.  In particular we have to be able to distinguish between counterfactual factors that are realistically controllable and those that are not, otherwise we might become engulfed in feelings of guilt, shame or remorse when it really isn't justified.  Likewise, we can blame others for not foreseeing or controlling things that they in truth could not have.
I guess the bottom line here is that human nature seems to involve considerable cognitive activity that entails imagining states other than those have actually occurred -- undoing reality is something that makes us (uniquely) human.  Though there is ample scientific evidence that this characteristic is adaptive in an evolutionary sense, it might still be questioned whether focusing too much cognitive effort on alternatives to events distracts us from fully experiencing and appreciating the present.  As with other human tendencies, it might be beneficial at times to control and limit our inclinations, natural though they may be.
_______________________________________
Some Source References:

Counterfactual thinking - Wikipedia

What might have been:  The social psychology of counterfactual thinking. Neal J. Roese & James M. Olson (eds.) Psychology Press, 2014, 2nd edition.

Epstude, K.; Roese, N. J. (2008). "The functional theory of counterfactual thinking". Personality and Social Psychology Review 12 (2): 168–192.

Kahneman, D., & Tversky, A. (1982). "The simulation heuristic". In Kahneman, D. P. Slovic, and Tversky, A. (eds.). Judgment Under Uncertainty: Heuristics and Biases, pp. 201-208. New York: Cambridge University Press.




Sunday, July 5, 2015

"Geezer Grease:" My Missed Opportunity to Make Bazillions

“It's paradoxical that the idea of living a long life 
appeals to everyone, but the idea of getting old doesn't appeal to anyone.”

One of my least favorite parts of advancing into Geezerhood has been nature's insistence on making me pay for past indiscretions in the way I mistreated my body.  For instance, in younger years it was quite the thing for those of us with white skin to deliberately burn it and try to turn it into the ideal "tan."   We applied "sun tan lotion" not "sun screen" or "sun block" because the hope was that it would promote quicker, darker tanning not prevent it.  SPF 4 was about the highest I recall using. Plus, I grew up in Colorado, where sunny skies, low humidity and high elevations guaranteed a massive dose of UV rays.

Well, those years of exposure have now resulted in many nasty little pre-cancerous thingies on my forehead that require special treatments and regular visits to my dermatologist, whom I have on speed dial. I've undergone exotic-sounding procedures in an attempt to stave off worse developments, like "liquid nitrogen thermal destruction," "microdermal abrasion" and "photodynamic therapy," and I have an arsenal of lotions and potions that I apply daily.  And of course I don't set foot out of the house without SPF 50 and my broad-brimmed dufus hat (aka "Geezerware").*

My childhood solar epidermal abuse has also led to skin that is irritatingly delicate and prone to injury.  The most irksome form of this for me is that my forearms bruise so easily that often I can't recall the source.  And when I am aware of the cause, I have watched helplessly as bruising from even the smallest bump or scratch spreads like purple watercolor on wet paper and develops into a hideous, leprosy-like discoloration that lasts 10 days, minimum.  Unlike the precancerous thingies, the bruising and bleeding aren't life-threatening, but they do a real number on my vanity because they symbolically shout "OLD MAN WALKING HERE!!!!"  Along with "age spots" these bruises are almost guaranteed to get you the senior discount at your local retailer without even asking.

I've quizzed my dermatologist repeatedly to see if there is any preventative treatment for my susceptibility to bruising, and barring that some ameliorative cream, pill, or injection. The answer is always "No," delivered with a sympathetic but somewhat patronizing smile (she's a young'n, after all).  Indeed, the current scientific consensus seems to be that this condition is a common aspect of normal aging (assuming more serious causes have been ruled out) that comes from losing some of the protective fatty layer in the skin that protects the blood vessels -- we literally become "thin-skinned" as we get older -- and my dermatologist argues that this process is accelerated by sun exposure -- those who abused their skin when young are likely to be even more susceptible to losing the fatty layer.  Although there are many pseudoscientific-homeopathic-biodynamic-synergistic-astromagnetic-universallifeforceallaroundus remedies available for sale, there is very little hard evidence they do much except bleed your wallet**

The authoritative Mayo Clinic puts it this way:
"Most bruises form when small blood vessels (capillaries) near the skin's surface are broken by the impact of a blow or injury — often on the arms or legs. When this happens, blood leaks out of the vessels and initially appears as a black-and-blue mark... As you get older, your skin also becomes thinner and loses some of the protective fatty layer that helps cushion your blood vessels from injury." [my emphasis]
And to make this even more depressing, the Mayo Clinic says that
 "Once a bruise has formed...not much can be done to treat it. Most bruises eventually disappear as your body reabsorbs the blood — although healing might take longer as you age [my emphasis]. It might help to elevate the affected area and apply ice. If the sight of a bruise bothers you, cover it with clothing or makeup."
Thanks, doc. I should put ice packs on my bruises and walk around with my arms held in the air, while wearing a long-sleeved shirt in Hawai'i?  That's all you got?  If we can send rockets to distant comets and develop nanobots that can deliver drugs to specific tissues in the body, surely we can come up with something better than that.

I was recently discussing this with some fellow geezers (well, one was a soon-to-be geezer still in denial).  We had just been hiking and had the cuts and bruises to prove it.  Over much-needed beer we engaged in some "competitive complaining" (see my blog "Geezer Olympics") about bruising and other skin problems but then started talking more productively about possible preventive measures.  Of course much of what came out was "alchological," meaning it makes much more sense when you are under the influence of alcohol. So have a couple of shots before reading on.

In particular, we concluded that what the world's geezers need most is a special transparent cream that contains (a) nanoparticles that form a thin, flexible, protective shield on the skin, maybe like the new kinds of body armor that rely on nano technology (see Discovery, 4/2/13),  (b) a highly concentrated antioxidant of some kind (c) super sun block of at least SPF 100,  (d) a broad spectrum antibiotic just in case (a) doesn't work completely, and (e) moisturizers and various beautification agents (why not?).  Of course, for vanity's sake this wonder-cream would be completely invisible on the skin.  Lather up with it before your morning coffee and voila!  No more bruises!

We could market this stuff and make bazillions!!! ("How about another beer?")  A name. We need a name for our product.  Something that our intended market would immediately identify with and rush out to buy.  I've got it!  How about "GEEZER GREASE?" 

The next day the practicality of our idea seemed to have faded considerably. Still, it was a very appealing notion even if a bit fantastical.  The name in particular had a nice ring to it.  Then, just for the heck of it I Googled "geezer grease," not really expecting any results.

Wrong.

Mill Creek Catalog
Turns out someone has beaten us to the name and is already using it for a skin balm. Their grease doesn't have any nanoparticles, but it does include an interesting ingredient we hadn't thought of ---- Cannabis.  Yup, a pot-infused ointment to "cure" your skin troubles (or at least make it so you don't care as much).  The product is sold at a couple of outlets, including Mill Creek Natural Foods  and Green Stop Cannibis.  Mill Creek is especially enticing in their description: "When you need serious natural skin care for dry, itchy or chapped hands, feet, elbows, or to help with minor scratches, excellent on small cuts. This rich blend will feel so smooth and soothing...just a tiny amount is all you need. Hand blended with extra virgin olive oil, cannabis, calendula blossoms, comfrey root, goldenseal, vitamin E, rosehips & beeswax." [my emphasis].

Well, we missed our opportunity to make bazillions of dollars by ourselves, but maybe we could join forces with the cannabis company and come up with a new product that combines both sets of ingredients.  We could call it "Super Geezer Whoopee Grease."

_________________
*My apologies to those of you who (a) look good in broad-brimmed hats, (b) think you do, and (c) those who don't but aren't vain like me and feel sun protection is more important than looking good.  I, however, am vain and know I look like a dufus in most hats, particularly those with a broad brim.

**There are a few products that may have some small degree of protective or ameliorative effects but have mainly anecdotal evidence or inconsistent scientific support.  Retinol, proven to stimulate collagen production and reduce fine face wrinkle might work on forearms by improving the supportive structure of the skin.  Alpha-hydroxy compounds which promote exfoliation and new skin growth on the face may also work on arms, but this has never been shown scientifically.  Arnica, a substance derived from aloe, has weak and inconsistent data supporting its efficacy in speeding healing of bruises.  Oh, and a method sure to work is the use of forearm guards, or chaps --a real geezer fashion statement.
_________________