An Exploration of Gluttony

Jacques Callot, The Seven Deadly Sins - Gluttony

Jacques Callot, The Seven Deadly Sins - Gluttony (Photo credit: Wikipedia)

As I have done twice before (with Sloth and with Envy), today I plan to explore the Deadly Sin of Gluttony.  Partially because this is one of the few Deadly Sins I remember without having to look them up, and partially because this was the sin above all others (with the possible exception of the array of “sins” surrounding sex) about which my mother commented to me–at length.  So here we need to look at another sin of the flesh.  Let’s explore Gluttony.  Simply put, the sin of Gluttony is the overindulgence of food and drink and the obsession, the preoccupation, with sustenance as pleasure.  The wariness most human institutions display regarding pleasure of any kind is a topic for another day, but suffice for this essay, it seems likely that all creatures that eat and drink feel pleasure when doing so.  It seems to be the body’s way of encouraging a creature to leave the safety of a nest or hiding place to go and find what’s needed for continued life.  At its most basic, life itself, continued existence, requires food and, at least, water, at more irregular intervals but just as absolutely as life requires breathing (of some kind).  And all living bodies reward the finding and ingesting of food and water with pleasure.  And therein, in human cultures, lies the rub.

Now, any human need can and will be abused, can and will be twisted, can and will be taken too far in any possible direction.  Food can have an emotional component much more complex than the simple “food is necessary and therefore pleasurable” equation explored in the above paragraph.  For some people, food was made equivalent to love by a parent or, for some others, food was the only pleasure possible.  These motivations simply scratch the surface.  We all know that humans can become addicted to their pleasures, whether of alcohol, drugs, food, sex, gambling, even shopping according to some experts.  This has something to do with brain chemistry, something to do with filling up an empty space emotionally, something to do with early and inexplicable losses.  Now such things are, at least in the scientific community, a matter of the physical self, not a matter of will.  But to most of us, and most especially to the accepted dogma of our religions, addiction is still equated with lack of discipline, and all self-feeding behavior with a turning away from the hope of salvation.   Thus, sin.  And, with food and drink, the sin is Gluttony.

Gluttony will always be perceived as the pig in the trough, slopping through enormous amounts of food with no real appreciation in the smallest possible amount of time.  Which, regarding the pig, of course, is a bit of a canard.  They apparently, in real life rather than in legend and moral tales, don’t eat either that sloppily or hugely.  And, since their function for humans is to put on a lot of weight in order to provide a lot of meat and lard, the somewhat unpleasant concept of force-feeding enters the picture, thus making the pig not to blame for his, well, piggishness.

"Whatever it is, we'll eat it!" Youn...

"Whatever it is, we'll eat it!" Young pigs enclosed next to the footpath near Monkhall. (Photo credit: Wikipedia)

My own definition of Gluttony as either a sin or as a behavioral problem to resolve probably has to do with occasions when I might eat beyond fullness, when I, in essence, force-feed myself.  When the desire-reward cycle gets out of whack and all I want is more, more, more!  When all thought of sustenance, moderation, health, diet, anything, even pleasure, is lost in the perpetual motion machine my fork becomes.  Did I ever do this?  I remember younger times when I would go out in the evening after a full dinner and still order a hamburger and fries and a malt and eat every bit.  I remember Thanksgivings and buffets in hotels and all-you-can-eat pizza parties.  Is this sin?  Is this Gluttony with a capital “G”?  Remembering those times now, it’s not hard to think so.  And yet, such occasions were relatively rare.  Besides, I have never binged in the classic sense of eating an entire box of cookies or a gallon of ice cream all alone.  That would make me physically ill.  And while I (in spite of all the pounds I’ve lost) still have many more to lose, by and large those pounds were not gained from binging, from force-feeding, from Thanksgiving dinner and Vegas buffets.  The too, too solid flesh still clinging to me came from eating perhaps just a bit more than I should have and exercising a lot less than I should have (for my own benefit, not the desires or approval of others) over a lifetime, not from any specific banquet where Gail kept going back for more Beef Stroganoff, don’t hold the noodles.  It’s hard to find that sinful.  But then, as with many sins, it’s much easier to identify the Glutton, the Slothful, the Envious, the Vain, in others rather than one’s own self.

Edward Curtis photo of a Kwakwaka'wakw potlatc...

Edward Curtis photo of a Kwakwaka'wakw potlatch with dancers and singers. Kwakwaka'wakw people in a wedding ceremony, bride in centre. Photo taken by Edward Curtis, 1914. Edward Curtis photo of a Kwakwaka'wakw potlatch. (Photo credit: Wikipedia)

And why is Gluttony a sin in any event?  As in my other essays on this overall topic, let’s look at the advantages to civilization if Gluttony is considered a sin.  First, in our hunter-gatherer days, Gluttony would have been the mark of selfishness.  To cram into one’s own mouth what should have been shared with the tribe for everybody’s well-being would have been considered a mark of wrongness from the very start of the human race.  Just as sharing all one has, especially all the food one has, is considered a mark of favor, a signal of a generous human soul, hoarding to keep all that necessary food to oneself is a very bad thing.  These concepts still operate today. In times of disaster, hoarding items needed by the community is still considered wrong, even when the hoarder has merely been provident in advance of the potential disaster.  And generosity is considered such a good thing in human society that, for example, certain tribes of the First People of the Pacific Northwest created the Potlatch, in which a chief would give away all he had as a signal both of his wealth and of his goodness, hoping only that it would be incumbent upon these to whom he gave to then themselves invite him to a Potlatch in which he would, essentially, recoup all and more of what he gave away.  In my own Christian tradition, there is the concept of casting the bread upon the waters and it will be returned a hundredfold.  (Often, the implication for individuals is that you cast your bread on the waters in order that it be returned a hundredfold, when that is not what was meant by the proverb.)  Obviously, this is a sophisticated concept, this idea of generosity with food; however, there is no complexity to the concept of hoarding, hiding, keeping it to oneself.  That is considered nasty the world over.

Cities of the ancient Near East

Cities of the ancient Near East (Photo credit: Wikipedia)

But the signal reason for Gluttony becoming a sin, in my judgment, was the beginning and continuation of agriculture and thus private property.  To me, this still was a huge mistake on the part of humanity, as I’ve noted before in my essays discussing Sloth and Envy.  In agricultural societies, the “Big Man” (this is a current theory in anthropology, which attempts to explain the eventual rise of kings) would be a person able to talk others into helping him sow, cultivate and harvest his crops.  In return for giving them a (small) percentage, he would store the rest against the bad times of year or against drouth or disaster.  Thus he became wealthy and powerful.  And of course, his motivation would be, basically, Gluttony–the desire to have more than enough to eat and drink.

As time went on, the piling up of extra food through cultivation of crops required the invention of, well, infrastructure.  A man spending all his time cultivating crops or even supervising others doing so would not have time to hunt for meat, to grub out clay for pottery, to make mud bricks to build storage units or a home, would not have time or the resources in his own self to protect all he gleaned from those who would steal it.  And so had to be born in the world specialists who, in return for a percentage of his crop, would make the pottery, graze the goats or sheep, shear them, weave the fleece into cloth, build the mud brick buildings, create a military to protect the community, and, and this is by no means the least of the important specializations, build up temples manned (or womaned) by priests to please the gods.  All these things were paid for by a percentage of the crops.  A gluttonous Big Man, one who hoarded, who waxed in fatness, who ate too much himself, and allowed his family and his servants to eat too much, kept that percentage small and thus the priests, the king, the military men, even the goatherds and builders and potters and whatnot, all those who did other things than raise plants and animals, would not have enough to continue their tasks, let alone have any chance of indulging in gluttony themselves.  And so the civilization prospered because of a balance between the Gluttony required to build and the Gluttony that would destroy.  (The map to the right of the start of this paragraph shows one of the earliest of such civilizations, Mesopotamia.)

But while Gluttony that results in creation would be a good thing for the Big Men (and their families) and the resulting kings, priests, generals and whatnot, Gluttony could not be allowed to pervade all the people in the society.  It could not even be allowed to exist among even “important” people other than the king, priests, etc., because then such people would hoard the king’s share, the priest’s share.  And the king and the priest, just as the potter and the herder, did not sow, cultivate or reap their own food.  They subsisted entirely upon the labors of the farmers.  No, Gluttony had to become a sin, had to be punished, guarded against.  From the farmer who stored the surplus grain and paid out of it his taxes to the lowliest worker on his worst farm who would be punished, sometimes with death, for hiding any part of the harvest, everyone in the society needed to feel that such hoarding, over-consumption–Gluttony–was a great sin and would be horrifically punished in this world or the next.  Of course, in the meantime, the king, the general, the priest, the Big Men, did hoard food, did engage in banquets in which food was consumed to a point where rooms had to be set aside to allow for vomiting so that a guest could start all over again, did waste food and lay waste to the countryside in order to get it.  Anyway.

But then, a sin does not rise to the status of a Deadly Sin if it isn’t a popular failing.

Thus, naming Gluttony a sin is a good thing for the civilization, as it is  normally constituted.  But is there anything that can be said in Gluttony’s favor, as we have done with Envy and Sloth?  Well, obviously, Gluttony underlies the will to store food against future disaster, which begins the whole cycle leading to diversification of labor and thus eventually to a civilization.  Plus, to be fair, the immense variety of foodstuffs, the variable flavors of food, would probably not be a part of our world without Gluttony.  Gluttony serves as the undoubted basis for the migration of human groups throughout the world, the underlying cause of human beings taking over the planet,the reason we cook and prepare food instead of simply gnaw at bones and leaves, the development of domestic strains of grain, other kinds of plants, and animals.  Without human intervention, for example, there would be no cattle.  No, none at all.  We developed cattle from wild ruminants, but apart from yaks and water buffalo, what we know of as cattle are quite different from their wild kin and would not be able to survive without continued care (or exploitation) by humans.  Gluttony led to population growth.  Gluttony led to appreciation of the finer things in life, well-cooked and delicious food, wine, beer, ale, mead and liquor.  And, in a sense, Gluttony can be extended to the consumption not just of food and drink, but also of art, of fashion, of performance, of architecture, landscape, beauty.  Without Gluttony, life would be less interesting, less satisfying, less civilized in its more refined sense.

But all the things I’m listing in the above paragraph are in themselves an indictment of Gluttony.  Without it and what it creates, we would be a small population living with the Earth instead of exploiting and destroying it.  Without the wealth that resulted from Gluttony, there would be no civilization so attractive to outside human groups that war was invented and used (and still is) to overcome that civilization and get all that wealth for themselves.  Not that we would recognize such a world.  For most of us, probably, we would not love such a world, no matter how much healthier we and the world would be.

As for me, with my love of good food, of cooking, of creating intriguing new dishes from delicious ingredients, I am definitely part of the problem.  Because I have to breathe to live, I prefer my oxygen to come from pure mountain air instead of the stuff on a subway platform.  Because I must drink water to live, I prefer it to be unpolluted, fresh, rather than skimming from a puddle.  And thus do I want my food to be good, fresh, healthy and delicious.  And plentiful.  I have to eat to live.  I try not to live to eat.  But, much like Lust, which I may explore in a future essay, Gluttony is never satisfied.  There is never enough.  If we indulge in all that we want, we will want more tomorrow and the day after that.  Like all the Deadly Sins, without feeling their drive, we would not be human, would not have the world we know.  But their drive is not self-limiting.  Unless we find a way to rein in Gluttony, in particular, we will gnaw the planet bare.  And that’s not good for anybody.

Flanders, Netherlands

Taste, Trends and Cowboy Boots

Painting "Herd Quitters"

Have you ever pondered the difference between what you are supposed to like and what you actually do like?  I’m not thinking, here, about the truly important stuff, such as sexual preference (which is almost certainly not a choice), or with whom you fall in love anyway (which is more like compulsion or madness).  This is the more surface stuff, more about still not liking tangerine even when it’s the “in” color (I say it’s orange and I say the hell with it) or (like Ed Wood of long-ago B-movie days) loving Angora shruggies whether they are fashionable or not (something I can’t wear whether I like them or not or whether I think I should like them or not, because Angora itches).

Or, even more simply, what we are taught by our mothers (usually), local style mavens (often), and the media (all too often) to think of as stylish, trendy, fashionable, cool or just in good taste may not be what we, in our heart of hearts, really find pretty, attractive and delightful.  I remember in high school thinking that the pep club uniforms we had (slightly above the knee purple box pleated skirts with German lederhosen-style straps worn over white button-down Oxford shirts and with white tennis shoes) were really good-looking.  I liked the quality of the wool flannel in the skirt, I liked the hidden stitching on the stitched-down portion of the box pleats, I liked the simplicity of the purple and white, the shirts and tennis shoes complementing the skirt.  I thought the tout ensemble of the whole (as a friend’s mother would put it) looked good on me.  And I did not dare say so.  All the comments I ever heard about this uniform were, ahem, uniformly negative.  It was considered clunky, even then (and, yes, this was a long time ago), it was considered dowdy and totally uncool.  Nobody liked it.  So I, in my 16-year-old wisdom, didn’t like it either.  But I really did.

This led to confusion over time, because I learned probably the opposite of what I should have learned.  I learned that I’d better trust other’s taste in preference to my own.  I learned that what I liked was kitschy, ordinary, dowdy (that word again) and that what I was supposed to like was all that was cool, trendy, attractive.  And so I tried to like it.

Black Western cowboy boots on a white background

Black Western cowboy boots on a white background (Photo credit: Wikipedia)

For example.  I’ve lived either in the West in small towns or medium cities, Los Angeles or New York City virtually my whole life.  And somewhere along the way, I fell in love with Western-style clothes.  Specifically, such items as cowboy boots, snap-buttoned shirts, and fringed leather jackets.  But for a long, long time I didn’t tell anybody that, because when I was growing up, to be a cowboy or to like such western styles and ways of living was the most totally uncool thing you could do.  As I recall, the cool kids had truly unpleasant appellations for the cowboy kids, to which I will not give any credence by repeating them here.  And oh how I wanted to be a cool kid.  (I wasn’t, because I was an academic nerd, a term that had not been invented yet.  I liked most of my teachers and the challenge of learning stuff.  This is ALWAYS uncool in high school, at least in public high school.)  So I pretended to go along with the contempt (and it was true contempt, growing out of the bottomless pit of insecurity that a teenager lives with every day) for cowboys.

But way inside where I didn’t even look I really liked how they dressed.  And I couldn’t admit it.  Not even to myself.

A few years later, when I lived in Wyoming, where everybody was a cowboy (except for the cowgirls) and that was just background, not even a lifestyle choice (which is a term I don’t think anybody who lives in Wyoming understands or wants to), I went with a girlfriend to Cheyenne Frontier Days, one of the best rodeos going.  And that’s when I first really met up with, watched, and started to understand real working cowboys.  Rodeo cowboys, at least.  For them, wearing jeans bleached practically white with a round white patch in the back pocket where the chewing tobacco rubbed against the material, wearing tight shirts with snaps instead of buttons, and wearing, of course, and most iconically, the hat and the boots, didn’t have anything to do with style, with cool, with any sort of John Travolta post-modern irony.  It was simply the clothes you wore that were most practical for a physical, demanding way of life filled with hard work and not a lot of money.  You wore cowboy boots because if you rode, the pointed toes got your feet in the stirrups quickly without you having to look down and the high heels kept your feet from slipping through the stirrups, so that you wouldn’t be caught and dragged if your horse threw you.  The hat?  Wide brim to keep off the brutal western sun, deep crown to use to water yourself or your horse.  Jeans because they don’t wear out and you don’t have enough money to buy lots of pants.  The tight shirt with snaps?  The tight part is to protect against brush and thorns that would catch on looser material.  I don’t actually have any guesses about the snaps.

But Western wear has always been stylin’, whether it was “in style” or not.  Snaps and complexly designed yokes and fringe and embroidery were a major part of the look of a Western shirt.  And, let’s face it, during the mid-years of last century, Western wear was one of the few ways a man was allowed to express his own taste for color, style, for actual pretty, in what he wore.  And still be the most macho dude around.

So, here were Pat (my friend) and I, wandering around “backstage” at Cheyenne Frontier Days.  And I mostly noticed that people who are very comfortable in their skins, in their choices, look like they belong in their cowboy clothes.  This is something that can be extended, of course, to any style of clothing.  Queen Elizabeth II looks quite comfortable in satin encrusted with embroidery and jewels, wearing her orders and sashes and necklaces and tiaras and crowns.  For her, it’s not a costume, it’s not “cool”.  It’s just her uniform for a certain part of her working life.  I also noticed that the real working cowboys, whether their work is ranching or rodeo, look so utterly, droolingly delicious in their jeans and boots and snap-buttoned shirts and hats that a mere female has a real hard time remembering that these men are not icons, they’re human beings, with undoubtedly human problems.  I’m not suggesting that a girl shouldn’t get interested in a cowboy (or vice versa), but somewhere between “they never say a word and they’re always hurt” and “my heroes have always been cowboys”, it’s probably best to find the cowboy who interests you more for his thoughts, his humor and his liking of you than because he can wear the hell out of a pair of tight jeans.  Just sayin’.

But they sure are fun to look at.

However, that congruence between what I really liked, what my taste genuinely was, and what was out there to like, what was okay to like, didn’t survive the end of the rodeo season.  For one thing, I moved away from Wyoming.  For another, it still wasn’t cool in Colorado to like cowboys.  Oh well.  Life went on.

Roy Rogers and Dale Evans. Photo taken at 61st...

Roy Rogers and Dale Evans. Photo taken at 61st Academy Awards 3/29/89 - Governor's Permission granted to copy, publish or post but please credit "photo by Alan Light" if you can (Photo credit: Wikipedia)

And eventually I moved to Los Angeles.  LA is not part of the West, just so you know.  It may have been once, when Roy Rogers lived in the San Fernando Valley, but in the eighties and nineties when I lived there, LA was just too cool and trendy, too center-of-the-world, to give house room to the real life of the West.  But even there, there came around, as it does every few years in LA and New York City (but apparently nowhere else except every place in Texas), a fad for the cowboy look.  Oh, not for being a cowboy, just for looking like one, in a sort of deconstructed way.  And people who always seem to know what the next big thing is would rent a vacant lot or a parking lot and put thousands of pairs of used cowboy jeans or Western shirts and/or thousands of pairs of used cowboy boots out and people would buy them and buy them and buy them.  I did too.  I got a pair of black lizard Frye boots with really high heels and really pointed toes for some impossibly small amount of money and loved them to pieces, even if they were a bit narrow for my fat little baby feet.  I’m still mad at myself for getting rid of them.  One of the reasons those vacant lots filled with used boots was even possible is that you can’t kill a good pair of Frye boots no matter what you do to them, they’ll outlast you (or at least your ability to walk in boots with really pointy toes and really high heels).

Of course, I wore them the trendy, LA way, NOT with jeans, snap-buttoned shirts and fringe, but with long swirly skirts that were in style and so, of course, not Western.  And, heavens above, not with a cowboy hat.  After all, you had to have some standards.  And the cognitive dissonance went on.  Because I really liked those cowboy boots and what I wanted in my hidden self was to wear them right, with a fringed leather skirt or with chaps and jeans, and (even though I get the worst hat head you ever saw) with a cowboy hat.  And no matter how completely un-trendy it was (and it was), I wanted a fringed leather jacket and turquoise jewelry (none of which I could afford).  I really wanted them.  And I kept quiet about it, because just saying it out loud would brand me as some kind of nerd, geek or whatnot, with no style at all.

Finally I moved back here to the West.  Oh, not for that reason.  And not without a very large detour to New York City where I discovered that while what’s in style rules on 5th Avenue, you can wear what you want and like what you want in the Village (at least, you can so long as it’s black).  Which helped me, finally, realize that it was okay, it really was, for me to like things (cats, Georgette Heyer novels, Sherlock Holmes, Frye boots, fringed leather jackets, Tex-Mex food, Arts and Crafts furniture, Victoriana, Fiestaware, and the American West) because I liked them.  Whether somebody else did, whether it was cool or trendy, mattered not in the slightest.

I started buying turquoise jewelry.  Not the really good stuff, I still can’t afford it, but I have a couple of pieces I wear almost all the time.  I have a fringed, embroidered, suede Western jacket.  I just bought a pair of cowboy style ankle boots with conchos on them (I can hardly wait to wear them with the new jeans four sizes smaller than I’ve worn for years).  I go to the Rooftop Rodeo here in town.  And I’m starting to not care whether or not the Western-style pieces I’m looking at (rugs, cushions, even furniture) are cool or trendy anywhere but in my mind and heart.

Even more, I’m realizing that it’s okay for me not to like stuff that iscool or trendy.  No more apologies that I’m just not a minimalist when somebody tells me that the best furniture is Mies van der Rohe.  I know who he is, his stuff is lean and gorgeous and simple, and I couldn’t live with it for a minute.  I’m finally learning that stating for the record that I don’t like modern furniture is not going to get me drummed out of the human race, it’ll just keep me from being invited to a house where there wouldn’t be a comfortable place to sit anyway.  So now I can admit out loud, darn it, that I really liked those high school pep club uniforms and that I don’t care if tangerine is this year’s best color, because it’s orange and I hate orange and always did.

"The Cow Boy"

Manners

Although I’m not sure I want to reveal this to the world, I read etiquette books for entertainment.  I have a collection of them, the earliest published in in the 1870s, the latest Miss Manners‘ new revision published in 2005.  This is part of my interest in history, because etiquette books help me understand now just how people actually behaved, but how they thought they should behave.  And sometimes, these old books give some form of insight into why.  Furthermore, reading a series of such books over time lets us know the human behaviors that have actually changed, and those that haven’t.

Judith Martin (aka Miss Manners) upon receipt ...

Judith Martin (aka Miss Manners) upon receipt of the 2005 National Humanities Medal (Photo credit: Wikipedia)

If books about the proper use of forks, leaving of cards, and methods of introductions seem a strange source for any such insights, I can only suggest you try reading them.  Just as laws prohibiting something (running red lights, for example) are only promulgated if a lot of people are doing the prohibited action, rules of etiquette behave the same way.  An etiquette book will only write that is is rude or boorish to use the tablecloth as a handkerchief if people are using tablecloths to wipe their noses (which is a really disgusting thought, isn’t it?).    And today’s etiquette books don’t even mention using the tablecloth in such a fashion because apparently the shame of it all finally changed the behavior.

On the other hand, every etiquette book I’ve ever read has long, long, long treatises on training children into the civilized pretence that they’re grateful you came to their birthday party, and that the present is a surprise and delighted in not because it’s the latest toy but because it was so thoughtfully given.  Apparently, human nature is not going to change that fundamentally.

Historically, the role of etiquette in human life apparently has been twofold:  the first, to make it possible for humans to live in social groups without decimating each other; and the second, to help us in the task of arranging our societies hierarchically (that is, to know who is on the same level as we are and to keep the arrivistes out).  But even more basic than that, humans do not have a built-in set of instincts or hard-wired behaviors to help us live in groups, such as gazelles do, or dogs or even gorillas.  Moreover, we live in social groups much larger and more complicated than our DNA was designed to handle.  Even now, it is noticeably difficult for anthropologists to determine whatever social behaviors come “naturally” to humans, even those living in small groups.  So laws are necessary for us, and religious and moral systems, and etiquette.  All civilizations have systems of etiquette, just as they do laws and religions, and all are designed to, well, control human behavior.

So let’s look at the two reasons given above for the use of etiquette in our lives.  The first makes rules from the simplest (when walking up or downstairs, keep to the right) to the most complex (one leaves one of your own calling cards and two of your husband’s when making morning calls) in order to make living in groups of people larger than one’s family, well, easier.  If we all more or less keep to the right when climbing or descending stairs, we get to the subway platform faster (which allows us more time to wait for the subway, but no system is perfect).  The calling card issue, while out of date in today’s world, and too complex even when it wasn’t out of date, does have a logical basis.  Married women (with their marriageable daughters) made the “morning” calls (always made in the afternoon).  They left one of their own cards, sometimes if they had a daughter with her name penciled on it, for the lady of the house to keep, they left one of their husband’s cards (it was assumed he had far more interesting (or at least less boring) things to do with his time than make morning calls) for the lady of the house, and one of their husband’s cards for the gentleman of the house (it was considered rude for a lady to leave a card for a gentleman for obvious but never overtly mentioned reasons that the only possible relationship between a lady and a gentleman not of her own family was, ahem, romantic).  Thus, the husband was taking part in social life without being bothered (which was what he probably had in mind) and all the recipients had bits of pasteboard with names and addresses on them from which to make up their invitation lists.

In High-Change in Bond Street (1796), James Gi...

In High-Change in Bond Street (1796), James Gillray caricatured the lack of courtesy on Bond Street, which was a grand fashionable milieu at the time. (Photo credit: Wikipedia)

All of this sounds so arcane and ridiculous to us, doesn’t it?  But it has its present day analogues.  Morning calls became incredibly elaborate, but their original function was to stay in touch with those one wanted to remain friends with, become friends with, or social climb to be friends with.  Today, we use Facebook, Twitter, email, texting, even the incredibly outdated telephone call, to do the same thing.  All this kind of technology we use in our iPhones, iPads and computers today was first created in order to facilitate and ease the human need to stay connected with friends, acquaintances, and whatnot.  It is hard to even imagine today how isolated a family would be in its own house before the invention of the telephone.  What other way then getting out of one’s own house, walking or riding or taking a cart or coach to somebody else’s house and then physically “calling” upon them would there be to maintain one’s friendships before the telephone?  When “calling” first started, even writing a letter was a major issue (some very fine people couldn’t write, some postal systems were dreadful, and postage costs were very high–in eras when a penny bought a loaf of bread, to send a letter cost a penny or more).  And, by the way, it was called a “morning” call because until the 1820s or so, “morning” was all day from arising until dinner–people didn’t usually eat lunch, and “afternoon” as a concept didn’t really get started until the 19th Century.  (This is reflected even in our Bible, where in Genesis the narrator says “and the evening and the morning made the first day.”

I’ve gotten off track.  My point (and, to quote Ellen DeGeneres, I do have one) is that, however simple or elaborate, however common-sensical or ridiculous, the system of etiquette in general and most of its rules in particular are designed to both ease and codify the way we humans behave in groups.  Etiquette is designed to supplement law and morality and to handle those small items of human contact that don’t rise to thou shalt not kill.  Rather, they remain on the level of one just simply does not spit on the sidewalk.  (This, by the way, is a rule that I wish was more honored in the observance than the breach.)

The second use of etiquette is or can be considerably less benign.  Humans, no matter how right or wrong each considers the concept, live in hierarchies.  Even in the most liberal and free of countries, there are hierarchies, some more or less codified, some simply feeling “natural.”  The hierarchies in some countries seem cruelly limiting and immovable to our eyes, those in others may seem so nearly invisible that the country approaches anarchy, but they are always there except in the simplest hunter-gatherer groups (where the principle of hierarchy is anathema and any attempts by any tribe member to behave exceptionally in order to get exceptional treatment is shamed).  Part of this makes sense, as it did to Samuel Johnson, who said (testily, as he said most things) that the idea that the highest ranking person went through a door first was not snobbish but merely practical, designed to get the show on the road (that is a very loose paraphrase of what he said, by the way).  He does has a point.  Although I would say most of us in the United States would not agree with it in principle, there does need to be an order to things.  In the past, in terms of etiquette, the order was often from the top down.  What made it unfair to our eyes was that the people who did the ordering were almost always those sitting at the top or at least those who could convince others they were sitting at the top.  Usually this was not done, at least originally, in any mannerly fashion but by simple force of arms.  After that, of course, the hierarchy was ordained by God and that was all there was to it.  Many of us in the modern era find this rather suspicious, especially given the words of our several religious heritages, most of which state that the humble are quite as important as the, well, important.

Be that as it may, humans were never very good at accepting the idea of a hierarchy unless that human was at the top or could reach it, and so etiquette began to perform a double function.  First, the people on top elaborated their etiquette, as they elaborated their clothes, to distinguish themselves from the upstarts crowding in on them from underneath.  Second, the people underneath (those upstarts) began to copy the manners they perceived in their supposed social betters so they were less distinguishable from the ones on top.  This became quite a race starting in the late Middle Ages when trade and the creation of wealth from other means than plunder got started again.  Its most amusing and appalling recrudescence from our point of view is probably that of sumptuary laws, which defined what each segment of society could actually wear.  Believe me, this was not much of an issue in the Dark Ages, because nobody had good-looking clothes.  But once it was possible to import fine wool or even silk, it became a major THING.  There were even laws in Parliament distinguishing what a middle-class tradesman’s wife could wear (boring black and dark colors with high necks and long sleeves) and what the Earl could wear (silks, velvets, ermine, furs, jewels).  This might seem to some to be as limiting for the Earl as for the tradesman’s wife, but it probably was more galling to the latter than the former, especially when the tradesman became the chief of his guild and had more money than the Earl down county whose castle was falling down.

Why were clothes so important?  Because how else do you determine if somebody is SOMEBODY or just folks?  As Russell Crowe expounded in a recent movie “Robin Hood”, what is the difference between a knight and one of his men-at-arms?  Well, primarily the horse, because nobody but knights could afford them, but also the fact that the knight wore chain mail and tunics in the colors of his heraldry, while the man-at-arms wore coarsest wool in dark colors.  So anybody, whether low or high on the hierarchy, could tell literally at a glance who was who and who was where simply by what they were wearing.  (This seems really odd to us because except at weddings and suchlike, most of us wear jeans and t-shirts (or would like to, even when we can’t), no matter what rung of the social ladder we’re clinging to at the moment.)

Knights of the Temple II

Knights of the Temple II (Photo credit: Wikipedia)

Oddly enough, this has repercussions to this day.  Certain professions have uniforms, sometimes explicitly so, sometimes simply an unwritten compact.  Beat cops and traffic patrolmen wear uniforms, as do all members of the military on duty.  So do janitors, usually, and doctors and nurses.  Laboratory technicians wear lab coats, insignia of their profession, while supreme court justices and district judges wear judicial robes, insignia of theirs.  The members of church choirs wear robes, too, their uniform.  And we all know a lawyer or accountant when we see one, because they always dress (whatever their gender) in business suits.  There are many reasons for uniforms, but they all are based in the simple problem of recognition of a professional or social group by those not members of that group (or even by other members of the group).  Doctors wear scrubs because their own clothes are less sanitary or at least less easy to keep sanitary, but the scrubs are relatively uniform in appearance so we can all tell when we’re in the emergency room which is the woman who’s actually going to stop the bleeding.  Some uniforms become amazingly complex and dazzling (look at the picture of a general in the Marine Corps one day and you’ll see what I mean), while others stay simple or become more simple through time (those judicial robes are the descendants in spirit of the elaborate churchly or noble robes of the Renaissance).  I suppose there could be a rather sniffy moral to be drawn at which uniforms get fancier and which don’t, but I’m reaching the end of this essay and I’d rather not be sniffy anyway.

Again, uniforms and sumptuary laws are an example of the use of etiquette as a means of organizing society vertically, as it were, just as rules like not touching the water fountain with your mouth help to organize social groups horizontally to make life simpler, easier, more elegant and more pleasant for everybody in an equal way.  Etiquette has gotten a bad name over the centuries for the vertical organization, because it is basically not fair or equal.  Unfortunately, for many of us, we’ve thrown the baby out with the bath water and decided all etiquette was wrong or limiting or constricting because some etiquette has been used to exclude.  Which results in a lot of spitting on the sidewalk, attempting to go up stairs which are filled side to side with people descending them, making ascent impossible, and such outrageous situations as no more morning calls.

Manual on Courtly Etiquette, Volume 10 (稿本北山抄,...

Manual on Courtly Etiquette, Volume 10 (稿本北山抄, kōhon Hokuzanshō) (Photo credit: Wikipedia)

Spring Thoughts

Aspen trees near Aspen, Colorado

[NOTE:  I’m categorizing this post also as  “writing” because I am attempting to write a somewhat descriptive essay–creating a picture with words.  I would be most interested to know if I approach this goal, but then again, I’m putting in several images to help . . . . ]

Received an ecard today from a friend filled with budding flowers and trees and an Easter message, and I realized that spring did in fact, ahem, spring going on a month ago, in late March, as it always does.  Except in the high country in Colorado.  Here, I have always maintained, we have one day of spring in which the aspen bud (aspen is both singular and plural so imagine I mean “aspen trees bud”) and the lilacs bloom.  This happens some time in June, hopefully early June, hopefully after the last snow, and then we have approximately two and a half months of summer, if we’re lucky.

This early spring we’re having here in Colorado (completely apart from the lack of rain or snow and the resulting fire danger) is a little disconcerting.  Whether it’s a weather (ooh, clever use of words, there, Gail) anomaly or a symptom of climate change (a scary and controversial topic into which I’m not going), that’s not what normally takes place at high altitude.  Here, historically, we’re more likely in March, April and May to get heavy snows instead of snowdrops.  I’m trying to remember (using increasingly faulty equipment) when we in past years saw the first crocus, the first robin, the first bluebird, and it seems to me it was later in April than has happened this year.  I definitely remember, however, always seeing the first crocus peeking through the snow.

In any event, spring has a special feel to it, doesn’t it?  Freshness, balmy air with a few brisk winds for contrast, growing things.  I don’t think there’s a green as beautiful at any place or time as the green of new leaves with the sun shining through them.  All the animals start up their lives again after the winter’s rest, scurrying around finding food and nesting materials, making homes, getting ready for babies.  The birds chirp so cheerily and some of them dart around in such finery, their feathers so filled with color and life, they lift the heart.  And even while recognizing the practical reasons for flowers, oh they look so frivolous and bright, waving in the breeze on their stems.  Even here in Estes Park, where we don’t have much spring to speak of.

Now, New York is a place that understands spring!  They do the season right in that state.  Nature in New York starts with the forsythia, which is a kind of bush type of thing that in spring has delicate yellow flowers arrayed on more-or-less dark red new canes.  The rest of the year, these bushes are kind of background, but in spring they become sun-colored lace by the sides of the roads.  The forsythia is followed by daffodils, huge clumps of daffodils all blooming in a kind of yellow frenzy against the darker green of their leaves and stems.  Then the tulips pop out, bringing pink and purple (and, of course, more yellow) into the mix.  By this time, the trees have gotten the message and their new green leaves start to unfurl, making even an elderly dowager of a maple tree look like a girl again, quite giddy with the fun of dancing through the spring.  If had lived closer to water (although in New York city, water is always closer than it is in Colorado, it seems), I would also have enjoyed the pussy willows (as we called them), the little paw-like catkins bursting out of the willow wands.  I saw them in the florists shops, though, and touching their softness was almost irresistible.

lilac Syringa vulgaris in bloom

Lilacs (Photo credit: Wikipedia)

Soon after the robins and bluebirds arrived, other bushes and bulbs would spring forth, and the flowering trees would turn into sticks of cotton candy, cloudy with pink or white blooms.  Then, the most glorious of spring flowers would finish the show:  lilacs bloom earlier in New York than I ever remember from Colorado and I love lilacs, their color and their scent, more than almost any other spring flower.  In my Bronx neighborhood, there were several older houses that had lilacs bushes so huge they were more like trees, so filled with blossom that walking by them was a heady experience, the fragrance saturating my senses.  And so spring renewed a tired world, animals and people and flowers coming out of their winter funks, with even the spring rains feeling soft and warm and welcoming.

Here, it’s quieter, somehow.  The blooming plants seem to grow more closely to the ground and their blooms are not riotous in their color, at least not this time of year.  The mountains in Colorado have glorious wildflowers that array themselves in rich, paintbox colors, but those come later on, in June or July.  Now there’s the haze of green new growth that underlays last year’s dead stems, fuzzy buds on the aspen that will (hopefully after the last snow) break out into a green so delicate even from a distance you can see the veins in the leaves, and there are the crocus (croci?) with their pale lavender and cream cups and soft green leaves.  Later, in early June, there will be the blue flag, a kind of native iris, which creates a haze of blue in the low-lying ground close to the reservoir and on the big meadows in the park (as I mentioned in an earlier post about how we in Estes Park talk, this means Rocky Mountain National Park, the best back yard in the world).

While all this greening and coloring is going on, the animals–and the people–start to put off winter coats and lethargy and begin making a big fuss about life again.  While I always love to watch the deer and elk (and, yes, even the bears from a safe distance and usually on the other side of a window), it is the tiny ones that fascinate, the chipmunks and ground squirrels.  Because they are fair game for predators (we are a wild place here in spite of all our cars and houses and electric lights), from bobcats to eagles, they move quick quick quick and then sit up and scan their surroundings as this one is doing:

RMNP rodent

RMNP rodent (Photo credit: Wikipedia)

Then, there are the birds darting through the air, building nests, finding new things to eat, flirting with the big folk.  Truly beautiful birds make Estes Park and the mountains their summer home.  While we may not have cardinals or orioles or purple martins as the East Coast does, we have Stellar jays (blue shading into black, unlike the blue and white of the more standard jay), camp robbers (I can’t remember their actual name, this is what we call them up here, big birds in gray and white, utterly fearless), ravens and crows, chickadees, cedar waxwings, magpies, downy woodpeckers and our own wonderful blue, blue, bluebird, among many others.  They fill the air with song and their quick, darting flight.

And, later, in June, will come the flying jewels, the hummingbirds.  Almost everyone keeps bait around their houses, either the kinds of (usually red) flowers the hummingbirds adore or a hummingbird feeder.  They are enchanting to watch as they zip through the air or hover, with that distinctive sound they make, not quite the hum of their names, but not quite a buzz either.  They are quite territorial, and the battles between two of the tiny males are more furious and aerobatic than any other aerial combat.  They move so fast it is as if our eyes see where they were and not where they are.  Here in the mountains, they arrive at the very end of the spring renewal, and they delight us all summer long.

Finally, there are the big animals, the elk and deer that wander around all winter in scruffy coats and lost antlers, now sleeking up into their summer wear, growing new weapons covered in softest velvet, eating everything in sight.  And the bears come out of their dens in April (early this year, it seems), searching for food and frightening the populace (bears are not cuddly, not tame, and they are very dangerous).  While we see bobcats and coyotes all winter, the eagles and hawks seem to reappear in the spring, as do the Canada geese and the whistler swans.  They love our small lake here, a place to rest and find food during their travels.  So spring increases our populations of animals, and that burgeoning brings the tourists, another sign of spring.  If nothing else let us know it is nearly summer, the sudden inability to turn left would.  And so spring, bringing our senses back to life after our winter naps, leads into summer, the rich, fat season, filled with skies nearly purple in their blueness, leaves darkening into forest green, animals raising sleek babies, the joys of water and air and rocks, views and breeze and tiny, surprising lakes, rivers and summer thunderstorms.  And the memories of spring.

Spring’s pageant is ever new and ever the same.  It is, after all, the circle of life, and as necessary to our planet and our lives as the sun itself.  Perhaps it is intrinsic to spring that it be exhilarating, beautiful, warm, fuzzy, or perhaps that is just a bonus.  In any event, even here in our much shorter, quieter springtimes in the high mountains, our hearts and spirits lift with each chirp of a bird, each bursting forth of an aspen’s leaves, each bloom of a lilac.

Two males hummingbird are fighting. They do it...

Hummingbirds in Combat (Photo credit: Wikipedia)

A Mountain Bluebird

Monday, Monday

Fogg Dam Conservation Reserve which is one of ...

Rainy Day at a Dam in Australia. (Photo credit: Wikipedia)

Before the Flood (in my case, two of them, the Big Thompson Flood and the Lawn Lake Flood), I used to love a rock group known as “The Mamas and the Papas“.  They had wonderful voices, quite lyrical, and a rich style, unique for the time, sounding much more well-produced than many other contemporary groups, which, following “The Rolling Stones“, preferred a rougher edge.  One of The Mamas and the Papas’ earliest hits (after “California Dreamin‘”) was a song entitled “Monday, Monday“.  It was a very ‘Monday’ song, about treachery and betrayal.  It pointed out that the day might leave, but the Monday feeling hung around.  I’m not sure why Mondays feel that way, but this one sure does.  I was confident that when I stopped working at the day job, Monday would once again be the bright start of the week, not its lowest nadir.  But some Mondays just are nadirs, and that’s all there is to it.

Monday

Monday (Photo credit: Eric M Martin)

Sometimes nothing works except Tuesday, but there are a few things a person can do.  Running away to Australia (where it already is Tuesday) is probably not an option for most of us; it isn’t for me.  But writing is always an option.  For instance, I didn’t know when I started this post that it would end up being about writing, or actually about anything having to do with getting over a Monday.  I thought it was just going to be a complaint, about weather and not enough sleep and having to run errands and do chores and pay bills, to say nothing of political emails that I will truly say nothing of, but that I’m very tired of getting.  Instead, my thoughts turn to the psychological benefits, let alone the artistic benefits, of writing out one’s less than stellar or chirpy moods.  That’s what I’m in the process of doing, after all, and it’s working.

Whether in a blog or simply a private journal, writing about what you’re (I’m) feeling helps in a number of ways:  First, for me at least, it helps me figure out what I am feeling, and often I’m not sure.  I may have just a case of the blahs, kind of “itch”, as Connie Willis so beautifully puts it, a kind of existential angst that can afflict anyone.  But writing about it can often pinpoint what is really going on.  In my case, today, it combines not enough sleep with a meeting I’m not sure I’m ready for and then stir in just a bit of waiting for an email and then getting an email, not even remotely the one I’m waiting for, that seems to come from the bowels of political nastiness and you’ve got that Monday feeling.  Or at least I do.  But there have been times when the bad mood went a lot deeper, and writing made it possible for me, eventually, to see what really was bugging me.

Second, at times writing about the problem can help you (me) find a solution.  Sometimes, of course, the solution is just to stop feeling sorry for yourself and get on with life.  At other times, when the problem goes deeper, the mind is searching underneath consciousness for a solution, and writing, especially the kind of writing you can do on a computer in a journal, can help you get out of your own way so you can see what you need to do or feel or be to resolve the issue.  Here, the trick is to simply write, without the little critic we all have living in our heads yammering away about the quality of what you’re doing (somehow the little critic never seems to think that the quality is good, darn it).  It’s hard to shut him or her up, but it is possible.  Just keep writing, let the words come out, no matter how silly or self-serving or mindless they seem.  Eventually, your mind will settle down to the hard work of letting you know what’s wrong.  It’s kind of like therapy, only using touch-typing instead of psychoanalysis.

Third (one of the smaller tricks of writing is to realize that the brain likes things in threes, so when you provide, in an essay, lists of options or whatnot, make sure there are at least three of them), when the time comes, and it inevitably does, that the solution does not appear right then and there, save what you’ve written and let it sit.  Put it away and come back to it on another day, preferably not a Monday.  Not only might you realize there is a resolution somewhere in that storm of words that you didn’t see before, but also you might have an essay, a blog, or a part of a greater work just sitting there waiting for you to refine it.  A double blessing.  And even if the solution isn’t forthcoming, you will probably come to realize that it was just Monday, after all, and things are better simply because life is change.  (Plus, you might still have a usable piece of writing!)

Like the song, “Monday, Monday.”  I have the feeling that composing that song took away the writer’s blues.  And even if it didn’t, he got a great song out of his dreary Monday.

The Mamas and the Papas Deliver

The Mamas and the Papas Deliver (Photo credit: Wikipedia)