Memorial Day

Memorial Day Commemoration 2008

Tomorrow will be the day Memorial Day is celebrated in the United States of America.  The date, which moved around a bit while the holiday was known as Decoration Day, was fixed at May 30 for many years, but in the last couple of decades it was changed to the last Monday in May, thus allowing a three-day weekend for most working people.  More and more, it seems that the holiday marks the beginning of summer rather than what it was intended to mark:  to memorialize, remember and gratefully thank the more than a million men and women who have lost their lives serving in our country’s armed forces.

In these days, after more than ten solid years of war, more families have ties to the military, it seems.  And more, sadly, have lost relatives and friends and loves to war.  (I seem to be using the word “more” a great deal in this post, but that’s appropriate.  For all that the spiritual says “we ain’t gonna study war no more”, it seems that war will always be with us, a larger and larger specter on the horizon, a horrible biological marker we can’t seem to rid ourselves of.  As a character said in the movie Gladiator, “there will always be somebody left to fight.”

Memorial Day got its start during and after the Civil War, when Southern families would decorate the graves of their fallen soldiers with flags and flowers, usually in May, although the actual date varied from state to state.  Later on, it became a country-wide holiday, marking not just the Civil War, but all wars in which United States military personnel have fought and died.  Unlike such holidays as Veteran’s Day (which is the current name for the holiday marking the end of the First World War and which now serves to mark the service of Armed Forces members in all wars, whether they were killed or not), Memorial Day is a day to thank those that gave the ultimate sacrifice of their lives in service to their country.

American Civil War Graves

American Civil War Graves (Photo credit: smilla4)

I was a young woman during the war of Viet Nam, a time during which military service got itself tarred with the brush of the political decisions made regarding that war.  My memories of those days are still bitter, with people greeting returning soldiers, alive and dead, with scorn and sometimes worse.  Memorial Day seemed rather to mock our nation’s blistering discord over that war that played itself out in so many ways against the soldiers who had very little choice about fighting.  It is probably not remembered now as much, but during those days, there was a draft, and soldiers went to Viet Nam because they had to, not because they volunteered.  Most of them served honorably, although it sometimes seemed our media deliberately chose to find stories of dishonor.  I wish to make this clear:  I did not approve of the reasons for our military involvement in Viet Nam and was opposed to its continuation.  However, both as the child of a man who had retired from the Army as a Lieutenant Colonel and as a young adult who could not figure out why anybody, let alone members of the same generation, should vilify soldiers conscripted into service, I was appalled at the treatment of veterans and the dismissal of those who died in that jungle.

Since then, how many wars have there been?  Police actions, some were called.  Other names were used for other military ventures, because there is a law that while the American President can deploy troops, only Congress can declare war.  And now, for the first time in how many years, we are fighting only (ONLY!) one war, the one in Afghanistan, a country that has so far absorbed and spit back every foreign military venture taking place within its borders, from Britain to Russia to us.  Only one war.  I don’t remember who said it, but I clearly remember some wise person saying that there is always a war somewhere.

Wars are the big things that history books love to talk about, the movement of troops, the decisions of generals, the pageantry of it all.  Meanwhile, there are individual human beings, not toy soldiers, on those wrecked fields, driving those caissons and tanks, getting killed or, as is so very common in today’s modern war, getting severely injured in either or both brain and body.  Individual human beings.  People who could have sweethearts, wives, husbands, girlfriends, boyfriends, sisters, brothers, mothers, fathers, children of their own.  Lives.  Once such people were openly called “cannon fodder.”  It was always an ironic, grim designation, because it was so awfully true.

Quartermaster Corps branch insignia

Quartermaster Corps branch insignia (Photo credit: Wikipedia)

My father, as I said above, was in the Army.  He joined when he was very young.  Since he didn’t have a birth certificate, he was able to lie about his age.  He worked his way up to sergeant in the Quartermaster Corps.  This division of the United States Army handled supply and infrastructure, and my father started his military life as an army cook (or more probably he worked up to being an army cook by peeling potatoes like every other soldier).  When World War II broke out, he was stationed in Panama.  The Army had to vastly increase its size to meet the demands of WWII, and so required many more officers than it had needed since the Civil War.  It instituted an Officers’ Candidate School, for which my father applied and was accepted.  He graduated as a Second Lieutenant and remained in the Quartermaster Corps, achieving the rank of Major while on active duty in what was then Persia (now Iran and parts of Iraq) supervising the construction of the Red Ball Express across Persia to transport materiel to Russia, then our ally.   (Note:  the Wikipedia entry for “Red Ball Express” discusses only the convoys through France, but my father told me himself that the same designation was used for this highway through Iraq and Iran.)  He was wounded in the leg during an attack by Bedouins, thus managing to achieve the difficult task of getting wounded in furtherance of the war but not in battle.  And also managing to predate by some sixty-odd years the exact experience of our troops in today’s war in Iraq, only now the Bedouins are called insurgents.  He recovered, but his leg was never right again, and he was rotated back to the United States, where he served at Camp McCoy in Wisconsin and was retired in 1947, with the usual bump in rank and with full medical disability.  He died, partially as a consequence still of that wound incurred in Persia, in 1960 and is buried, as was his fond wish, in Arlington National Cemetery.

My brother, David, served in the Korean War (ahem, police action), in the Navy.  He was deployed on a ship that patrolled the Atlantic Ocean from Cuba up to Nova Scotia, I believe, and was delighted that his service was routine, boring and without much incident.  He served three years, I believe, and then went to college and an eventual working life as a systems analyst, creating computer programs, sometimes for the United States military, thus coming full circle.  I have a picture that I unfortunately cannot find or I would post it showing him grinning in his sailor uniform.  My brother enjoys good health and a good life in California and long may he do so.

My husband, when I married, had retired from a career in the United States Air Force, in which he served as a surgeon.  At one time, he was one of the doctors retrieving our astronauts (in the Gemini Program, I believe).  His second to last deployment was as Command Surgeon at the Air Force Academy.  He retired as a full Colonel and moved to our small town of Estes Park, where he acted as Chief Surgeon of the Estes Park Medical Center.  He and I divorced (while living in California) in 1985 and he returned to Louisiana, his home state.  He died there, the victim of esophageal cancer, in the late eighties.

I am so very fortunate that my loved ones, those I personally met and knew, did not die in battle.  I have ancestors, as have we all, who did die in one or another battle, in this country or another one, fighting some war or another that now exists only in history books.  I have a direct ancestor on my mother’s side who fought in the Revolutionary War as a member of a militia in Virginia.  (He survived it.)  It is that relative that allows me to claim that I am related to Davy Crockett.  I can only assume that some great-great grandfathers or uncles of mine fought in the Civil War and probably in all other wars we have fought as a country.  My stepfather served in the 22nd Engineers Corp in France in World War I (where he lost his hearing as a result of mustard gas, or so I was told).  I would imagine that nobody reading this could actually say that no family member, no ancestor, ever died in a war, ever fought in a war.  It is an ugly, continuing part of the human condition.  I would not suggest that wars are never justified.  I’m not a pacifist (I try not to be any type of “ist”, actually).  Certainly, in my reading, it seems very much the truth that World War II had to be fought, for reasons that became totally clear only after the war ended.  And it also seems that the Civil War could not have been avoided and, although it was not actually the reason it was fought, the end of slavery that resulted from the Civil War is an unreservedly good thing (although we didn’t handle the peace arising from the Civil War with any kind of grace).  Other wars seem to me to be fought for reasons that do not rise to the term “necessary” or “good”.  But then I am not as informed as I perhaps should be.  I will not pronounce on our current and immediately past wars, because they are too close for there to be any judgment, at least on my part.

But they are fought by individual human beings.  And on Memorial Day, we honor those who have died in battle, those who have put their own bodies between their beloved home and war.  Our troops have been (along with two very big oceans) a primary reason that these ugly wars have not been fought on American soil since the Civil War–at least until terrorism changed the face of war culminating (so far) in the attack on and destruction of the World Trade Center.  Those women and men who have died in battle have saved countless civilian lives as well as the lives of fellow soldiers.  It is too much to ask of any human being, but that we survive as a country today has much to do with their service.

In my small town of Estes Park, there is a lawn in front of the Public Library on Elkhorn Avenue.  Currently, in honor of Memorial Day, the library staff has “planted” hundreds of tiny American flags in this lawn.  It is beautiful and it brings tears to the eye.  I would rather see the boys and girls themselves, all alive and spiffed up in their dress uniforms, than flags, but seeing the flags helps me remember, helps remind me in the midst of the barbecue and the Memorial Day sales, to silently thank them all.

Sign posted along the Red Ball route

Sign posted along the Red Ball route (Photo credit: Wikipedia)

Rainy Sunday

Rain and mist.  My little mountain town looks as if it should be situated on a loch in Scotland.  All we need is a castle and some heather.  Instead we have high mountains we can’t see today, lots of pines, budding aspen and the start of a summer of wildflowers, small darting creatures and lots of elk.  Here’s a picture of a sleek gentleman in velvet wading in one of Estes Park‘s rivers on a sunnier day.  The picture is courtesy of Roxy Whalley, and many more of her wonderful photographs can be found at Images By Roxy  and A Picture A Day 2012.  Right now, and of course I can’t remember what I did with my camera, there are a number of elk lying down in the wet grass across the street, only their ears and growing antlers visible to me.  Sometimes I  need to remember and express my gratitude for being able to live here, because it is a gift.

. 05-12-12 - Wapiti in the River

On such a rainy Sunday, while feeling grateful for my blessings and sending up (around? through? wherever!) my thanks to the Author, I’m reminded that among those blessings and thanks are the choices I am given to make and the results of the choices I have made.  Sometimes, I think we all–I know I do–can feel coerced into the lives we’re living, caught somehow by circumstance or fate or some kind of determinism.  Why am I here?  We ask this, and we’re not (as the joke would have it) trying to figure out why we’re standing in the laundry room (in this joke, we’re looking for our glasses).  But of course, what we’re really looking for is either purpose or at least an explanation.

There are many sources from religions to philosophies to governments to mothers to science to (probably the wisest) comedians to tell us what our purpose is, explaining why we’re here.  We all know what they are and each of us has already or needs soon to come to terms with how much those explanations personally resonate.  But in a (very) superficial survey, I would state that reasons given for the existence and ultimate purpose of human beings, of life on this planet, of this planet’s own existence, of the existence of the universe, range from (a) utter determinism and predestination to (z) (or maybe zzzzzzz) mere chance.  Somewhere in the middle of that vast spectrum you will find my own microscopic dot, I’m sure.

choose determinism

choose determinism (Photo credit: alyceobvious)

But today, I keep thinking that for every situation, place, mess, glory or whatnot in which I’ve been plopped down, there is a spectrum ranging between a deterministic explanation and a free will expanation.  For example, why did I move back to Estes Park after a life in Los Angeles and New York City?  Why Estes Park?  Well, my mother and I moved here after my father died because they had stored their furniture in Colorado, Daddy had loved it here, and Mama found a house for cheaper rent in Estes Park than she could in all the front range towns (then, they’re small cities now).  How to parse that decision in terms of Choice, of Chance, of Determinism?  The Universe or God providing a path?  There is no real way to know.  Mama was too much under the survival and grief gun to ponder any of that.  She just wanted a roof, a job and a safe place for herself and her (sullen and hopefully temporarily unhappy) daughter.  So by default Estes Park became home, the place I knew, the refuge when things went bad, the place to escape from when the rest of the world (any part of it) looked better than a mountain valley with few jobs and no prospects.  No matter how beautiful it was.

But it isn’t just that Estes was and is home and I’ve always been homesick for the mountains.   True, when things went bad in my life (which happened a lot, but then that happens a lot to everybody), I’d think about Estes Park as home and want to go there to lick my wounds.  When things got better (which doesn’t inevitably happen for anybody, but which does take place more often than we notice, I think), Estes Park would once again become a nice place to visit.  Then, due to a weird confluence of strange events, I got older.  And due to an even weirder confluence of even stranger events, while I didn’t get rich, or even “comfortable”, as they say, I did manage to inherit, work for and save (saving being, alas, the least of it) enough not to fret over job prospects in a small mountain town.  Because while Estes Park is a hard place to live when you have to earn a living, if you can retire there on even a semi-pittance, Estes Park is a lovely place to live, filled with beauty, friends and important things to do.  So it became a choice once more open to me both in practical and in emotional terms.

But there were other forces.  Chance?  Determinism?  I don’t know.  When I moved to New York City, that choice was mine, but it was influenced by events in my life in Los Angeles that could very well be the universe nudging me toward a specific outcome, or which could have been pure chance onto which I imposed some kind of meaning.  This, by the way is a very old human sport, engaged in because our brains are hard-wired to form patterns.  Scientists believe that this wiring came about to allow us to pick out the pretty fruit against the background of green leaves.  But now, the pattern-formation wiring in our heads also will form patterns of behavior, of activities in the world, in an attempt to find the fruit of meaning against the background of noise.  In any event, the patterns I saw I interpreted in terms of the choice I wanted to make and I moved to New York City.

New York City

New York City (Photo credit: kaysha)

And loved it.  And would be there still were it not for some new patterns forming against the noise.  Patterns of economic disaster for all, physical problems for me, and the combination of isolation and loneliness these patterns (and some iffy choices on my part) created.  (Friends in Manhattan moved to Jersey, I stopped working because of my health, my health kept me at home in my Bronx coop which was very far from anywhere I wanted to be, etc., etc., etc.)  And I gradually came to the realization that I could no longer be there in my coop in the Bronx.  Since Manhattan was financially out of the question, where was I to be?  And was it simply my choice to stick a pin in a map?  Or was there a pattern?

Chance?  Determinism?  Choice?

Looking back makes it a lot easier to see the combinations.  While we’re in a situation, it is very hard to distinguish what parts of the decisions we make are free choice, reaction to random chance, or possibly the influence and caring of a superior entity.  Do I see the pretty fruit because it just happens to be there?  Or do the patterns in the foliage lead me to it?  Or whether it is all noise and background and I’m making up the pretty fruit I was trying to find.

But I came home, using as much single-minded effort to do so that I had used to move to New York.  And while I still miss Manhattan, I am glad I did.  Here is a very good place.  Whether I’m supposed to be here because some Force in the universe wills it and I am merely a pawn being moved, or whether I’m here because I am as much a maker of my life patterns as I am the one who discerns them, or whether I’m here purely out of rational choice and completely by chance, I don’t yet know.   Perhaps it is some unique combination of them all.

Estes Park in Rocky Mountains, Colorado.

Estes Park in Rocky Mountains, Colorado. (Photo credit: Wikipedia)

Right now, I’m not on the downward spiral of a bad decision or a bad place where I’m hunting desperately for someone, something, more wise and powerful than I am, to tell me what to do and assure me that it will all come out all right.  On the other hand, I kind of miss those times in my life when my desire for an outcome, my determination to make something happen, would overcome all chance, all determinism in the world. All gates were open, all systems were go, all circumstances in the world seemed to coalesce, serendipitously, into a green light which would sustain me until the project was complete, or close enough to complete so that clenching my teeth and soldiering on would make it so.

Today, I’m pondering my choices and my chances.  Oddly, like the elk in the stream in Roxy’s picture.  That elk is there because the original species indigenous to Estes Park was wiped out in hunting and another species was brought down from Wyoming to repopulate the National Park.  So, to what extent can we look at this particular elk and see the determinism of the universe and of human beings to place his kind in Estes Park? To what extent does that particular elk’s individual health and luck (the chance of his life) play a part in our seeing him in that river at that time and place and date?  And to what extent is he in that river because he just thought, what the hell, it’s easier to drink the water if I’m already in it?  Determinism.  Chance.  Choice.

What brand-new combination will come to me next, as it does to that elk?  What will move me on, whether metaphorically or (less likely at this point) actually to another place, another goal, another purpose.  While I came back to this beautiful place, this genuine home, to retire, to be still, to do small things and perhaps finally do them a bit better, and I hope that continues, it seems I’m not done with dreaming or hoping, either.  Or wondering if the Author, as I mentioned above, just might have something more for me to do and in just what way that will manifest to me.  As a choice?  As a chance?  As a destiny?

Meanwhile, on this rainy Sunday, I plan to make a small destiny of looking outside at the lovely misty mountains, feel the stroke of the rain on my skin, see if the elk have (entirely their choice, I hope) left the meadow below the road to find some other place to bed down this night, and open myself to patterns, to the fruit against the leaves, the intricate winding dance of chance and choice and determinism, and see what that dance creates for me next.

An Exploration of Gluttony

Jacques Callot, The Seven Deadly Sins - Gluttony

Jacques Callot, The Seven Deadly Sins - Gluttony (Photo credit: Wikipedia)

As I have done twice before (with Sloth and with Envy), today I plan to explore the Deadly Sin of Gluttony.  Partially because this is one of the few Deadly Sins I remember without having to look them up, and partially because this was the sin above all others (with the possible exception of the array of “sins” surrounding sex) about which my mother commented to me–at length.  So here we need to look at another sin of the flesh.  Let’s explore Gluttony.  Simply put, the sin of Gluttony is the overindulgence of food and drink and the obsession, the preoccupation, with sustenance as pleasure.  The wariness most human institutions display regarding pleasure of any kind is a topic for another day, but suffice for this essay, it seems likely that all creatures that eat and drink feel pleasure when doing so.  It seems to be the body’s way of encouraging a creature to leave the safety of a nest or hiding place to go and find what’s needed for continued life.  At its most basic, life itself, continued existence, requires food and, at least, water, at more irregular intervals but just as absolutely as life requires breathing (of some kind).  And all living bodies reward the finding and ingesting of food and water with pleasure.  And therein, in human cultures, lies the rub.

Now, any human need can and will be abused, can and will be twisted, can and will be taken too far in any possible direction.  Food can have an emotional component much more complex than the simple “food is necessary and therefore pleasurable” equation explored in the above paragraph.  For some people, food was made equivalent to love by a parent or, for some others, food was the only pleasure possible.  These motivations simply scratch the surface.  We all know that humans can become addicted to their pleasures, whether of alcohol, drugs, food, sex, gambling, even shopping according to some experts.  This has something to do with brain chemistry, something to do with filling up an empty space emotionally, something to do with early and inexplicable losses.  Now such things are, at least in the scientific community, a matter of the physical self, not a matter of will.  But to most of us, and most especially to the accepted dogma of our religions, addiction is still equated with lack of discipline, and all self-feeding behavior with a turning away from the hope of salvation.   Thus, sin.  And, with food and drink, the sin is Gluttony.

Gluttony will always be perceived as the pig in the trough, slopping through enormous amounts of food with no real appreciation in the smallest possible amount of time.  Which, regarding the pig, of course, is a bit of a canard.  They apparently, in real life rather than in legend and moral tales, don’t eat either that sloppily or hugely.  And, since their function for humans is to put on a lot of weight in order to provide a lot of meat and lard, the somewhat unpleasant concept of force-feeding enters the picture, thus making the pig not to blame for his, well, piggishness.

"Whatever it is, we'll eat it!" Youn...

"Whatever it is, we'll eat it!" Young pigs enclosed next to the footpath near Monkhall. (Photo credit: Wikipedia)

My own definition of Gluttony as either a sin or as a behavioral problem to resolve probably has to do with occasions when I might eat beyond fullness, when I, in essence, force-feed myself.  When the desire-reward cycle gets out of whack and all I want is more, more, more!  When all thought of sustenance, moderation, health, diet, anything, even pleasure, is lost in the perpetual motion machine my fork becomes.  Did I ever do this?  I remember younger times when I would go out in the evening after a full dinner and still order a hamburger and fries and a malt and eat every bit.  I remember Thanksgivings and buffets in hotels and all-you-can-eat pizza parties.  Is this sin?  Is this Gluttony with a capital “G”?  Remembering those times now, it’s not hard to think so.  And yet, such occasions were relatively rare.  Besides, I have never binged in the classic sense of eating an entire box of cookies or a gallon of ice cream all alone.  That would make me physically ill.  And while I (in spite of all the pounds I’ve lost) still have many more to lose, by and large those pounds were not gained from binging, from force-feeding, from Thanksgiving dinner and Vegas buffets.  The too, too solid flesh still clinging to me came from eating perhaps just a bit more than I should have and exercising a lot less than I should have (for my own benefit, not the desires or approval of others) over a lifetime, not from any specific banquet where Gail kept going back for more Beef Stroganoff, don’t hold the noodles.  It’s hard to find that sinful.  But then, as with many sins, it’s much easier to identify the Glutton, the Slothful, the Envious, the Vain, in others rather than one’s own self.

Edward Curtis photo of a Kwakwaka'wakw potlatc...

Edward Curtis photo of a Kwakwaka'wakw potlatch with dancers and singers. Kwakwaka'wakw people in a wedding ceremony, bride in centre. Photo taken by Edward Curtis, 1914. Edward Curtis photo of a Kwakwaka'wakw potlatch. (Photo credit: Wikipedia)

And why is Gluttony a sin in any event?  As in my other essays on this overall topic, let’s look at the advantages to civilization if Gluttony is considered a sin.  First, in our hunter-gatherer days, Gluttony would have been the mark of selfishness.  To cram into one’s own mouth what should have been shared with the tribe for everybody’s well-being would have been considered a mark of wrongness from the very start of the human race.  Just as sharing all one has, especially all the food one has, is considered a mark of favor, a signal of a generous human soul, hoarding to keep all that necessary food to oneself is a very bad thing.  These concepts still operate today. In times of disaster, hoarding items needed by the community is still considered wrong, even when the hoarder has merely been provident in advance of the potential disaster.  And generosity is considered such a good thing in human society that, for example, certain tribes of the First People of the Pacific Northwest created the Potlatch, in which a chief would give away all he had as a signal both of his wealth and of his goodness, hoping only that it would be incumbent upon these to whom he gave to then themselves invite him to a Potlatch in which he would, essentially, recoup all and more of what he gave away.  In my own Christian tradition, there is the concept of casting the bread upon the waters and it will be returned a hundredfold.  (Often, the implication for individuals is that you cast your bread on the waters in order that it be returned a hundredfold, when that is not what was meant by the proverb.)  Obviously, this is a sophisticated concept, this idea of generosity with food; however, there is no complexity to the concept of hoarding, hiding, keeping it to oneself.  That is considered nasty the world over.

Cities of the ancient Near East

Cities of the ancient Near East (Photo credit: Wikipedia)

But the signal reason for Gluttony becoming a sin, in my judgment, was the beginning and continuation of agriculture and thus private property.  To me, this still was a huge mistake on the part of humanity, as I’ve noted before in my essays discussing Sloth and Envy.  In agricultural societies, the “Big Man” (this is a current theory in anthropology, which attempts to explain the eventual rise of kings) would be a person able to talk others into helping him sow, cultivate and harvest his crops.  In return for giving them a (small) percentage, he would store the rest against the bad times of year or against drouth or disaster.  Thus he became wealthy and powerful.  And of course, his motivation would be, basically, Gluttony–the desire to have more than enough to eat and drink.

As time went on, the piling up of extra food through cultivation of crops required the invention of, well, infrastructure.  A man spending all his time cultivating crops or even supervising others doing so would not have time to hunt for meat, to grub out clay for pottery, to make mud bricks to build storage units or a home, would not have time or the resources in his own self to protect all he gleaned from those who would steal it.  And so had to be born in the world specialists who, in return for a percentage of his crop, would make the pottery, graze the goats or sheep, shear them, weave the fleece into cloth, build the mud brick buildings, create a military to protect the community, and, and this is by no means the least of the important specializations, build up temples manned (or womaned) by priests to please the gods.  All these things were paid for by a percentage of the crops.  A gluttonous Big Man, one who hoarded, who waxed in fatness, who ate too much himself, and allowed his family and his servants to eat too much, kept that percentage small and thus the priests, the king, the military men, even the goatherds and builders and potters and whatnot, all those who did other things than raise plants and animals, would not have enough to continue their tasks, let alone have any chance of indulging in gluttony themselves.  And so the civilization prospered because of a balance between the Gluttony required to build and the Gluttony that would destroy.  (The map to the right of the start of this paragraph shows one of the earliest of such civilizations, Mesopotamia.)

But while Gluttony that results in creation would be a good thing for the Big Men (and their families) and the resulting kings, priests, generals and whatnot, Gluttony could not be allowed to pervade all the people in the society.  It could not even be allowed to exist among even “important” people other than the king, priests, etc., because then such people would hoard the king’s share, the priest’s share.  And the king and the priest, just as the potter and the herder, did not sow, cultivate or reap their own food.  They subsisted entirely upon the labors of the farmers.  No, Gluttony had to become a sin, had to be punished, guarded against.  From the farmer who stored the surplus grain and paid out of it his taxes to the lowliest worker on his worst farm who would be punished, sometimes with death, for hiding any part of the harvest, everyone in the society needed to feel that such hoarding, over-consumption–Gluttony–was a great sin and would be horrifically punished in this world or the next.  Of course, in the meantime, the king, the general, the priest, the Big Men, did hoard food, did engage in banquets in which food was consumed to a point where rooms had to be set aside to allow for vomiting so that a guest could start all over again, did waste food and lay waste to the countryside in order to get it.  Anyway.

But then, a sin does not rise to the status of a Deadly Sin if it isn’t a popular failing.

Thus, naming Gluttony a sin is a good thing for the civilization, as it is  normally constituted.  But is there anything that can be said in Gluttony’s favor, as we have done with Envy and Sloth?  Well, obviously, Gluttony underlies the will to store food against future disaster, which begins the whole cycle leading to diversification of labor and thus eventually to a civilization.  Plus, to be fair, the immense variety of foodstuffs, the variable flavors of food, would probably not be a part of our world without Gluttony.  Gluttony serves as the undoubted basis for the migration of human groups throughout the world, the underlying cause of human beings taking over the planet,the reason we cook and prepare food instead of simply gnaw at bones and leaves, the development of domestic strains of grain, other kinds of plants, and animals.  Without human intervention, for example, there would be no cattle.  No, none at all.  We developed cattle from wild ruminants, but apart from yaks and water buffalo, what we know of as cattle are quite different from their wild kin and would not be able to survive without continued care (or exploitation) by humans.  Gluttony led to population growth.  Gluttony led to appreciation of the finer things in life, well-cooked and delicious food, wine, beer, ale, mead and liquor.  And, in a sense, Gluttony can be extended to the consumption not just of food and drink, but also of art, of fashion, of performance, of architecture, landscape, beauty.  Without Gluttony, life would be less interesting, less satisfying, less civilized in its more refined sense.

But all the things I’m listing in the above paragraph are in themselves an indictment of Gluttony.  Without it and what it creates, we would be a small population living with the Earth instead of exploiting and destroying it.  Without the wealth that resulted from Gluttony, there would be no civilization so attractive to outside human groups that war was invented and used (and still is) to overcome that civilization and get all that wealth for themselves.  Not that we would recognize such a world.  For most of us, probably, we would not love such a world, no matter how much healthier we and the world would be.

As for me, with my love of good food, of cooking, of creating intriguing new dishes from delicious ingredients, I am definitely part of the problem.  Because I have to breathe to live, I prefer my oxygen to come from pure mountain air instead of the stuff on a subway platform.  Because I must drink water to live, I prefer it to be unpolluted, fresh, rather than skimming from a puddle.  And thus do I want my food to be good, fresh, healthy and delicious.  And plentiful.  I have to eat to live.  I try not to live to eat.  But, much like Lust, which I may explore in a future essay, Gluttony is never satisfied.  There is never enough.  If we indulge in all that we want, we will want more tomorrow and the day after that.  Like all the Deadly Sins, without feeling their drive, we would not be human, would not have the world we know.  But their drive is not self-limiting.  Unless we find a way to rein in Gluttony, in particular, we will gnaw the planet bare.  And that’s not good for anybody.

Flanders, Netherlands

Taste, Trends and Cowboy Boots

Painting "Herd Quitters"

Have you ever pondered the difference between what you are supposed to like and what you actually do like?  I’m not thinking, here, about the truly important stuff, such as sexual preference (which is almost certainly not a choice), or with whom you fall in love anyway (which is more like compulsion or madness).  This is the more surface stuff, more about still not liking tangerine even when it’s the “in” color (I say it’s orange and I say the hell with it) or (like Ed Wood of long-ago B-movie days) loving Angora shruggies whether they are fashionable or not (something I can’t wear whether I like them or not or whether I think I should like them or not, because Angora itches).

Or, even more simply, what we are taught by our mothers (usually), local style mavens (often), and the media (all too often) to think of as stylish, trendy, fashionable, cool or just in good taste may not be what we, in our heart of hearts, really find pretty, attractive and delightful.  I remember in high school thinking that the pep club uniforms we had (slightly above the knee purple box pleated skirts with German lederhosen-style straps worn over white button-down Oxford shirts and with white tennis shoes) were really good-looking.  I liked the quality of the wool flannel in the skirt, I liked the hidden stitching on the stitched-down portion of the box pleats, I liked the simplicity of the purple and white, the shirts and tennis shoes complementing the skirt.  I thought the tout ensemble of the whole (as a friend’s mother would put it) looked good on me.  And I did not dare say so.  All the comments I ever heard about this uniform were, ahem, uniformly negative.  It was considered clunky, even then (and, yes, this was a long time ago), it was considered dowdy and totally uncool.  Nobody liked it.  So I, in my 16-year-old wisdom, didn’t like it either.  But I really did.

This led to confusion over time, because I learned probably the opposite of what I should have learned.  I learned that I’d better trust other’s taste in preference to my own.  I learned that what I liked was kitschy, ordinary, dowdy (that word again) and that what I was supposed to like was all that was cool, trendy, attractive.  And so I tried to like it.

Black Western cowboy boots on a white background

Black Western cowboy boots on a white background (Photo credit: Wikipedia)

For example.  I’ve lived either in the West in small towns or medium cities, Los Angeles or New York City virtually my whole life.  And somewhere along the way, I fell in love with Western-style clothes.  Specifically, such items as cowboy boots, snap-buttoned shirts, and fringed leather jackets.  But for a long, long time I didn’t tell anybody that, because when I was growing up, to be a cowboy or to like such western styles and ways of living was the most totally uncool thing you could do.  As I recall, the cool kids had truly unpleasant appellations for the cowboy kids, to which I will not give any credence by repeating them here.  And oh how I wanted to be a cool kid.  (I wasn’t, because I was an academic nerd, a term that had not been invented yet.  I liked most of my teachers and the challenge of learning stuff.  This is ALWAYS uncool in high school, at least in public high school.)  So I pretended to go along with the contempt (and it was true contempt, growing out of the bottomless pit of insecurity that a teenager lives with every day) for cowboys.

But way inside where I didn’t even look I really liked how they dressed.  And I couldn’t admit it.  Not even to myself.

A few years later, when I lived in Wyoming, where everybody was a cowboy (except for the cowgirls) and that was just background, not even a lifestyle choice (which is a term I don’t think anybody who lives in Wyoming understands or wants to), I went with a girlfriend to Cheyenne Frontier Days, one of the best rodeos going.  And that’s when I first really met up with, watched, and started to understand real working cowboys.  Rodeo cowboys, at least.  For them, wearing jeans bleached practically white with a round white patch in the back pocket where the chewing tobacco rubbed against the material, wearing tight shirts with snaps instead of buttons, and wearing, of course, and most iconically, the hat and the boots, didn’t have anything to do with style, with cool, with any sort of John Travolta post-modern irony.  It was simply the clothes you wore that were most practical for a physical, demanding way of life filled with hard work and not a lot of money.  You wore cowboy boots because if you rode, the pointed toes got your feet in the stirrups quickly without you having to look down and the high heels kept your feet from slipping through the stirrups, so that you wouldn’t be caught and dragged if your horse threw you.  The hat?  Wide brim to keep off the brutal western sun, deep crown to use to water yourself or your horse.  Jeans because they don’t wear out and you don’t have enough money to buy lots of pants.  The tight shirt with snaps?  The tight part is to protect against brush and thorns that would catch on looser material.  I don’t actually have any guesses about the snaps.

But Western wear has always been stylin’, whether it was “in style” or not.  Snaps and complexly designed yokes and fringe and embroidery were a major part of the look of a Western shirt.  And, let’s face it, during the mid-years of last century, Western wear was one of the few ways a man was allowed to express his own taste for color, style, for actual pretty, in what he wore.  And still be the most macho dude around.

So, here were Pat (my friend) and I, wandering around “backstage” at Cheyenne Frontier Days.  And I mostly noticed that people who are very comfortable in their skins, in their choices, look like they belong in their cowboy clothes.  This is something that can be extended, of course, to any style of clothing.  Queen Elizabeth II looks quite comfortable in satin encrusted with embroidery and jewels, wearing her orders and sashes and necklaces and tiaras and crowns.  For her, it’s not a costume, it’s not “cool”.  It’s just her uniform for a certain part of her working life.  I also noticed that the real working cowboys, whether their work is ranching or rodeo, look so utterly, droolingly delicious in their jeans and boots and snap-buttoned shirts and hats that a mere female has a real hard time remembering that these men are not icons, they’re human beings, with undoubtedly human problems.  I’m not suggesting that a girl shouldn’t get interested in a cowboy (or vice versa), but somewhere between “they never say a word and they’re always hurt” and “my heroes have always been cowboys”, it’s probably best to find the cowboy who interests you more for his thoughts, his humor and his liking of you than because he can wear the hell out of a pair of tight jeans.  Just sayin’.

But they sure are fun to look at.

However, that congruence between what I really liked, what my taste genuinely was, and what was out there to like, what was okay to like, didn’t survive the end of the rodeo season.  For one thing, I moved away from Wyoming.  For another, it still wasn’t cool in Colorado to like cowboys.  Oh well.  Life went on.

Roy Rogers and Dale Evans. Photo taken at 61st...

Roy Rogers and Dale Evans. Photo taken at 61st Academy Awards 3/29/89 - Governor's Permission granted to copy, publish or post but please credit "photo by Alan Light" if you can (Photo credit: Wikipedia)

And eventually I moved to Los Angeles.  LA is not part of the West, just so you know.  It may have been once, when Roy Rogers lived in the San Fernando Valley, but in the eighties and nineties when I lived there, LA was just too cool and trendy, too center-of-the-world, to give house room to the real life of the West.  But even there, there came around, as it does every few years in LA and New York City (but apparently nowhere else except every place in Texas), a fad for the cowboy look.  Oh, not for being a cowboy, just for looking like one, in a sort of deconstructed way.  And people who always seem to know what the next big thing is would rent a vacant lot or a parking lot and put thousands of pairs of used cowboy jeans or Western shirts and/or thousands of pairs of used cowboy boots out and people would buy them and buy them and buy them.  I did too.  I got a pair of black lizard Frye boots with really high heels and really pointed toes for some impossibly small amount of money and loved them to pieces, even if they were a bit narrow for my fat little baby feet.  I’m still mad at myself for getting rid of them.  One of the reasons those vacant lots filled with used boots was even possible is that you can’t kill a good pair of Frye boots no matter what you do to them, they’ll outlast you (or at least your ability to walk in boots with really pointy toes and really high heels).

Of course, I wore them the trendy, LA way, NOT with jeans, snap-buttoned shirts and fringe, but with long swirly skirts that were in style and so, of course, not Western.  And, heavens above, not with a cowboy hat.  After all, you had to have some standards.  And the cognitive dissonance went on.  Because I really liked those cowboy boots and what I wanted in my hidden self was to wear them right, with a fringed leather skirt or with chaps and jeans, and (even though I get the worst hat head you ever saw) with a cowboy hat.  And no matter how completely un-trendy it was (and it was), I wanted a fringed leather jacket and turquoise jewelry (none of which I could afford).  I really wanted them.  And I kept quiet about it, because just saying it out loud would brand me as some kind of nerd, geek or whatnot, with no style at all.

Finally I moved back here to the West.  Oh, not for that reason.  And not without a very large detour to New York City where I discovered that while what’s in style rules on 5th Avenue, you can wear what you want and like what you want in the Village (at least, you can so long as it’s black).  Which helped me, finally, realize that it was okay, it really was, for me to like things (cats, Georgette Heyer novels, Sherlock Holmes, Frye boots, fringed leather jackets, Tex-Mex food, Arts and Crafts furniture, Victoriana, Fiestaware, and the American West) because I liked them.  Whether somebody else did, whether it was cool or trendy, mattered not in the slightest.

I started buying turquoise jewelry.  Not the really good stuff, I still can’t afford it, but I have a couple of pieces I wear almost all the time.  I have a fringed, embroidered, suede Western jacket.  I just bought a pair of cowboy style ankle boots with conchos on them (I can hardly wait to wear them with the new jeans four sizes smaller than I’ve worn for years).  I go to the Rooftop Rodeo here in town.  And I’m starting to not care whether or not the Western-style pieces I’m looking at (rugs, cushions, even furniture) are cool or trendy anywhere but in my mind and heart.

Even more, I’m realizing that it’s okay for me not to like stuff that iscool or trendy.  No more apologies that I’m just not a minimalist when somebody tells me that the best furniture is Mies van der Rohe.  I know who he is, his stuff is lean and gorgeous and simple, and I couldn’t live with it for a minute.  I’m finally learning that stating for the record that I don’t like modern furniture is not going to get me drummed out of the human race, it’ll just keep me from being invited to a house where there wouldn’t be a comfortable place to sit anyway.  So now I can admit out loud, darn it, that I really liked those high school pep club uniforms and that I don’t care if tangerine is this year’s best color, because it’s orange and I hate orange and always did.

"The Cow Boy"

Manners

Although I’m not sure I want to reveal this to the world, I read etiquette books for entertainment.  I have a collection of them, the earliest published in in the 1870s, the latest Miss Manners‘ new revision published in 2005.  This is part of my interest in history, because etiquette books help me understand now just how people actually behaved, but how they thought they should behave.  And sometimes, these old books give some form of insight into why.  Furthermore, reading a series of such books over time lets us know the human behaviors that have actually changed, and those that haven’t.

Judith Martin (aka Miss Manners) upon receipt ...

Judith Martin (aka Miss Manners) upon receipt of the 2005 National Humanities Medal (Photo credit: Wikipedia)

If books about the proper use of forks, leaving of cards, and methods of introductions seem a strange source for any such insights, I can only suggest you try reading them.  Just as laws prohibiting something (running red lights, for example) are only promulgated if a lot of people are doing the prohibited action, rules of etiquette behave the same way.  An etiquette book will only write that is is rude or boorish to use the tablecloth as a handkerchief if people are using tablecloths to wipe their noses (which is a really disgusting thought, isn’t it?).    And today’s etiquette books don’t even mention using the tablecloth in such a fashion because apparently the shame of it all finally changed the behavior.

On the other hand, every etiquette book I’ve ever read has long, long, long treatises on training children into the civilized pretence that they’re grateful you came to their birthday party, and that the present is a surprise and delighted in not because it’s the latest toy but because it was so thoughtfully given.  Apparently, human nature is not going to change that fundamentally.

Historically, the role of etiquette in human life apparently has been twofold:  the first, to make it possible for humans to live in social groups without decimating each other; and the second, to help us in the task of arranging our societies hierarchically (that is, to know who is on the same level as we are and to keep the arrivistes out).  But even more basic than that, humans do not have a built-in set of instincts or hard-wired behaviors to help us live in groups, such as gazelles do, or dogs or even gorillas.  Moreover, we live in social groups much larger and more complicated than our DNA was designed to handle.  Even now, it is noticeably difficult for anthropologists to determine whatever social behaviors come “naturally” to humans, even those living in small groups.  So laws are necessary for us, and religious and moral systems, and etiquette.  All civilizations have systems of etiquette, just as they do laws and religions, and all are designed to, well, control human behavior.

So let’s look at the two reasons given above for the use of etiquette in our lives.  The first makes rules from the simplest (when walking up or downstairs, keep to the right) to the most complex (one leaves one of your own calling cards and two of your husband’s when making morning calls) in order to make living in groups of people larger than one’s family, well, easier.  If we all more or less keep to the right when climbing or descending stairs, we get to the subway platform faster (which allows us more time to wait for the subway, but no system is perfect).  The calling card issue, while out of date in today’s world, and too complex even when it wasn’t out of date, does have a logical basis.  Married women (with their marriageable daughters) made the “morning” calls (always made in the afternoon).  They left one of their own cards, sometimes if they had a daughter with her name penciled on it, for the lady of the house to keep, they left one of their husband’s cards (it was assumed he had far more interesting (or at least less boring) things to do with his time than make morning calls) for the lady of the house, and one of their husband’s cards for the gentleman of the house (it was considered rude for a lady to leave a card for a gentleman for obvious but never overtly mentioned reasons that the only possible relationship between a lady and a gentleman not of her own family was, ahem, romantic).  Thus, the husband was taking part in social life without being bothered (which was what he probably had in mind) and all the recipients had bits of pasteboard with names and addresses on them from which to make up their invitation lists.

In High-Change in Bond Street (1796), James Gi...

In High-Change in Bond Street (1796), James Gillray caricatured the lack of courtesy on Bond Street, which was a grand fashionable milieu at the time. (Photo credit: Wikipedia)

All of this sounds so arcane and ridiculous to us, doesn’t it?  But it has its present day analogues.  Morning calls became incredibly elaborate, but their original function was to stay in touch with those one wanted to remain friends with, become friends with, or social climb to be friends with.  Today, we use Facebook, Twitter, email, texting, even the incredibly outdated telephone call, to do the same thing.  All this kind of technology we use in our iPhones, iPads and computers today was first created in order to facilitate and ease the human need to stay connected with friends, acquaintances, and whatnot.  It is hard to even imagine today how isolated a family would be in its own house before the invention of the telephone.  What other way then getting out of one’s own house, walking or riding or taking a cart or coach to somebody else’s house and then physically “calling” upon them would there be to maintain one’s friendships before the telephone?  When “calling” first started, even writing a letter was a major issue (some very fine people couldn’t write, some postal systems were dreadful, and postage costs were very high–in eras when a penny bought a loaf of bread, to send a letter cost a penny or more).  And, by the way, it was called a “morning” call because until the 1820s or so, “morning” was all day from arising until dinner–people didn’t usually eat lunch, and “afternoon” as a concept didn’t really get started until the 19th Century.  (This is reflected even in our Bible, where in Genesis the narrator says “and the evening and the morning made the first day.”

I’ve gotten off track.  My point (and, to quote Ellen DeGeneres, I do have one) is that, however simple or elaborate, however common-sensical or ridiculous, the system of etiquette in general and most of its rules in particular are designed to both ease and codify the way we humans behave in groups.  Etiquette is designed to supplement law and morality and to handle those small items of human contact that don’t rise to thou shalt not kill.  Rather, they remain on the level of one just simply does not spit on the sidewalk.  (This, by the way, is a rule that I wish was more honored in the observance than the breach.)

The second use of etiquette is or can be considerably less benign.  Humans, no matter how right or wrong each considers the concept, live in hierarchies.  Even in the most liberal and free of countries, there are hierarchies, some more or less codified, some simply feeling “natural.”  The hierarchies in some countries seem cruelly limiting and immovable to our eyes, those in others may seem so nearly invisible that the country approaches anarchy, but they are always there except in the simplest hunter-gatherer groups (where the principle of hierarchy is anathema and any attempts by any tribe member to behave exceptionally in order to get exceptional treatment is shamed).  Part of this makes sense, as it did to Samuel Johnson, who said (testily, as he said most things) that the idea that the highest ranking person went through a door first was not snobbish but merely practical, designed to get the show on the road (that is a very loose paraphrase of what he said, by the way).  He does has a point.  Although I would say most of us in the United States would not agree with it in principle, there does need to be an order to things.  In the past, in terms of etiquette, the order was often from the top down.  What made it unfair to our eyes was that the people who did the ordering were almost always those sitting at the top or at least those who could convince others they were sitting at the top.  Usually this was not done, at least originally, in any mannerly fashion but by simple force of arms.  After that, of course, the hierarchy was ordained by God and that was all there was to it.  Many of us in the modern era find this rather suspicious, especially given the words of our several religious heritages, most of which state that the humble are quite as important as the, well, important.

Be that as it may, humans were never very good at accepting the idea of a hierarchy unless that human was at the top or could reach it, and so etiquette began to perform a double function.  First, the people on top elaborated their etiquette, as they elaborated their clothes, to distinguish themselves from the upstarts crowding in on them from underneath.  Second, the people underneath (those upstarts) began to copy the manners they perceived in their supposed social betters so they were less distinguishable from the ones on top.  This became quite a race starting in the late Middle Ages when trade and the creation of wealth from other means than plunder got started again.  Its most amusing and appalling recrudescence from our point of view is probably that of sumptuary laws, which defined what each segment of society could actually wear.  Believe me, this was not much of an issue in the Dark Ages, because nobody had good-looking clothes.  But once it was possible to import fine wool or even silk, it became a major THING.  There were even laws in Parliament distinguishing what a middle-class tradesman’s wife could wear (boring black and dark colors with high necks and long sleeves) and what the Earl could wear (silks, velvets, ermine, furs, jewels).  This might seem to some to be as limiting for the Earl as for the tradesman’s wife, but it probably was more galling to the latter than the former, especially when the tradesman became the chief of his guild and had more money than the Earl down county whose castle was falling down.

Why were clothes so important?  Because how else do you determine if somebody is SOMEBODY or just folks?  As Russell Crowe expounded in a recent movie “Robin Hood”, what is the difference between a knight and one of his men-at-arms?  Well, primarily the horse, because nobody but knights could afford them, but also the fact that the knight wore chain mail and tunics in the colors of his heraldry, while the man-at-arms wore coarsest wool in dark colors.  So anybody, whether low or high on the hierarchy, could tell literally at a glance who was who and who was where simply by what they were wearing.  (This seems really odd to us because except at weddings and suchlike, most of us wear jeans and t-shirts (or would like to, even when we can’t), no matter what rung of the social ladder we’re clinging to at the moment.)

Knights of the Temple II

Knights of the Temple II (Photo credit: Wikipedia)

Oddly enough, this has repercussions to this day.  Certain professions have uniforms, sometimes explicitly so, sometimes simply an unwritten compact.  Beat cops and traffic patrolmen wear uniforms, as do all members of the military on duty.  So do janitors, usually, and doctors and nurses.  Laboratory technicians wear lab coats, insignia of their profession, while supreme court justices and district judges wear judicial robes, insignia of theirs.  The members of church choirs wear robes, too, their uniform.  And we all know a lawyer or accountant when we see one, because they always dress (whatever their gender) in business suits.  There are many reasons for uniforms, but they all are based in the simple problem of recognition of a professional or social group by those not members of that group (or even by other members of the group).  Doctors wear scrubs because their own clothes are less sanitary or at least less easy to keep sanitary, but the scrubs are relatively uniform in appearance so we can all tell when we’re in the emergency room which is the woman who’s actually going to stop the bleeding.  Some uniforms become amazingly complex and dazzling (look at the picture of a general in the Marine Corps one day and you’ll see what I mean), while others stay simple or become more simple through time (those judicial robes are the descendants in spirit of the elaborate churchly or noble robes of the Renaissance).  I suppose there could be a rather sniffy moral to be drawn at which uniforms get fancier and which don’t, but I’m reaching the end of this essay and I’d rather not be sniffy anyway.

Again, uniforms and sumptuary laws are an example of the use of etiquette as a means of organizing society vertically, as it were, just as rules like not touching the water fountain with your mouth help to organize social groups horizontally to make life simpler, easier, more elegant and more pleasant for everybody in an equal way.  Etiquette has gotten a bad name over the centuries for the vertical organization, because it is basically not fair or equal.  Unfortunately, for many of us, we’ve thrown the baby out with the bath water and decided all etiquette was wrong or limiting or constricting because some etiquette has been used to exclude.  Which results in a lot of spitting on the sidewalk, attempting to go up stairs which are filled side to side with people descending them, making ascent impossible, and such outrageous situations as no more morning calls.

Manual on Courtly Etiquette, Volume 10 (稿本北山抄,...

Manual on Courtly Etiquette, Volume 10 (稿本北山抄, kōhon Hokuzanshō) (Photo credit: Wikipedia)

Spring Thoughts

Aspen trees near Aspen, Colorado

[NOTE:  I’m categorizing this post also as  “writing” because I am attempting to write a somewhat descriptive essay–creating a picture with words.  I would be most interested to know if I approach this goal, but then again, I’m putting in several images to help . . . . ]

Received an ecard today from a friend filled with budding flowers and trees and an Easter message, and I realized that spring did in fact, ahem, spring going on a month ago, in late March, as it always does.  Except in the high country in Colorado.  Here, I have always maintained, we have one day of spring in which the aspen bud (aspen is both singular and plural so imagine I mean “aspen trees bud”) and the lilacs bloom.  This happens some time in June, hopefully early June, hopefully after the last snow, and then we have approximately two and a half months of summer, if we’re lucky.

This early spring we’re having here in Colorado (completely apart from the lack of rain or snow and the resulting fire danger) is a little disconcerting.  Whether it’s a weather (ooh, clever use of words, there, Gail) anomaly or a symptom of climate change (a scary and controversial topic into which I’m not going), that’s not what normally takes place at high altitude.  Here, historically, we’re more likely in March, April and May to get heavy snows instead of snowdrops.  I’m trying to remember (using increasingly faulty equipment) when we in past years saw the first crocus, the first robin, the first bluebird, and it seems to me it was later in April than has happened this year.  I definitely remember, however, always seeing the first crocus peeking through the snow.

In any event, spring has a special feel to it, doesn’t it?  Freshness, balmy air with a few brisk winds for contrast, growing things.  I don’t think there’s a green as beautiful at any place or time as the green of new leaves with the sun shining through them.  All the animals start up their lives again after the winter’s rest, scurrying around finding food and nesting materials, making homes, getting ready for babies.  The birds chirp so cheerily and some of them dart around in such finery, their feathers so filled with color and life, they lift the heart.  And even while recognizing the practical reasons for flowers, oh they look so frivolous and bright, waving in the breeze on their stems.  Even here in Estes Park, where we don’t have much spring to speak of.

Now, New York is a place that understands spring!  They do the season right in that state.  Nature in New York starts with the forsythia, which is a kind of bush type of thing that in spring has delicate yellow flowers arrayed on more-or-less dark red new canes.  The rest of the year, these bushes are kind of background, but in spring they become sun-colored lace by the sides of the roads.  The forsythia is followed by daffodils, huge clumps of daffodils all blooming in a kind of yellow frenzy against the darker green of their leaves and stems.  Then the tulips pop out, bringing pink and purple (and, of course, more yellow) into the mix.  By this time, the trees have gotten the message and their new green leaves start to unfurl, making even an elderly dowager of a maple tree look like a girl again, quite giddy with the fun of dancing through the spring.  If had lived closer to water (although in New York city, water is always closer than it is in Colorado, it seems), I would also have enjoyed the pussy willows (as we called them), the little paw-like catkins bursting out of the willow wands.  I saw them in the florists shops, though, and touching their softness was almost irresistible.

lilac Syringa vulgaris in bloom

Lilacs (Photo credit: Wikipedia)

Soon after the robins and bluebirds arrived, other bushes and bulbs would spring forth, and the flowering trees would turn into sticks of cotton candy, cloudy with pink or white blooms.  Then, the most glorious of spring flowers would finish the show:  lilacs bloom earlier in New York than I ever remember from Colorado and I love lilacs, their color and their scent, more than almost any other spring flower.  In my Bronx neighborhood, there were several older houses that had lilacs bushes so huge they were more like trees, so filled with blossom that walking by them was a heady experience, the fragrance saturating my senses.  And so spring renewed a tired world, animals and people and flowers coming out of their winter funks, with even the spring rains feeling soft and warm and welcoming.

Here, it’s quieter, somehow.  The blooming plants seem to grow more closely to the ground and their blooms are not riotous in their color, at least not this time of year.  The mountains in Colorado have glorious wildflowers that array themselves in rich, paintbox colors, but those come later on, in June or July.  Now there’s the haze of green new growth that underlays last year’s dead stems, fuzzy buds on the aspen that will (hopefully after the last snow) break out into a green so delicate even from a distance you can see the veins in the leaves, and there are the crocus (croci?) with their pale lavender and cream cups and soft green leaves.  Later, in early June, there will be the blue flag, a kind of native iris, which creates a haze of blue in the low-lying ground close to the reservoir and on the big meadows in the park (as I mentioned in an earlier post about how we in Estes Park talk, this means Rocky Mountain National Park, the best back yard in the world).

While all this greening and coloring is going on, the animals–and the people–start to put off winter coats and lethargy and begin making a big fuss about life again.  While I always love to watch the deer and elk (and, yes, even the bears from a safe distance and usually on the other side of a window), it is the tiny ones that fascinate, the chipmunks and ground squirrels.  Because they are fair game for predators (we are a wild place here in spite of all our cars and houses and electric lights), from bobcats to eagles, they move quick quick quick and then sit up and scan their surroundings as this one is doing:

RMNP rodent

RMNP rodent (Photo credit: Wikipedia)

Then, there are the birds darting through the air, building nests, finding new things to eat, flirting with the big folk.  Truly beautiful birds make Estes Park and the mountains their summer home.  While we may not have cardinals or orioles or purple martins as the East Coast does, we have Stellar jays (blue shading into black, unlike the blue and white of the more standard jay), camp robbers (I can’t remember their actual name, this is what we call them up here, big birds in gray and white, utterly fearless), ravens and crows, chickadees, cedar waxwings, magpies, downy woodpeckers and our own wonderful blue, blue, bluebird, among many others.  They fill the air with song and their quick, darting flight.

And, later, in June, will come the flying jewels, the hummingbirds.  Almost everyone keeps bait around their houses, either the kinds of (usually red) flowers the hummingbirds adore or a hummingbird feeder.  They are enchanting to watch as they zip through the air or hover, with that distinctive sound they make, not quite the hum of their names, but not quite a buzz either.  They are quite territorial, and the battles between two of the tiny males are more furious and aerobatic than any other aerial combat.  They move so fast it is as if our eyes see where they were and not where they are.  Here in the mountains, they arrive at the very end of the spring renewal, and they delight us all summer long.

Finally, there are the big animals, the elk and deer that wander around all winter in scruffy coats and lost antlers, now sleeking up into their summer wear, growing new weapons covered in softest velvet, eating everything in sight.  And the bears come out of their dens in April (early this year, it seems), searching for food and frightening the populace (bears are not cuddly, not tame, and they are very dangerous).  While we see bobcats and coyotes all winter, the eagles and hawks seem to reappear in the spring, as do the Canada geese and the whistler swans.  They love our small lake here, a place to rest and find food during their travels.  So spring increases our populations of animals, and that burgeoning brings the tourists, another sign of spring.  If nothing else let us know it is nearly summer, the sudden inability to turn left would.  And so spring, bringing our senses back to life after our winter naps, leads into summer, the rich, fat season, filled with skies nearly purple in their blueness, leaves darkening into forest green, animals raising sleek babies, the joys of water and air and rocks, views and breeze and tiny, surprising lakes, rivers and summer thunderstorms.  And the memories of spring.

Spring’s pageant is ever new and ever the same.  It is, after all, the circle of life, and as necessary to our planet and our lives as the sun itself.  Perhaps it is intrinsic to spring that it be exhilarating, beautiful, warm, fuzzy, or perhaps that is just a bonus.  In any event, even here in our much shorter, quieter springtimes in the high mountains, our hearts and spirits lift with each chirp of a bird, each bursting forth of an aspen’s leaves, each bloom of a lilac.

Two males hummingbird are fighting. They do it...

Hummingbirds in Combat (Photo credit: Wikipedia)

A Mountain Bluebird

Monday, Monday

Fogg Dam Conservation Reserve which is one of ...

Rainy Day at a Dam in Australia. (Photo credit: Wikipedia)

Before the Flood (in my case, two of them, the Big Thompson Flood and the Lawn Lake Flood), I used to love a rock group known as “The Mamas and the Papas“.  They had wonderful voices, quite lyrical, and a rich style, unique for the time, sounding much more well-produced than many other contemporary groups, which, following “The Rolling Stones“, preferred a rougher edge.  One of The Mamas and the Papas’ earliest hits (after “California Dreamin‘”) was a song entitled “Monday, Monday“.  It was a very ‘Monday’ song, about treachery and betrayal.  It pointed out that the day might leave, but the Monday feeling hung around.  I’m not sure why Mondays feel that way, but this one sure does.  I was confident that when I stopped working at the day job, Monday would once again be the bright start of the week, not its lowest nadir.  But some Mondays just are nadirs, and that’s all there is to it.

Monday

Monday (Photo credit: Eric M Martin)

Sometimes nothing works except Tuesday, but there are a few things a person can do.  Running away to Australia (where it already is Tuesday) is probably not an option for most of us; it isn’t for me.  But writing is always an option.  For instance, I didn’t know when I started this post that it would end up being about writing, or actually about anything having to do with getting over a Monday.  I thought it was just going to be a complaint, about weather and not enough sleep and having to run errands and do chores and pay bills, to say nothing of political emails that I will truly say nothing of, but that I’m very tired of getting.  Instead, my thoughts turn to the psychological benefits, let alone the artistic benefits, of writing out one’s less than stellar or chirpy moods.  That’s what I’m in the process of doing, after all, and it’s working.

Whether in a blog or simply a private journal, writing about what you’re (I’m) feeling helps in a number of ways:  First, for me at least, it helps me figure out what I am feeling, and often I’m not sure.  I may have just a case of the blahs, kind of “itch”, as Connie Willis so beautifully puts it, a kind of existential angst that can afflict anyone.  But writing about it can often pinpoint what is really going on.  In my case, today, it combines not enough sleep with a meeting I’m not sure I’m ready for and then stir in just a bit of waiting for an email and then getting an email, not even remotely the one I’m waiting for, that seems to come from the bowels of political nastiness and you’ve got that Monday feeling.  Or at least I do.  But there have been times when the bad mood went a lot deeper, and writing made it possible for me, eventually, to see what really was bugging me.

Second, at times writing about the problem can help you (me) find a solution.  Sometimes, of course, the solution is just to stop feeling sorry for yourself and get on with life.  At other times, when the problem goes deeper, the mind is searching underneath consciousness for a solution, and writing, especially the kind of writing you can do on a computer in a journal, can help you get out of your own way so you can see what you need to do or feel or be to resolve the issue.  Here, the trick is to simply write, without the little critic we all have living in our heads yammering away about the quality of what you’re doing (somehow the little critic never seems to think that the quality is good, darn it).  It’s hard to shut him or her up, but it is possible.  Just keep writing, let the words come out, no matter how silly or self-serving or mindless they seem.  Eventually, your mind will settle down to the hard work of letting you know what’s wrong.  It’s kind of like therapy, only using touch-typing instead of psychoanalysis.

Third (one of the smaller tricks of writing is to realize that the brain likes things in threes, so when you provide, in an essay, lists of options or whatnot, make sure there are at least three of them), when the time comes, and it inevitably does, that the solution does not appear right then and there, save what you’ve written and let it sit.  Put it away and come back to it on another day, preferably not a Monday.  Not only might you realize there is a resolution somewhere in that storm of words that you didn’t see before, but also you might have an essay, a blog, or a part of a greater work just sitting there waiting for you to refine it.  A double blessing.  And even if the solution isn’t forthcoming, you will probably come to realize that it was just Monday, after all, and things are better simply because life is change.  (Plus, you might still have a usable piece of writing!)

Like the song, “Monday, Monday.”  I have the feeling that composing that song took away the writer’s blues.  And even if it didn’t, he got a great song out of his dreary Monday.

The Mamas and the Papas Deliver

The Mamas and the Papas Deliver (Photo credit: Wikipedia)

Driving Miss Tina

2003-2007 Nissan Murano photographed in Colleg...

Nissan Murano (not mine, but similar) (Photo credit: Wikipedia)

As I may have mentioned, I christened my car “Tina” after I bought her.  It started as “Tiny” because she’s a big girl, but she didn’t like it, so now it’s Tina.  I have always loved Nissan cars–one literally saved my life in 2003 (that story I’ll blog about at some point, trust me)–and when I moved back to Colorado, with the prospect of snowy mountain roads, I bought a one-year-old Nissan Murano, silver gray with black interior.  She has many talents, my new (still feels new to me) big girl of a car.

Interesting (to me) digression:  While some complex mechanisms remain resolutely neuter, neutral and completely without individuality, others come equipped with personality, gender and, definitely, opinions of their own.  When I was in college, the elevator in my dorm hated me.  It simply did, that’s all there was to it.  My first car’s name was “Prudence Duvernoy” (from a character I had played in Tennessee Williams’ “Camino Royale”), and that car was madly in love with another student’s big old Chevy and always found a way to park next to him.  My second computer seemed a bit miffed that somebody so clueless could possibly be in charge of it, and I spent more on repairs and tech support than I had for the computer.  I think most people would, if absolutely pressed to the wall about it, admit that some machine in their life seemed to have distinct preferences and likes or dislikes.  And acted upon them.

In any event, back to today’s topic.  My car has many talents, chief among which is being the easiest to drive and the safest-feeling car I’ve ever owned.  As I said, she’s a big girl, and in my part of Colorado, which gets a lot of wind, it’s a delight to have this big solid vehicle around me as traffic lights wave around like banners and flags get ripped off flagpoles and construction signs have to have holes in them to protect them from becoming lethal flying weapons.  Tina also has the ability to find a parking space within reasonable distance of my destination virtually every time.  Even in Estes in the summertime.  That’s a very good talent for a car to have.  And, in spite of her size, she doesn’t guzzle gas, but sips it instead.  Very useful in the coming years.  She’s also comfortable and not cramped.  I’m glad she has cloth seats, because leather seats can be sticky in summer and cold in winter.  She has quite a bit of cargo space, and her rear seats fold down nice and flat.  So, yes, I’m very fond of Tina and she seems quite fond of me.

And where does she spend most of her time?  In my garage.  I’m sure she’s glad it’s there (she’s the first car I’ve ever had that didn’t live outside all the time like a husky).  I know I am, because I’m lucky enough to have an attached garage, which is a great luxury in a cold climate.  But Tina doesn’t spend a lot of time out on the road both because of my California gas crisis background (“is this trip necessary?  how much time do I want to spend in line at the gas station?  and I really shouldn’t be using so much gas anyway”  and so forth), and because I’m spending much more time these days at home, writing.  All good things.  But it turns out I miss driving.  Really a lot.

On Monday, when I went to buy my new toy (see previous post), one of the things I noticed about the whole trip was how much I enjoyed it.  Not just driving Tina because she’s a good, drivable car, but simply driving.  When I first learned to drive, my greatest (non-romantic) pleasure was to drive, simply to drive, not to go anywhere in particular, but to go!  I remember when I was a little girl, Daddy would sometimes say, particularly after dinner on a summer evening, “hey, want to go for a drive?”  And we all piled in, thrilled at the idea.  Daddy, Mama, Gail (that’s me) and Velvet (that’s the dog).  Of course, no summer evening drive engineered and guided by my father would ever come home without stopping at A&W Root Beer, so we had a hidden agenda, but so much of the joy was the drive itself.  This was a while ago, so our car didn’t have air conditioning (nothing had air conditioning except the movie theater, let alone a car) and Greeley, Colorado, while it did cool off after dark in the summer, was HOT.  My mother would bring beach towels so we could actually sit on the seats (which weren’t leather, but the particularly stiff and staunch plastic they had for car seats in the fifties) and Daddy would say a few Army words (as Mama called them while she shot a very dirty look at him) until the steering wheel cooled down enough to touch, and of course we’d have all the windows open.  So off we’d set, no seat belts, of course, not back then, and Velvet’s head out the side back window, ears flopping (she was a cocker spaniel), and me with the dog mostly in my lap, talking to Daddy at the top of my lungs.  The best time.  Ever.  (Especially with the soft ice cream cones we’d always get on the way back home, “we” in this case including the dog, who loved ice cream.)

Obviously, I grew up with the idea that one of the great things to do is get in the car and go for a ride.  And I think that feeling has always been there, even when I didn’t have a car, the time or a full tank.  When I moved to California, after my divorce and before I got so poor I couldn’t afford the gas (let alone trying to be a good person ecologically), I would get in my car and drive on a Sunday or late at night when the world just got to be too much with me and my life was otherwise out of control.  I remember late nights driving up the freeway to Palmdale and letting the car out, with much the feeling that I’m sure a horseback rider has, and driving as fast as I could on those straight empty highways in the high desert.  (For any possible California Highway Patrol person reading this, I think the statute of limitations has run.  I hope.)  I remember trying to pretend I was a famous star incognito driving a convertible (when actually I was a word processor in a Sentra that didn’t even have a sunroof) tooling up and down the Pacific Coast Highway on the way to or from Malibu,  just too cool for school.

Big Sur, California

Big Sur, California (Photo credit: the_tahoe_guy)

Once I took a vacation and drove up the Pacific Coast Highway practically to Oregon, which included driving the utterly glorious (and terrifying) Highway One to and past Big Sur.  It’s perhaps better to be a passenger on such a road trip, because as a driver, you can’t really take your eyes off the twisty turny narrow heartstopping road long enough to look out at the unbelievable heartstopping (for another reason) view.  But there are lots of turnouts, so I’d stop and stare at the Pacific and get back in my little Sentra and twist around the switchbacks some more.  Anybody who loves to drive someday simply has to drive on that road between San Simeon and Carmel.

The only time I didn’t enjoy driving was, of course, the daily commute to work.  Even then, there were times it had its compensations.  After all, if I was in my car getting to work, I wasn’t AT work, drudging away, so that was still a plus.  And there is nothing quite like the feeling of driving home after work.  The relief of it.  Except, of course, in southern California when it rained.  Just as Colorado drivers forget how, each and every year, to drive in the snow, Los Angeles drivers forget how to drive in the rain.  And a year’s worth of oil and muck on the roads gets as slick as snot (I know it’s a disgusting image, but it’s the only one that really says it) when the rains first come.  One night, when I worked downtown, I remember that it took me over two hours to get from my office to my apartment during a rainstorm.  At that time I drove a stick shift, and by the time I arrived home, I thought my leg was permanently damaged from the constant shifting into and out of first gear, trying to get ten more feet down the pavement.

Until I got Tina, I was also frightened of driving in snow, for the very good reasons of the stark terror I’d felt over the years commuting to work in Denver in the blizzards, and a bad accident (I’ve talked about it on this blog) in Wyoming during a blizzard.  But now, Tina does very well with her all-wheel drive and her big all-season tires and her weight.  She’s only slid around once or twice and that was in my neighborhood, so I may be getting a little too sanguine about what is really more dangerous than standard driving.

But last Monday, even with the high winds, driving was just a sheer pleasure.  Going down the canyon (that’s how Estes residents, or “locals” (see my post on Estes definitions) talk about driving down to the “valley” (ditto)) with little traffic was a pleasure, looking out at the trees and the sky and beauty.  I had lunch at a great place in Lyons called “Oskar Blues” and then set off to Boulder for my shopping.  I found parking places easily (okay, Tina found them), and I had the delight I just talked about in my previous post of purchasing my new iPad.  Then I went to Whole Foods, which is another terrific shopping experience, especially for someone like me whose only alternative in her home town is a pretty standard Safeway.  There I bought produce and strawberries that smelled so richly of strawberry that my mouth was watering right there in the store, and other good things to enjoy.  And then I drove home, up the canyon, out of the worst of the wind.

And I loved it.  It reminded me of being young and taking off on a California highway just for the sheer joy of it.  I know it’s frivolous and ecologically unsound and I do try to minimize my driving for the most part, both for reasons of carbon footprint and pollution, but oh how I love to drive Miss Tina!

Colorado Sky

Colorado Sky--One of the Delights of Driving (Photo credit: Let Ideas Compete)

New Toy

It’s all Steve Jobs‘ fault.

Apple iPad Event

Apple iPad Event (Photo credit: Wikipedia)

Yes, I have succumbed.  Yesterday, at the Boulder Apple Store, I lost all common sense and self-control and bought an iPad.  The new one.  The gorgeous shiny, pretty thing I’ve wanted since the first one came out of the mind of Jobs, the design gurus at Apple, and, sadly, the factories of China.  And, what’s worse, I’m not even sorry.

Not only is the gadget thrilling, the experience of shopping at an Apple store is amazing.  Walk in and no matter how busy they are (and they are always busy), within a minute an employee will have approached you and within another minute, the person who will guide you through the purchase has arrived.  You never stand in line, the wonderful widgets are brought to you and, with small hand-held devices, the employees “ring up” your purchase right then and there.  After which, if you like, they set up the gadget for you and answer all your questions.  And the “wow” factor remains.  Even the packaging is magnificent:  sturdy, attractive, of a quality designed to underscore the quality of what is packaged.  (Yes, packaging is evil.  If the actual trivial way in which we chop up the planet just for ephemeral things isn’t bad enough, the layers of plastic and cardboard in which we surround them will be.)  But Apple’s packaging becomes part of the experience of buying.

So, I discovered that the merest touch and gesture would, more easily and elegantly than on my iPhone, move me from screen to screen, app to app.  I found out how brilliant all images are (unfortunately, this also included my own face, which lately I have enjoyed seeing, shall we say, as if through a bit more mist).  I stayed up late (nonsense, early for me) reading a novel on the delicious sharp screen.  Earlier, I synchronized my new toy with all my other Apple toys (iPhone and iPod).  I surfed apps, and I turned it on and off so often that I actually had to recharge it on its first night.  And I’m still enamored.  Although the guilt level is higher today.

You see, I don’t really need it.  No, let’s state it more forcefully.  I do not need an iPad.  I have an iPhone I still do not really know how to use to its fullest capacity, I have two computers, one a Mac, one a Dell and, as I said above, an iPod.  Obviously, I have long since drunk the Kool-Aid.  But if there’s anything Steve Jobs knew how to do, it was to create desire for those shiny, pretty things–desire that immediately becomes need.  Of course, unlike so many other shiny, pretty things, once a person has an Apple gadget, the delight has a tendency to stick around.  Unlike the toys of my childhood, which barely kept my interest past New Year’s Day after being so wanted, so desperately wanted, prior to Christmas, my iPod, my iPhone, and, I’m sure, my iPad (MY! iPAD!) will be used and happily so for a long time to come (at least, they will if I can figure out all their options and mechanisms).

The term “shiny, pretty things” is not mine.  It comes from the antic and gadfly mind of Mark Morford, a truly sane voice howling in our current cultural wilderness.  He was pointing out in his weekly column for the San Francisco Chronicle that our monkey-desire for these shiny, pretty things is gnawing our planet bare.  And it is.  And I’m guilty.  But as he also pointed out in the same column, all of us want ours before the chance is gone.  After all, they only had to make one more iPad so I could have one.  Only one more.   Sort of how I feel about Colorado–glad I moved back here and NOW they can close the gates and throw away the key.

All of which does have a tendency to take a little of the shine off my new toy.  But I’m still glad I got it.  Thanks, Mr. Jobs.

Although, upon thought, I really should have gotten the white one.

Image representing Apple as depicted in CrunchBase

Little Niggling Things About the Movies

Last night, I saw “The Help” for the first time.  I could wax lyrical about the performances, the story, the many ways they got it right, but this is going to be a blog about the little niggling things that movies get wrong, at least for me.  So here’s one little niggling thing in all that wonderful rightness in “The Help”: Skeeter’s hair.  I don’t mean to brag about my longevity, but I was alive during that era and I remember hair.  Skeeter’s hair was, I think, supposed to be a mass of undisciplined curls, designed to show that she wasn’t caught in the feminine mystique of that era.  And they got her hair right, too, in the scenes where it was straightened and smooth and in the wonderful scene of her first date when she arrives after driving in an open truck in all that Mississippi humidity — her hair suddenly looking as if it was filled with static electricity, frizzing up beautifully.  But the perfect cork screw curls trailing down her face, well, they’re just impossible.  Not just for the styles of the era, but for curly hair in high humidity.  Without tons of what hairdressers call “product,” there are no such things as perfect corkscrew curls, there is only frizz, wild curls that don’t drape lovingly down the side of a pretty face.  The whole point was that she didn’t do all the stuff women did then to make their hair smooth and perfect (well, perfect for the time).  And the other point is that the movie stylists got the costumes, the makeup, the way women looked so right otherwise, but Skeeter’s hairstyle would never have been a style and it was too pretty, symmetrical and cared for to be the non-style the character really required.  It didn’t spoil my enjoyment of a truly good movie, but it’s one of the things I remember about it.

And that got me to thinking about other movies where one niggling little thing (or a few) ends up being more memorable than all the things the cast and crew got right.  I don’t mean to make of this an “ah ha, I’ve got you” kind of thing.  Movie websites are filled with those, after all.  And I don’t mean to denigrate the work and care and artistry that go into making a movie.  I have worked on them myself, mostly in film school, and the quality achieved is often amazing, considering the constraints of time and money everybody works with.  And it is impossible to get everything right, anyway.  Even Steven Spielberg is human (although his level of artistic creation sometimes make me wonder about that).  I also am not talking about deliberate stuff, where one of the artistic points happening may be pastiche, or parody or a stylized version of some kind of reality.  But some of these niggling little things do make me wonder.

For example, in “Titanic,” James Cameron‘s vision is not only remarkable, it’s quite specific.  And in one of his movies, apparently, truly it is his vision we’re experiencing when we watch and listen.  He did his research.  And yet, Rose wears makeup (and it’s made clear it is not simply the actress wearing it, but the character when, during a scene, her tears have made the mascara run), a VERY red and black evening gown, and is apparently living with her fiance.  I’m not suggesting those things didn’t happen, even during that era, but NOT with a supposedly virtuous unmarried girl of good family.  This is pre-WWI, after all, and not only was the whole function of virtuous girls of good family to get married, they achieved that ambition usually using the weapons of ignorance and innocence (imposed by that good family), even if they faked it.  (Old Hollywood joke, attributed to Sam Goldwyn and which I am going to mangle:  “Sincerity is the most important thing and once you learn to fake it, you’ve got it made.”)  Trust me, no Philadelphia debutante would have appeared, PRIOR to her wedding day, wearing visible makeup and a red and black evening gown.  And, even with her mother’s chaperonage, who would NOT have taken passage on even the richest ship in the same cabin/suite as her fiance.  Another bit I didn’t get.  At one point, Rose’s mother talks to her friends at tea and says, “The whole purpose of going to college is to find a fiance.  Rose has already done that.”  Rather than being pre-WWI, that attitude is post-WWII.  Prior to the fifties, most women did not go to college, and if they did, it was in defiance of the current mores and to get an education.  Rose would have gone to a boarding school, perhaps, or a “country day school,” and then possibly a finishing school, but if her mother’s only goal for her daughter was marriage, certainly she would not have taken the chance of having Rose be thought of as a bluestocking by going to college.

Cover of "Titanic (Three-Disc Special Col...

"Titanic" DVD Cover (Amazon)

Whew!  I’m glad I got that out.  I love the movie “Titanic,” but those bits of it always bothered me.  Because Cameron’s (and his crew’s) research was so otherwise impeccable, those must have been artistic choices, and I simply do not know why they were made as they were.

One minor item:  In the movie “Pride and Prejudice” (not the BBC miniseries starring Colin Firth which simply makes me drool and which was very well researched and designed indeed), Elizabeth’s father is played by Donald Sutherland.  He did a fine job of the part.  But for some reason, to me, his teeth looked like chiclets, big white chiclets, far too large for his mouth, and completely unlikely given the period and the dental care available.  I’m not suggesting that everybody in a period movie should go around with brown teeth (in fact, that is truly off-putting, because we watching the film would find that so disgusting), but Mr. Sutherland’s teeth outraged credibility.

So did Clive Owen‘s teeth in the otherwise good film (I loved it) “King Arthur”.  Not as badly, because Mr. Owen was portraying a great knight and future king and a member of the Roman Empire, which while it did not have modern dentistry, did have some dentistry.  But perfect, utterly perfect white teeth seem actually a bit wrong for even films set in the modern era.  I don’t know if the gentleman has caps, which have a tendency to appear to be too big for any mouth they’re in, or if he simply has really terrific teeth, but they bothered me in that film.  Especially since the other actors (while they had cared-for, attractive teeth) seemed to have, well, fewer of them than Mr. Owen.

Schindler's List

Schindler's List (Photo credit: Wikipedia)

And, finally, another artistic choice I didn’t understand, and in one of the most superb films I think ever made:  “Schindler’s List.”  Mr. Spielberg had a running motif of a little girl in a red coat walking through the frame in an otherwise purely black and white picture.  For some reason I’d love to hear him explain, he chose to photograph those scenes in color film and de-colorize all but the little girl’s red coat.  But color film de-colorized looks blue, not the rich silvers and grays of the black-and-white stock he used for the rest of the film.  It jarred me.  And technically, it wasn’t necessary, because the little girl’s coat could be, or so I was told at film school, colorized on the black-and-white internegative.  In my view, that would have been the better choice.  But of course I’m not Steven Spielberg by several hundred decimal points, and I’m absolutely sure he had an extremely cogent reason for his choice.  I just wish I knew what it was.

And for today, that’s enough about niggling little things, especially teeth, which seemed to figure as prominently in this essay as they did (to me) in the actors’ mouths.  I think, since TV tonight is abysmal, I’ll pick out the BBC version of “Pride and Prejudice” and spend the evening with Mr. Darcy as exquisitely and perfectly played, without one niggling little thing, by Colin Firth.  Yum!