Archive for the ‘movies’ Category

Solitaire and Christmas films

yukon-solitaire-large

©2017 By Bob Litton. All Rights Reserved.

¶You have my permission to skip this post. Just realize all the while that you probably will have missed something that someday might have helped you significantly.

Where’s the queen of hearts?

¶I have a confession to make in my cyberspatial confessional. I’m addicted to the Internet game “Yukon Solitaire”. It could be worse, I guess, if I had a smartphone. I saw on the Internet today that many Americans are addicted to that device, which I don’t have; just have a cheap old flip-phone. I tried a smartphone a year ago, but it didn’t respond to my fingers accurately enough, had a bunch of apps that I couldn’t afford to use, and ran out of juice too quickly.
¶But back to the solitaire. I know what many well-meaning folks will say: “Be happy! Playing solitaire can keep your brain rejuvenated! Keep you from becoming senile.”
¶That well may be, but I view playing the stupid game a major waste of time. I could be writing the “Great American Novel” or drawing masterpieces. Instead, I gaze at my monitor’s screen and try to determine if there is some magic strategy for attaining the “perfect win”. And that’s what I actually call it: “the perfect win”. It’s when I can get all the cards in their proper columns and complete down to at least the number “3” cards. Of course it is quite possible (and usual) to win when I’ve had to move several lower cards up to the top, but that’s just a “win”, not a “perfect win”.
¶I must admit that, besides the supposed benefit of keeping my brain active, playing “Yukon Solitaire” has revealed to me some interesting facts of life and facets of my personality. Probably the profoundest fact is that losing is as important an element of playing Yukon Solitaire — or, for that matter, any game — as winning. If I won every game or even several games in a row, boredom would quickly descend upon me. Of course, the opposite is also true: whenever I lose too many games sequentially I become frustrated and irritated and I resolve (for a day) to give up the game. But then that old lust to play returns and there I am before the computer again.
¶A year or so ago, I heard on one of the NPR talk shows a woman who had written a book (or maybe it was just an article) about how people can learn much about their own psyches from playing “Scrabble®”. I played that game only once, many years ago, and it bored me so much I never ventured into it again, so I didn’t listen very long to the radio conversation. However, I did attend enough to gather that it must be possible, indeed, to discover a lot about one’s personality and perhaps even improve it by playing Scrabble® and other such games.
¶Another thing I learned about the Yukon Solitaire game is that the outcome is not as much a matter of chance as in the original solitaire game. The player can calculate odds of moving certain cards as opposed to moving others at times when mutually excludable options exist. Also, one can begin to gauge which rows demand more attention because, if too neglected, they contain too many uncovered cards near the game’s end. Naturally, those rows tend to be the last three. Yet another insight is noticing that one’s odds of winning are proportional to the balance of red and black cards at the opening.
¶I could go on with my insights, but I don’t want to tempt my readers to try the game; for it truly is addictive, and I don’t want to be responsible for your fall.

* * * * * *

O Merry…Merry…something or other

¶While I’m still in the confessional, I guess I might as well admit to having spent a bunch of hours over several weeks in November and December watching Hallmark Channel’s massive array of Christmas romance movies. Even beyond the twelve days of Christmas.
¶It was all part of my attempt — only slightly successful — to escape the pall of gloom that fell over me and millions of my fellow citizens following the November 8 election. I was trying to avoid the news programs, which, in my case, is very difficult because I am something of a news and political junkie. I’m only a nominal Christian: a fellow who no longer attends a church and does not adhere to the Apostle’s Creed. Nor have I paid much attention to Christmas in decades. But this time I wanted to escape into some kind of cheery mythical world. And I found a bunch of that in several of those movies. Of course some were rather saccharine, but others were worth the viewing.
¶When one watches a series of films all pretty much about the same motif, one picks up on common elements. Two of the most common themes in the Hallmark Christmas movies are (1) the Scrooge theme, and (2) the real Santa theme. If you have seen the 1947 film “Miracle on 34th Street”, you might recall that it contained both themes.
¶I am using “the Scrooge theme” rather broadly here, meaning that the storyline presents a case of a person who loved Christmas as a child but, due to some unfortunate experience in the past, now either denigrates or ignores it. The protagonist is not a “Scrooge” in the sense of being selfish or inhumane, although some might be business executives more intent on making money than on sharing cheerful hours with others. One, for instance, was the story of a developer who wanted to convert a building that, on one floor, had housed a music therapy center. In another, rather preposterous story — even by fictional standards — the reindeer Dancer is too ill to fly on Christmas Eve — so Mrs. Claus sends the North Pole’s handler in cognito to buy a replacement at a reindeer farm; when the farm’s owner declines to sell, she orders the handler to steal a reindeer. (Don’t be concerned: Mrs. Claus finally recognizes her fault and the whole situation is resolved to everyone’s satisfaction.) In yet another, a Christmas tree farmer is about to lose his place because, due to bad weather, his crops have not sold well during the past two years, and the banker is set to foreclose on him; but he is saved by the story’s heroine, a marketing executive from New York who creates a “brand” campaign for his trees and drafts the farmer’s daughter and his friends to promote them countywide.
¶By far the most fascinating of the stories, however, is the fantasy tale of a nurse in 1945 who has not heard from her soldier husband. She worries that he is possibly a war fatality. After a few early scenes in which she reveals her charitable good nature, the nurse drives home during a blizzard and runs off the road into a ditch. After she crawls out of the ditch she stumbles through the snow to storage building, climbs through a window, and falls asleep. In the morning she goes to a local police station for help, but on the way she doesn’t recognize any of the vehicles on the road. During her interview with the police, they suspect that she has suffered some brain damage. Eventually, she comes to realize that she is in the 21st century, not the 20th. The police chief takes her home to spend Christmas with him and his family, and to further examine her to see is she is mentally off or perhaps is playing a confidence game. Through some ingenious detective work, the policeman concludes that she really has time-traveled; and the problem now is how to get her back to 1945.
¶I won’t take up the necessary time or space to explain it all, but the nurse’s situation involves a comet that passed by Earth in December 1945 and is scheduled to also pass it this December. So that policeman and the community — which has come to appreciate her because she has reminded them of their long forgotten customs of caroling and hanging Christmas lights on the town gazebo — accompany her to the storage building. She goes inside; and, after the crowd watches the comet pass overhead, they open the door to find she is no longer there. The last scene in the movie is of her shoveling the packed snow from in front of her car and her husband, in uniform and a duffel bag over his shoulder, showing up to help her.
¶Yeah, pretty far out but still heart-warming.
¶And now I, too, am back in the real world. Alas!

Finis

Advertisements

A Drama of Self: The Tipping Point

dilemma

PHOTO CREDIT: MS WORD CLIP ART

©2016 By Bob Litton. All Rights Reserved.

I’m curious: Do you see yourself as a character — in particular, the protagonist — in a screenplay? Ever reflect on the plotline, its beginning and all scenes since then, trying to figure out the other characters’ parts and the probable denouement? Or am I the only one so deeply solipsistic as to be constantly gazing on the internal screen? No, that can’t be the case, else the word “solipsistic” would never have been coined; they don’t make up adjectives applicable to only one person. Still, I find it difficult to imagine other people’s dramas, whether they be adventurous epics, tragedies or comedies, except as they tangentially affect my drama.

Many of us bloggers, I believe, use our blogs as candid diaries — electronic volumes open to the cosmic universe instead of little books hidden away in secret drawers. We can use them as depositories of our thoughts and feelings (mostly feelings), pretending that they are locked up in our computers, at first only peripherally aware that they are actually scattered across the planet and beyond. But then another part of us wonders how invisible and generally non-responsive readers perceive our outpourings. Mostly, all we can glimpse are their national flags. We are, then, self-analyzing split personalities.

So, desiring to be more honest than I have been during most of my life, I intend to relate the story of how I believe my solipsism became the major theme of an imaginary biopic; if one cannot repress a congenital tendency, then perhaps he at least can relieve the pressure by allowing it full expression, like steam from a teapot.

Going back to childhood meditations and actions, though I truly believe the habit really began that long ago, is beyond my capacity; the images are too fractured and vague. A clearer scene is more available in my nineteenth year, while I was in the air force and stationed on Okinawa, largest of the Ryukyu Islands. That was when I began to read very serious books for the first time; when, under the influence of the late British philosopher Bertrand Russell, I developed a longing to resolve all paradoxes; when I began to question my beliefs and especially every action’s motive. As a psychiatrist two years later put it, “You look at both sides of the coin and the edge too.”

An anecdote that quite well illustrates my message here concerns a book discussion group that one of the chaplains on the base initiated. As I recall, there were about a dozen of us airmen and civilians sitting in a circle at the first meeting, when the chaplain reviewed some nonfiction book and invited the rest of us to offer our comments. Then the chaplain explained that his performance was essentially a pattern he wanted us to follow when reviewing our own reading choices in future meetings. I, the eager fool, volunteered to present a review at the next meeting, a week later.

I had already been reading two books alternately: Arthur Koestler’s Reflections on Hanging, a critique of capital punishment; and some book whose title I cannot recall, a collection of historical narratives about various heinous crimes committed in England. While reading them I became aware of the dichotomy in my reactions to the books’ subjects: when reading Koestler my feelings reacted against capital punishment; when reading the other book my revulsion could be so strong in some cases that I believed no type of punishment could be harsh enough for the perpetrators: they were all hanged. That experience got me to musing over how much I was susceptible to weirdly and quickly varying attitudes, how my values could shift radically in just a short time, from the setting of one book down and the opening of another. Was my value system really that fragile and unstable? I wondered if this phenomenon was true of others, so I decided to try an experiment.

I do not recall the details of my mode of presentation, only that I alternated between summarizing various parts of each book and interpolating quotes here and there. I didn’t realize how long it was. I guess the chaplain felt the room was getting stuffy, for while I was reading he got up, went to a window and raised it. Shortly afterwards, one man, only a few years older than I was, interrupted me by asking, “Are we going to get a chance to discuss this? It sounds like a bunch of morbidity to me.” Another fellow murmured something about people who “should have gone to college”. I don’t remember how I responded or even that I did; I felt deflated and defeated; my lack of response was way too predictive of future encounters; I probably just said, “I’m sorry you feel that way.” The whole episode might have turned out better if I had begun the presentation with an explanation that I was conducting a psychological experiment; but, on the other hand, to have done so would probably have compromised the validity of the result.

When no succeeding review was announced, I went to the chaplain and asked him what was up. He replied that he had discontinued the book review sessions because too few people were participating.

During all my life since then I have from time to time pondered how we can act decisively in murky situations and dilemmas when our ideas and feelings react against each other. Just what is the “tipping point”, as it has come to be nominated?

Finis

For more commentary on this topic, see my Dec. 15, 2013, post “To Be Or…Catastrophe!”

Of Errol Flynn…Friendships and Dreams

Adventures of Robin Hood(1938).docx

ADVENTURES OF ROBIN HOOD (1938) Errol Flynn as Robin Hood takes aim at one of the king’s deer while Will Scarlet (Patric Knowles) looks on. Flynn was too well-suited to such swash-buckling roles to readily escape type-casting, although he did obtain other roles in several combat movies during WWII, as well as the part of 19th century boxer “Gentleman Jim” Corbett {1942) and Mike Campbell in the film adaptation of Ernest Hemingway’s “The Sun Also Rises” (1957), two years before his death at age 50.

PHOTOS CREDITS: Microsoft Word Clip Art

TEXT: © 1985, 2011, 2015 By Bob Litton. All Rights Reserved.

I saw the filmed autobiography of Errol Flynn, My Wicked, Wicked Ways, on CBS a few weeks ago.  Although it had an abundance of true-life comedy and adventure, it was generally a sad story, as I think most biographies of interesting people are bound to be.

Other than Gene Autry, Errol Flynn is the only hero I can remember having as a child.  Well, actually my hero-worship of Flynn was a composite of him and Robin Hood, the part with which I most identified him.  But, Robin Hood was long ago and Flynn was of the present.

As years passed, I was vaguely aware of Flynn’s risky love affairs.  I wasn’t a reader of movie magazines, but such incidents were also reported elsewhere.  They tarnished his image a bit for me, but I silently rooted for him and hoped that all would turn out well.

And Robin Hood?  Robin Hood was printed forever (?) on celluloid and in Howard Pyle’s book of the legends.  I used to play the role in hours of transcendent imagination, which frequently overwhelm susceptible children.  With imaginary bow and arrow I would win the imaginary archery shoot and take the golden arrow.  Armed with a yard stick, which imagination miraculously transformed into a broadsword, I dealt Sir Guy of Gisborne a death blow.  (Never, however, did I let imagination carry me so far away as to jump on and off my mother’s furniture.)

But back to Flynn.  What struck me most about the television biography were two things: (1) the tenuousness and fragility of relationships and (2) the ease with which one can be deflected from even the simplest life goals.

If we are to believe the film, Flynn married his first wife, Lili Damita, because she threatened to throw herself off a window ledge if he didn’t.  And after they were married she couldn’t tolerate the adoration Flynn’s female fans showered upon him.  She apparently became ever more neurotically possessive and finally “took him to the cleaners” in divorce court.

Then there was Flynn’s own idol―John Barrymore.  Flynn practically made a rest home out of some rooms in his house for the aged, alcoholic actor.  Barrymore died, and another of Flynn’s closest friends, a stunt man, was killed while doing a stunt―both deaths occurring in the same week.  After spending an evening trying to drink his grief away, Flynn came home, turned on the light in his living room and screamed with horror at the sight of Barrymore, neatly groomed and dressed, sitting upright in a chair.

When Flynn started to run out of his house, three of his friends popped out of the shadowy foyer and stopped him, explaining that they had bribed the mortician to let them bring Barrymore home for a last round of drinks with his chums.  With a hand still shaking from the fright, Flynn filled some glasses, and they all raised them toward Barrymore in manner of a toast.

Friends!  I wondered which was the case: That they were good friends because they had gone to an extreme length to arrange a final drink between buddies; or that they were bad friends because, in their foolish desire to make it a surprise, they had scared a friend almost out of his wits?

The other point ― how easy it is to be deflected from even simple goals ― was exemplified by the hold Jack Warner and Hollywood generally had over Flynn.  Judging by the film, I gathered that all Flynn wanted to do was play in serious roles (such as Rhett Butler in “Gone with the Wind”) and earn enough money to buy a boat large enough to sail around the world.  Because of his various relationships, however, he couldn’t seem to break away from Hollywood or from his roles as a swashbuckler.  (Nice irony there!)

Given the contrast of Flynn’s personal situation (a $2,500 a week salary) with that of the American public as a whole when he was making his most memorable films ― between 1935 and 1940 ― one is perhaps justified in accusing him of being just a bit self-indulgent.  Moreover, it can reasonably be argued that by providing escapist films for a suffering world he was doing more good for his fellow humans than he ever could have done aboard a boat sailing the seas.

Apparently, there wasn’t much he could do about obtaining more serious roles; in those days, once you were a success at the box office in one particular part, you were type-cast.  Still, to sail to the Fijis and Malaysia in his own ship was Flynn’s personal dream―regardless of how simple or even simple-minded it was from our point of view―and perhaps he did himself great personal harm we cannot imagine by not carrying through with his dream.

Erroll Flynn, My Wicked, Wicked Ways

“My Wicked, Wicked Ways: The Legend of Errol Flynn” was the title of the 1985 CBS Television biography I watched in Monahans. The TV bio’s title was taken, in part, from the title of Flynn’s ghost-written autobiography published in 1959, the year he died.

— The Monahans News, February 7, 1985

By the way, yesterday Chris Ruggia and I discussed my January 19th blog post, “Bob’s Current Preferences”, and he explained his intent in the third panel of the comic there. Consequently, I added a few sentences to the end of the post; so any of you readers who have already read that post and are curious about the mysterious “brush” can click on the listing at the side of this page, under “Recent Posts”, and return to the cartoon and my “explication” of it.

NOTE TO NON-BLOGGER READERS: WordPress has its program set up where only WP bloggers can register “likes” and “comments” on this page. However, if you are a non-blogger, I would be glad to read any comments or helpful criticisms you might wish to share and, therefore, have left my email address in the “About” page above the title of this post. Please, no “snarky” comments, or I will have to delete it.
Thank you for reading.
BL

Santa Claus’s and Darth Vaders

© 1982, 2011, 2014 By Bob Litton. All Rights Reserved.

NOTE TO READERS: Merry Christmas, everybody, and may the Force be with you…the light side of it, I mean. Well, the holiday season crept up on us even earlier this year; pretty soon “Black Friday” will come right after the Fourth of July. (Wonder how they will gerrymander that one.)

But people are beginning to “push back” on the merchants’ greedy ambition of extended holidays. We almost saw a revolution concerning the phenomenon this year, with employees complaining about the notion of opening early on Thanksgiving Day.

Years ago, I started holing away in my hovel during early December, not turning on the radio or the TV and reducing the number of trips to the grocery store so I could avoid the incessant clamor of bells and chipmunks. I believe that the Christmas season would be much more enjoyable if they would not decorate or display Christmas items until two weeks before Christmas: Let’s see, that would be December 11, according to my calendar. I know I would probably regain the pleasure from Christmas that I enjoyed as a small boy.

Since I do not have any fresh ideas concerning December 25th, I thought it would possibly be a pleasant entertainment for you if I republished a column I wrote back in 1982 for the Monahans News. I also included it in my CD-ROM, A West Texas Journalist, in 2011. I have added a couple of sentences (italicized) in the fourth paragraph, relating more of the experience: the additional content is factual; it just was left out of my original composition in the newspaper.
   Enjoy!!!
   — BL

* * * * * *

While over at Gensler Elementary a couple of weeks ago to photograph the main characters in the Christmas play, I couldn’t help but recall the time I played Santa Claus in the second grade.

At the time I thought it was a “bit part” because I didn’t have many lines and because I was so covered up in cotton beard and stuffed red suit nobody would be able to recognize me.

I recall only two other players—a boy and a girl playing precisely what they were.  Surely there must have been other roles, although in those days the teachers didn’t feel it was essential that everybody in the grade have a part in the Christmas play.

Only two children for whom to leave gifts and, wouldn’t you know it, whoever filled my sack had put in an odd number of presents. I discovered that discomfiting fact during our rehearsal; and, departing from the script, I stuck one arm in the bag to search for the missing present, then turned the bag upside down, frustrated. The teacher loved that bit of unintentional pantomime and told me to repeat it during the actual performance.  Ever since then I’ve had an intense dislike for long division, especially when there’s a remainder.

But the worst of it was, I botched up my only lines—the last words spoken in the play.  In fact, the curtain was to be closing as I uttered them.

There I was, down center stage with the curtains swishing shut behind me and all those adult faces in the auditorium waiting for me to say something.

I thought real hard and then I bellowed: “A good Christmas to all and to all a merry night!”

Only recently I’ve noticed that Santa Claus is not really a “bit part” at all, but is usually at the center of most Christmas plays.  Any boy who gets selected for the role should be proud of the fact and cherish the memory of it.

As a matter of fact, perhaps only children should play the role of Santa, since he is supposed to be an elf, isn’t he?  After all, a man cannot slip down a chimney, much less be towed through the sky by “tiny reindeer”—even eight of them.

*  *  *  *  *  *

This year, with Darth Vader in town, I’ve had occasion to observe that he is as popular among the little ones as Santa Claus.  Some of those kids huddling around him had been babes in arms when “Star Wars” was released.

This has to be the “Age of Favorite Villains”!  For adults, it’s J.R. Ewing; for the kids, it’s Darth Vader.  Their primary attraction, I suppose, is the flair and the absoluteness of their villainy.  You don’t have to worry about whether you’ll be called upon to understand them or to take sides with them.  They’re fun to hate.

I’m rather glad to see a return to the depiction of Evil as absolute.  Back when I was a kid, Walt Disney did a good job of it with his wicked queen in “Snow White and the Seven Dwarfs”.  Somewhere along the way, however, he and those who followed after his death became rather silly with their fumbling crooks and flying Volkswagens.

The best absolute rendering of Evil of late was in the TV mini-series production of “East of Eden”.  The female lead in that show, Jane Seymour, could really spit venom!

Of course, “Star Wars” and its sequels have a moral underpinning, too.  Luke Skywalker is supposed to get his moral training from the Jedi, but he hasn’t the stamina or the patience for it.  We got the impression from “The Empire Strikes Back” that Luke’s days on the side of virtue are numbered.

The only trouble with movies like “Star Wars” is that the audience doesn’t go to see a resolution of the conflict between Good and Evil.  Rather, they go to see the weird characters and the special effects.  The evil in Darth Vader is camouflaged by the absence of humanness in him; it’s as easy to adapt to him as it is to a video game cassette.

The paradox of Evil—that it is at once a separate “force” and yet inherent in humans—has perplexed philosophers of art at least since Plato, who believed plays should be banned because they accustomed the audience to an acceptance of the unreal.

For myself, I prefer to have the knowledge of Good and Evil writ large, especially in dramatic productions, so that “he who runs may read”.

                                                               — The Monahans News, December 23 & 25, 1982

Finis

O, Here Comes Teacher, Wagging His Finger

By Bob Litton

As a “man of letters” I read as much as my drowsy eyelids allow me. However, I do not progress very rapidly in my perusal of a book, regardless of whether the subject is deep or light, improving or escapist. One reason for my slowness, of course, is my inherent turtle’s metabolism in all its manifestations, mental or physical; another is, of course, my ancient age: my mind wanders.

A more laudable cause, though, is that I read as much — perhaps more — for detection of style and grammatical quirks in the author. This attribute has contributed plenty to the development of my own style. That is the positive side of the attribute; the negative is that I become annoyed, sometimes even angry, when I note faults in the diction or grammar in other writers. Just this week, for instance, I started reading Thomas Hardy’s Jude the Obscure, his last novel. In Part First, Episode III (or “Section III”— I cannot quite comprehend Hardy’s division style), I came upon this paragraph:

The boy (Jude Fawley, the protagonist) had never strayed so far north as this from the hamlet in which he had been deposited by the carrier from a railway station southward, one dark evening some few months earlier, and till now he had had no suspicion that such a wide, flat, low-lying country lay so near at hand, under the very verge of his upland world. The whole northern semicircle between east and west, to a distance of forty or fifty miles, spread itself before him; a bluer, moister atmosphere, evidently, than that he had breathed up here.

It might be a small matter to my readers, but it seems to be that an omniscient author should be more exact in his measurements. The word “some”, in particular, irritates me when it is used by either an indifferent or a lazy writer. What does “some” mean here? It could be four months or fourteen months: the word “few” would have been quite adequate if allowed to stand alone, but instead Hardy wants to make us guess. ( The word “somewhat” is often used by other writers in much the same way: I think “somewhat” should be banned from the English dictionary.) The same criticism is pertinent to the other phrase I have emphasized in bold type. If two characters were discussing the area described here, I could accept such a remark from them; but Hardy, the novelist, is supposed to be the omniscient observer, and I expect a bit more decisiveness from him even if, as a real human being, he does not know the precise radius of the semicircle: just let him lie and let us get on with the story.

Now, do not take me wrong and believe that I am judging Hardy harshly or that I do not appreciate his writing; I like his style generally and I especially like his irony. I am just using a sample of his writing to prove a point about the value of clarity (even if it is a fabrication) in getting the pace of a story moving. I see this kind of vagueness almost every day in newspaper reports. It is both funny and aggravating to read a news account in which the reporter has written something like this: There were only about two or three people in the room. My God! Cannot the knucklehead count to three?

* * * * * *

I had a brief but engrossing debate with a friend of mine this morning over how a married woman should be titled. I am as much…well, almost as much…a feminism supporter as he, but I am definitely more conservative in maintaining the conventional standards of our language than he. During our conversation I mentioned that I would like to keep as much of our language conventions as possible, but I realize that language is a live thing in that it evolves just as animals and plants do.

“There just isn’t much that can be done about it,” I said. “Take ‘want to’, for instance. The new form — ‘wanna’ — is already practically standard; and I can’t really argue against it, ugly though I view it, because that basically is what we pronounce. In my youth there was a hue and cry over the contraction ‘ain’t’, but have you ever tried to pronounce the logical contraction ‘amn’t’?”

Then we started in on my objection to the frequent introduction of some woman as “Mrs. Mary Jones”.

“It should be ‘Mrs. John Jones’!” I declared.

That is when my friend sat bolt upright in his chair.

“And I most strongly disagree with that!” he said. “The woman is not ‘John Jones’! That is her husband’s name.”

“True, but ‘Mrs.’ denotes a marital connection. Her husband’s name is not ‘Mary’. And, even if you dispense with the husband’s entire name, she still is saddled with a male’s last name: her father’s.”

Our conversation drifted off into a discussion of patriarchal and matriarchal societies — at least to the small degree we were acquainted with them. We never really settled on an answer to the marital name question, although we did agree that it will probably end up in a lengthy title involving several hyphens.

* * * * * *

A more dramatic — and therefore more fascinating to me — bump in the road to good, clear written communication is our use of the word “only”. Ponder these sentences for a few minutes:

Only Miss Muffet sat on the tuffet.
Miss Muffet only sat on the tuffet.
Miss Muffet sat only on the tuffet.
Miss Muffet sat on the only tuffet.

Now, Gentle Reader, surely you will agree that each of those sentences says something different from the others, even though the same words, and only those words, are used in each. The example above illustrates how our language is virtually totally dependent upon word order. Such has not always been the case. English has evolved from a state in which each word had several forms, each form indicating how it was to be understood in the sentence. Students of ancient Greek and Latin will immediately grasp my point, for those languages (being dead) retain their various forms. In other words, they are highly inflected. Other readers should grasp some degree of my import when I suggest they consider the few remnants of our older language as apparent in our pronouns: I, me, he, him, she, her, they, them, who, whom, they, theirs, etc. In Standard English those terms still require their unique meanings; but, sadly, one can witness every day the decline in observance of those distinctions. Even in movies and TV shows, one will hear the characters say things like “Him and I went to the ball game” or “Who are you talking to?” I was startled — almost shocked — one day a couple of years ago when I first viewed “The Thin Man” film (1931) and heard Nora Charles’ old aunt, discerning a knock at the door, say, “That must be they now!” It was a shocking utterance because it was grammatically correct, something you would not hear in a modern remake of the film.

But I have digressed from the main point I want to make about the four brief sentences above. That is, I constantly see instances in the papers where either a speaker or the reporter places the word “only” in a position which detracts from the clarity of a sentence or may even totally misrepresent it. Sorry, I do not have any real-life examples at hand, but you can find one if you keep a lookout for it. When you notice it, just try mentally relocating that word to a spot closer to the word it is meant to modify.

* * * * * *

Another of my pet peeves is the “feminine so”. The word “so” has multiple uses, but the use with which I am concerned here is that of an intensive. The word can logically be used that way, but only if it has an apparent referent or consequent. The problem is that sometimes it is employed as what we might call an “absolute intensive”, without any referent. This often occurs in conversations among high-strung teenage girls, as when they exclaim something like “I am soooo depressed!” So what? If the girl had said, “I am so depressed that I am going to get drunk!” then we would have something by which to gauge the intensity of her condition. An alternative intensive is available: “very”. The word “very” does not call out for a referent or consequent; we can simply accept its simple measure of degree above normal. I used to teach that a “that clause” should follow the “so” intensive; but I had been overstating the situation. Further study led me to conclude that the referent might be included in part of the paragraph preceding the sentence using “so” and no “that clause” be needed. For instance, somebody might say, “The earth is cracking around here and our cattle are dying from thirst. I’m going to have to give up this farm. This drought is so devastating.”

* * * * * *

Finally, I would like to reiterate the perennially pointed to problem of confusing the meanings of “effect” and “affect”. Both words can be used as nouns and both can be used as verbs. However, the meanings are different in every instance. I will not bother to take up space explaining those differences here. I will only suggest that those of my readers who wish to render their own writings grammatically logical should go to a large dictionary and compare/contrast “effect” and “affect”. (I saw one term used where the other should have been used in The New York Times either yesterday or the day before, so even the masters can goof occasionally.)

I have a similar issue with the terms “attain” and “obtain”. I see them as having entirely different meanings, but the Internet dictionary I use allows for some degree of crossover. What can I say? I am a purist!

Finis

 

 

 

 

 

 

 

 

TO BE OR…CATASTROPHE!

catastrophe type model

Unidentified Catastrophe Model Type from Google Images

“It is an intriguing thought that the same mathematics may underlie not only the way the genetic code causes the embryo to unfold but also the way the printed word causes our imaginations to unfold.”

                                                 — E.C. Zeeman, “Catastrophe Theory”,
                                                     Scientific American (April 1976), page 83

©2013 By Bob Litton

Most of us Earthlings consider “catastrophe” a synonym for “disaster”. Not so do the mathematicians, in whose abstract realm a catastrophe is just a way of describing an abrupt change in the world of things and events. For, they say, discontinuous or divergent events are simply another aspect of continuous and stable events.

The way these theories manage to come about is apropos. The system for describing continuous phenomena mathematically — called “differential calculus” — was invented concurrently by an Englishman and a German in the 17th century. It took a Frenchman, René Thom, to come up with a way of describing divergences in 1968. The suitableness arises out of the common notion that the British and the Germans are staid, while the French are spontaneous. The general tendency of differential calculus was to lob off imaginative bypaths, to make the universe into a safely circumscribed, predictable clock.

And yet, catastrophe theory is, in a fundamental sense, as deterministic as differential calculus. Using it, one supposedly could predict the moment a dog will bite (by measuring the space between his lips) or when he will tuck in his tail and run (by measuring the angle between his ears and skull). In another and just as basic sense, however, catastrophe theory does relate positively to the imagination: It not only serves practical ends when it is applied through continually multiplying ramifications to diverse aspects of Nature and human society, but it also may serve as a model to explain, as catastrophist E.C. Zeeman says in the article cited in the epigraph above, how our imagination itself works. Let us speculate as to what implications catastrophe theory might have for at least two of the humanities: dramatic criticism and ethics.

Thom devised seven basic mathematical formulas with accompanying diagrams —some quite visually pleasing* — to explain his theory; and he gave them colorful names such as “butterfly” and “swallowtail” due to their resemblance to natural phenomena . Thom’s “laws” have supposedly become diffused throughout educated society by now; at least, that would seem to be the case, since a term, “tipping point”, an essential element in the catastrophe theory, has appeared in news reports on a variety of subjects during the past decade. Most of those reports have concerned the weather and economic predictions, but I would not be surprised to see the theory affect dramatic criticism; for, one of the basic facts of psychological catastrophes is that a person cannot be neutralized by two contrary emotions of equal intensity. “…(T)wo controlling factors are then in direct conflict,” says Zeeman. “Simple models that cannot accommodate discontinuity might predict that the two stimuli would cancel each other, leading again to neutral behavior. That prediction merely reveals the shortcomings of such simplistic models, since neutrality is in fact the least likely behavior.”[1] It is conceivable then that a theater critic of the catastrophe school could chop Hamlet all to pieces.

Interpretations of Hamlet the character are multiple (e.g., straight forward: his search for certainty before committing a distasteful act; influence of the Reformation: contemporary debate about the existence of Purgatory; Freudian: Oedipus-Complex; Mirror: other characters’ interpretations of Hamlet’s motives and actions as concentrated on their selves, and the audience’s interpretations). Heretofore, the variety of the interpretations has been held up as a sign of the superiority of Shakespeare’s psychological perspicacity. The “straight forward” interpretation — the one most generally adopted — maintains that the prince represents the ineffective intellectual (of reason divided against its self). Perhaps because of Shakespeare’s literary stature, the nearly flawless classical structure of this play, and the poetic quality of the lines the playwright puts into Hamlet’s mouth, such inaction has been considered a weakness in the character and not of the characterization. But what if someone eventually analyzes Hamlet the character through the lens of catastrophe theory and discovers that the prince is really just a poked bag full of contrary ideas and not of contrary emotions.

What would be necessary to accomplish this, of course, would be measurable parameters, something that could be graphed so that, say, a “butterfly” or a “swallowtail” catastrophe would develop. Each stage in Hamlet’s psychological development would have to be given some numerical value. However, it should not be more difficult to “measure” Hamlet’s speeches than it was to measure the lip-span of an enraged dog. Something of the sort has already been done with self-pity, which Zeeman declares can be measured directly. Self-pity, he says,

is a defensive attitude commonly adopted by children, and it often seems that sympathy is powerless to alleviate it. A sarcastic remark, on the other hand, may provoke a sudden loss of temper and, by releasing tension, may open a pathway back to a less emotional state. It is unfortunate that sarcasm should succeed where sympathy fails, but the cause of that irony is apparent in the model. The sarcasm brings an increase in frustration, and as a result the point representing mood travels across the behavior surface as far as the fold curve; having reached the extremity of the bottom sheet, it is forced to make a catastrophic jump to the top sheet, and self-pity is transformed into anger.[2]

Once the parameters as scales have been worked out, the catastrophic analysis should be applicable to the protagonist in any play, at least in any play purporting to fit the mold of a play having a protagonist who undergoes some sort of “recognition of self” event. If the protagonist does not measure up to the criteria (i.e., if his change is not adequately justified by the cumulative causes) then he might be declared an “incomplete protagonist”; and the play, a “flop”.

Many people — of the anti-behaviorist sort — would hate to see such a development, for the same approach could be applied to living beings. In fact, this is already being done in England, where doctors have used catastrophe theory in conjunction with trance therapy to cure girls of anorexia nervosa. It is one thing, of course, to say that a certain behavior has had sufficient precedent cause and quite another thing to create behavior because one knows what will be sufficient cause. The latter action is what essentially terrifies the anti-behaviorists, although they waste no time in transferring their distaste to the former.

The problem is at least as old as Socrates, who advised “know thyself”, and his fellow Athenians, who killed him because self-knowledge was exactly what they did not want. In the ethical sense, however, the unpleasant prospect is not that individuals might be manipulated by some scientist who knows how to “cue” them through catastrophic determinants, but rather that proof irrefutable might evolve out of such research that individuals have no free will — not any more than has a beaker of mercury.

Now, let us lower the level of our discussion from the abstract to the concrete and the particular, where I, at least, feel more comfortable. Take the instance of a young man in the military during the early 1960s — at the height of the Cold War: the Cuban missile crisis. This young man, who had previously accepted the world pretty much as he found it, began to read philosophy books for lack of anything better to do on a small island. Because of his reading, he started to question the beliefs he had taken for granted. Particularly, he began to see his country’s role in the world in a different, less idealistic light. All of a sudden he came to consider that the world might come to an end — and he with it — without his ever having had a chance to do something creative. He felt a strong impulsion to act; but he could not determine the right, the highest mode, of action.

It was not only that the problems were multifarious and overwhelming, but also that he did not feel secure about the sincerity of his own motives. He wondered how he could be certain he was not simply reacting to his environment, which was rather desolate and lonely, after all. He had had no problem joining the service; he felt that the four years he had signed up for was a small price to pay for all the past and potential opportunities he did and would enjoy. Nor did he feel strong animus against any of the other servicemen, not even those who outranked him, not any more anyway that he had felt toward civilians; they were serving their country as well as he, only with what he conceived to be a blindered dedication.

He wanted to be truer to himself than that: if only he could determine what his true self was. What if he committed some irreversible act on the basis of a strong but transitory faith and on the morrow became convinced that he had acted selfishly and foolishly? The poor fellow got so strung out that he began to analyze the motives behind his own thoughts.  He constructed little schematic problems for himself to solve, hoping in that way to abstract the issue to such a point that he could logically answer it in only one way.

For instance, he considered the possibility of a man who regularly craves candy bars discovering one day that he is a diabetic. The imaginary man tries to quit eating candy bars, and as long as he is active doing something else he does not even think about them. Occasionally, however, when he is alone and a candy bar is within easy reach, he is strongly tempted to eat it. The instant he thinks about how the candy bar would taste, that can be the only image occupying his mind. When he thinks about the diabetes, on the other hand, then that disease is the only subject uppermost in his mind. He knows, however, that he will have to act, for he cannot sit there forever, mentally hopping back and forth between desire and fear.

Therefore, the young man wondered, how can we say this fellow had any choice, since in the one second in which he has to act there can be only one thought in his mind? It seems simply fortuitous which thought happens to be there when he makes his move; and yet, whichever one it is, is the one that will “dictate” his choice. What “mechanism” is it that causes one thought to displace the other? Where does that displaced thought “go”? And what is the engine of the final, over-riding impulse that propels action?

The result was a nervous breakdown.

This seems to me to be at least partly how catastrophe theory relates to our artistic and ethical lives.

________

*(The reader can find more detailed information and illustrations on the seven elemental catastrophe graph forms on the Internet if he/she wishes to delve that deeply into the subject.  My purpose in this essay is not to discuss the mathematical aspects of catastrophe theory, but to explore the theory’s potential applications to dramatic plots and to ethical quandaries.)

[1]E.C. Zeeman, “Catastrophe Theory”, Scientific American (April 1976), page 65.
[2]Ibid., page 69.

 Finis

Continue reading

Probable Impossibilities in Films

By Bob Litton

The poet should prefer probable impossibilities to improbable impossibilities.
      — Aristotle, Poetics

It’s kinda  fun to do the impossible.
      — Walt Disney

I feel slightly handicapped as I write this essay, because I started out under a false assumption: I had thought Walt Disney was the first to broach the subject of “probable impossibilities”, yet I just learned that many famous people at least mentioned the concept before Disney, beginning, I suppose, with Aristotle. And I had wanted to hold off on mentioning Aristotle’s Poetics until I begin writing a future post essay on poetic theories.

My first hearing of the phrase occurred while I was watching one of Disney’s early television shows; it was, I believe, one of those ABC network specials just prior to the opening of Disneyland. I was in my early teens at the time and much enamored of all the Disney productions to that date. The scene I am recollecting is of Walt, his backside leaning against a desk in what apparently was his office, speaking directly to his TV audience: It was an educative moment.

The topic was our human ability to suspend, for the moment, rational thinking and accept — just for pure entertainment’s sake — the depiction of illogical events. Walt mentioned by way of example the instances where cartoon characters are seen running so rapidly and blindly that they venture several yards beyond a cliff’s edge until, finally realizing their predicament, they fall to the ground far below. To our open-minded imaginations it does seem plausible that we could continue a few yards — or feet at least — before we lose our buoyancy. The ever popular Roadrunner cartoons, where Wile E. Coyote continually encounters what would normally be considered death-dealing mishaps while chasing his eternal prey, Roadrunner, are prime examples of how we can find enjoyment in episodes depicting nothing but “probable impossibilities”. Those films push the concept’s envelope to such an extreme degree, in fact, as to practically render them “improbable impossibilities” — the very mode Aristotle disdained.

Now, to get to the real substance of what this essay is supposed to be about. I have had several informal discussions with an artist friend in which our critiquing of movies and TV dramas involved frequent mention of implausible sequences. The most common of these, certainly, are those interminable special effects renderings of the heroes diving through glass window panes (without incurring even a scratch) just before a structure blows up, or running ahead of automatic weapons fire (the bullets causing a series of dirt puffs behind them), usually without  being wounded. I would like to think that most of my countrymen are weary of that kind of redundant plotting, but the continual production of such shows warns me that is a pipe dream.

A much more subtle test of our credulity resides in two films by Nora Ephron: Sleepless in Seattle and You’ve Got Mail.  Now, don’t get me wrong; I love both of these films; I credit them with alleviating my depression for several months. And, as for the implausibility issue, my main interest here is the question: Why doesn’t it bother me? How can I imagine that real people might have progressed through these story lines?

At risk of insulting my readers’ intelligence, I believe I should outline the implausible elements, some of them at least, in each of the Ephron films.

Let’s take Sleepless first, since it was in fact the first produced. If you have never believed in fairy tales or even in “love at first sight”, then you will struggle to sit through the entirety of this movie. Imagine a woman falling in love with a man whom she has never seen, only heard on a radio talk show. Imagine a man, a grieving widower with an eight-year-old son to raise, falling in love the old-fashioned “at first sight” way while he sees the heroine walking through an airport terminal. Imagine the son and an eleven-year-old girl playmate finagling an airline ticket, getting the boy to the Seattle airport and on a New York-bound plane without anyone noticing. (Actually, something similar to that happened just recently, only the boy was nine and the route, Minneapolis to Las Vegas.) But, there’s more to it: mystical dialogue. At regular points during the story, two phrases get repeated by various characters; the phrases are “It is (or “was”) magic!” and “It’s a sign!”

New Yorker magazine movie critic Anthony Lane semi-panned the film partly because of the implausibility factor and because, consequently, it gave too little opportunity for Meg Ryan and Tom Hanks to bounce their chemistry off each other.

In You’ve Got Mail , Ryan and Hanks have plenty of opportunity to play off each other’s personalities, but the implausibilty factor assumes a more comical and, at the same time, sinister face. The story line is this: Joe Fox (Tom Hanks) is the youngest of a three-generation male line who open a large chain bookstore in New York that threatens the survival of a much smaller (children’s) bookstore owned by Kathleen Kelly (Meg Ryan). A media war between them develops. As contrast, a backstory line has the two business foes involved in an email relationship, one in which they preclude personal identity information but which gets so personal otherwise that they agree to meet. The rest of the story is too long even to summarize here, but I have given enough, I believe, to picture its convolution.

However, there are a few relatively minor elements in the film which I need to mention, for they all bear much on my topic. One is the obvious, huge improbability of a bookstore owner who has lost her cherished store not only forgiving her destructive competitor but falling in love with him. A related but even huger improbability is the likelihood of their being, at the same time, confidants through email exchanges. The funniest line in the film, to me, was Joe saying to Kathleen, near the film’s climax, “How can you forgive that guy for standing you up and not forgive me for this tiny little thing of putting you out of business?” That is Nora Ephron’s way of saying to her audience, “Look, I know this movie is absurd, just accept it as such and enjoy it anyway!”

A third, more comical, improbability is the Fox family makeup and their suitability as Kathleen’s in-laws. Firstly, Joe, who has just recently turned thirty, has an “aunt” (the daughter of his octagenarian grandfather) about six or seven years old, and a “brother” three to four years old: a highly unlikely makeup, although not impossible. But the subterranean improbability is Kathleen’s being able to accept her future father-in-law — a man who is the most hard-nosed competitor among the three generations and a cynic when it comes to business and relationships with women. The grandfather’s profile is not so harsh, especially when he relates that he had known, even exchanged letters with and possibly dated, Kathleen’s mother.

Now, the basic question is: How much of these improbabilities are we willing to tolerate in order to enjoy the cartoons and the romantic comedies?

I don’t feel qualified to assert that the infractions on our credulity should not impede our enjoyment or even to claim that most people will be able to enjoy all of them. However, as for the Roadrunner cartoons, I believe it is acknowledged that children and most adults will tolerate them enough if for no other reason than that the audience fully anticipated such absurdities from the beginning. It is, in fact, the absurdities themselves that are entertaining. It is only when we have seen such cartoons repeatedly that the continual mishaps, like chewing gum, lose their flavor.

The romantic comedies are a more difficult matter. We anticipate at the start of them that the representations of characters and events will closely resemble those in our everyday world, even though the description “romantic comedy” should have clued us that the story is going to be a step above reality, will be funny and will have a happy ending. Still, we can emotionally accommodate that stretch, even happily welcome it as a bit of relief from our own either humdrum or tragic lives. Also, there were the key phrases in Sleepless mentioned above (“It’s magic!” and “It’s a sign!”) that tell us we are watching a modern fairy tale and we should accept it as such. Of course, not everybody will be able or willing to do that much accepting, and we all have our limits for accepting; we have semi-conscious parameters for allowance; and it is usually only when those parameters have been breached that we realize what our limits are.

As for the suitability of the in-laws in You’ve Got Mail, that is something I could not have accepted had the movie progressed to the “meet-the-in-laws” stage; but it didn’t, so my anticipation of what the future holds for the lovers is as thin as gossamer and just as impermanent.

Finis

%d bloggers like this: