A Drama of Self: The Tipping Point

dilemma

PHOTO CREDIT: MS WORD CLIP ART

©2016 By Bob Litton. All Rights Reserved.

I’m curious: Do you see yourself as a character — in particular, the protagonist — in a screenplay? Ever reflect on the plotline, its beginning and all scenes since then, trying to figure out the other characters’ parts and the probable denouement? Or am I the only one so deeply solipsistic as to be constantly gazing on the internal screen? No, that can’t be the case, else the word “solipsistic” would never have been coined; they don’t make up adjectives applicable to only one person. Still, I find it difficult to imagine other people’s dramas, whether they be adventurous epics, tragedies or comedies, except as they tangentially affect my drama.

Many of us bloggers, I believe, use our blogs as candid diaries — electronic volumes open to the cosmic universe instead of little books hidden away in secret drawers. We can use them as depositories of our thoughts and feelings (mostly feelings), pretending that they are locked up in our computers, at first only peripherally aware that they are actually scattered across the planet and beyond. But then another part of us wonders how invisible and generally non-responsive readers perceive our outpourings. Mostly, all we can glimpse are their national flags. We are, then, self-analyzing split personalities.

So, desiring to be more honest than I have been during most of my life, I intend to relate the story of how I believe my solipsism became the major theme of an imaginary biopic; if one cannot repress a congenital tendency, then perhaps he at least can relieve the pressure by allowing it full expression, like steam from a teapot.

Going back to childhood meditations and actions, though I truly believe the habit really began that long ago, is beyond my capacity; the images are too fractured and vague. A clearer scene is more available in my nineteenth year, while I was in the air force and stationed on Okinawa, largest of the Ryukyu Islands. That was when I began to read very serious books for the first time; when, under the influence of the late British philosopher Bertrand Russell, I developed a longing to resolve all paradoxes; when I began to question my beliefs and especially every action’s motive. As a psychiatrist two years later put it, “You look at both sides of the coin and the edge too.”

An anecdote that quite well illustrates my message here concerns a book discussion group that one of the chaplains on the base initiated. As I recall, there were about a dozen of us airmen and civilians sitting in a circle at the first meeting, when the chaplain reviewed some nonfiction book and invited the rest of us to offer our comments. Then the chaplain explained that his performance was essentially a pattern he wanted us to follow when reviewing our own reading choices in future meetings. I, the eager fool, volunteered to present a review at the next meeting, a week later.

I had already been reading two books alternately: Arthur Koestler’s Reflections on Hanging, a critique of capital punishment; and some book whose title I cannot recall, a collection of historical narratives about various heinous crimes committed in England. While reading them I became aware of the dichotomy in my reactions to the books’ subjects: when reading Koestler my feelings reacted against capital punishment; when reading the other book my revulsion could be so strong in some cases that I believed no type of punishment could be harsh enough for the perpetrators: they were all hanged. That experience got me to musing over how much I was susceptible to weirdly and quickly varying attitudes, how my values could shift radically in just a short time, from the setting of one book down and the opening of another. Was my value system really that fragile and unstable? I wondered if this phenomenon was true of others, so I decided to try an experiment.

I do not recall the details of my mode of presentation, only that I alternated between summarizing various parts of each book and interpolating quotes here and there. I didn’t realize how long it was. I guess the chaplain felt the room was getting stuffy, for while I was reading he got up, went to a window and raised it. Shortly afterwards, one man, only a few years older than I was, interrupted me by asking, “Are we going to get a chance to discuss this? It sounds like a bunch of morbidity to me.” Another fellow murmured something about people who “should have gone to college”. I don’t remember how I responded or even that I did; I felt deflated and defeated; my lack of response was way too predictive of future encounters; I probably just said, “I’m sorry you feel that way.” The whole episode might have turned out better if I had begun the presentation with an explanation that I was conducting a psychological experiment; but, on the other hand, to have done so would probably have compromised the validity of the result.

When no succeeding review was announced, I went to the chaplain and asked him what was up. He replied that he had discontinued the book review sessions because too few people were participating.

During all my life since then I have from time to time pondered how we can act decisively in murky situations and dilemmas when our ideas and feelings react against each other. Just what is the “tipping point” as it has come to be nominated?

Finis

For more commentary on this topic, see my Dec. 15, 2013, post “To Be Or…Catastrophe!”

 

Shop Talk: Our Changing Language

© 2016 By Bob Litton > All Rights Reserved (except for quoted passages).

All right, I admit: I am consistent only in my inconsistency. That might explain why I am back into my blog, at least for this post about my native language. I consider it to be that important.

This morning I listened to WBUR.org.’s Tom Ashbrook — the regular host of the weekday “On Point” program — interview linguist John McWhorter, of Columbia University, about how the English language is constantly “morphing” (not “evolving”) and how we should accept the sometimes disconcerting changes as natural. I tried about half a dozen times to phone in and offer my input but each time got a busy signal, so I gave up. As an alternative approach, I am resuming the chair in front of my dormant blog.

As a former working journalist and sometime teacher of English composition, I have feelings about English grammar and expression just as fervid as my feelings about democracy. Discussions about either one cause me to grab my sword and buckler, figuratively speaking.

One of the offerings I had in store for Ashbrook and McWhorter was to assert that while it is true that, as McWhorter said, our grammar and spelling became crystallized in the 18th Century through such efforts as Samuel Johnson’s Dictionary, most of the subsequent deviations from the “rules” were actually sensible gestations attributable to easier pronunciation and reading comprehension. Former crimes like the use of such terms as “ain’t” as a contraction for “am not” (have you ever tried to say “amn’t”?), the beginning of sentences with “But”, and the ending of sentences with a preposition have now become established as acceptable in general communication, although they are still considered Nonstandard (U.S.) or Informal (U.K.) in academic, professional and business papers.

Another influence on the development of our language was the employment of Latin grammatical structure and definitions by our pioneering grammarians. Unlike McWhorter and his cohorts, I appreciate the historical efforts to maintain the rules of Latin grammar, even though those rules are not entirely symbiotic to English. In its early centuries, English was just as inflected as Latin:the uses of its words were determined by orthography, not by position in the sentence as they are now. The result of our language’s pupation is that now we have to be taught explicitly not only the case names but how and where they are to be used in sentences.

Most of us literary types, regardless of how tolerant we might consider ourselves, still retain, I believe, prejudices toward grammatical infractions. One fault which McWhorter dwelt upon that particularly irritates me, but which he finds perfectly acceptable, is the confusion of cases in pronouns. His example was the use of “me” (the objective case) as the subject of a sentence, a role normally reserved for nominative case pronouns, in this instance “I”. A typical erroneous sentence would be “Me and her went to the show.” For anybody who doesn’t see the problem, that should be “She and I went to the show.” Actually there are two problems here: one is the grammatical issue already noted; the other is a matter of etiquette — politeness dictates that we mention other people prior to ourselves. When “I” and “me” become legitimized as identical twins, then our language will indeed become chaotic.

Another modern infraction which McWhorter and Ashbrook discussed was the term “like”, used principally by teenagers as a meaningless interpolation during their jabbering, as, for example, “So my mom was, like, going ballistic because I didn’t get home before eleven last night!” I would add to that grievously ubiquitous error the phrase “you know”, which I constantly hear even educated guests repeating on Ashbrook’s show (and elsewhere); it seems to serve as a substitute for “uh”, the old-timey pause syllable many of us utter when we haven’t quite got our phrasing organized in the brain. Those terms wouldn’t be so annoying if they were used less, but many people employ them repeatedly within a single comment.

One caller, a teacher, astutely remarked that we need to try and inculcate Standard English into children’s minds if they are to cope well in society and business. McWhorter acknowledged as much but maintained that children are very capable of handling two and even more languages adeptly; they can readily use Formal English in their school papers and Informal, even slang, at home and among their friends on the street.

Another aspect of our changing tongue which McWhorter mentioned and which always fascinates me is the more glaring differences between the English of the Beowulf saga, Chaucer’s Tales, and Shakespeare’s plays; we need defining footnotes — in the cases of the first two, even facing page “translations” — to comprehend those works now.

I might add that we can include much 19th Century literature among the works that require footnote definitions or good guessing. Among these latter I can list George Eliot’s 1860 novel Mill on the Floss, which I have almost finished reading. In particular, there are some terms the less-educated characters frequently use which in my first encounters I had to read twice to glean what was meant. The heroine’s father, Edward Tulliver, for instance, has a habit of proclaiming his confusion with life, as in Chapter IX of Book III, where he says to his employee, Luke:

 ‘The old mill ’ud miss me, I think, Luke. There’s a story as when the mill changes hands, the river’s angry — I’ve heard my father say it many a time. There’s no telling whether there mayn’t be summat in the story, for this is a puzzling world, and Old Harry’s got a finger in it — it’s been too many for me, I know.’

The “’ud” and the “summat” most of us readers would easily enough interpret as “would” and “somewhat”. Also, nowadays even an uneducated character would say “many times” instead of “many a time”, but we get it. One might suppose that the “as” is a typo, but really it is an antique way of saying “that”.  A reader unacquainted with English folklore might wonder who “Old Harry” is, but the rest of us would recognize him as the Devil. The phrase that really caused me to pause, however, was that last one: “too many”: what I finally discerned Tulliver to be saying is “too much for me”.

Further in on their conversation, Luke says to Tulliver:

‘Ay, sir, you’d be a deal better here nor in some new place….’

In our age, we would say “a good deal” or “a great deal”, but Luke’s meaning there is clear enough. The term that confused me (and it actually occurs several times earlier in the novel) was “nor”; after a little head-scratching, I deduced that it stands for “than”. Wow, I said to myself, I wonder how that came about!

Finally, and on the same page again, Tulliver says:

‘But I doubt, Luke, they’ll be for getting rid o’ Ben, and making you do with a lad — and I must help a bit with the mill. You’ll have a worse place.’

Now, this one really stumped me! Nonetheless, I figured it out. Old Tulliver and other characters in the novel are actually using “doubt” for “believe”! You explain that one to me!

In spite of its confusing and frustrating aspects, my native tongue — and other languages, too, (I’m studying classical Greek right now) — fascinate me. Maybe I should have been a philologist.

Finis

Thanks

Thank you for visiting my blog, which I am dropping for art and health’s sake. I will leave it in cyberspace for anyone who might want to browse through the 43 months of archives.

Goodbye.

BL

Sports and Games

ancient-olympics

A foot race at the Olympic games in classical Greece (Bing Images)

© 2016 By Bob Litton

Back in my pub-hopping days one of the denizens at a Dallas bar described me as “that heavy dude”. He wasn’t referring to my physical frame but rather to my tendency to limit my conversations to serious topics. He had a fairly broad notion of what is considered as “serious”.

The immediate spur to his remark was my lament that all the local “watering-holes” seemed to be turning into sports bars, with multiple TV sets scattered throughout. One bar I used to frequent now has five TV sets on the walls of its two rooms, but the only time more than one person watches them is on Sunday afternoons in the fall when the gladiator contests known as American football are being  flushed through the cables. And they are attention-hogging during the endless days of the national championship events. Of course, baseball’s World Series and the triple-crown horse races draw a small bevy of viewers also. But mostly the TVs are there for “ambience” and to make you feel comfy when you’re the only lounger in the place.

I quit going to bars recently — even gave up beer — but primarily for reasons other  than the distraction created by those constantly blinking images on TV screens. Still, the sporting events were a significant part of my withdrawal. I’m just not a fan of sports, particularly of the contact sports such as football and hockey and especially the “extreme” sports that look to me like glamorized street-brawling.

Funny thing, though, is that, when the only stool available at the pub’s bar is directly in front of a TV, I get drawn in. If it’s a contact event I instantly begin to silently root for the “underdog”. If it’s a solo event, such as golf, I become hypnotized by the ball’s behavior. The ball takes front-and-center status only when it’s either  in a sand pit or lands on the green, especially on the edge of the green: that’s when the drama starts; it is rescue and putting time. I quietly gasp in awe as the seemingly self-determined white globe rolls serenely past the hole then does a U-turn, returns, and plops in (I’ve actually seen that). Now, that’s entertainment! But it’s all provided by the ball; I don’t give a hoot about the golfer; don’t even pay attention to his name.

To me, golf is not a sport. True, there is calculation, concentration and occasional slight exertion (at tee-off) involved, but no strenuous action by the body. No, golf is strictly a game.

This is where it gets complicated: What is the difference between a game and a sport? The terms are often used interchangeably by the players and the commentators. Both are contests, but in my view a game is a contest between calculations (aka “strategies”), while a sport is a contest between skills and endurances. There is a degree of calculation in sports, I acknowledge, but it is not what draws the fans and it is not the primary element in winning, while in games it is all that matters.

I read that over the past few years players of chess, poker and bridge have petitioned the International Olympics Committee to include those games in the quadrennial show. What’s next, tiddlywinks?

I also read that the field sports people also want to be included. I don’t know what the IOC has against their inclusion. Could it be that they require too much space? Or perhaps it is because those sports are not sufficiently universal. I am glad that auto-racing is not an Olympic event; it is really just a contest of mechanics’ skills and draws viewers who basically only came to see wrecks and perhaps be treated to the sight of a body being toted off the track on a gurney.

To me, the truest sports are those such as boxing, wrestling, soccer, tennis, swimming, and tumbling, where the human body is fully tested for strength and vigor; and the brain is tested for strategy and constant calculating. Also, in boxing and wrestling the “violence” is minimal and no harm to the opponent is intended; the victor wins on points, not on knockouts.

I recognize that my view of the Olympics differs from the original events in ancient Greece. The Greek city-states put a lot of pressure on their athletes: if a contestant dared come home without a laurel wreath, he was shamed; if he came home a victor, on the other hand, he was treated royally and became a celebrity. Also, the boxing events were bare-knuckled and bloody, pretty much like our modern “extreme sport” boxing.

I wonder if the IOC will ever include TV sports-watching in its lineup of events. I might try out for that, although I have no doubt that I wouldn’t even survive the preliminary trials.

Finis

Class War: A Perspective on Wealth

weaalth distribution in U.S.A.

Source: Bing Images

©2016 By Bob Litton

WEALTH
1a. An abundance of valuable material possessions or resources, riches
1b. The state of being rich, affluence
2. Goods and resources having value in terms of exchange or use
3. A great amount, a profusion
The Free Dictionary (online)

* * * * * * * *

As long as I have been aware of societal divisions into classes I have hated the whole idea of a caste system and not strictly because of any income gaps. No, I am repelled by the notion that somebody could believe that he or she is irreversibly superior to me by divine right or other source such as a congressional appointment. I got my first taste of the totem pole culture while in the air force when I learned that I was expected to be always the first to salute and that I must always surrender place when even the spouse of an officer picks up mail at the postal window. That very much offended me and I still bridle a little when memories of those incidents come to mind.

Nor do I have any appreciation for the terms “upper class”, “middle class” and “working class”. I guess we are supposed to be thankful that the words “peasant” and “slave” are no longer generally descriptive of people in the United States and most other countries, but that is not enough: the whole class system must be erased entirely. (Sadly enough, slavery— or “involuntary servitude”— is still irritatingly present, although illegally, in my fatherland.) In the United States, class distinctions are not generally based on bloodlines as they have been in Europe and in Asia but on wealth, although family connections were more noticeably determinate up through, perhaps, mid-20th century.

During the last couple of decades the topic of “income inequality” has often appeared in newspaper and magazine articles and columns. Repeatedly the image of a very small portion of the U.S. population—the so-called “one-percenters”, those whom Thorstein Veblen called “the leisure class”— has accumulated more than a third of the nation’s wealth, and the next 19 percent possess more than 50 percent, leaving the remaining 80 percent of our citizens with only 15 percent of our national treasure. How did that happen?

Now the conservatives like to argue that the super-affluent obtained their riches through hard work, thrift, and prudent investments. To a limited degree that is true for some of the rich but not, I believe, for all of them. With the exception of “prudent investments”, those attributes cannot logically account for the vast wealth gained by the one-percenters. A person would have to enjoy an extremely high hourly wage to get wealthy through “hard work” (a phrase I handily contemn). Of course, most of us are aware of the ridiculously high “salaries” and bonuses lavished on corporate executives, even when their companies are losing money and are letting the CEOs go with “golden parachutes”.

In an April 24, 2014, column Harvard economist and regular The New York Times contributor Paul Krugman wrote that the primary route to riches for most of the one-percenters is not by way of “hard work”. Krugman applauded a recently published economics book by French economist Thomas Piketty, Capital in the 21st Century, in which the author asserts that the affluent don’t get rich from enterprise but from assets gained mostly through inheritance. Piketty calls for “progressive taxation as a way to limit the concentration of wealth”, wrote Krugman. Conservative critics have responded with ad hominem attacks, calling Piketty a “communist”, Krugman noted, because they cannot come up with any substantively valid arguments to refute him.

Some roads to riches, however, do involve initiative and energetic endeavor—along with considerable native intelligence. Two of the richest men in the U.S., for instance, started their eventual capitalistic enterprises while still in school, with assistance from classmates. Microsoft co-founder Bill Gates wrote his first computer program at age 13 while in prep school and went on to refine his geek skills, with college classmates, to a point where he could start-up Microsoft. Mark Zuckerberg launched Facebook working with four college classmates in their dorm rooms. More about those two later.

Many young people of our time, though, seek to win fame along with fortune in either entertainment or sports. The most worrisome thing about this trend, for me, is that only a very small number attain the stature and earnings they had hoped for. And the “earnings” of those who do seem as ridiculously out of proportion as those of the corporate executives.

Harrison Ford, for example, reportedly received from $10 million to $25 million (different sources cite different amounts) upfront for his final appearance as Han Solo in The Force Awakens (2015), along with .5 percent of the film’s gross earnings. Contrast that with the $500,000 his contracted base pay was for Return of the Jedi (1983), the $100,000 for The Empire Strikes Back (1980), and the $10,000 for Star Wars (1977). The .5 percent on gross sales of course significantly augments those figures.

Another major Hollywood figure, Carole Lombard, was the highest paid cinema star of 1937, during the Great Depression. Of course her earnings that year ($485,000) did not come anywhere near Harrison Ford’s, but we must allow for inflation. She did earn $150,000 for each picture, definitely exceeding the amounts Ford initially received for his first two Star Wars films. The main reason I mention Lombard here, though, is the interesting tidbit I picked up from an August 25, 1938, article in The Mercury. She paid four-fifths of her 1937 earnings on taxes; after that amount plus various incidental expenses such as her press agent’s fee, her net income was about $20,000, according to Mercury.  “‘But I have no kicks,’ she [said]. ‘I am pretty happy about the whole thing, and 20,000 dollars a year is plenty.’ She added: she was glad the government was spending the rest on public improvements….’”

Then there is the music industry. I recall viewing the film The Glenn Miller Story in 1954. It starred Jimmy Stewart as Miller and June Allyson as his wife Martha. I now can recall only three scenes from it, and even those only vaguely. The scene related below is the only one pertinent to this essay. (I transcribed the dialogue from the film as I viewed it recently on YouTube):

Miller’s parents come to visit him, Martha and their infant child in their new home — a mansion for the times. While Glenn and Martha lead his parents up the wide stairway, his father inquires about how his son has managed to pay for the house on a musician’s earnings:
Pop: “Paid for, is it?”
Glenn: “O yeah, yeah, all paid for.”
Pop: “Must be doing pretty well.”
Mom: “O yes, he’s doing pretty well. Don’t we hear him on the radio every night?”
Pop: “That’s only 15 minutes. Don’t suppose they pay very much for that.”
Mom: “Well, there’s the records, and he’s playing at the Hotel Pennsylvania.”
Pop: “How much do they pay for playing on one of those records, son?”
Glenn: “We get three cents a record.”
Pop: “Three cents, huh? Have to sell a heap of records to make it worthwhile, don’t you?
Mom: “But they do, dear.”
Pop: “How many copies of a record do they sell, son?”
Glenn: “O, of ‘Moonlight Serenade’ we sold about 800,000.”
Pop: “Did you say 800,000?”
Glenn: “That’s right.”
Pop: “O! Heh, heh, heh.”
Now, if you suppose that Glenn Miller’s orchestra spent a full 8-hour day producing “Moonlight Serenade”, then they grossed $24,000 on that one record. Of course not all of that went to Glenn, there were the members of his band and presumably a studio rental and sound technicians to cover, not to mention some income tax. Still, that was just for one record; when you consider similar days for a whole work week, Glenn still came out pretty well.

The above scene brings to the fore an important question. We may be amazed and even disgusted at the huge amounts recording artists make from their records and live performances; but when we look at it a little more objectively, three cents is a really paltry amount on a single record. It is only when we multiply it by the 800,000 purchasers that the earnings jump significantly. And the world’s consumer market has increased tremendously since the 1930s, when Miller was just starting out. How can we begrudge some musician three cents on a single record?

As far as record sales go, the gain hasn’t increased all that much since Miller’s time. From what I have been able to gather online, the most popular musicians of today—the “rock stars”—receive only 75 cents to a dollar on an album. No, most of their wealth comes from live performances and T-shirts. An average box office “take” for a live performance was noted as between $150,000 and $200,000; but the fee for venue rental can be as high as $50,000, and there are the truck drivers and “roadies” wages to pay. Still, one source claims that each member of the rock band Metallica has had from $5 million to $10 million in the bank from their start-up to the present.

The earnings of major sports have become similarly ludicrous. In December 2015, basketball star LeBron James signed a lifetime contract with Nike to act as their brand-enhancer. The exact amount was not revealed, but ESPN reporter Darren Rovell estimated it could be worth $1 billion. In 2014, James joined the Cleveland Cavaliers for $22 million a year. And, Rovell wrote, James “ranks as sixth on Forbes 2015 list of highest paid athletes.” James’ total wealth—from endorsements and business ventures as well as from playing basketball—has been estimated at $64.8 million.

During the same month (December 2015) that James contracted with Nike, pitcher David Price finalized a $217 million, 7-year deal with the Boston Red Sox. According to Jimmy Golden of the Associated Press, the terms of the contract are that Price be paid $30 million a year for 2016-2018, $31 million in 2019 and $32 million in each of the final three years.

Professional sports teams that used to be filled by white men only are now predominately black. I don’t know what happened to the white guys: Are they physically unable to compete anymore or are they too racist to engage in the try-outs? As for the blacks, professional sports teams have become the equivalent of the 1840s gold mines; they apparently dream from childhood on of becoming sports heroes; sports has become their pathway to success and financial security. And the team owners are playing this longing to the hilt: I read an article not long ago that related how scouts have been venturing to a certain country in Africa (which one I don’t recall) to recruit youngsters to come to the USA and show their stuff; unfortunately, a large percentage of youths who venture westward cannot make the grade, not because they aren’t talented enough but because there aren’t that many positions open.

Another source of over-the-top income is gambling, either in the stock market or in the lotteries.

Oprah Winfrey, for instance, bought 6.4 million shares of Weight Watchers stock in October 2015 for $43 million. Almost as soon as this newsy item was out, Weight Watchers stock skyrocketed by 90 percent, according to ABC News. Winfrey said she invested so heavily in the company—which was “struggling with declining sales and a looming debt of $144 million” (ABC)—because Weight Watchers had helped her and millions of others with their weight issues. (The prestige derived from her joining the board of directors very likely helped them, too.) In February 2016, however, Weight Watchers stock declined 29 percent, according to USA Today (Feb. 26, 2016), and as of that date Oprah had lost about $29 million on her investment.

Then there is the much less admirable mode of gambling in which a hell of a lot of poor people engage: the lotteries. As far as I am concerned, this is a national sin. Yet the fact that lottery winnings are so absurdly astronomical testifies to the willingness of many of my compatriots to be gulls. I recall a news story from 1987 about a New York City janitor who won $5 million in a lottery, went to work the next day and, as he was about to climb a ladder to screw in a light bulb, another bulb lit up (in his head). “What am I doing this for?” he wondered, “I’m a millionaire.” Fifteen months later, according to a 1992 New York Times story by Alessandra Stanley, the lottery winner died in an automobile accident; and, since he left no will, his family had to spend a lot of time and money on lawyers and accountants plus income and estate taxes before they could extricate themselves with an estimated $400,000. Many other lottery winners’ stories have reportedly ended in similar Dickensian tragedies, according to what I have seen on the Internet.

But the most ridiculous money-grubbing story I ever read about concerns that silly little ditty known as “The Birthday Song”. If you live in the English-speaking regions, you probably know the words to the song. (I won’t dignify them by calling them “lyrics”, as some people do.) And if you don’t know the words, they are absolutely simple to learn; no memorizing necessary. It goes like this: “Happy birthday to you/Happy birthday to you/Happy birthday dear Nancy (or whoever)/Happy birthday to you.” It is usually sung at birthday parties; Marilyn Monroe sang it to John F. Kennedy at a celebration for him, but it is a universal tradition for anybody’s birthday in my country.

The ditty, you will note, contains only four words with an additional two stuck in as the addressee. And when it was supposedly composed by two sisters in Kentucky in 1893 no thought was taken as to copyright. The two ladies, Patty and Mildred J. Hill, used the ditty simply as a tool for teaching young children to sing. It reportedly first appeared in print in 1912, still without credits or copyright notices. Then, in 1935, the Summy Company registered a copyright. That company was bought by Warner/Chappell Music in 1988, when “Happy Birthday” had an estimated value of $5 million. Groups larger than small gatherings of relatives and friends had to pay royalties to the company for the opportunity to chirp the nonsense. For one such opportunity, in February 2010, the royalties reportedly amounted to $700. According to the Wikipedia article where I read up on this farce, “the song is the highest-earning single song in history, with estimated earnings since its creation of $50 million.” In addition, legal battles over the copyright issue went on for decades until February 8, 2016, when Warner/Chappell accepted a final judgment declaring the ditty to be in the public domain. For an entertaining summary of the lurid history of “Happy Birthday” I refer you to the Wikipedia article.

So, what can we do to “level the playing field” in economic, not sports, terms? Not a whole lot, I’m afraid, for greed and thievery will always be part of the human makeup. There are some proposals and movements, though, that seem promising to a small extent.

One is that old one FDR applied during the Great Depression and to which I referred when discussing Carole Lombard’s patriotic attitude: raising tax rates on the rich. Such is not going to happen, however, as long as the Republicans dominate Congress. Anyway, to me it seems a Sisyphean solution, attacks the symptom, so to speak, rather than the problem, and would certainly aggravate the tensions between rich and poor. But it might stabilize the income gap until a more satisfactory solution can be instituted.

A sort of obverse to that approach is what has been termed a universal basic income (U.B.I.), which New Yorker staff writer James Surowiecki wrote about in his June 20, 2016, column. The tactic here is to pay every U.S. adult a stipend of, say, $10,000 a year (children would receive a smaller amount). An experiment on this idea was tried in Dauphin, Manitoba, Canada, in the mid-nineteen-seventies, and, although a conservative government buried it quietly in 1979, later research indicated that while the guaranteed basic income was in force hospitalization rates had fallen, more teenagers had stayed in school, and work rates had only barely dropped.

New experiments on U.B.I. are currently underway or planned in Finland and in Oakland, California, Surowiecki reports. He writes: “In the U.S., the new interest in the U.B.I. is driven in part by how automation will affect workers. Bhaskar Sunkara, the publisher of the socialist magazine Jacobin, told me, ‘People are fearful of becoming redundant, and there’s this sense that the economy can’t be built to provide jobs for everyone.’”

I’m all in favor of a U.B.I., but even it might leave a certain discontent in people’s minds—the yearning to be useful and creative. I am too cynical to believe that every adult in the U.S. has enough imagination and energy to discover and develop a creative purpose or function or vocation on his/her own just to preserve his mental health. I can only hope I am wrong.

As long as we have businesses and industries that still employ people and that hope to retain their work force for a long period, another approach might fit: Employee Stock Ownership Plans (ESOPs). Actually, ESOPs have been around for years now, becoming popular in the mid-nineteen-seventies. According to the National Center for Employee Ownership, by 2014, seven thousand companies had ESOPs covering 13.5 million workers. I will let NCEO describe the system themselves, for they can do it more clearly than I:

“Similar to Profit-sharing plans, the ESOP is a trust fund into which the company contributes new shares of its own stock or cash to buy existing shares….Shares in the trust are allocated to individual employee accounts. Although there are some exceptions, generally all full-time employees over 21 participate in the plan. Allocations are made either on the basis of relative pay or some more equal formula. As employees accumulate seniority in the company, they acquire an increasing right to the shares in their account, a process known as vesting. Employees must be 100% vested within three to six years, depending on whether vesting is all at once (cliff vesting) or gradual.

“When employees leave the company, they receive their stock, which the company must buy back from them at its fair market value (unless there is a public market for the shares). Private companies must have an outside valuation to determine the price of their shares. In private companies, employees must be able to vote their allocated shares on major issues, such as closing or relocating, but the company can choose to pass through voting rights (such as for the board of directors) on other issues. In public companies, employees must be able to vote on all issues.”

There is more and important information in the NCEO statement that might interest you, but I will have to refer you to NCEO’s website (www.nceo.org) to read it, for my essay is already too long and I have a bit more to write.

All the media coverage over the huge disparity between the incomes of the super-rich and the rest of society apparently has had some impact: The Giving Pledge. According to its Wikipedia article, the Giving Pledge’s goal “is to inspire the wealthy people of the world to contribute the majority of their net worth to philanthropic causes, either during their lifetime or upon their death. The Pledge is a moral commitment, not a legal contract.” In June 2010, billionaires Bill Gates and Warren Buffett formally announced “the Giving Pledge campaign” and began recruiting members. By August, forty people had pledged $125 billion. As of March 2016, one hundred forty-two individuals or couples had pledged an aggregate total of $731,860,000,000.

A year or two ago, before I had even heard of The Giving Pledge, I read a comment by Melinda Gates (Bill’s wife) in some news article to the effect that she didn’t need a billion dollars to live on and was planning to give some of her wealth away. I have long been suspicious of Bill because of his viciously aggressive business tactics, but I was also pleased by his reported charitableness: he reportedly has donated many, many computers to children in Africa. I realize that could be a subtle business tactic, too, since it might lead to future purchases of his Microsoft products in the future, but why “look a gift horse in the mouth”? (Come to think of it, the Trojans might have done well to have done just that!)

As for Warren Buffett, he has been one of my favorite people for several years now—ever since he urged Congress to make his tax rate higher than his secretary’s. If he approves of Bill Gates enough to associate with him in this Giving Pledge organization, then I guess I’ll have to accept Gates as okay, too.

The top five donors on The Giving Pledge roster are Bill and Melinda Gates ($77.3B), Warren Buffett ($66.7B), Larry Ellison ($49.3B), Michael Bloomberg ($37.2B), and Mark Zuckerberg and Priscilla Chan ($35.7B).

According to the Wikipedia article, “The pledge does not involve pooling money or supporting a particular set of causes or organizations. The pledge asks only that the individual give the majority of their wealth to philanthropic causes or charitable organizations either during their lifetime or in their will….The pledge encourages signatories to find their own unique ways to give that inspire them personally and benefit society.”

I don’t know whether my curiosity derives from good old-fashioned journalistic instinct or from dirty old cynicism, but I wonder what these people’s motives are. Could they be reacting to the threat of a possible new revolution of the French sort? (You might recall that one year later the “Occupy Wall Street” movement began in New York City.) Could they be honestly sensitive to the inequity of the wealth disparity? Could they have concluded that a hyper-tax is looming ahead and want to determine for themselves where and how their contributions are to be spent? I can’t answer those questions, and I don’t think it is necessary that I do so. Although the Giving Pledge is not likely to benefit me individually or directly, if it reduces the number of solicitations for contributions that show up in my mail box each December, then I will be pleased.

Finis

 

 

Meaningfulness

© 2016 By Bob Litton. All Rights Reserved.

Peter: Jesus, you are my Ground of Being!
Paul: Lord, you are my Ultimate Concern!
Jesus: Whaaattt?

This past Friday, my friend Chris and I met in my humble lodging for our regular bi-weekly, two-hour conversation and coffee-sipping. Over the past two months, we have been viewing DVD lectures by the late philosophy professor Robert Solomon, a specialist on Friedrich Nietzsche (N.). Solomon’s wife, Professor Kathleen Higgins, also a Nietzsche scholar, participates in the series. The lectures are about N. — his life, personality, and philosophy — of course; but interspersed among all of them are some comments on previous philosophers who had positively influenced Nietzsche, such as Arthur Schopenhauer (S.), and those who had negatively affected him, such as Socrates. This essay is partly my own take on S.’s and N.’s views concerning the meaning of life. The later part is my own view of purpose and meaningfulness — what the philosophers call teleology.

I have read very little of S. the pessimist, for I don’t need to read anything that will make me more depressed than I already am. Besides, everyone who is literate in Western philosophy, even in the most minor degree, has read or heard that S. considered life as essentially “suffering and death”, and that, given the choice of whether to live or die, the better option would be to die, but that an even better option would be not to have been born.

What I did not know, however, and one of the bits of interesting notions in Schopenhauer’s weltanschauung, is that S. eschewed Immanuel Kant’s view that one could justify life and find meaning through rationalism, and progress through rationalism to the Christian faith, according to Prof. Solomon. A more visceral response, particularly through an aesthetic appreciation of music, was more effective, S. believed. The benefit of music S. attributed to its abstractness as contrasted with the representational character of pre-20th century visual arts. Listening to, and contemplating, music, he held, would lift the suffering human out of his or her pointless individuality into a consciousness of a larger Reality, or “life as a whole”. But, as I mentioned in one of my early poems, that lift can last only as long as the music lasts.

I have read a few of Friedrich Nietzsche’s works but, unfortunately, not the one which is most pertinent on this topic, The Birth of Tragedy. So, I will have to rely again on Prof. Solomon’s — and Prof. Higgins’ — interpretations. They say that, while N. agreed with most of what S. had to say about life being almost totally a matter of suffering and death, he differed with S. on finding it pointless. Where S. postulated that humans proceed from desire or hunger to satisfaction and back to desire/hunger, always longing for complete satisfaction or contentment (picture a “couch potato”) and never finding it; N. believed that absolute and permanent contentment is not really any human’s desire at all. Rather, N. theorized, meaning is to be found in the passions, i.e. dedication to a person, to a project, or to an art can give meaning to life. Here, again, arises the question of how long that passion can last.

One of the best though tardiest lessons I ever learned was that regular settings and reviews of goals are very important. I recall reading, while a senior at the university, an article that related how frequently college seniors commit suicide. Of course, several reasons can cause young people to kill themselves; the later teens and early twenties are emotionally tumultuous years; but what struck me about this article was that it was specifically about college seniors who were soon to graduate. Either the article stated or I inferred (can’t recall which) that the most likely cause for many of those deaths was that the students had not set any goals beyond college; campus life was all that had mattered to them, and they could not see anything meaningful beyond it.

It is indeed an interesting contrast between S. and N. that while the first sought respite, the latter sought strife (not strife against other people but a continual struggle within the self to make one’s self better). N.’s view is very much in keeping with that of the ancient Athenians. Consider the following passage from Thucydides’ The Peloponnesian War, in which a Corinthian ambassador, while urging the Spartans to aid them in their conflict with Athens, criticizes them for their lackadaisical attitude:

“The Athenians are revolutionary, and their designs are characterized by swiftness alike in conception and execution; you have a genius for keeping what you have got, accompanied by a total want of invention, and when forced to act you never go far enough. They are adventurous beyond their power, and daring beyond their judgment, and in danger they are sanguine; your way is to attempt less than your power justifies, to mistrust even what your judgment sanctions, and to think that there will be no end to your dangers….So they toil on in trouble and danger all the days of their life, with little opportunity for enjoying, ever engaged in getting: their only idea of a holiday is to do what the occasion demands, and to them laborious occupation is less of a misfortune than inaction and rest. In a word, one might truly say that they were born into the world to take no rest themselves and to give none to others.”[1]

As for myself, I believe that the most contented people are also the most active people. To that extent I certainly agree with N. But I also believe that there is a Reality — a spiritual Reality that surrounds us and yet is much too much beyond our capacity to understand. Each of us must search and discover it on his/her own without over-reaching.

A recent NOVA episode on PBS hosted by astrophysicist Brian Greene reveals how the latest frontier of cosmology has forced scientists into theories they are sometime embarrassed to present. One of them is that our universe is actually two dimensional with an edge to it that is comparable to a holograph. Also, they say that space is nowhere empty, not outer space nor molecular space, but that in every part of it “things” are constantly moving, from particles to planets; and that space is not like a vapor but more like a piece of pliable material that can bend and be stretched. Even more nonintuitive: There is no past, present or future; there is only NOW.

I do not mean to imply that all of this new scientific theory-developing is an argument for a higher being: most of the scientists, I think, would deny that absolutely. All I am saying is that, as S. and N. should have, we should refrain from placing absolute designs on “the real world/universe” until a good deal more evidence is in, probably beyond my own remaining lifespan.

In the meantime, we can each discover our own Higher Power (I read that there must be 7.4 billion of them about now), purposes and life-meanings. Let’s just don’t try to impose them on others.

Bob Litton, March 1967, reading An American Tragedy in Wesley-PCF office

Bob Litton at Southern Methodist University in 1967.

[1] Thucydides, The Peloponnesian War, trans. Richard Crawley, ed. Sir Richard Livingstone, (Oxford University Press:1943), Book I, ¶70.

Finis

NOTE TO READERS: For some reason I don’t know, WordPress.com (WP) does not allow non-WP bloggers to register “Likes” on my or other WP bloggers’ posts. However, anyone can enter a comment in the “Comment” box and it will be published, after I have “moderated” it. I am inviting non-WP bloggers to comment, even if it just to say “Like” or “Don’t Like”. And, although I prefer positive comments, disagreeing or critical remarks are fine, too, especially if they might help me improve my writing; but no snarking, please: that’s rude!
— BL

Another Twist of the Kaleidoscope[1]

Greek Amphora

An ancient Grecian amphora: Image Source > Bing Images

© 2016 By Bob Litton. All Rights Reserved.

I am in a strange position right now. On the one hand, I have three topics in my noggin, each deserving extended composition. On the other hand, they all require more research than I have devoted to them thus far, if they are to be “done up” right. Yet it has been eleven days since I published my last post, and my ego is supposing that some regular — but non-“Following” — readers are getting a bit antsy after returning often to my blog site and finding nothing fresh. So, my only recourse is to compose a potpourri of short opinions/insights. (Well, actually there are a couple of other options, but I don’t want to go down that “rabbit trail” right now.)

I

About twenty years ago, in Dallas, I bought a set of classical Greek language texts published by Cambridge University Press. I purchased them because I had been reading translations of the early Greek tragedies and Thucydides’ History of the Peloponnesian War and wanted to read them in the original language. I had noted some editors’ comments that the playwright Euripides, the historian Thucydides, and the philosopher Plato, were superb stylists. I had been a good student of Spanish, French, Chinese, and Old English (Anglo-Saxon), so I did not anticipate much difficulty with Greek, although I figured that the Greeks’ odd-ball alphabet would annoy me for a while. By Zeus, was I wrong! All the diacritical marks, the dizzily varying declensions and conjugations, and the swamping mass of vocabulary to learn frustrated me. I got as far as Section VII (out of XIX), laid my books aside, and went on to other interests. Twice over the next two decades I started the Greek again — at Section I. (I got that one down pat, by the way!)

A couple of months ago, I dove back into the translation of Thucydides and was freshly astonished by the parallels with current events. If you read the Greek statesman Pericles’ oration at the memorial service for the first Athenian warriors killed during the Peloponnesian War, you too, I believe, will be struck by the similarity of Pericles’ claims for Athens’ “exceptionalism” to American politicians’ claims for our homeland’s superior qualities. Thucydides also lays out in bold yet unbiased descriptions the virtues and faults not only of Athens but of Sparta, Corinth, Thebes, Corcyra and other city-states as well. He also analyzes the characters in their actions and their motives. The people as a whole are scrutinized with equal clarity. The acts of heroism and of treachery are rendered vividly.

I possess the first two (of four) volumes of Harvard University Press’ Thucydides, with Greek printed on the left-hand pages and English on the right. However, I have delved into the first volume only as far as the first 70 pages. The version I read all the way through, years ago, and am perusing for the second time is the 1874 translation by Richard Crawley, heavily abridged by Sir Richard Livingstone for the Oxford University Press in 1943, during the hottest period of World War II. It is only 388 pages long (not counting two maps and an index) with the pages measuring 9×15 cm. Still, condensed though it is, Livingstone’s offering provides a full sense of the flavor and drama of that conflict — the “world war” of its time. Especially perspicacious is Thucydides’ analysis of the class warfare between the aristocrats and the democrats, which led into the general war. I have excerpted the sentences below from his commentary:

Revolution brought on the cities of Greece many calamities, such as exist and always will exist till human nature changes, varying in intensity and character with changing circumstances. In peace and prosperity states and individuals are governed by higher ideals because they are not involved in necessities beyond their control, but war deprives them of their very existence and is a rough teacher that brings most men’s dispositions down to the level of their circumstances. So civil war broke out in the cities; and the later revolutionaries, with previous examples before their eyes, devised new ideas which went far beyond earlier ones, so elaborate were their enterprises, so novel their revenges. Words changed their ordinary meanings and were construed in new senses. Reckless daring passed for the courage of a loyal partisan, far-sighted hesitation was the excuse of a coward, moderation was the pretext of the unmanly, the power to see all sides of a question was complete inability to act….

The cause of all these evils was love of power due to ambition and greed, which led to rivalries from which party spirit sprung. The leaders of both sides used specious phrases, championing a moderate aristocracy or political equality for the masses. They professed to study public interests but made them their prize, and in the struggle to get the better of each other by any means committed terrible excesses and to still greater extremes in revenge. Neither justice nor the needs of the state restrained them, their only limit was the caprice of the hour, and they were prepared to satisfy a momentary rivalry by the unjust condemnation of the opponent or by a forcible seizure of power….[2]

Appear familiar? Of course, history does not repeat itself in a symmetrically balanced manner; there are some differences from that situation in ancient Greece and today’s world; but I believe there are more analogous than non-analogous elements, both in our Congress and in the world entire. In fact, I am so enamored of Thucydides’ work that I believe our senators and representatives should be required to take a month-long course with this book as their text before they assume office, or perhaps even before they run for office, and attain a passing grade.

II

 Do you not know that your body is a temple of the Holy Spirit within you, which you have from God, and that you are not your own? For you were bought for a price; therefore glorify God in your body.
                                                                                                — I Corinthians 6:19-20

If there are any anti-spiritual types out there in Cyberland, I beg your pardon, but I feel a calling to preach a bit here. Oh, don’t worry overmuch; it’s not a fire and brimstone message; really more of an extended pet peeve with an ounce of theology sprinkled on to give it some authority. Although I matured in the Methodist Church and even considered a few times becoming a minister, I argued myself out of it by pointing at the Apostles’ Creed and grunting at the several elements I could not honestly adhere to. But that is all fodder for some later blog post; not now.

The above passage from Paul of Tarsus, however, resonates with me for two reasons. Firstly, it brings forward the image of my favorite pastor during those young years, Clark Calvert: he was my mentor, even a sort of father figure for me, and he used that verse to counsel me. Secondly, I appreciate the image conjured by the verse itself: my body as the eternal residence of the Holy Spirit. To be perfectly frank with you, dear reader, the Holy Spirit is the only Person of the Trinity I feel that I can comprehend and be comfortable with. God the Father is too abstract and paradoxical, especially when I consider the old conundrum about Evil; and Jesus of the New Testament — “The Son” — has too many faces and does and says self-contradictory things, like some protagonist in a Jacobean tragedy. The Holy Spirit, on the other hand, is definitely comprehensible to me; he has a definite, singular role to play: to act as our guide, comforter, and advocate. And I believe He/She/It has done all that for me many times. Naturally, I don’t always respond positively to the nudges, but I recognize my responsibility when I recalcitrantly plunge ahead at the suggestion of my impulses.

But let’s return to the image of the body as the temple of the Holy Spirit. Lately, like within the past couple of years, I have become inordinately conscious of my appearance and, even worse, of the appearance of others. Of course I realize that, aging as I have, I would become more aware of the changes in my body, particularly in my face; giving up three molars during the past twelve months certainly highlighted those changes! I really do not take good enough care of myself, and I cannot fathom why. Is it just laziness or perhaps a self-contempt expressing itself physically?

But it is my view of others that really bothers me. I judge people constantly, especially young people, who, to my way of thinking, have an almost moral obligation to keep themselves in shape and definitely to avoid tarnishing their features with rings in their noses and lips, and with tattoos all over their bodies. What are they going to do, I wonder, when they get older and suddenly realize how tacky they look. One can erase only so much. Enough people are ill-favored, even downright ugly, and I look on them with pity, thinking that Nature has been too unkind to them; but, ironically, many of them found mates, while I remained single.

Then there is the obesity epidemic which is affecting all generations. I am overweight myself but am gradually losing some of it; I can now get into half a dozen pants that wouldn’t fit six months ago. However, I can’t see myself as readily as I can others; and the external scene is downright shocking. Especially ridiculous is the sight of the many fat nurses — people whose jobs are to help other people get well and stay healthy. And now, in our small town at least, we have a number of peace officers and criminal justice students who look like balloons. Those people are supposed to be able to chase malefactors, aren’t they? Our modern mode of working is the central villain here: most of our jobs involve a lot of sitting; when I went into the county tax office recently to renew my license tag I was at once both shocked and amused at the sight of a dozen female clerks who looked like walruses on a beach.

I feel guilty judging others as the above remarks evidence. I can’t change the world to fit my aesthetic and moral values; yet the impulse to judge is almost constant. Sometimes I wish I were blind.

— BL

Postscript:  Parenthetically speaking, Paul of Tarsus was not commenting on the Corinthians’ appearance. He was chastising them…actually even condemning some… for the immoral physical actions, such as fornication, that they were guilty of. I think Paul was a bit harsh with the Corinthians, when you consider what he confessed to the Romans:
I don’t really understand myself, for I want to do what is right, but I don’t do it. Instead, I do what I hate.
                                                                                                                    — Romans 7:15

Finis

[1] If you are interested in my first “kaleidoscope” post, look in the archives for “Off My Head”, July 29, 2015.

[2] Thucydides, The Peloponnesian War, trans. Richard Crawley, ed. Sir Richard Livingstone, (Oxford University Press:1943), Book III, ¶83.

NOTE TO READERS: For some reason I don’t know, WordPress.com (WP) does not allow non-WP bloggers to register “Likes” on my or other WP bloggers’ posts. However, anyone can enter a comment in the “Comment” box and it will be published, after I have “moderated” it. I am inviting non-WP bloggers to comment. And, although I prefer positive comments, disagreeing or critical remarks are fine, too, especially if they might help me improve my writing; but no snarking, please: that’s rude!
— BL

%d bloggers like this: