Posts Tagged ‘Survival’

Class War: A Perspective on Wealth

weaalth distribution in U.S.A.

Source: Bing Images

©2016 By Bob Litton

1a. An abundance of valuable material possessions or resources, riches
1b. The state of being rich, affluence
2. Goods and resources having value in terms of exchange or use
3. A great amount, a profusion
The Free Dictionary (online)

* * * * * * * *

As long as I have been aware of societal divisions into classes I have hated the whole idea of a caste system and not strictly because of any income gaps. No, I am repelled by the notion that somebody could believe that he or she is irreversibly superior to me by divine right or other source such as a congressional appointment. I got my first taste of the totem pole culture while in the air force when I learned that I was expected to be always the first to salute and that I must always surrender place when even the spouse of an officer picks up mail at the postal window. That very much offended me and I still bridle a little when memories of those incidents come to mind.

Nor do I have any appreciation for the terms “upper class”, “middle class” and “working class”. I guess we are supposed to be thankful that the words “peasant” and “slave” are no longer generally descriptive of people in the United States and most other countries, but that is not enough: the whole class system must be erased entirely. (Sadly enough, slavery— or “involuntary servitude”— is still irritatingly present, although illegally, in my fatherland.) In the United States, class distinctions are not generally based on bloodlines as they have been in Europe and in Asia but on wealth, although family connections were more noticeably determinate up through, perhaps, mid-20th century.

During the last couple of decades the topic of “income inequality” has often appeared in newspaper and magazine articles and columns. Repeatedly the image of a very small portion of the U.S. population—the so-called “one-percenters”, those whom Thorstein Veblen called “the leisure class”— has accumulated more than a third of the nation’s wealth, and the next 19 percent possess more than 50 percent, leaving the remaining 80 percent of our citizens with only 15 percent of our national treasure. How did that happen?

Now the conservatives like to argue that the super-affluent obtained their riches through hard work, thrift, and prudent investments. To a limited degree that is true for some of the rich but not, I believe, for all of them. With the exception of “prudent investments”, those attributes cannot logically account for the vast wealth gained by the one-percenters. A person would have to enjoy an extremely high hourly wage to get wealthy through “hard work” (a phrase I handily contemn). Of course, most of us are aware of the ridiculously high “salaries” and bonuses lavished on corporate executives, even when their companies are losing money and are letting the CEOs go with “golden parachutes”.

In an April 24, 2014, column Harvard economist and regular The New York Times contributor Paul Krugman wrote that the primary route to riches for most of the one-percenters is not by way of “hard work”. Krugman applauded a recently published economics book by French economist Thomas Piketty, Capital in the 21st Century, in which the author asserts that the affluent don’t get rich from enterprise but from assets gained mostly through inheritance. Piketty calls for “progressive taxation as a way to limit the concentration of wealth”, wrote Krugman. Conservative critics have responded with ad hominem attacks, calling Piketty a “communist”, Krugman noted, because they cannot come up with any substantively valid arguments to refute him.

Some roads to riches, however, do involve initiative and energetic endeavor—along with considerable native intelligence. Two of the richest men in the U.S., for instance, started their eventual capitalistic enterprises while still in school, with assistance from classmates. Microsoft co-founder Bill Gates wrote his first computer program at age 13 while in prep school and went on to refine his geek skills, with college classmates, to a point where he could start-up Microsoft. Mark Zuckerberg launched Facebook working with four college classmates in their dorm rooms. More about those two later.

Many young people of our time, though, seek to win fame along with fortune in either entertainment or sports. The most worrisome thing about this trend, for me, is that only a very small number attain the stature and earnings they had hoped for. And the “earnings” of those who do seem as ridiculously out of proportion as those of the corporate executives.

Harrison Ford, for example, reportedly received from $10 million to $25 million (different sources cite different amounts) upfront for his final appearance as Han Solo in The Force Awakens (2015), along with .5 percent of the film’s gross earnings. Contrast that with the $500,000 his contracted base pay was for Return of the Jedi (1983), the $100,000 for The Empire Strikes Back (1980), and the $10,000 for Star Wars (1977). The .5 percent on gross sales of course significantly augments those figures.

Another major Hollywood figure, Carole Lombard, was the highest paid cinema star of 1937, during the Great Depression. Of course her earnings that year ($485,000) did not come anywhere near Harrison Ford’s, but we must allow for inflation. She did earn $150,000 for each picture, definitely exceeding the amounts Ford initially received for his first two Star Wars films. The main reason I mention Lombard here, though, is the interesting tidbit I picked up from an August 25, 1938, article in The Mercury. She paid four-fifths of her 1937 earnings on taxes; after that amount plus various incidental expenses such as her press agent’s fee, her net income was about $20,000, according to Mercury.  “‘But I have no kicks,’ she [said]. ‘I am pretty happy about the whole thing, and 20,000 dollars a year is plenty.’ She added: she was glad the government was spending the rest on public improvements….’”

Then there is the music industry. I recall viewing the film The Glenn Miller Story in 1954. It starred Jimmy Stewart as Miller and June Allyson as his wife Martha. I now can recall only three scenes from it, and even those only vaguely. The scene related below is the only one pertinent to this essay. (I transcribed the dialogue from the film as I viewed it recently on YouTube):

Miller’s parents come to visit him, Martha and their infant child in their new home — a mansion for the times. While Glenn and Martha lead his parents up the wide stairway, his father inquires about how his son has managed to pay for the house on a musician’s earnings:
Pop: “Paid for, is it?”
Glenn: “O yeah, yeah, all paid for.”
Pop: “Must be doing pretty well.”
Mom: “O yes, he’s doing pretty well. Don’t we hear him on the radio every night?”
Pop: “That’s only 15 minutes. Don’t suppose they pay very much for that.”
Mom: “Well, there’s the records, and he’s playing at the Hotel Pennsylvania.”
Pop: “How much do they pay for playing on one of those records, son?”
Glenn: “We get three cents a record.”
Pop: “Three cents, huh? Have to sell a heap of records to make it worthwhile, don’t you?
Mom: “But they do, dear.”
Pop: “How many copies of a record do they sell, son?”
Glenn: “O, of ‘Moonlight Serenade’ we sold about 800,000.”
Pop: “Did you say 800,000?”
Glenn: “That’s right.”
Pop: “O! Heh, heh, heh.”
Now, if you suppose that Glenn Miller’s orchestra spent a full 8-hour day producing “Moonlight Serenade”, then they grossed $24,000 on that one record. Of course not all of that went to Glenn, there were the members of his band and presumably a studio rental and sound technicians to cover, not to mention some income tax. Still, that was just for one record; when you consider similar days for a whole work week, Glenn still came out pretty well.

The above scene brings to the fore an important question. We may be amazed and even disgusted at the huge amounts recording artists make from their records and live performances; but when we look at it a little more objectively, three cents is a really paltry amount on a single record. It is only when we multiply it by the 800,000 purchasers that the earnings jump significantly. And the world’s consumer market has increased tremendously since the 1930s, when Miller was just starting out. How can we begrudge some musician three cents on a single record?

As far as record sales go, the gain hasn’t increased all that much since Miller’s time. From what I have been able to gather online, the most popular musicians of today—the “rock stars”—receive only 75 cents to a dollar on an album. No, most of their wealth comes from live performances and T-shirts. An average box office “take” for a live performance was noted as between $150,000 and $200,000; but the fee for venue rental can be as high as $50,000, and there are the truck drivers and “roadies” wages to pay. Still, one source claims that each member of the rock band Metallica has had from $5 million to $10 million in the bank from their start-up to the present.

The earnings of major sports have become similarly ludicrous. In December 2015, basketball star LeBron James signed a lifetime contract with Nike to act as their brand-enhancer. The exact amount was not revealed, but ESPN reporter Darren Rovell estimated it could be worth $1 billion. In 2014, James joined the Cleveland Cavaliers for $22 million a year. And, Rovell wrote, James “ranks as sixth on Forbes 2015 list of highest paid athletes.” James’ total wealth—from endorsements and business ventures as well as from playing basketball—has been estimated at $64.8 million.

During the same month (December 2015) that James contracted with Nike, pitcher David Price finalized a $217 million, 7-year deal with the Boston Red Sox. According to Jimmy Golden of the Associated Press, the terms of the contract are that Price be paid $30 million a year for 2016-2018, $31 million in 2019 and $32 million in each of the final three years.

Professional sports teams that used to be filled by white men only are now predominately black. I don’t know what happened to the white guys: Are they physically unable to compete anymore or are they too racist to engage in the try-outs? As for the blacks, professional sports teams have become the equivalent of the 1840s gold mines; they apparently dream from childhood on of becoming sports heroes; sports has become their pathway to success and financial security. And the team owners are playing this longing to the hilt: I read an article not long ago that related how scouts have been venturing to a certain country in Africa (which one I don’t recall) to recruit youngsters to come to the USA and show their stuff; unfortunately, a large percentage of youths who venture westward cannot make the grade, not because they aren’t talented enough but because there aren’t that many positions open.

Another source of over-the-top income is gambling, either in the stock market or in the lotteries.

Oprah Winfrey, for instance, bought 6.4 million shares of Weight Watchers stock in October 2015 for $43 million. Almost as soon as this newsy item was out, Weight Watchers stock skyrocketed by 90 percent, according to ABC News. Winfrey said she invested so heavily in the company—which was “struggling with declining sales and a looming debt of $144 million” (ABC)—because Weight Watchers had helped her and millions of others with their weight issues. (The prestige derived from her joining the board of directors very likely helped them, too.) In February 2016, however, Weight Watchers stock declined 29 percent, according to USA Today (Feb. 26, 2016), and as of that date Oprah had lost about $29 million on her investment.

Then there is the much less admirable mode of gambling in which a hell of a lot of poor people engage: the lotteries. As far as I am concerned, this is a national sin. Yet the fact that lottery winnings are so absurdly astronomical testifies to the willingness of many of my compatriots to be gulls. I recall a news story from 1987 about a New York City janitor who won $5 million in a lottery, went to work the next day and, as he was about to climb a ladder to screw in a light bulb, another bulb lit up (in his head). “What am I doing this for?” he wondered, “I’m a millionaire.” Fifteen months later, according to a 1992 New York Times story by Alessandra Stanley, the lottery winner died in an automobile accident; and, since he left no will, his family had to spend a lot of time and money on lawyers and accountants plus income and estate taxes before they could extricate themselves with an estimated $400,000. Many other lottery winners’ stories have reportedly ended in similar Dickensian tragedies, according to what I have seen on the Internet.

But the most ridiculous money-grubbing story I ever read about concerns that silly little ditty known as “The Birthday Song”. If you live in the English-speaking regions, you probably know the words to the song. (I won’t dignify them by calling them “lyrics”, as some people do.) And if you don’t know the words, they are absolutely simple to learn; no memorizing necessary. It goes like this: “Happy birthday to you/Happy birthday to you/Happy birthday dear Nancy (or whoever)/Happy birthday to you.” It is usually sung at birthday parties; Marilyn Monroe sang it to John F. Kennedy at a celebration for him, but it is a universal tradition for anybody’s birthday in my country.

The ditty, you will note, contains only four words with an additional two stuck in as the addressee. And when it was supposedly composed by two sisters in Kentucky in 1893 no thought was taken as to copyright. The two ladies, Patty and Mildred J. Hill, used the ditty simply as a tool for teaching young children to sing. It reportedly first appeared in print in 1912, still without credits or copyright notices. Then, in 1935, the Summy Company registered a copyright. That company was bought by Warner/Chappell Music in 1988, when “Happy Birthday” had an estimated value of $5 million. Groups larger than small gatherings of relatives and friends had to pay royalties to the company for the opportunity to chirp the nonsense. For one such opportunity, in February 2010, the royalties reportedly amounted to $700. According to the Wikipedia article where I read up on this farce, “the song is the highest-earning single song in history, with estimated earnings since its creation of $50 million.” In addition, legal battles over the copyright issue went on for decades until February 8, 2016, when Warner/Chappell accepted a final judgment declaring the ditty to be in the public domain. For an entertaining summary of the lurid history of “Happy Birthday” I refer you to the Wikipedia article.

So, what can we do to “level the playing field” in economic, not sports, terms? Not a whole lot, I’m afraid, for greed and thievery will always be part of the human makeup. There are some proposals and movements, though, that seem promising to a small extent.

One is that old one FDR applied during the Great Depression and to which I referred when discussing Carole Lombard’s patriotic attitude: raising tax rates on the rich. Such is not going to happen, however, as long as the Republicans dominate Congress. Anyway, to me it seems a Sisyphean solution, attacks the symptom, so to speak, rather than the problem, and would certainly aggravate the tensions between rich and poor. But it might stabilize the income gap until a more satisfactory solution can be instituted.

A sort of obverse to that approach is what has been termed a universal basic income (U.B.I.), which New Yorker staff writer James Surowiecki wrote about in his June 20, 2016, column. The tactic here is to pay every U.S. adult a stipend of, say, $10,000 a year (children would receive a smaller amount). An experiment on this idea was tried in Dauphin, Manitoba, Canada, in the mid-nineteen-seventies, and, although a conservative government buried it quietly in 1979, later research indicated that while the guaranteed basic income was in force hospitalization rates had fallen, more teenagers had stayed in school, and work rates had only barely dropped.

New experiments on U.B.I. are currently underway or planned in Finland and in Oakland, California, Surowiecki reports. He writes: “In the U.S., the new interest in the U.B.I. is driven in part by how automation will affect workers. Bhaskar Sunkara, the publisher of the socialist magazine Jacobin, told me, ‘People are fearful of becoming redundant, and there’s this sense that the economy can’t be built to provide jobs for everyone.’”

I’m all in favor of a U.B.I., but even it might leave a certain discontent in people’s minds—the yearning to be useful and creative. I am too cynical to believe that every adult in the U.S. has enough imagination and energy to discover and develop a creative purpose or function or vocation on his/her own just to preserve his mental health. I can only hope I am wrong.

As long as we have businesses and industries that still employ people and that hope to retain their work force for a long period, another approach might fit: Employee Stock Ownership Plans (ESOPs). Actually, ESOPs have been around for years now, becoming popular in the mid-nineteen-seventies. According to the National Center for Employee Ownership, by 2014, seven thousand companies had ESOPs covering 13.5 million workers. I will let NCEO describe the system themselves, for they can do it more clearly than I:

“Similar to Profit-sharing plans, the ESOP is a trust fund into which the company contributes new shares of its own stock or cash to buy existing shares….Shares in the trust are allocated to individual employee accounts. Although there are some exceptions, generally all full-time employees over 21 participate in the plan. Allocations are made either on the basis of relative pay or some more equal formula. As employees accumulate seniority in the company, they acquire an increasing right to the shares in their account, a process known as vesting. Employees must be 100% vested within three to six years, depending on whether vesting is all at once (cliff vesting) or gradual.

“When employees leave the company, they receive their stock, which the company must buy back from them at its fair market value (unless there is a public market for the shares). Private companies must have an outside valuation to determine the price of their shares. In private companies, employees must be able to vote their allocated shares on major issues, such as closing or relocating, but the company can choose to pass through voting rights (such as for the board of directors) on other issues. In public companies, employees must be able to vote on all issues.”

There is more and important information in the NCEO statement that might interest you, but I will have to refer you to NCEO’s website ( to read it, for my essay is already too long and I have a bit more to write.

All the media coverage over the huge disparity between the incomes of the super-rich and the rest of society apparently has had some impact: The Giving Pledge. According to its Wikipedia article, the Giving Pledge’s goal “is to inspire the wealthy people of the world to contribute the majority of their net worth to philanthropic causes, either during their lifetime or upon their death. The Pledge is a moral commitment, not a legal contract.” In June 2010, billionaires Bill Gates and Warren Buffett formally announced “the Giving Pledge campaign” and began recruiting members. By August, forty people had pledged $125 billion. As of March 2016, one hundred forty-two individuals or couples had pledged an aggregate total of $731,860,000,000.

A year or two ago, before I had even heard of The Giving Pledge, I read a comment by Melinda Gates (Bill’s wife) in some news article to the effect that she didn’t need a billion dollars to live on and was planning to give some of her wealth away. I have long been suspicious of Bill because of his viciously aggressive business tactics, but I was also pleased by his reported charitableness: he reportedly has donated many, many computers to children in Africa. I realize that could be a subtle business tactic, too, since it might lead to future purchases of his Microsoft products in the future, but why “look a gift horse in the mouth”? (Come to think of it, the Trojans might have done well to have done just that!)

As for Warren Buffett, he has been one of my favorite people for several years now—ever since he urged Congress to make his tax rate higher than his secretary’s. If he approves of Bill Gates enough to associate with him in this Giving Pledge organization, then I guess I’ll have to accept Gates as okay, too.

The top five donors on The Giving Pledge roster are Bill and Melinda Gates ($77.3B), Warren Buffett ($66.7B), Larry Ellison ($49.3B), Michael Bloomberg ($37.2B), and Mark Zuckerberg and Priscilla Chan ($35.7B).

According to the Wikipedia article, “The pledge does not involve pooling money or supporting a particular set of causes or organizations. The pledge asks only that the individual give the majority of their wealth to philanthropic causes or charitable organizations either during their lifetime or in their will….The pledge encourages signatories to find their own unique ways to give that inspire them personally and benefit society.”

I don’t know whether my curiosity derives from good old-fashioned journalistic instinct or from dirty old cynicism, but I wonder what these people’s motives are. Could they be reacting to the threat of a possible new revolution of the French sort? (You might recall that one year later the “Occupy Wall Street” movement began in New York City.) Could they be honestly sensitive to the inequity of the wealth disparity? Could they have concluded that a hyper-tax is looming ahead and want to determine for themselves where and how their contributions are to be spent? I can’t answer those questions, and I don’t think it is necessary that I do so. Although the Giving Pledge is not likely to benefit me individually or directly, if it reduces the number of solicitations for contributions that show up in my mail box each December, then I will be pleased.




Bob’s Apology to the Children of the World

© 2016 By Bob Litton. All Rights Reserved.

O little children, how I regret the need to write this letter to you. If we big people had done our duty for many years now, this apology would not have been necessary. You might not be able to read or comprehend by yourselves what I shall say here, so you perhaps should wait until you are a little older and have learned more and bigger words. (I will try to rein in my tendency to use complicated words, but that is very hard to do.) Or your parents might sit down with you and reduce the content to your level of understanding. I doubt that they will, because it could be too embarrassing for them.

I don’t have any children of my own, but there was a time when I deeply wanted a baby. However, I was already past the age when being a good daddy was practicable; and, anyway, I didn’t have a wife. A mommy is just as important in a child’s development as a daddy, usually more so. But my being childless is not really important: I am still just as responsible for our troubles as any parent.

But, let’s get on with the basic message I want to share with you.

The world is in a sad situation right now, both in an environmental way and in a social way. Perhaps the primary cause of that sad situation… (Let me introduce a new word to you here: dire. I would rather use that word than “sad” because, although it contains much the same meaning, it also means more. You see, a situation can be “sad” and yet limited; it might affect only one person or just a few people, and it might be just a temporary mood. “Dire”, however, adds more meaning — the element of threat. If something is a threat then it is neither tied to a mood nor likely to be temporary; it could mean the end of all life, even all things.)

One current threat is Climate Change. The Earth’s temperature is increasing; at least that is what about 300 of the world’s scientists have told us. And many things that we can see, if we look at them, appear to back up the scientists’ claims: the Arctic ice is melting, threatening the habitat of the polar bears and the Eskimos; the coral reefs, on which many sea creatures depend for food, is receding; the schedules and flight patterns of migratory birds are changing; and, perhaps the simplest test of all, the recording on temperature gauges is inching upward year by year. And those are just a few of the observable changes.

Now, a sizable minority of the world’s population refuses to acknowledge these changes or to attribute them to Man’s use of energy sources that come out of the earth, such as coal and oil. And other people, who might recognize Man’s guilt in all this mess, don’t have the political will to do anything about the problem. What hinders them is that to take the urgent actions needed to try and reverse, or at least moderate, disaster would require eliminating some industries, such as coal-mining and oil/gas-drilling, which have employed many people — perhaps your daddy or mommy — for a long time. You can understand, can’t you, why your parents, if they work in one of those industries, would fight to keep their jobs? They want to be able to feed and clothe you just as they have always done. And when the cost of a solution closely affects a person’s family his or her range of vision becomes severely narrowed.

Another threatening element in our world’s scene is tribalism. If you are Americans, you probably think that only the Native Americans (formerly known as “the Indians”) live in tribes. Actually, though, we are all members of tribes in that our facial features, skin colors, cultural attitudes, political arrangements, and even spiritual beliefs are shared by varying fractions of the world’s population. Throughout the centuries, tribes have often been in conflict with one another; this is very noticeably the current case in the Middle East, Africa and South Asia. But it is also an issue in Europe and the United States, where mass migrations of peoples who are fleeing oppression and poverty in their homelands continue. Especially when a bunch of them move to any one country, they tend to congregate in the same area so that they can share themselves with others of their own culture and language; thus, we have neighborhoods that become known as “China Town” or “Little Mexico”. Large influxes of peoples bringing with them their traditions, religions and other cultural habits appear threatening to native peoples, who want to protect their own cultural norms from alterations. Now, some of the native people — particularly the farmers — often welcome the foreigners because those refugees are willing to do work that some natives do not want to do. That causes quarrels between the farmers and their urban neighbors.

There are also, naturally, more practical problems that come with mass migrations: how to house, feed, clothe, educate and medicate the foreigners. The governments in Europe, the United States and some African countries are wrestling with those problems right now. A subtle and dangerous aspect of this social turmoil is the element of racism and religious bigotry involved. Ethnic jealousy and political partisanship also are part of this poisonous mixture. Such a seemingly small matter as whether a Muslim woman should be allowed to wear her religion-prescribed head scarf in some places has engendered debates in parliaments and the media.

Religion itself is a major element in the world’s general conflict. In the Middle East, one branch of Islam attacks another branch over the question of who was the rightful successor of Mahomet as leader of their religion. In China, the government is again trying to extinguish Christianity. And here in the U.S., one political party is working hard to infuse the Christian religion more deeply into our political system; they want to establish Christianity as the official religion of the U. S.. In all our conflicts, a primary element is the “us versus them” mentality, and that is especially true of the religious divisions.

Then there is the question of how you children are going to earn a living when you grow up. Robotics and mechanization are already reducing the number of humans who are needed for many types of jobs. In Japan, I read recently, they are already using robots to work the reservation counters at airports. A batch of sociological studies all indicate that many more positions will be taken over by robots over the next 25 years, including those of lawyers, doctors, and news reporters. So, what will you do? How will you spend all your “free time”? How will your food and shelter be paid for? Don’t expect the owners of factories and other businesses or the political officials to care: they want to eliminate the need for human employees because doing so will save them money. Why should they spend that savings on your needs?

Now, I should give credit to those grown-ups who are trying to solve some of the problems I have too briefly described above. There are many individuals, companies and even governments who are altering their practices regarding gaseous emissions from factories and vehicles, which are a major cause of the Climate Change problem. There are also some statesmen who are trying to tamper down the social strife caused by religious and cultural differences.

And there are your parents, who had enough faith in humanity to bring you into the world. I feel some mental and emotional conflict within myself at this point because, on the one hand, I wonder at their wanting to bring children into a world full of direful and daunting difficulties; while, on the other hand, I admire them for their faith and for providing us with you. The solutions will require people — intelligent, energetic and loving people — to discover and put them into practice.

Thus I leave you, Children of the World, with my most heart-felt apology for the messes we have left for you to clean up, and with my earnest hope and encouragement for your success.

Bless you,

Bob Litton

Fragile Civilisation

© 2016 By Bob Litton. All Rights Reserved.

I have been viewing again some DVDs about the course of Western Europe’s cultural history since the fall of the Roman Empire; I bought the set a year ago. They constitute the 13-part documentary titled Civilisation: A Personal View by Kenneth Clark, which was produced by BBC-2 back in 1969. While still a graduate student at SMU, I enjoyed the series when it came to the U.S. the following year. The personable, humorous and brilliant Kenneth Clark immediately became my newest hero.

My description of  this Scotsman, Kenneth Clark (1903-1983), contains the adjective “humorous”, but I don’t mean by that that he was a comedian or even that the primary tone of Civilisation is light-hearted: it is in fact often melancholy, even at times somberly prophetic, for the theme of the narrative is how the trend of civilisation in Europe has not been an unswervingly upward slant but has declined several times since 476 C.E. (the generally accepted date of Rome’s conquest by the Germanic chieftain Odoacer) and has even slipped into darkness once for several hundred years. Nonetheless, Clark’s comments frequently are interlarded with understated wit, a quality which has characterized many British intellectuals over the centuries.

But wry wit is not my theme: rather I want to align myself with Clark’s emotional concern about the impending fate of the West today—Europe’s of course but America’s as well. During at least two of his presentations (or lectures, if you prefer), Clark alludes to the very possible extinction of what he chose to call today’s “civilisation”. (This spelling, by the way, is not a typographical error; the British spell “civilisation” with an “s” while we in the United States spell it with a “z”; I have elected to employ the British spelling throughout this essay.) Without being specific, Clark alludes to recent events as portents of another dip in humanity’s cultural development. I still don’t know what he could be referring to: the Cold War? modern art? mechanization? materialism? political corruption? Here and there in the episodes he mentions all those and other fault-lines, as well as the constant, congenital “fragility of civilisation”. But if there is any single danger to current civilisation that he considers our immediate nemesis, I am not certain which it is.

Early in the first episode of Civilisation, Clark conceded that he couldn’t define civilisation…“yet”. Then, playing on the cliché about the philistine who at first demurs when asked what to him is “fine art”, Clark adds “…but I know it when I see it.” He later makes the same remark about “barbarism”. Soon thereafter, however, he lists several attributes of his subject: “intellectual energy, freedom of mind, a sense of beauty and a craving for immortality”. Still further on in the series, Clark adds stability, confidence, prosperity, order, and broad participation in society. And even further on, Clark describes a civilised society as “intelligent, creative, orderly, and compassionate”; but these latter qualities are not simply what create civilisation, they are also what are necessary to sustain it. Nomadic peoples, such as the Vikings for instance, although supremely confident and adventuresome, could not develop a civilisation, according to Clark’s definition, because they were unstable and saw no value in maintaining anything other than their tools for survival: in the case of the Vikings, their ingenious ships. And the highly cultivated society of 17th century France could not last because the portion of the population which participated in it was too small.

I perhaps should mention “light”, since Clark asserted that light “can be seen as the symbol of civilisation.”  He is referring to the light of reason, education and accumulated knowledge as well as to the light that was so typical of Dutch painting during the 17th century and to the light studies in 19th century French Impressionism. His appreciation of light is almost mystical.

Although Clark does not name any singular major threat that confronted mid-20th century Western Europe, he does specify what caused the luster of previous cultures to fade: fear of war, plague and the supernatural; boredom; exhaustion; and insularity.

At the end of Clark’s cultural tour he confesses himself to be a “stick-in-the mud”, by which he means that he holds onto several values and beliefs which have been abandoned by some other modern intellectuals. Peace, he says, is preferable to violence, and knowledge is preferable to ignorance. He adds that he cherishes courtesy and compassion. And above all he advocates for the recognition that we humans are a part of Nature’s big picture, not separate from it, and that we should view other animals as our brothers and sisters, much as Saint Francis of Assisi did.

Now, to the present. I have my own personal issues with which to cope, issues that no one other than I can resolve. But I also share in many, and in some ways starker, issues that confront Americans as a whole and others that are faced by everyone on this planet, whether they are aware of them or not. What makes these problems seem especially intractable is that they are typified by paradoxes and dilemmas.

Recently, for instance, I heard an interview on National Public Radio in which the interviewee was author of a book about the psychological disturbances that afflict many military service people when they return home from places like Viet Nam and the Middle East. These disturbances we have classified as “post-traumatic stress disorder (PTSD)”. The author, who is himself a veteran of the Iraq conflict, claimed, however, that that classification is inaccurate, at least in his case. He said that the problem evolved not from having been in a combat situation but from leaving it. Coming home to a “stable” environment had made him feel marooned, so to speak. On the battle field he had been in the company of men who depended on each other every second for their survival; when he got home, he felt isolated because of the separateness and indifference he saw all around him. In another NPR interview, a woman who had survived the horrors of the ethnic war in Bosnia during the 1990’s said she was ashamed to admit it, but she now yearns for those days because people cared for each other at a very deep level. During that same interview, mention was made of how the murder and suicide rates in New York City steeply declined immediately after 9/11.

I cannot accept the notion that the cohesion of society—of civilisation—depends upon war and other calamities.

For any of you who are interested, you can view Kenneth Clark’s Civilisation: A Personal View documentary on YouTube…at least as of May 29, 2016.




Yesterday, Today, Tomorrow

© 2016 By Bob Litton. All Rights Reserved.
When a person reaches the 76th year he can develop the notion that, because he has lived through — even studied — much history, he has accumulated a dense patina of knowledge in his brain; yet he also feels afflicted by the suspicion that he does not know how to apply it. I recall in my youth enduring various puzzling illnesses and mechanical problems which, after healing or correcting by learning the causes and applying the proper treatments or techniques, I have said to myself, “There now, in the future when I come across this situation again, I will know what to do!” The only problem with that assumption is that the illness or mechanical failure  seems never to repeat itself. There is always a new puzzle to ponder. Because of a few such episodes in my recent past, the idea of composing this essay flowered in my brain.


Einstein said that time and space are the same. I take that remark to mean that if I get up from this chair and walk over to my bookcase, about fifteen feet away, I will be walking into the future; and that if I turn around and walk back to my chair, I will be walking into the past, because I am going the same distance, over the same area, over the same period of time, only in reverse — just like the “rewind” device on my VCR. But I don’t feel that to be the case, for I have aged infinitesimally during both transits. (I wonder, to render this example valid, would it be necessary for me to retrace my steps backward rather than doing an about-face and proceeding forward again but in the opposite direction?)

I’m a very time-sensitive person, and the only place I feel that I am delving into the past is in the memory sections of the brain (the pre-frontal lobe [short-term] and the hippocampus [long-term]). Of course there are extant, exterior entities, such as an old photo or a “golden oldie” sound recording, even a scent, that can stir and augment memories.

A strange aspect of some memories is that they have made me imagine that the events which they relate still exist. Those particularly vivid memories, though very transient, are so palpable as to make their events’ extinction seem improbable. When I had such a memory unfold in my mind one day recently, I wondered where I would have to search to recover the event itself; but I quickly shook off that notion after realizing that every event has preceding and subsequent events, and I could not bring back that singular, desirable scene without also summoning its past and future. That enterprise would require a time machine.

Before you summon the guys in white coats, consider a few sentences from an article in last January’s Harper’s magazine. Titled “WHAT CAME BEFORE THE BIG BANG?”, the essay was written by MIT physicist and novelist (what a combination) Alan Lightman. Actually, in the sentences I will quote here, Lightman is referring not to his own cosmological theory but to one being investigated by another MIT scientist, Alan Guth, and California Institute of Technology physicist Sean Carroll. Their hypothesis, known as the “Two-Headed time theory”, according to Lightman, proposes that the order of our universe, then much smaller than an atom, “was at a maximum at the Big Bang; disorder increased both before and after…. (T)he forward direction of time is determined by the movement of order to disorder. Thus the future points away from the Big Bang in two directions. A person living in the contracting phase of the universe sees the Big Bang in her past, just as we do. When she dies, the universe is larger than when she was born, just as it will be for us. ‘When I come to understand that the reason I can remember the past but not the future is ultimately related to conditions at the Big Bang, that was a startling epiphany,’ said Carroll.

Lightman compares the expansion and contraction phases of the universe to that children’s toy, the “Slinky”, which, as he points out, “reaches maximum compression on impact, and then bounces back to larger dimensions. Because of the unavoidable fluctuations required by quantum physics, the contracting universe would not be an exact mirror image of the expanding universe; a physicist named Alan Guth probably did not exist in the contracting phase of our universe.” Still, there is always that wiggle room left by “probably”.

Lightman describes a few other theories of the “origin” of the universe, none of which allow for the notion of time and therefore do not consider “before” and “after” and therefore are outside the province of my essay here. However, I do want to bring in one more analogy that Lightman uses to characterize the expanding/contracting phases of the universe: a movie of a glass dropped on a tile floor, shattering, then recombining and flying back up to the table top from which it fell. If I think of the glass shards as events in my life, and of the possibility that they are scattered now out there in the vastness of space/time, and that they might someday in the far-off future recombine to become those events again, then my dream, as I related above, of summoning memories is not so absurd as you readers might have judged earlier. Heh?


We are frequently advised by gurus of various varieties to “live in the moment” in order to be happy. Who am I to argue with that formula? Only it doesn’t work for me. Why?

Well, I think it’s partly a function of Fate: I don’t have any choice but to live in the moment, yet the present seldom smiles on me, definitely not for more than a few hours. The present, in fact, seems like the target on a dart board where missiles are continually bombarding. I keep looking for that day when I can proceed from arising to retiring without some, at the least, irksome or, at the most, catastrophic encounter. I can’t recall the last time I gamboled through such a day, although I feel certain there have been some, quite likely many such. They were just too long ago. (And here, I see, I can’t even write about the present without bringing in the past and the future: depending on one’s definition of “the present”, it seems impossible to separate it from those periods. Is the present this day, this experience, or really just this “moment”? )

Another problem with the “live in the moment” prescription, not just for me but for every adult, I believe, is that even in our most positive moments we have to consider future events: college, career, possibly marriage, elections, and retirement funding. A host of other, smaller concerns requiring decisions are scattered through our lives. As one old humorist expressed it, “Why does any man examine the teeth of a horse he is thinking of buying and yet forgo checking out his prospective bride’s teeth?”

Laying all that aside, just how do I confront the present? That is too big a question. I mean, in this time I cannot ignore the fact that many of the problems I have to face also stand before almost everybody else: crazy politicians rattling their sabers, oncoming weird weather disasters and famines, fanatical gun toters, out-of-control medical and housing prices, etc. I can’t limit all those problems to myself. The conundrum, then, for me is: How can I separate out what affects only me from what affects everybody else? I cannot totally and sensibly demarcate those boundaries. Yes, there are a few somewhat private health issues which I have, but even they, as types, also afflict at least some small portions of the population; how I weary of hearing a “comforting” friend utter, “Oh, that’s just part of getting old!” or “Yeah, that stuff has been going around lately!” Why cannot my current problem be mine…individual…alone?


My, how the calendar has shrunk! It used to be the case that when someone reminded me that some event took place last year, I could imagine an expanse of time with body to it. Now “last year” seems like what we once-upon-a-time called “last month”. I’m not sure whether this change is due to aging in me or to a more encompassing phenomenon which Alvin Toffler described as the perception of “too much change in too short a period of time” in his book Future Shock back in 1970. If the latter, then it is really weird how external events can cause one’s notion of a calendar period to shrink. There is now a whole “scientific” field of people — called “futurists” — who gather data from a large array of sources to predict what the future holds. Simple crystal balls and astronomical charts are passé.

When one reaches an age as advanced as my own, he or she is confronted with the reality that their options have greatly shrunk. There is no point in our seeking another academic field or degree, although we might have fun and benefit from taking a “continuing education” course occasionally. And we might look at our overloaded bookshelves, count the books we haven’t read, and resolve for the nth time never again to enter that bookstore a few blocks away. I swear! I must have bought all those books just because they are so decorative! Oh me, oh my!

Nor are we to get married…not sensibly anyway. Oh, if we are very wealthy and are seduced by a beautiful, young “honey-pot” into being her “sugar-daddy”, we might find ourselves wandering down that church aisle or into that Las Vegas drive-thru wedding chapel. Or, if we are much less affluent, we might marry someone nearer our own age just because we each anticipate the other will at least nurse us through our final days. The latter case is a little less contaminated by folly or predatoriness but does retain some of the strategical about it: no hot romantic blood there certainly.

But there are other issues that affect not just me and my ilk but many, many other Americans. The news media daily reminds us that we have several potential, horrifying fates lying in wait for us — climate change disasters; the threat of religious and political radicalisms; dissolution of welfare programs, including Social Security; Alzheimer’s disease; and Donald Trump as president. And those are just the severest ones. Although we should not let them overwhelm us to the extent that we can prevent them from doing so, we still have to pay attention to them in order to prevent, or at least defend ourselves against, them. So, in that very deep sense we are attached to the future.




Goodbye, Tooth Fairy!

Tooth Fairy

©2004, 2016 By Bob Litton. All Rights Reserved.

NOTE TO READERS: The article below was written back in 2004.  At that time, I submitted it to one of our local weeklies. The publisher/editor never printed it. I did not ask him why, but I supposed, with substantial grounds, that his reason was that it was “soft news”; i.e. material that had no immediate relevance for the populace but was a rather small matter that yet could in fact disturb them— they might avoid their local barbers and dentists. Also, while he puts out the best chronicle in the three-county area in the sense that his reporters cover ”hard news” (governmental, political and social events) more fully and accurately, he doesn’t have much appreciation for feature articles or what used to be called “familiar essays” (a common element in the “Talk of the Town” section of New Yorker magazine), which are my forte. So, this article has been stuck in my files all those years, yet I believe it still makes engrossing matter for the intellectually curious reader.

     I have altered the names somewhat, reducing them to initials, because I did not have permission from the subjects to include their full names, although they knew I would publish the article sometime, somewhere. Also, the barber retired half a dozen years ago.


◊  ◊  ◊  ◊  ◊  ◊

There’s new news and there’s old news—but they are not always so simply distinguishable.

Take for example a recent trip to the barbershop for my monthly trim. I went to K. N.’s barbershop. Usually, one of the other barbers cuts my hair, but on this day I had the honor of K. N. himself shearing my mane. After he had done the basic work, he pressed out a palmful of lather and smeared it on my neck. It had been I don’t know how many years since anyone had done that.

“Since when did you start shaving the neck?” I asked. “I thought shaving was out ever since the AIDS scare happened.”

“Oh, we’ve got these stainless steel razors now,” he said. “I used to use Solingen steel blades from Germany. Other barbers used Sheffield steel from England. But they both had pores in them that retained blood. Now I use stainless steel. And I use it only one time.” The stainless steel blades, we discovered after looking at a box, are made in the U.S.

The State of Texas Barber Board, K.N. told me, sent out new regulations about ten years ago ordering barbers to quit using the porous razor blades. They also had to get rid of their strops and hones.

K.N. said he doesn’t offer shaves, even though they would be allowable with the stainless steel blade. He quit shaving years ago, he said, “because people have skin blemishes—like moles. And when you lather a customer up you can’t see the moles.”

About ten years ago was also when the Center for Disease Control, or CDC, sent out regulations telling dentists to modify their practices in the interest of reducing the potential for transmission of HIV, according to local dentist J.F.

The regulations were a response to the case of a dentist in Florida who a decade ago allegedly infected five patients with the AIDS virus, J.F. told me. However, he said, all five cases involved different strains of the virus.

My conversation with the doctor about AIDS developed when I went to see him about tender gums. As I sat in the chair I noticed that the ordinary chairside spittoon was missing.

“Where’s the spittoon?” I asked the dentist’s hygienist as she was sticking a tube in my mouth.

“Oh, we can’t use those anymore,” she said, “because of AIDS.”

What she had stuck in my mouth, J.F. later told me, is called a high-speed suction tube. It removes all that saliva and blood we used to spit into the spittoon. Also, J.F. said he has a line separator in his alley so that there is no possibility of backflow.

The doctor told me the amount of regulations controlling dental practice these days is voluminous. And some of them are ridiculous, he added.  “The virus lives only minutes—some people say less than a minute—out of its moist environment,” he said. “But the regulations are so stringent; we can’t even give a kid his tooth to leave for the Tooth Fairy*. That tooth has to be treated as ‘medical waste’.”

While I was still there, J.F. called up the CDC to get a more definite fix on how long the HIV can live outside its fluid environment. However, they refused to give him a specific time period and said only that when the virus dries out it dies. They added that the hepatitis A virus could live several weeks in the open air before dying. (They obviously didn’t want to give the Tooth Fairy any wiggle room.)

So you see how a news story that began back in the early 1980s, when Ronald Reagan was president, continues to ripple into the 21st century. And how our daily lives are continually and probably forever changed in the minutest of ways by the event that created the story.

*Fairy: I don’t know how widely the folklore of the “Tooth Fairy” extends, so perhaps I should relate it briefly here. Children’s “baby teeth” begin to drop out at about age six. Generous, loving parents sometimes tell their child to put a dropped tooth under their pillow so that the Tooth Fairy can remove it and replace it with a small coin, such as a dime.


The Doldrums

Dear Reader,
I don’t know if this message is relevant to you or not. If you are a WordPress blogger who has boarded my blog as a “Follower”, then it most likely does not relate because you receive an email flag every time I publish a new post. And, if you are a new visitor to my site and do not expect to return, then it does not relate in your case either. However, if you visit me occasionally and intend to return for more reading pleasure but have not signed on as a “Follower”, then this message is definitely addressed to you.

All that wind is the preface to my admitting to a lack of wind in my sails right now, so I have been merely tossing on the waves here on the pond of “The Vanity Mirror”. I will continue to be idle at least for a little while longer, for I am in “the doldrums”. I looked that word up in an online dictionary to be sure I was using it appropriately, and I certainly was:
  (dōl′drəmz′, dôl′-, dŏl′-)
pl.n. (used with a sing. or pl. verb)
A period of stagnation or slump.
b. A period of depression or unhappy listlessness.
A region of the ocean near the equator, characterized by calms, light winds, or squalls
b. The weather conditions characteristic of these regions of the ocean.

All the above definitions are at least poetically pertinent, but the highlighted one fits my present condition best. I have had to contend with several enervating problems lately: a plumbing problem that resulted in a squishy carpet in my living room, tugs of war with so-called medical providers, and a next door neighbor who is dying of lung cancer. (As a former journalist, I have seen several dead bodies; and I visited my mother in the ICU ward when she was dying from heart failure; but this is the first time I have seen the outward evidences of a body attacked in various ways by a vicious, ugly disease.)

I know I have published posts in the past that I composed as I wrote — “off the top of my head”, as the image goes —; and the problem is not that I do not have topics at hand (even a significant amount of research done on a couple); but I don’t have the emotional energy right now to produce. Hopefully, by the end of this month I will be ready to move forward.

The above sentences might strike a few of you as arrogant, as if I were imagining a line of people waiting at the ticket window and being told that the show has been cancelled tonight. Well, although that is not really the case, it isn’t far from the truth either. I always look at my “stats page”, and I regularly see small flags from certain countries which I have begun to perceive as frequent visitors who, for whatever reasons, do not wish to click the “Follow” button. So, you see, this post is essentially my apology and explanation to them. Please be patient and come back. In the meantime, there are many prior posts in the “archives” to peruse — three years’ worth — and they don’t die just because the ink is dry.

Best regards and thanks to all of you!

The Dismal Science: A.K.A Economics .001, Part I

© 2015 By Bob Litton. All rights reserved.

Dear readers, I originally had intended to make this essay a single exposition. However, as I got into it the ideas, emotions and resultant words multiplied like the sorcerer’s apprentice’s mops, so I decided I had better break it into at least two, and possibly more, essays. I wanted to get the rocks out of my gut. As I write this preamble, I am nearly through with Part I. The composing procedure I have adopted is thus far slightly rambling; as you know, I have a tendency to write “off the top of my head”. But rest assured the substance is not fluff: I have been observing and pondering the economic scene in my homeland for many years now. O God, please help her!

I wrote in one of my newspaper columns decades ago that I have an aversion to economics, which has been dubbed “the dismal science”. But, like nearly everyone else on this globe, I cannot escape its various impacts on my life. I was thinking of excluding among the victims the Inuit in the Arctic and the forest dwellers in the Amazon jungle, but we have read recently of how climate change is melting away the tundra ice and thereby eliminating the surface on which some Inuit have their homes, and how some greedy gold, oil and farmland seekers are invading the Amazonians’ habitat; so, even those reclusive tribes are not excludable from the modern economic seine.

Over the past few years I have tried to organize in my mind the little I know (or think I know) about our capitalist economic system. I tried back in my early manhood to read a textbook designed for Economics 101 courses. I don’t think I got past the first chapter; any academic field that employs graphs and mathematical symbols is over my head. I just checked and saw that Wikisource has produced an apparently accessible English translation of Karl Marx’s Das Kapital, which I will probably at least scan while working on this blog post, but I don’t want such reading to interfere much with my lifetime’s impressions.

That connotation-rich appositive above, “the dismal science”, evokes in my mind two images: (1) the candle-lit laboratory of the mad scientist or wizard, and (2) an above-ground intellectual field so dense and counter-intuitive as to be depressively hindering to the mathematically challenged person (me). I really believe that both meanings are appropriate, even though most people would say that the second is the one intended.

The usual wizard of folklore favors “black magic” that is intended to harm others, and he often is secretive, hiding either in some secret cave or in a vacant castle, in its dungeon or in its tower. According to a PBS program I saw decades ago, Leonardo da Vinci, probably the historical human being who epitomizes such a scientist, reportedly built, in secret, a very large telescope in a tower. Of course, he did not build it to harm anyone; he built it to satisfy his insatiable curiosity about one of his many interests: the stars and planets. The Church, however, did not see it that way; they preferred to view the whole enterprise (when they found out about it) as diabolical. Curiosity has for centuries been a bugbear. (The fact that Leonardo was left-handed did not help either.) A much more homely wizard, the king of Id’s magician, quite often makes mistakes that temporarily harm or at least embarrass him. And the ancient Chinese sorcerers experimented with many plants in their search for the elixir that would spark an eternal — or at least an extended — lifespan; their experiments yielded some poisons, for which some paid with their lives, but they also contributed a good deal to modern pharmacology. So, I see all that as indicating there is no way to avoid the bad in any science, no matter how good the intention and the final result might be. (Examine all the side effects on your medicine labels.)

And now some few words on the second image. There are many out-spoken economists these days, some of whom regularly contribute op-ed articles or columns to our magazines and newspapers. Years ago, I used to enjoy watching John Kenneth Galbraith argue with William F. Buckley Jr. on the latter’s PBS show, “Firing Line”, although I did not completely follow what either said. The two economists I follow now, whenever I happen to notice their columns, are James Surowiecki in The New Yorker and Paul Krugman in The New York Times. Both are, I believe, liberals: Krugman acknowledged as much. Although I do not grasp the thought processes that lead them to their conclusions, I agree with both men most of the time (I cannot recall any instance when I balked at one of their conclusions, although there might have been such an occasion.) They are pretty good at speaking to my level of comprehension. Some of that stuff is bound to have sunken in.

Still, there are some economic facts which I have never understood and have been disinclined, until now, to research. The oldest such matter concerns the old, hot debate regarding the “gold or silver standard” of the 1890s. From my easy chair vantage point, I could not fathom what difference it made which standard was adopted; and, since it had been settled long before I was born, I did not really give a damn. Yet, a related question continued to nag at me: Why do we marry our currency to some pretty but basically lifeless metal anyway? A barter system — what Marx called the “exchange value” of products and services — although complicated to institute, would be much more natural and true to life.

Now I would like to present my overall view of how economics rules our lives.

The world is divided into two classes of people: capitalists and workers. The former invest their money in some enterprise, either a start-up company or an existing company. Usually, the money is bet on an existing company that has already proven it can float; however, venture capital placed in an experimental or innovative effort is not unheard of; and, since the investors are “getting in on the ground floor”, when the cost of “shares” is lowest, has the best chance of “earning” a large “return” on their investments.

Just as with the wizard’s experiments, any number of hazards can cause problems for the capitalists: the company might be part of a Ponzi scheme, a CEO’s errors in judgment can diminish or even destroy the company, the company could be swallowed in a “hostile takeover”, faulty or inferior products might cause the company to lose out on desired governmental contracts or have to recall products, an extended employee strike could squeeze productivity and thus profits, and so on. Some capitalists these days, e.g. Donald Trump, Warren Buffett and Mark Zuckerberg, have gained small fortunes in a single day’s stock market activity and can afford to lose the same amount without missing a step. But for many other investors — the much smaller ones — a comparable loss can eliminate their life savings. Those are the people who define themselves as “the creators of jobs”.

So, who assumes the jobs? Who actually gets the work done? That’s the workers. What I hate most about this separation into “capitalists” and “workers” is that it has been extended from the situation at the factory to that in social spaces: homes, restaurants, theaters and even churches. Of course, social class distinctions in the latter places have been moderated somewhat during the past century, but they still exist in hardly less apparent ways, what with the current movement toward “gentrification” of neighborhoods, the robotization of an increasing number of occupations, and the constant media attention on celebrities — the vast majority of whom are wealthy entertainers, athletes, foreign royalty, and entrepreneurial billionaires. Now we are debating the fairness of income inequity and slicing up social classes into one-percenters, (vanishing) middle class, and working class. Yet the working class and those even lower on the social ladder are reluctant to revolt against this pernicious system because they view the elites as models of what they might become, if only they can get that football or basketball “scholarship” or get an agent to notice them on “open mic nite” at the local bistro or if one of the lottery tickets they are buying today with half their paltry paycheck will just vault them over the rung where hangs that middle-class person. They don’t want to destroy those privileged positions, because they want to attain them.


NEXT (I hope): A more detailed look at the two major classes.

%d bloggers like this: