If April is the cruellest month, as T.S. Eliot has said,
Then January is the hardest, its nights the gloomy darkest.
Snow does ensconce us all around, and it is hard to leave the bed.
If April stands for waste, although dull roots it stirs and lilacs breeds,
This month only weaves its icy laces and forms hard burrs in snow.
January has its magic days when bright sun reflects from white.
It is a month of resolutions when we decide to bend our
Hearts toward the light—wherever we might actually find it.
But as this month has only hours left to go, something in me stirs—
My spirit. Good-by, Cold Number One. It’s Number Two tomorrow.
Monday, January 31, 2011
Sunday, January 30, 2011
Happy 50th Anniversary
In another context entirely, I happened across a fact worth celebrating. This January marks the fiftieth anniversary of a cartoon feature in Mad Magazine, Spy vs. Spy. The cartoon appeared for the first time in issue number 60 of the magazine in January of 1961. The cartoon is still there, still making us chuckle wryly. I’m sure I’m not the only one for whom this cartoon, second only to the Alfred E. Neuman image (“What, me worry?”), became emblematic of that great magazine. The creator of the cartoon was Antonio Prohías, a Cuban. Prohías fled Cuba as Castro there rose to power. He left the country, Wikipedia here tells me, just three days before Fidel took firm hold of the free press on that blessed isle. Cuba’s loss was a gain for collective madness. History might well view Mad Magazine as the defining publication of the twentieth century. And Prohías, in my mind, belongs to that most venerable of lineages, the jesters of their time who, always with a wink, tell it like it really is.
Labels:
Humor,
Mad Magazine,
Prohias Antonio,
Spy vs Spy
Friday, January 28, 2011
The Downside of Abstraction
Many years ago—and I mean many—I was actively involved supporting a project to import liquefied natural gas (LNG) from Arzew in Algeria to the American East Coast. My employer then, an engineering firm, had pioneered a novel way of containing huge masses of LNG in excavations in the ground employing cryogenic equipment to keep suitable clay frozen as it held this strange but valuable substance. Back in those days, in the oil fields of Algeria, natural gas was a waste product and simply flared to the skies in order to get rid of it—and this project intended to capture this waste, concentrate it by freezing and compression, and getting it to a market where it was in demand. A story on the New York Times business page today therefore drew my attention. This is a case of an entrepreneur hoping to export natural gas from the United States to Europe and to Asia—where prices for the gas are twice what they are here.
My mind then, as I pondered this situation in the light of recent economic and resource events, produced an observation. “The downside of abstraction”—I found myself spontaneously thinking. It took me a moment to unwrap this thought. Gas and energy were obviously in the background—but so was China’s recent curtailment of its exports—namely of rare earths, vital, as these are, for the coming post-petroleum age. Also present behind this incomplete slogan was President Obama’s promotion of exports at the State of the Union—and the fact that the U.S. Department of Energy had already approved the entrepreneur’s intention to export natural gas from these shores. And then I’d finally parsed the thought. What my unconscious had produced was an insight. In this country we’ve become so accustomed to valuing the abstraction of money that we’ve become blind to the concrete character and value of natural resources. Therefore the Department of Energy—which should be principally concerned about our energy future but which, like every other institution, reflexively worships abstraction, had approved shipping away resources we need right here to see us through the gigantic transformation into a fossil-free age.
Money is certainly a valuable tool—but its ultimate meaning resides in what it denominates concretely. If we don’t maintain this distinction—as the Chinese certainly do—we’re bound to make mistakes we’ll later regret.
On the same page of the paper Brigitte also highlighted a story about investor millions rushing into funding social networking sites. This story talked about LinkedIn hoping to go public. “All this eager investment in ephemera?” she jotted in the margin. And then, using a huge curved arrow, she pointed to the natural gas story by way of contrast. I saw the paper after she had read it. Hence my spontaneous production of that headline contains within it the linkage between these two stories—and shows you that socially networking with your mate will produce blog posts. But nobody is sending me any money.
My mind then, as I pondered this situation in the light of recent economic and resource events, produced an observation. “The downside of abstraction”—I found myself spontaneously thinking. It took me a moment to unwrap this thought. Gas and energy were obviously in the background—but so was China’s recent curtailment of its exports—namely of rare earths, vital, as these are, for the coming post-petroleum age. Also present behind this incomplete slogan was President Obama’s promotion of exports at the State of the Union—and the fact that the U.S. Department of Energy had already approved the entrepreneur’s intention to export natural gas from these shores. And then I’d finally parsed the thought. What my unconscious had produced was an insight. In this country we’ve become so accustomed to valuing the abstraction of money that we’ve become blind to the concrete character and value of natural resources. Therefore the Department of Energy—which should be principally concerned about our energy future but which, like every other institution, reflexively worships abstraction, had approved shipping away resources we need right here to see us through the gigantic transformation into a fossil-free age.
Money is certainly a valuable tool—but its ultimate meaning resides in what it denominates concretely. If we don’t maintain this distinction—as the Chinese certainly do—we’re bound to make mistakes we’ll later regret.
On the same page of the paper Brigitte also highlighted a story about investor millions rushing into funding social networking sites. This story talked about LinkedIn hoping to go public. “All this eager investment in ephemera?” she jotted in the margin. And then, using a huge curved arrow, she pointed to the natural gas story by way of contrast. I saw the paper after she had read it. Hence my spontaneous production of that headline contains within it the linkage between these two stories—and shows you that socially networking with your mate will produce blog posts. But nobody is sending me any money.
Thursday, January 27, 2011
You’ve Come a Ways, Henrietta
In looking up how distance is measured in astronomy, I came across something that tells about change in our times. The person who discovered that Cepheid variables can be used for measuring distance was Henrietta Swan Leavitt (1868-1921). I could not find an article on her in my (new) 1956 Encyclopedia Britannica but did find an article on her in my 1989 World Book Encyclopedia. I recently got my EB from the Magees as Christmas present. Some Christmases back, John also gave me a very fine biography of Rosalind Franklin (1920-1958). Does the name ring a bell? With James Watson and Francis Crick she was a co-discoverer of DNA; they got the Nobel and she did not. Okay. Nobel Prize rules insist that the award can only be given to living persons, and Rosalind had died by 1962…
Variable stars pulse in luminosity, and Leavitt discovered that those with longer periods between bright and dimmer phases are more luminous than those with shorter pulses. The earliest of these were discovered close enough to us so that we could measure their distance from us using simple geometry—the parallax method. How that’s accomplished is shown here. We could therefore correlate changes in measurable brightness, short and long pulses, and known distances. Then later, by extrapolation, just a reading of luminosity and period enabled us to use Cepheids for distance measurement even when they were too far away to discover their distance by parallax.
Franklin saw the DNA’s structure by taking an X-ray diffraction image of it—and that image then greatly helped Crick and Watson to zero in on the structure. The image, known as Photo 51, is available here.
Variable stars pulse in luminosity, and Leavitt discovered that those with longer periods between bright and dimmer phases are more luminous than those with shorter pulses. The earliest of these were discovered close enough to us so that we could measure their distance from us using simple geometry—the parallax method. How that’s accomplished is shown here. We could therefore correlate changes in measurable brightness, short and long pulses, and known distances. Then later, by extrapolation, just a reading of luminosity and period enabled us to use Cepheids for distance measurement even when they were too far away to discover their distance by parallax.
Franklin saw the DNA’s structure by taking an X-ray diffraction image of it—and that image then greatly helped Crick and Watson to zero in on the structure. The image, known as Photo 51, is available here.
Labels:
Astronomy,
DNA,
Franklin Rosalind,
Leavitt
Wednesday, January 26, 2011
Bananagrams
One of Monique and John’s Christmas gifts to us was Bananagrams, a Scrabble-like word game—and in selecting this item they once more proved that often it’s the small gifts (this one around $15) that bring the most lasting pleasures to the recipients. Ever since the 1960s, our whole family has been much devoted to My Word, a word-guessing game played on paper. Over the decades we’ve expanded, indeed much improved the game. We have versions where one can guess eleven-letter words. Brigitte and I also go through crossword puzzle phases—printing out three at a time from various sources and then solving them, usually in the summer, out back in the shade of the umbrellas. We’ve also made crossword puzzles—of which the most famed was one New York Times-style Sunday crossword puzzle prepared for the occasion of my Mother’s eightieth birthday. Bananagrams fits right into this matrix. The game offers a puzzle, a creative challenge—and you can easily play a round over lunch, for instances, without taking too much time.
The game consists of 144 tiles with letters. Fifty-four are vowels, E having the greatest frequency (18). J, K, Q, X, and Z are minorities; they appear only twice. The mission of the game is to use all letters the player receives in a crossword or Scrabble-like formation.
We’ve devised our own version with our own rules for measuring excellence. We divide the tiles in half and then we use our 72 tiles to make a puzzle. The result is judged by the density of the structure—the smallest possible square or rectangle—and by the average length of the words: the higher the average, the better the score. Here is the winner from yesterday’s game, laid by Brigitte.
We permit any formation spelled correctly, including abbreviations that might appear in crossword puzzles. You might take exception to YA in the lower right, but a crossword creator would clue that as: “Fully grown person but not yet middle-aged (abr.)” And yes, I noticed. Troglodyte is misspelled. But that could be fixed. Challenge to the reader: How would you do it? By the way, it can be done while still staying within the 11x11 square, the word-count would remain the same, and the unsightly doubling of AXIS would also be fixed.
Abraham Nathanson created Bananagrams and introduced it in 2006—thus virtually yesterday. Nathanson passed away at age 80 in 2008. He was a graphics designer. Thanks, Abraham. Nice contribution to the peace of the world.
The game consists of 144 tiles with letters. Fifty-four are vowels, E having the greatest frequency (18). J, K, Q, X, and Z are minorities; they appear only twice. The mission of the game is to use all letters the player receives in a crossword or Scrabble-like formation.
We’ve devised our own version with our own rules for measuring excellence. We divide the tiles in half and then we use our 72 tiles to make a puzzle. The result is judged by the density of the structure—the smallest possible square or rectangle—and by the average length of the words: the higher the average, the better the score. Here is the winner from yesterday’s game, laid by Brigitte.
This layout turns out to fit into an 11 x 11 square, thus 72 tiles occupying an area of 121. Seventy-two, of course, is the virtually unreachable perfection—no blanks at all. Next we calculate the average word-length, done by counting the total words (22), the tiles they use (107), and dividing tiles by words. Here the result is 4.864 (best to use three decimals). Believe me, that’s a pretty high score. This puzzle has one 10, two 9s, one 8, one 7, and three 6-letter words. That’s as far as we’ve come in developing a scoring system.
We permit any formation spelled correctly, including abbreviations that might appear in crossword puzzles. You might take exception to YA in the lower right, but a crossword creator would clue that as: “Fully grown person but not yet middle-aged (abr.)” And yes, I noticed. Troglodyte is misspelled. But that could be fixed. Challenge to the reader: How would you do it? By the way, it can be done while still staying within the 11x11 square, the word-count would remain the same, and the unsightly doubling of AXIS would also be fixed.
Abraham Nathanson created Bananagrams and introduced it in 2006—thus virtually yesterday. Nathanson passed away at age 80 in 2008. He was a graphics designer. Thanks, Abraham. Nice contribution to the peace of the world.
Labels:
Crosswords,
Games,
Recreation
Tuesday, January 25, 2011
Ideology Is Not The Cause
In the current climate of fear, anxiety, and what appears to be intercultural conflict, it is especially important to remember that any and all structures of human thought are easily exploited to create massive death and havoc. It’s certainly fashionable in this country to single out Islam—while forgetting that, mutatis mutandis, every religious and secular structure of ideas has at one time or another (conditions “cooperating”) been twisted all out of shape and used as the authority to justify killing “the other.”
All widespread human ideologies have developed, over time, to achieve the universal good—however flawed these structures might be in detail. The sequence of events is not as depicted in the usual bashing approaches. We’re led to believe that ideology comes first and/or is peculiarly suited to sanction violence. The actual sequence is that vast social pressures arise, almost all grounded in economics and the abuse of power. These are experienced as injustice. Those reacting to the pressure then look around to justify what they want to do—which is to hit back. To hit back is the motive, not the Qur’an in this case. And whatever the ideology that prevails in those places where the pressure is most felt—it will be that ideology that gets deformed to give authority for doing what should not be done.
The now prevailing terrorism—suicide bombings and the like—arises because of general underdevelopment superintended by wealthy and exploitive regimes so common in Islamic countries. Those rulers are thought to be conspiring and in alignment with the much more powerful West. The West, therefore, becomes the target. Secularism is not immune from this sort of thing either, and modern quasi-scientific ideas have also been twisted into justifications.
Indeed it is always best to ignore the ideological justifications of low-level conflicts and simply look at what is actually happening to the populations that breed the anarchists, terrorists, you name it.
All widespread human ideologies have developed, over time, to achieve the universal good—however flawed these structures might be in detail. The sequence of events is not as depicted in the usual bashing approaches. We’re led to believe that ideology comes first and/or is peculiarly suited to sanction violence. The actual sequence is that vast social pressures arise, almost all grounded in economics and the abuse of power. These are experienced as injustice. Those reacting to the pressure then look around to justify what they want to do—which is to hit back. To hit back is the motive, not the Qur’an in this case. And whatever the ideology that prevails in those places where the pressure is most felt—it will be that ideology that gets deformed to give authority for doing what should not be done.
The now prevailing terrorism—suicide bombings and the like—arises because of general underdevelopment superintended by wealthy and exploitive regimes so common in Islamic countries. Those rulers are thought to be conspiring and in alignment with the much more powerful West. The West, therefore, becomes the target. Secularism is not immune from this sort of thing either, and modern quasi-scientific ideas have also been twisted into justifications.
Indeed it is always best to ignore the ideological justifications of low-level conflicts and simply look at what is actually happening to the populations that breed the anarchists, terrorists, you name it.
Labels:
Ideologies,
Islam,
Terrorism
Monday, January 24, 2011
Four Seasons
Now that winter has arrived, I am able to complete a series I began last April. The winter picture turned out to be the most difficult—not to take the photos but choose the one to include. I opted for the cheer of winter sunshine—although an over-cast image with snow falling tempted me as well. In a compromise, I am showing the darker image down below. They were taken one day apart, both in natural light.
Labels:
Seasons
This Building Obeys the “Next” Command
In my last dream this morning I was standing on a street in front of a big yellow two-, three-storey, very run down building. I faced a big shop window. Part of its glass had been shattered and I could have reached in to touch the merchandise. That merchandise was old, used, worn, ratty clothing displayed indifferently on hangers. The outer wall of the building next to the window had been severely damaged. Two young women, next to me, arm in arm, looked at the same display. They clung like that, perhaps, because the scene was not a nice one. In irritation I looked up and sort of mentally touched the upper left corner of the building—touched it with my eyes. The building at once responded with a rapid, rumbling motion. The broken wall magically healed itself, the dreary shop window moved off to the right and disappeared, and a tidy office structure, but still yellow, had replaced the old one.
I turned to the two women who were still standing there. “This building obeys the ‘Next’ command,” I said to them, half amazed and half amused. They were surprised too and kind of shook their heads. This sort of thought, of course—the kind in which awareness lights up the mind—rapidly terminates all dreams, and so I lay there in the dark this morning chuckling to myself.
I turned to the two women who were still standing there. “This building obeys the ‘Next’ command,” I said to them, half amazed and half amused. They were surprised too and kind of shook their heads. This sort of thought, of course—the kind in which awareness lights up the mind—rapidly terminates all dreams, and so I lay there in the dark this morning chuckling to myself.
Labels:
Dreams
Sunday, January 23, 2011
Remembering Ned Ludd
Brigitte and I read an article in the Atlantic titled “The Rise of the New Global Elite.” It is reachable here—and thanks, Monique, for pointing it out to us. We’re now in a period when the Atlantic isn’t reaching us. The article caused in me the spontaneous rise of the ghost of Ned Ludd—and not just because Brigitte likes to call herself a “born-again Luddite” when reading about contexts such as the author of the article, Chrystia Freeland, evokes. Brigitte’s self-description is amusing in that she comes from a family one side of which operated textile and the other textile machinery factories.
In my mind, anyway, the rise of a new global elite is the almost inevitable consequence of the rise of the machine—and the rise of the machine the consequence of our exploitation of fossil fuels. These two technological phenomena, combined with what the nuns taught me was Original Sin, were bound to produce both democratic forms of governance and the worship of the Market. I comment on that subject on LaMarotte today under the heading of “It Takes Fewer People” here. But I thought I'd mark the day on Ghulf Genes by summarizing the very interesting Luddite phenomenon, originating in the early nineteenth century. As once, so in the future. My sources here are strictly Wikipedia.
What everybody knows is that the Luddites destroyed machinery to save jobs. But the details are very interesting and telling. This movement, dating to 1811 or the year after, coincides with the introduction of the first mechanized looms in the textile industry in England, thus the first time that steam was deployed to drive production machinery: the marriage of machines and fossil fuels. The mechanized looms incorporated, in their very design, by mechanical arrangements, capacities until then only possessed by skilled workers. Human skills had thus been transferred to machines—and cheaper, less skillful workers could be employed, expensive weavers furloughed. Workers attacked mills, broke weaving frames with hammers, and burned factories. Later riots erupted over unemployment, and the “break the machines” fervor also spread to agriculture when threshing machines were destroyed. Threshing machines separate cereal grains from chaff.
The Luddite movement coincided with the Napoleonic Wars in which England was engaged (1803-1814). Suppression of Luddite uprisings at one time employed more members of the British army than the war, lead to at least one spectacular trial in 1813 after England passed the Frame-Breaking Act (1812) which made industrial sabotage a capital crime. In 1813 seventeen men were executed for this crime, and many others were “transported” to Australia to serve as prisoners and future fathers of the Australian population.
History has its amusing aspects, not least the naming of movements. The Luddite movement had multiple leaders, but none was actually named Ned Ludd. How did the name come to cleave to the movement? Here is how Wikipedia tells the story (here):
In my mind, anyway, the rise of a new global elite is the almost inevitable consequence of the rise of the machine—and the rise of the machine the consequence of our exploitation of fossil fuels. These two technological phenomena, combined with what the nuns taught me was Original Sin, were bound to produce both democratic forms of governance and the worship of the Market. I comment on that subject on LaMarotte today under the heading of “It Takes Fewer People” here. But I thought I'd mark the day on Ghulf Genes by summarizing the very interesting Luddite phenomenon, originating in the early nineteenth century. As once, so in the future. My sources here are strictly Wikipedia.
What everybody knows is that the Luddites destroyed machinery to save jobs. But the details are very interesting and telling. This movement, dating to 1811 or the year after, coincides with the introduction of the first mechanized looms in the textile industry in England, thus the first time that steam was deployed to drive production machinery: the marriage of machines and fossil fuels. The mechanized looms incorporated, in their very design, by mechanical arrangements, capacities until then only possessed by skilled workers. Human skills had thus been transferred to machines—and cheaper, less skillful workers could be employed, expensive weavers furloughed. Workers attacked mills, broke weaving frames with hammers, and burned factories. Later riots erupted over unemployment, and the “break the machines” fervor also spread to agriculture when threshing machines were destroyed. Threshing machines separate cereal grains from chaff.
The Luddite movement coincided with the Napoleonic Wars in which England was engaged (1803-1814). Suppression of Luddite uprisings at one time employed more members of the British army than the war, lead to at least one spectacular trial in 1813 after England passed the Frame-Breaking Act (1812) which made industrial sabotage a capital crime. In 1813 seventeen men were executed for this crime, and many others were “transported” to Australia to serve as prisoners and future fathers of the Australian population.
History has its amusing aspects, not least the naming of movements. The Luddite movement had multiple leaders, but none was actually named Ned Ludd. How did the name come to cleave to the movement? Here is how Wikipedia tells the story (here):
Ned Lud [Lud is a variant spelling of the name] was a weaver, believed to be from Anstey, who in 1779, by some accounts, either after being whipped for idleness, or after being taunted by local youths, smashed two knitting frames in what was described as a “fit of passion.” Other accounts offer the less dramatic explanation that Lud was told by his father, who was a framework-knitter, to ‘square his needles’; Lud took a hammer and “beat them into a heap.” News of the incident spread, and after a time, whenever frames were sabotaged, people would jokingly say that “Ned Lud did it.”
Labels:
Luddites
Saturday, January 22, 2011
Friday, January 21, 2011
Cheer, Whining, and Realism
It’s difficult to tell jokes expansively in an air raid shelter when bombs are falling. Brigitte and I are old enough and came from far enough away actually to have experienced the mood in such places directly. Indeed we’ve also lived in Germany, both of us briefly, at the very end of Word War II and saw the consequences of super-inflation with our own eyes—the second round, not that of the 1920s. In my own case it was a kind of Kodak moment. Mother had dispatched me to the bakery (no, I’m not actually kidding) with half of a cigarette; the baker immediately put it into his mouth and lit it. Then he gave me three freshly backed rolls—and we, the children, ate them for breakfast that morning. On the way to the bakery, which was just half a block away, I saw a cardboard box filled with Reichsmarks. The wind was lifting them out of the box and blowing them down the street.
Mind you, no particular virtue attaches to having lived through interesting times, but it does leave a mark. It makes you grow up faster and reinforces a certain stance to life. In that environment I actually knew of families that sent their children out to collect cigarette ends the Americans left behind after Patton’s Army arrived. Not my parents. But humanity is extraordinarily adaptive—and also quick to sweep such things under the rug and to forget them the moment pressures ease up. In times like those cultivated public behaviors like “positive attitude,” “stiff-upper lip,” “cautious optimism,” “grabbing the gusto,” etc. were most noticeably absent. These had been replaced by a kind of straight-forward realism. In people who had acquired and actually internalized cultural values, these values still restrained and shaped behavior; but those who had not acted without shame or restraint.
I mention these experiences because quite visible cracks have now appeared in our society and continue to spread. This heralds difficult times—and in times like that it’s also time to become realistic. The actual trigger for this post is a story in the New York Times suggesting that movements are now afoot to enable state governments to declare bankruptcy, if not literally then at least functionally. The result of such a move would be to deprive retired state workers of their pensions and of state bond holders of the value of their bonds. The Times puts it delicately thus: “Still, discussions about something as far-reaching as bankruptcy could give governors and others more leverage in bargaining with unionized public workers.” Precisely this sort of thing, if widespread enough and combined with other shocks, like massive unemployment or run-away in- or deflation, produce draconian changes in government, like the Nazi regime in Germany.
Fifty years of economic and technological expansion, an ideology of progress, feeble-minded notions that reality is being radically transformed, have left enough of a residue so that the prevailing notion is that this too shall pass and—furthermore—without extraordinary or unusual change or sacrifice. The problem-solution pairing has taken too much of a hold. This suggests that any situation whatsoever is capable of resolution by changing the mechanics through legislation or opinion-molding. By feeble-minded notions above I have in mind ridiculous ideas blabbed by dot.com gurus and entrepreneurs in the late 1990s that needing to make profits was no longer a real requirement; daughter Monique used to collect news clippings like that because, in the future, she thought, nobody would believe that this sort of thing had actually been said. Another Internet myth—“The New Price is ‘Free’”—belongs in that category as well—as does the notion that a nation can fight wars without constraining consumption and raising taxes.
It really is important to heed George Santayana’s quip to the effect that “Those who cannot remember the past are condemned to repeat it.” He said that, I’m told, in Vol. 1 of The Life of Reason. Most people alive today are too young to have the memories of a past still alive in our memories. Hence it is actually a positive act to dig them out for a bit of display.
Mind you, no particular virtue attaches to having lived through interesting times, but it does leave a mark. It makes you grow up faster and reinforces a certain stance to life. In that environment I actually knew of families that sent their children out to collect cigarette ends the Americans left behind after Patton’s Army arrived. Not my parents. But humanity is extraordinarily adaptive—and also quick to sweep such things under the rug and to forget them the moment pressures ease up. In times like those cultivated public behaviors like “positive attitude,” “stiff-upper lip,” “cautious optimism,” “grabbing the gusto,” etc. were most noticeably absent. These had been replaced by a kind of straight-forward realism. In people who had acquired and actually internalized cultural values, these values still restrained and shaped behavior; but those who had not acted without shame or restraint.
I mention these experiences because quite visible cracks have now appeared in our society and continue to spread. This heralds difficult times—and in times like that it’s also time to become realistic. The actual trigger for this post is a story in the New York Times suggesting that movements are now afoot to enable state governments to declare bankruptcy, if not literally then at least functionally. The result of such a move would be to deprive retired state workers of their pensions and of state bond holders of the value of their bonds. The Times puts it delicately thus: “Still, discussions about something as far-reaching as bankruptcy could give governors and others more leverage in bargaining with unionized public workers.” Precisely this sort of thing, if widespread enough and combined with other shocks, like massive unemployment or run-away in- or deflation, produce draconian changes in government, like the Nazi regime in Germany.
Fifty years of economic and technological expansion, an ideology of progress, feeble-minded notions that reality is being radically transformed, have left enough of a residue so that the prevailing notion is that this too shall pass and—furthermore—without extraordinary or unusual change or sacrifice. The problem-solution pairing has taken too much of a hold. This suggests that any situation whatsoever is capable of resolution by changing the mechanics through legislation or opinion-molding. By feeble-minded notions above I have in mind ridiculous ideas blabbed by dot.com gurus and entrepreneurs in the late 1990s that needing to make profits was no longer a real requirement; daughter Monique used to collect news clippings like that because, in the future, she thought, nobody would believe that this sort of thing had actually been said. Another Internet myth—“The New Price is ‘Free’”—belongs in that category as well—as does the notion that a nation can fight wars without constraining consumption and raising taxes.
It really is important to heed George Santayana’s quip to the effect that “Those who cannot remember the past are condemned to repeat it.” He said that, I’m told, in Vol. 1 of The Life of Reason. Most people alive today are too young to have the memories of a past still alive in our memories. Hence it is actually a positive act to dig them out for a bit of display.
Labels:
Culture,
World War II
Wednesday, January 19, 2011
Smart Money
This phrase, for me, has over the decades come to be an all-purpose abbreviation for amoral attitudes toward reality—not because intelligence, and therefore the intelligent deployment of resources, not least money, is repugnant as such but because the phrase has always implied to me special knowledge applied for personal gain with the implication that a zero-sum game is in progress and, therefore, the gain of the smart fellow is someone else’s loss. Exploitive. The phrase also implies a relatively short time span, always. Thus it’s a now sort of thing—not the slow, laborious accumulation of acorns to get through a long, long winter.
The phrase arose this morning as Brigitte and I contemplated a pie chart showing the indebtedness of the United States, and “smart money” popped into my mind spontaneously. I went to look up the origins of the phrase. Online Etymology Dictionary finds it used first in 1926 to mean “money bet by those in the know”—thus a kind of “insider trading.” Another aspect of it, not referred to in the dictionary—but I encountered it throughout my years in business—is the use of “other people’s money” rather than your own, which people even abbreviated as OPM, always with a bit of a sly grin. Therefore huge U.S. debts to foreign sources, amounting to $25 trillion, probably triggered the abbreviation, OPM, and in turn “smart money.”
What I hadn’t realized is that this same phrase had been used to mean something quite different in the seventeenth century, namely “‘money paid to sailors, soldiers, workers, etc., who have been disabled while on the job’ (1690s)”. In this phrase, the word “smart” refers to the verb, namely to hurt, to smart. Money for suffering, in other words. This got me interested in the origins of to smart. Turns out that the verb derives from the West Germanic root smert-. Knowing German well, the word schmerzen, to hurt, immediately came to mind.
Now my mind put two things together. The overlaid hurt and the witty meanings of smart—and the short-term nature of smart money dealings. And it occurred to me that smart money activities, when viewed in the long term, ultimately always produce—pain.
The phrase arose this morning as Brigitte and I contemplated a pie chart showing the indebtedness of the United States, and “smart money” popped into my mind spontaneously. I went to look up the origins of the phrase. Online Etymology Dictionary finds it used first in 1926 to mean “money bet by those in the know”—thus a kind of “insider trading.” Another aspect of it, not referred to in the dictionary—but I encountered it throughout my years in business—is the use of “other people’s money” rather than your own, which people even abbreviated as OPM, always with a bit of a sly grin. Therefore huge U.S. debts to foreign sources, amounting to $25 trillion, probably triggered the abbreviation, OPM, and in turn “smart money.”
What I hadn’t realized is that this same phrase had been used to mean something quite different in the seventeenth century, namely “‘money paid to sailors, soldiers, workers, etc., who have been disabled while on the job’ (1690s)”. In this phrase, the word “smart” refers to the verb, namely to hurt, to smart. Money for suffering, in other words. This got me interested in the origins of to smart. Turns out that the verb derives from the West Germanic root smert-. Knowing German well, the word schmerzen, to hurt, immediately came to mind.
Now my mind put two things together. The overlaid hurt and the witty meanings of smart—and the short-term nature of smart money dealings. And it occurred to me that smart money activities, when viewed in the long term, ultimately always produce—pain.
Labels:
Smart Money,
Words
Tuesday, January 18, 2011
Vanity Plate Poetry
Some purchasers of vanity license plates rise to veritable heights of poetic ambiguity. I encountered an example of this yesterday, a plate with the following “request”: PREY4ME.
Labels:
Vanity Plates
Monday, January 17, 2011
Further Notes on “Climates”
Prelude to the French Revolution
From “Social Aspects in Interpreting the French Revolution, 1789” by Abha Trivedi, University of Lucknow, available here from the Annual Proceedings of the Florida Conference of Historians, Volume 15, February 2008. The Emphasis is mine.
From “German Economy in the 1920s” by Daniel Castillo (here); Castillo was a junior in college studying business economics at the time of writing. The text has been edited a tiny bit.
Is it not difficult, in retrospect, to imagine the prevalent climate of opinion in France in the 1770s and 1780s and in Germany in the 1920s and the 1930s…
From “Social Aspects in Interpreting the French Revolution, 1789” by Abha Trivedi, University of Lucknow, available here from the Annual Proceedings of the Florida Conference of Historians, Volume 15, February 2008. The Emphasis is mine.
After a long period of prosperity, France witnessed serious economic depression in the 1770s and 1780s which caused resentment and bitterness as all classes faced a decline in their status. Production declined, unemployment increased and the recession soon reached to the agricultural sector. Severe drought in 1785 worsened the situation as the yields were short. Furthermore, in 1788 the harvest was ruined by an abnormally wet summer and conditions became even worse in 1789. Alexis de Tocqueville disagreed with Carl Marx’s contention that worsening conditions create a situation favorable to revolution. Tocqueville noted: ‘It is not always by going from bad to worse that a country falls into a revolution and that the French Revolution broke out when conditions were improving.’ He further noted, ‘the state of things destroyed by a revolution is almost always somewhat better than that which immediately precedes it.’ [1] In 1962 J. C. Davies supported Tocqueville as he observed, revolutions are most likely to occur when a prolonged period of objective economic and social development is followed by a short period of sharp reversal.[2] This seems to be borne out by the general economic trends of the eighteenth century.Prelude to the Nazi Era in Germany
[1] Alexis de Tocqueville, The Ancien Regime (Fontana ed., 1966)
[2] J.C. Davies, Toward a theory of Revolution, in J.C. Davies (ed.), When Men Revolt and Why.
From “German Economy in the 1920s” by Daniel Castillo (here); Castillo was a junior in college studying business economics at the time of writing. The text has been edited a tiny bit.
At first Germany tried to recover from the war [World War I] by way of social spending. Germany began creating transportation projects and modernized power plants and gas works. These projects were all used to battle the increasing unemployment rate. Social spending was rising at an unbelievable rate. In 1913 the government was spending approximately 20.5 marks per resident; by 1925 spending had risen to almost 65 marks per resident; in 1929 it reached more than 100 marks per resident. The rising amounts of money used for social spending combined with plummeting revenues that caused continuing deficits. Eventually municipal finances collapsed in 1930. Although it seemed as if the collapse was due to debt, in actuality ordinary budgets were the reason for the initial collapse. Municipal officials and politicians were unable to restore order to the budgets. Further adding to Germany's economic problems, the revenue from income tax began to fall. In 1913, more than 53% of all tax revenues came from income, but in 1925, it had dropped to 28%. As the returns from income tax decreased, the government began to depend much more on state trade and property taxes. The government also became highly dependent on the profits made from municipal utilities, such as electric power plants.To this I would add a number or two. In April of 1919, the exchange rate was 17 marks for the dollar. In April 1923 the rate was 24,000. In December of that same year, the rate was 4,200,000,000,000!
Even with all of Germany’s economic shortcomings, it could have still been possible to make reparation payments if foreign countries had not placed protective tariffs on Germany’s goods. With the income Germany could have gained by selling goods in foreign countries, for relatively low prices, reparation payments could have become feasible. The protective tariffs made this idea impossible and further depressed the German economy. Faced with reparation payments they could not afford, Germany began printing exaggerated amounts of money. This threw Germany into a state of super inflation. Inflation reached the point where millions of marks were worthless. Cartoons of the time depicted people with wheelbarrows full of money who could not buy a loaf of bread. “With the approach of world crisis foreign lenders withdrew capital and markets further closed against German imports” (Sweezy 8 [not in source]). The United States was an extremely significant example of this. When the U.S. was hit by the great depression they immediately sought to get the loans, which they had made to Germany, paid back. This, in addition to all of Germany’s other problems, practically caused the German economy to collapse.
Is it not difficult, in retrospect, to imagine the prevalent climate of opinion in France in the 1770s and 1780s and in Germany in the 1920s and the 1930s…
Labels:
Climates of Opinion,
French Revolution,
Nazi Era
Sunday, January 16, 2011
The Anatomy of Climates
I’ve commented once before on climates of opinion here, a subject now under intense examination. An article in the New York Times this morning, for example, wonders “Is the Anger Gone?”—and harks back to the sudden collapse of McCarthyism. The day-old Tunisian upheaval tells me indirectly what kind of climate of opinion must have been seething there. A few days ago I had occasion to review my memories of a visit to East Germany—and the feel of that place came back: so very different from the aggressive progressivism that then ruled the west.
It’s natural to represent such climates as in some ways mysterious. An unsentimental dissection of the thing suggests that climates of opinion simply sum up what most people observe and/or experience and the manner in which observations and experiences either are (or in the case of Tunisia were not) accurately reflected in the public media. The sheer cumulation of events, small and large, and the sheer repetition of the same words everywhere, in the media and then echoed back from the media at the water cooler, produce a climate, a kind of consensus describing not only how things are “in general” but also the trend that “things” are following.
I find it easy to explain the collapse of McCarthyism. In the early 1950s America was on the verge of the most potent economic expansion in its history, about to explode in a veritable storm of physical inventiveness that would culminate in the electronic age of the 1990s. Communism was not a viscerally felt threat on Main Street. The trauma of World War II was receding, not increasing. McCarthy’s crusade was, well McCarthy’s—supported by relatively small interest groups whose reach into the general public was not extensive. The times were relatively good, hence the poison of projected hate did not reach many. The public was immune.
The current climate, let’s call it the Climate of Confrontation, has much more to do with deep, pervasive economic phenomena (to steal a Marxist explanation) than anything else. The great success of that half-century—from McCarthyism to the dot.com bust—could not be perpetually maintained, not when the god of that era, Technology, was undermining jobs, globalism was shifting occupations overseas, and excess wealth sought innovative ways of taking the risk out of monstrous gambles—with hedge funds and the like. The terrorist attack in 2001 was, for the governing elite, a god-send, you might say. It introduced a wonderful distraction to a very major looming problem—namely the need to create a new economy for the U.S. population not dependent on constant, aggressive growth that, in its achievement, simply erodes the jobs of those who are supposed to maintain it. We’ve embraced the war on terror as a new mission—now that the god of Progress was trampling out the vintage of its living worshippers. The period since 9/11 has been a vast, monstrous over-reaction, and in fighting multiple wars and quasi-wars (Yemen, etc.), we’ve failed to transform the economy to support those conflicts but, instead, have been urged to continue valiantly consuming.
This sort of incoherence was absolutely bound to produce the climate of confrontation—because these moves are contradictory. World War II at least restored employment. The War on Terror is still eroding it. The world war had a clearly visible vector. The War on Terror appears to have no end. After losing eight million jobs in 2008 and 2009, we’ve managed to recover one million in 2010. Civility—however desirable and laudable—is not a solution to the very major problem of incoherence in our culture. And we will not solve it by media “confrontations” in which, no matter what the issue, two spokespeople must be faced off to argue the matter as heatedly as possible to maintain the ratings of Personalities.
The anger will disappear, I propose, when we start getting word of people in our extended circles getting jobs and rescuing their threatened mortgages—when fire departments start to hire rather than lay off—when taxes start rising—when our infrastructure starts being fixed, and something is done to mute down the obscene violence in our television shows and the computer games our children play.
It’s natural to represent such climates as in some ways mysterious. An unsentimental dissection of the thing suggests that climates of opinion simply sum up what most people observe and/or experience and the manner in which observations and experiences either are (or in the case of Tunisia were not) accurately reflected in the public media. The sheer cumulation of events, small and large, and the sheer repetition of the same words everywhere, in the media and then echoed back from the media at the water cooler, produce a climate, a kind of consensus describing not only how things are “in general” but also the trend that “things” are following.
I find it easy to explain the collapse of McCarthyism. In the early 1950s America was on the verge of the most potent economic expansion in its history, about to explode in a veritable storm of physical inventiveness that would culminate in the electronic age of the 1990s. Communism was not a viscerally felt threat on Main Street. The trauma of World War II was receding, not increasing. McCarthy’s crusade was, well McCarthy’s—supported by relatively small interest groups whose reach into the general public was not extensive. The times were relatively good, hence the poison of projected hate did not reach many. The public was immune.
The current climate, let’s call it the Climate of Confrontation, has much more to do with deep, pervasive economic phenomena (to steal a Marxist explanation) than anything else. The great success of that half-century—from McCarthyism to the dot.com bust—could not be perpetually maintained, not when the god of that era, Technology, was undermining jobs, globalism was shifting occupations overseas, and excess wealth sought innovative ways of taking the risk out of monstrous gambles—with hedge funds and the like. The terrorist attack in 2001 was, for the governing elite, a god-send, you might say. It introduced a wonderful distraction to a very major looming problem—namely the need to create a new economy for the U.S. population not dependent on constant, aggressive growth that, in its achievement, simply erodes the jobs of those who are supposed to maintain it. We’ve embraced the war on terror as a new mission—now that the god of Progress was trampling out the vintage of its living worshippers. The period since 9/11 has been a vast, monstrous over-reaction, and in fighting multiple wars and quasi-wars (Yemen, etc.), we’ve failed to transform the economy to support those conflicts but, instead, have been urged to continue valiantly consuming.
This sort of incoherence was absolutely bound to produce the climate of confrontation—because these moves are contradictory. World War II at least restored employment. The War on Terror is still eroding it. The world war had a clearly visible vector. The War on Terror appears to have no end. After losing eight million jobs in 2008 and 2009, we’ve managed to recover one million in 2010. Civility—however desirable and laudable—is not a solution to the very major problem of incoherence in our culture. And we will not solve it by media “confrontations” in which, no matter what the issue, two spokespeople must be faced off to argue the matter as heatedly as possible to maintain the ratings of Personalities.
The anger will disappear, I propose, when we start getting word of people in our extended circles getting jobs and rescuing their threatened mortgages—when fire departments start to hire rather than lay off—when taxes start rising—when our infrastructure starts being fixed, and something is done to mute down the obscene violence in our television shows and the computer games our children play.
Labels:
Climates of Opinion,
McCarthyism
Saturday, January 15, 2011
Still Repeated in the 23rd Century
In one of the Star Trek films or episodes, one of the characters (Lt. Saavik, I think it was) explains the word sabotage saying that in ancient times people wore wooden shoes, sabots. And when labor disputes arose, they used to throw their sabots into the machinery to stop it—hence the word sabotage. Delightful!
That word, sabot, is a thirteenth century French word, by the way. And, by the way, this etymology of the word sabotage is not accepted by the experts—although they do confirm that the story has legs. Of course it does! Great story—and still told in the twenty-third century…
Evidently the real origins come from careless or clumsy behavior—not least making too much noise when walking, which you would do if you walked about in wooden shoes. The word came from saboter, to bungle, to handle clumsily. It is also found in contexts of playing music badly. Perhaps those who could not afford leather shoes were in a lower class? And lacked refinement? And managed to mess things up?
Like the Star Trek crowd, I for one prefer the vivid picture of angry laborers taking off their clogs and stopping the damned machines in an upsurge of sawdust.
That word, sabot, is a thirteenth century French word, by the way. And, by the way, this etymology of the word sabotage is not accepted by the experts—although they do confirm that the story has legs. Of course it does! Great story—and still told in the twenty-third century…
Evidently the real origins come from careless or clumsy behavior—not least making too much noise when walking, which you would do if you walked about in wooden shoes. The word came from saboter, to bungle, to handle clumsily. It is also found in contexts of playing music badly. Perhaps those who could not afford leather shoes were in a lower class? And lacked refinement? And managed to mess things up?
Like the Star Trek crowd, I for one prefer the vivid picture of angry laborers taking off their clogs and stopping the damned machines in an upsurge of sawdust.
Thursday, January 13, 2011
The Military Order of the Purple Heart
This morning came an ordinary telephone call, the voice of an ordinary middle-aged woman, a voice like a neighbor’s, not mellifluous like soliciting voices tend to be: “This is Purple Heart calling. Would you have something to contribute? We’re coming by in two weeks.” The answer, in this house, is always “Yes.” “Fine. I’ll give you a reminder call a couple of days before.”
George Washington himself devised the Purple Heart as a decoration for soldiers in 1782. It was to be “the figure of a heart in purple cloth or silk edged with narrow lace or binding” — a little heart-shaped piece of cloth for merit. Why did Washington come up with this? Just before he published the general order instituting this reward, Washington had received orders from the Continental Congress ordering him to cease promoting soldiers for merit or to grant them commissions. And why did the Congress so order? It had run out of money. But merit should be recognized—or so George Washington thought. A little piece of cloth would symbolize it. And the Purple Heart was born.
The Military Order of the Purple Heart dates to 1932. It is an organization composed solely of the recipients of the decoration, and its mission was “to foster an environment of goodwill and camaraderie among combat wounded veterans, promote patriotism, support necessary legislative initiatives, and most importantly, provide service to all veterans and their families.”
The Order operates a good many programs—of which my caller belongs to the Service Program. This program consumes $6 million of the Order’s total budget of $9 million annually. It operates 70 offices, has a staff of 100 people, and collects clothing, shoes, books, appliances, and other items of value offered for sale in its shops. Unassuming, professional, barely noticed—and serving the needy well, well beyond people who have received this military distinction. The Order also has educational, memorial, scholarship, youth, and first responder programs, the last serving the families of law enforcement officers and fire fighters who have died in the line of duty. So many, many aspects of the core values of America are like that, like Purple Heart: quiet, but vigorously active.
George Washington himself devised the Purple Heart as a decoration for soldiers in 1782. It was to be “the figure of a heart in purple cloth or silk edged with narrow lace or binding” — a little heart-shaped piece of cloth for merit. Why did Washington come up with this? Just before he published the general order instituting this reward, Washington had received orders from the Continental Congress ordering him to cease promoting soldiers for merit or to grant them commissions. And why did the Congress so order? It had run out of money. But merit should be recognized—or so George Washington thought. A little piece of cloth would symbolize it. And the Purple Heart was born.
The Military Order of the Purple Heart dates to 1932. It is an organization composed solely of the recipients of the decoration, and its mission was “to foster an environment of goodwill and camaraderie among combat wounded veterans, promote patriotism, support necessary legislative initiatives, and most importantly, provide service to all veterans and their families.”
The Order operates a good many programs—of which my caller belongs to the Service Program. This program consumes $6 million of the Order’s total budget of $9 million annually. It operates 70 offices, has a staff of 100 people, and collects clothing, shoes, books, appliances, and other items of value offered for sale in its shops. Unassuming, professional, barely noticed—and serving the needy well, well beyond people who have received this military distinction. The Order also has educational, memorial, scholarship, youth, and first responder programs, the last serving the families of law enforcement officers and fire fighters who have died in the line of duty. So many, many aspects of the core values of America are like that, like Purple Heart: quiet, but vigorously active.
Labels:
Purple Heart,
Washington George
Wednesday, January 12, 2011
Notes: Social Networking
“Searching on Google is good, but having your friends help you find what you’re looking for is better.”You will find those words here, as part of an article titled “Google’s View: Three trends in social networking.” The author is Rafe Needleman, an editor of CNET’s webware. Needleman is summarizing the gist of a talk given by Joe Kraus, Director of Product Management, at Google. Krause does not actually utter those words in his talk, which is also accessible from the same address to which I link above.
Kraus’ talk, while a bit choppy—chopped up by anecdotes, each illustrated by pics of web pages (which happen not to be easy to see), and short slogans briefly flashed up — is worth a look if you wish to get a feel for how Google—and indeed every big web company, these days—is thinking.
Kraus talks about the web “evolving” in the direction of a giga-global social web. Other interpretations of what is actually happening, however, suggest themselves. One is that in a relentless drive to find yet other venues for advertising, web companies have discovered a vast new market and are now busy transforming the Internet solely to serve this market. One corollary of that is that an electronic utility now serving Information Seekers on the one hand and Interaction Seekers on the other will be skewed powerfully to favor the second group, thus making it more difficult to get facts, history, literature, etc. without it arriving weighed down heavily by the attached or linked opinions of every one of your “friends.”
Labels:
Google,
Internet,
Social Networks
Monday, January 10, 2011
I Saw It on Cable
I saw it on cable and therefore it’s true
Pictures of soldiers? It must be a coup!
Odd marks in the mud? That’s surely a gnu!
A crawling of letters—what markets now do.
Fast moving ads flash attempting to woo
Valued and precious yours truly—not you.
I turn up the sound while I go to the loo,
Not missing a moment of cry and the hue:
A new attack, Palestinian on Jew?
The CDC lady updating on flu?
So busy observing this newsworthy brew
Busy while chewing and stirring the stew
My mind gets distracted. Forgot what I knew.
Pictures of soldiers? It must be a coup!
Odd marks in the mud? That’s surely a gnu!
A crawling of letters—what markets now do.
Fast moving ads flash attempting to woo
Valued and precious yours truly—not you.
I turn up the sound while I go to the loo,
Not missing a moment of cry and the hue:
A new attack, Palestinian on Jew?
The CDC lady updating on flu?
So busy observing this newsworthy brew
Busy while chewing and stirring the stew
My mind gets distracted. Forgot what I knew.
Sunday, January 9, 2011
How Perception Shapes Our Sense of Reality
We feel somber on cloudy, foggy mornings. When the sun’s bright over snow, our bodies reflect the sunshine. What a beautiful day! We’re somewhat less aware of the fact that overcast weather is usually matched by the presence of low pressure and contrariwise a blue sky usually means high pressure—and that this affects our bodies too.
We also tend altogether to ignore that which repeats and is therefore unremarkable. Once decades ago I was driving to the airport in Washington, DC at around five in the morning in the dark, taking the Beltway (495 to 395)—the freeway thick, thick, thick with traffic. Suddenly it struck me forcefully that these hundreds, thousands of cars and trucks, roaring together at very high speeds—were all performing perfectly! Hundreds of thousands of tires were holding the air, millions of ball-bearings were smoothly rolling, nicely lubricated in clean oil; similar numbers of pistons translated millions of explosions per second all around me into motion; headlights flawlessly lit our way… And an intuition burst upon me of the vast and entirely ignored mass of human care and virtue that produced this harmonious, high-speed process moving these thousands of people to their destinations.
We notice what changes—the weather. We notice what goes wrong. That which works, and smoothly—that we ignore. It’s a biological arrangement, to be sure. Ever alert for danger. Ever prepared for adaptation. But we black-box whatever smoothly works. How many of us sit down with a paper and take even two minutes gratefully to admire the flawless operations of our digestive system? But if indigestion should happen to strike. Whoa! All bets are off in our rush for the Tums.
A troubled youth shoots down a member of Congress, kills a judge, a child, others; wounds yet other innocents in a Safeway parking lot. An artificial weather system—the electronic media—brings the news instantly. It’s all over the airwaves. The cable channels have cleared the decks. Intense focus—except for the supposedly startling discontinuity of ads—but the odd thing is that we’re so used to ads we don’t find these commercial pauses odd at all. The shooting in Arizona has a much greater impact on our sense of reality than the fact that yesterday—what? some 6.9 billion people?—yes, some 6.9 billion people had perfectly ordinary days, days on which, like the day before, the toilets flushed, the refrigerator kept the cold, the doctor was there for the appointment, the car started, the bucket held water, the streetcars ran, the light switch worked. Curious contrasts. The world’s going to hell in a handbasket? No, not quite yet. For that, presumably, we’ll have to wait until December 2012 when, if the History Channel has read Nostradamus correctly, Armageddon will finally arrive.
We also tend altogether to ignore that which repeats and is therefore unremarkable. Once decades ago I was driving to the airport in Washington, DC at around five in the morning in the dark, taking the Beltway (495 to 395)—the freeway thick, thick, thick with traffic. Suddenly it struck me forcefully that these hundreds, thousands of cars and trucks, roaring together at very high speeds—were all performing perfectly! Hundreds of thousands of tires were holding the air, millions of ball-bearings were smoothly rolling, nicely lubricated in clean oil; similar numbers of pistons translated millions of explosions per second all around me into motion; headlights flawlessly lit our way… And an intuition burst upon me of the vast and entirely ignored mass of human care and virtue that produced this harmonious, high-speed process moving these thousands of people to their destinations.
We notice what changes—the weather. We notice what goes wrong. That which works, and smoothly—that we ignore. It’s a biological arrangement, to be sure. Ever alert for danger. Ever prepared for adaptation. But we black-box whatever smoothly works. How many of us sit down with a paper and take even two minutes gratefully to admire the flawless operations of our digestive system? But if indigestion should happen to strike. Whoa! All bets are off in our rush for the Tums.
A troubled youth shoots down a member of Congress, kills a judge, a child, others; wounds yet other innocents in a Safeway parking lot. An artificial weather system—the electronic media—brings the news instantly. It’s all over the airwaves. The cable channels have cleared the decks. Intense focus—except for the supposedly startling discontinuity of ads—but the odd thing is that we’re so used to ads we don’t find these commercial pauses odd at all. The shooting in Arizona has a much greater impact on our sense of reality than the fact that yesterday—what? some 6.9 billion people?—yes, some 6.9 billion people had perfectly ordinary days, days on which, like the day before, the toilets flushed, the refrigerator kept the cold, the doctor was there for the appointment, the car started, the bucket held water, the streetcars ran, the light switch worked. Curious contrasts. The world’s going to hell in a handbasket? No, not quite yet. For that, presumably, we’ll have to wait until December 2012 when, if the History Channel has read Nostradamus correctly, Armageddon will finally arrive.
Labels:
Media,
Newspapers,
Nostradamus,
Perception,
Tucson Shooting
Saturday, January 8, 2011
A Novel Detective Series
No, I am not about to name one. I want to specify a new series where the relationships are not too, too, too predictable before I’ve even seen Episode 1.
- If a coroner is included at all, the coroner would not be female; if female, she would not be nubile and there would be present neither sexual nor mother-son tension between her or the detective she works with. The detective would never ask the coroner how soon he/she would know the time of death.
- The detective’s higher up would be a political innocent; the detective, instead, would spend all of his time trying to get good press coverage, keep an album of clippings about his own successful cases, and be utterly failing to attract journalistic attentions.
- The aggressive journalist would be neither an obnoxious male nor a sexy female; there wouldn’t be one. Why? Because papers can’t afford to take an interest in crime today. All real reporting comes by way of the AP and the UPS wire.
- The detective would absolutely hate all victims because they cause him all the hassles; he would have a grudging liking for criminals—because there are too many useless victims in the first place, and the criminals get rid of them. He would avoid victim’s families as if they were lepers, and failing to avoid them, pretend that he is the aggressive journalist. If the detective has a family, his children would have plenty of his quality time because he would be a mighty shirker of hard work and always home helping them with teenage problems. He would be happily married and plotting with his wife to start a flower shop.
- No case would ever be solved—only shelved. Each exciting episode would end by giving up the case as attention is diverted to the new case—to be handled in the next episode.
- There would be no female detective who is genuinely superior to all the males but under-appreciated; if there is a female, she would be unattractively corpulent, sit behind a desk, and solve every case without doing a thing—but nobody would recognize that she had done so, and the case would be shelved anyway.
- The second in rank would lack all ambition and spend his time trying to interest the local paper in hiring him as the aggressive journalist.
- The junior detective would be an old man passed over innumerable times, would definitely not be black or Asian or provide any humorous light moments at all.
- The setting would not be a grimy big city—nor would it be in Ireland, Scotland, Wales, Australia, in a small town, a ritzy suburb with offensive rich people, the English country side with an offensive lord of the manor, truck stop, an airport, or a holiday resort. It might be in an alternative universe, however—where all detective series are actually placed.
Labels:
Detective Series
Friday, January 7, 2011
Une foi, un loi, un roi
Reading the essays of Montaigne causes me to focus on sixteenth century France, the French religious wars (the Catholic League battling the Huguenots, a 36-year conflict that marred French life between 1562 and 1598) and reminds me that living in interesting times means living in times when faiths, laws, and kings compete. And, of course, it is always much worse, meaning much more interesting, when rulers are relatively weak, when ideas are hopelessly confused, and it is difficult to tell which way the trend is actually running.
The real message in that old French slogan is that, in the absence of an overweening unity at a high enough level to matter, social life will be rent, uncertainty will rule, power will continue to shift, and economic life will invariably suffer. Nobilities in tension with kings, bourgeoisie in conflict with the nobles, and tradition hanging on hard-pressed by reforming change. It would all be resolved, ultimately, in the French revolution which, in a way, swept the messy playing board clean. During this warring period an interesting early form of secularism arose; it was called politique, or the rule of pragmatism, practiced by moderates of both sides. Never mind faith, belief, or ideology. Lets get the (then still non-existent) trains to run on time. Our times here, it seems to me, might benefit from a bit of politique.
Montaigne lived in Aquitaine, the southwestern corner of France. That region was under Bourbon influence, which leaned in the Huguenot (Protestant) direction. He was nominally Catholic, temperamentally Stoic, and his detached and reasonable attitude well suited his times in turmoil. When trends are chaotic—and you have the means—by all means the tower of solitude where, sitting in relative peace, you write your blogs.
The real message in that old French slogan is that, in the absence of an overweening unity at a high enough level to matter, social life will be rent, uncertainty will rule, power will continue to shift, and economic life will invariably suffer. Nobilities in tension with kings, bourgeoisie in conflict with the nobles, and tradition hanging on hard-pressed by reforming change. It would all be resolved, ultimately, in the French revolution which, in a way, swept the messy playing board clean. During this warring period an interesting early form of secularism arose; it was called politique, or the rule of pragmatism, practiced by moderates of both sides. Never mind faith, belief, or ideology. Lets get the (then still non-existent) trains to run on time. Our times here, it seems to me, might benefit from a bit of politique.
Montaigne lived in Aquitaine, the southwestern corner of France. That region was under Bourbon influence, which leaned in the Huguenot (Protestant) direction. He was nominally Catholic, temperamentally Stoic, and his detached and reasonable attitude well suited his times in turmoil. When trends are chaotic—and you have the means—by all means the tower of solitude where, sitting in relative peace, you write your blogs.
Labels:
Huguenots,
Montaigne,
Religious Wars
Thursday, January 6, 2011
Mysterious Communities
Communities, when I think about it, are mysterious entities—and present only in shattered fragments in modern life. My image of a real community is of people working together, but in such a way that the commonality of the effort is reflected in the horizontal as well as in the vertical dimensions, thus that they are working together in the same space, more or less, and that the labor has the same inner and outer purpose, a purpose that all those who support it spontaneously understand. The manner in which individual efforts are meaningfully linked ought also to be quite visible: the people to whom I sell groceries teach my children, sell me clothes, repair my plumbing, attend my church... The activities we undertake together should have a purpose we actually share.
In modern life the closest thing we actually encounter is the spatial unity we share with others who work in the same enterprise. But the vertical and inner dimension may both already be quite vague. The aims of large corporations are up there in the sky and abstract—profit and growth. And my relation even to this partial community may be rather tentative. I’m easily replaceable; and my income is a necessary cost for the collective (but a benefit for my community). But the community’s is not the hammer hand; my employer may reduce me (cost that I am) and gladly if at all possible. My employer may itself also be (at least thinkably) altogether unnecessary—thus throwing my work identity up into the air as well. This is especially true if it competes—and doubly so if it must compete fiercely with others; the more embattled it is, the less necessary it appears. But then—what about me? That’s the thought I would be thinking if I worked for General Motors, an entity recently resurrected from the dead by the artificial interventions of Big Brother; to be sure, that resurrection was motivated by communitarian considerations, but it was not that of any genuine or local community but that of an abstract and vast beast, itself up in the sky.
My thought next turns to hospitals. When we arrived here in 1989, virtually none advertised. Now they all do. Our own St. John, founded by a handful of nuns because a hospital was needed here, now maintains a mere block from where it sits a vast billboard on the corner of Mack and Moross touting its services to the physically troubled in heart. I’d feel oddly marginalized if I worked for pushy Beaumont expanding by acquisitions yet. What has happened to that most honorable of all institutions, the hospital, that, these days, it must scramble like mad for its so-called “customers”? Has it been invaded by machines? Is it the clamor of the machines for return on investment that I hear shouted from the billboards?
Looking for the right word, the word “organic” comes to mind. Why? Because the alien atmosphere of modern life tells me that our institutions are no longer really anchored in organic social reality. They have detached from the soil. They’ve begun to levitate. Most communities, in fact, are communities in name alone. They’ve devolved. They have become transformed into bedroom communities or vast, layered, sleeping slabs. Their inhabitants shower and then travel all over the vast Detroit metroplex to work, to study, to worship (if they do so at all), to shop, to seek health services—everywhere. Virtually all communities today have become virtual; they’re communities of interest. People no longer share anything at all with those who live close by them—except incidentally. Annual block parties trumpet this fact in small talk. The neighbors are strangers unless, by happenstance, small children are present and the still local elementary school binds their mothers (rarely their fathers) into a common purpose they genuinely share. The vast majority—not least some of those mothers—work for institutions the purposes of which are expressible in numbers; and what they actually do is only incidentally related to the corporate objectives that their labor serves.
Real communities, of course, tend to restrict, constrain the individual. The young dream of escaping for the lights of the great city, the excitements and adventures of freedom in the distance. Little do they know… There is an ideal point of equilibrium somewhere, but the laws of this dimension are such that we rarely and only briefly experience genuine community life (if we are lucky to do so at all); memories of that time, to be sure, remain in the heart as a kind a memory of paradise, even when, in detail, it wasn’t—and we remember that perfectly well too. Very odd, indeed, but that’s the way things are.
In modern life the closest thing we actually encounter is the spatial unity we share with others who work in the same enterprise. But the vertical and inner dimension may both already be quite vague. The aims of large corporations are up there in the sky and abstract—profit and growth. And my relation even to this partial community may be rather tentative. I’m easily replaceable; and my income is a necessary cost for the collective (but a benefit for my community). But the community’s is not the hammer hand; my employer may reduce me (cost that I am) and gladly if at all possible. My employer may itself also be (at least thinkably) altogether unnecessary—thus throwing my work identity up into the air as well. This is especially true if it competes—and doubly so if it must compete fiercely with others; the more embattled it is, the less necessary it appears. But then—what about me? That’s the thought I would be thinking if I worked for General Motors, an entity recently resurrected from the dead by the artificial interventions of Big Brother; to be sure, that resurrection was motivated by communitarian considerations, but it was not that of any genuine or local community but that of an abstract and vast beast, itself up in the sky.
My thought next turns to hospitals. When we arrived here in 1989, virtually none advertised. Now they all do. Our own St. John, founded by a handful of nuns because a hospital was needed here, now maintains a mere block from where it sits a vast billboard on the corner of Mack and Moross touting its services to the physically troubled in heart. I’d feel oddly marginalized if I worked for pushy Beaumont expanding by acquisitions yet. What has happened to that most honorable of all institutions, the hospital, that, these days, it must scramble like mad for its so-called “customers”? Has it been invaded by machines? Is it the clamor of the machines for return on investment that I hear shouted from the billboards?
Looking for the right word, the word “organic” comes to mind. Why? Because the alien atmosphere of modern life tells me that our institutions are no longer really anchored in organic social reality. They have detached from the soil. They’ve begun to levitate. Most communities, in fact, are communities in name alone. They’ve devolved. They have become transformed into bedroom communities or vast, layered, sleeping slabs. Their inhabitants shower and then travel all over the vast Detroit metroplex to work, to study, to worship (if they do so at all), to shop, to seek health services—everywhere. Virtually all communities today have become virtual; they’re communities of interest. People no longer share anything at all with those who live close by them—except incidentally. Annual block parties trumpet this fact in small talk. The neighbors are strangers unless, by happenstance, small children are present and the still local elementary school binds their mothers (rarely their fathers) into a common purpose they genuinely share. The vast majority—not least some of those mothers—work for institutions the purposes of which are expressible in numbers; and what they actually do is only incidentally related to the corporate objectives that their labor serves.
Real communities, of course, tend to restrict, constrain the individual. The young dream of escaping for the lights of the great city, the excitements and adventures of freedom in the distance. Little do they know… There is an ideal point of equilibrium somewhere, but the laws of this dimension are such that we rarely and only briefly experience genuine community life (if we are lucky to do so at all); memories of that time, to be sure, remain in the heart as a kind a memory of paradise, even when, in detail, it wasn’t—and we remember that perfectly well too. Very odd, indeed, but that’s the way things are.
Labels:
Collectives,
Communities
Wednesday, January 5, 2011
Human Dignity
The notion of human dignity is rooted in the belief that humans have immortal souls and have been created by God. Thus they are something other than chickens; after all we can eat chickens without feeling any guilt. When beliefs in the transcendental origins of the human (at minimum) are abandoned, a kind of no-holds-barred tribalism becomes altogether possible—indeed laudable if it makes money and generates power. I was somewhat startled to see a book at my library, among new books, and high-lighted all by itself on the top shelf. It is David Limbaugh’s Crimes Against Liberty: An Indictment of President Barack Obama (August 2010), a kind of companion to Michelle Malkin’s Culture of Corruption: Obama and His Team of Tax Cheats, Crooks, and Cronies (which appeared in August as well, but a few days earlier). This sent me to the computerized catalog. I was wondering if my library also perhaps had a copy of Julien Benda’s 1928 book entitled The Treason of the Intellectuals (originally La Trahison des Clercs). No. But it’s worth reading—as an introduction to the other two above.
Labels:
Benda Julien,
Dignity,
Immortal Soul,
Tribalism
Monday, January 3, 2011
Die Da!
The title is in German and simply means “Those there”; it’s pronounced “deed ahh.” We’ll get there. Long ago now we made a visit to the German Democratic Republic, communist East Germany. We had family there. Tensions had eased enough to make the trip without concern. We went as a family, driving a little Opel we’d rented in the West, first to West then to East Berlin; we crossed at the famous Checkpoint Charlie and, once over there, navigated the horrendous bureaucratic wall (much worse than The Wall) to get permission to travel further inland. Finally we had the go ahead and zoomed off on a virtually empty autobahn into the East. It was the closest thing to time travel I’d ever engaged in: From booming West Germany in the 1970s we were suddenly transported to the 1940s. Time had literally frozen.
Some time after reunion with the family, late one morning on a weekend, Jonas, my brother-in-law (by profession he was a psychiatrist) headed out to buy the paper. Eager to see the papers myself I tagged along. We stopped at what looked like a tiny but well-stocked kiosk. I asked Jonas to point out the best paper to buy. He already had one in his hand. “Don’t bother,” he said. “There’s nothing in the papers.” Then, catching my pointed look at the paper he held, he laughed, dryly, and said: “It’s a sports paper.” His hand swept over a large display. “All of these are. Sports papers. That’s what we read here.” Intent on penetrating communism, I bought three of the ordinary papers for myself and later discovered the strangely fictional world projected by the authorities of the Deutsche Demokratische Repulik—unbelievably heroic figures of men and women who, in transports of joyous labor, climbed the steepest quota mountains and outperformed their eagerly following comrades in the production of plywood, steel, and even in milking cows.
Jonas took me on a walk and commented wryly on the strange realities on display everywhere in this frozen time. One of our stops was beside a damaged little car, a Trabant, the VW of the East. He showed me—and made me touch the surface of—its ripped fender so that I could experience a car the body of which was made of cardboard. We wandered long enough so that it was time for lunch, and on an impulse Jonas took me to a restaurant. It overflowed with masses of people, every table in crowded use. Some eight or nine people, two or three families, waited for a table. It looked like it was going to be a long wait. We were looking at each other, kind of questioning this decision, when three short, energetic, well-dressed men in suits entered the restaurant. Their leader immediately engaged the maitre’d in conversation. A fair amount of hand-waving went on; the leader spoke in a kind of stilted too-formal German. The maitre’d looked in deep pain; he seemed determined to prevail but, quite rapidly, folded. After a tiny shrug, he nodded and led the men inside to seat them somewhere—where it was impossible to see.
Jonas now looked at me. “Die da!” he said. Then he turned, and out of the restaurant we went. Communist party members in the DDR invariably wore a tiny round emblem on their lapels, the regime’s coat of arms. To indicate the identity of these worthies, East Germans had a habit of pointing a finger at the left-hand lapel and to say, Die da, the eternal them.
This memory suddenly surfaced the other day when I was reading a story about Iraq’s painfully forming democratic government. The DDR lasted from 1949 to 1990. How long would glorious new democracy reign in Iraq? And it struck me as odd that in that realm we were and are…Die da!
Some time after reunion with the family, late one morning on a weekend, Jonas, my brother-in-law (by profession he was a psychiatrist) headed out to buy the paper. Eager to see the papers myself I tagged along. We stopped at what looked like a tiny but well-stocked kiosk. I asked Jonas to point out the best paper to buy. He already had one in his hand. “Don’t bother,” he said. “There’s nothing in the papers.” Then, catching my pointed look at the paper he held, he laughed, dryly, and said: “It’s a sports paper.” His hand swept over a large display. “All of these are. Sports papers. That’s what we read here.” Intent on penetrating communism, I bought three of the ordinary papers for myself and later discovered the strangely fictional world projected by the authorities of the Deutsche Demokratische Repulik—unbelievably heroic figures of men and women who, in transports of joyous labor, climbed the steepest quota mountains and outperformed their eagerly following comrades in the production of plywood, steel, and even in milking cows.
Jonas took me on a walk and commented wryly on the strange realities on display everywhere in this frozen time. One of our stops was beside a damaged little car, a Trabant, the VW of the East. He showed me—and made me touch the surface of—its ripped fender so that I could experience a car the body of which was made of cardboard. We wandered long enough so that it was time for lunch, and on an impulse Jonas took me to a restaurant. It overflowed with masses of people, every table in crowded use. Some eight or nine people, two or three families, waited for a table. It looked like it was going to be a long wait. We were looking at each other, kind of questioning this decision, when three short, energetic, well-dressed men in suits entered the restaurant. Their leader immediately engaged the maitre’d in conversation. A fair amount of hand-waving went on; the leader spoke in a kind of stilted too-formal German. The maitre’d looked in deep pain; he seemed determined to prevail but, quite rapidly, folded. After a tiny shrug, he nodded and led the men inside to seat them somewhere—where it was impossible to see.
Jonas now looked at me. “Die da!” he said. Then he turned, and out of the restaurant we went. Communist party members in the DDR invariably wore a tiny round emblem on their lapels, the regime’s coat of arms. To indicate the identity of these worthies, East Germans had a habit of pointing a finger at the left-hand lapel and to say, Die da, the eternal them.
This memory suddenly surfaced the other day when I was reading a story about Iraq’s painfully forming democratic government. The DDR lasted from 1949 to 1990. How long would glorious new democracy reign in Iraq? And it struck me as odd that in that realm we were and are…Die da!
Sunday, January 2, 2011
Hidden Dimension
Death notices of entertainment figures (here I have Billy Taylor in mind) remind me forcibly that I’ve lived my life as if a vast sector of modern culture did not exist at all. I do believe I know far more about Tibet, ancient and modern, than I know about pop music no matter the genre. Music is an incredibly vast universe of experience, but I know next to nothing about it. My acculturation was in classical music (my Mother), but even attending classical concerts has in my case been strictly in the role of an Accompanying Person, never at my initiative. Radio—oh, radio—and the initiatives of family members have exposed me to a tiny selection of pop music, of which a few pieces I love with the irrational and total dedication of the true primitive. An arduous, and I mean exhaustive survey of my memories brings to mind a single attendance at a pop musical event to listen to Leo Kottke play his guitar compositions in a bar in Minnesota, the links of which are once more traceable by radio to the Prairie Home Companion—one of whose evenings I also once also attended in the body in St. Paul—and found by experience entirely to lack the magic produced by hearing that show entirely disembodied sitting, say, on a dark porch summer evenings and listening to radio. Through the air that surrounds me vibrates invisibly, inaudibly the labor of countless musicians—a continent as unknown to me as the far side of the moon. Billy Taylor, whoever you were, R.I.P.
Labels:
Country Music,
Kottke Leo,
Prairie Home Companion
Needed That
Studies in France have virtually no other end than the making of money—said Michele de Montaigne in his essay, "On schoolmasters' learning," in the second half of the sixteenth century. Found that a refreshing corrective to the narrowing of mental arteries always present, especially in those of some age, a symptom of which is to imagine that our times are more saturated with decadence than others.
Saturday, January 1, 2011
Through a Glass Brightly
In a real sense the medium is the message, not in any erudite, sociological, McLuhanesque sense but merely in the kind of content that it brings—and how it brings it. In 1911, we had neither radio nor television. News from far away reached us more rapidly then than in 1811; the telegraph had speeded up communications immensely. But it reached us each personally as print on paper and thus had a predominantly conceptual, abstract, symbolic form. In a meaningful sense then the Chinese saying still held: The sky is high and the emperor is far away. Radio began to spread in the 1920s and had reached world-wide extent by World War II. Radio rides on sound and for that reason, possibly, still retains a certain mysterious quality (for me anyway). Sound whispers to us from the dark, and we’re quite skilled at parsing multiple sounds simultaneously, spontaneously. The image of women, their cheeks pressed into their palms, anxiously staring at a monstrous trumpet that topped my grandmother’s radio during the war in Europe has become for me an icon, preserved like an old sepia print, of the Greater World under deep clouds of advancing doom. Television dawned in the 1950s and rapidly became the eye by means of which we watch Big Brother, a very flexible fellow who is now Ed Sullivan in grey and all the kings, the kings’ men, the kings’ clowns, in bright flickering colors to this day.
The deceptive aspect of TV is that it seems to be the best source of news because it is the most visual of media. It’s vivid, now, and the camera moves, like our own eye, its focus seemingly drawn by what’s important. But what the bright eye shows is less, much less than is actually there. We see what we, in fact, can’t see. From Detroit I cannot really see what happens in Afghanistan, in Washington, or in Los Angeles. If I think I do, I am deceived. If I were there, I wouldn’t see that. But I’d see a whole lot else—and that would make me think quite differently about the images I’m fed. We’re accustomed to hearing what we cannot see, therefore radio is, oddly, more natural somehow. But to see what we cannot—not without days or weeks of arduous travel—give us a peculiar experience for which, in effect, millions of years of evolution have failed to prepare us. The images are sharp, vivid, dynamic, accompanied by matching sound. But they’re not happening to us. Not really. They are far away. And we can’t move that eye. We seem to—but others really move the view. A small, tiny square looks here, looks there. But when I move my gaze—why then all I see is the window of my living room and a gauze curtain still carrying its Christmas lights.
As through a glass, brightly—but certainly not face to face. The vast sociological mirage evolves from this radically diminished perspective. It deceives by its vivid sensory message, makes effort to acquire knowledge using symbols too costly and inefficient; we know less, feel as if we’re overwhelmed by information; but that information is shrunken and deformed into something primitive but made on purpose by many teams to shape somehow—even if competing with each other—a deceptive something we call the public will.
The deceptive aspect of TV is that it seems to be the best source of news because it is the most visual of media. It’s vivid, now, and the camera moves, like our own eye, its focus seemingly drawn by what’s important. But what the bright eye shows is less, much less than is actually there. We see what we, in fact, can’t see. From Detroit I cannot really see what happens in Afghanistan, in Washington, or in Los Angeles. If I think I do, I am deceived. If I were there, I wouldn’t see that. But I’d see a whole lot else—and that would make me think quite differently about the images I’m fed. We’re accustomed to hearing what we cannot see, therefore radio is, oddly, more natural somehow. But to see what we cannot—not without days or weeks of arduous travel—give us a peculiar experience for which, in effect, millions of years of evolution have failed to prepare us. The images are sharp, vivid, dynamic, accompanied by matching sound. But they’re not happening to us. Not really. They are far away. And we can’t move that eye. We seem to—but others really move the view. A small, tiny square looks here, looks there. But when I move my gaze—why then all I see is the window of my living room and a gauze curtain still carrying its Christmas lights.
As through a glass, brightly—but certainly not face to face. The vast sociological mirage evolves from this radically diminished perspective. It deceives by its vivid sensory message, makes effort to acquire knowledge using symbols too costly and inefficient; we know less, feel as if we’re overwhelmed by information; but that information is shrunken and deformed into something primitive but made on purpose by many teams to shape somehow—even if competing with each other—a deceptive something we call the public will.
Labels:
Media,
Perception,
Radio,
Television
Subscribe to:
Posts (Atom)