Goethe once used Elective Affinities as the title of a novel (Die Wahlverwandschaften). The German literally means “relationships of choice,” but Goethe’s indirect reference was to an old way of describing why certain chemicals combined, and in English that phrase had been “elective affinities.”
I got to thinking about relationships in the context of culture, that maddening and fascinating, ubiquitous yet ephemeral phenomenon. And my thought ran thus: You have no choice in the culture where you’re born; and later, when choices do appear, they are greatly hemmed in by your total experience, so much so that leaving one culture for another is often a climactic experience—as exemplified especially by religious conversions. Goethe’s title came to me as I was walking along—inevitably because my mind produces associations in several languages. In Goethe my obedient brain found the two concepts I was thinking about, association and choice, neatly combined in into a single word.
This set me off on the sort of thing that can easily fill an hour’s ambling exercise. Language. The German tongue sticks to its roots—whereas English has been seduced by Rome—but indirectly by way of France: the Norman Invasion. Wahl and Verwandschaft are both good-old Germanic words; both elective and affinities have Latin roots. In Old English both relationship and affinity were expressed by the word gesibnes (in German, to this day, Sippe means blood-relationship and may come from the same root). But where does Verwandschaft come from? It comes from the root within it, Wand, or wall; those related occupy the same walled enclosure. For folks like me an Old English dictionary is a Godsend. The one I use is here. It tells me that the wall is used in the same way in Old English. The word was weall. Thucidides strongly suggests that walls and wealth helped produce each other—but weall has no link to weal, so far as I can discover, but the Old English for commonwealth was a Genæwela holding echoes of the current German word for community, Gemeinde. I wish I had a dictionary for Althochdeutsch as well…
Goethe’s aim in using the phrase was to ponder sexual and love relations in a context both chemical and transcending that level—and we talk of “chemistry” between two people, a phrase that, to the best of my knowledge, arose quite recently. Goethe’s contemporary, Swedenborg, saw this pattern in the heavenly reaches beyond. He said that souls up there gathered into communities by affinity—as did the damned in hell; both did so of inner choice. And we recognize the same thing in nature when we say that birds of a feather flock together. As above, so below. Swedenborg was 61 when Goethe was born (in 1749); Goethe was 23 when Swedenborg died (in 1772).
Marching along—it was gorgeous day today, cool, sunny, a light wind—I was thinking: Culture is another instance of being “walled in together,” as it were, but the choice is left to our genes. But within a culture, later, when our choice expands, we select our association by elective affinities. And if Swedenborg really did converse with angels, we are likely to do a great deal more of that after we shuffle off this mortal coil. Did I just say coil? Well, let’s not go there beyond saying that it also comes from the French...
Sunday, May 31, 2009
Saturday, May 30, 2009
Bell, Book, and Kindle
Google’s Library Project, an initiative to capture digitally the contents of major libraries across the world, has made it possible for someone like me, far from the centers of academia, to look at old books at my leisure. Google’s Book Search will discover and display very old objects; if the books are in the public domain, I can read them in the original—if I don’t mind staring at a screen or paying for the paper and the toner. An example of such a book (Google also let’s you “quote” it electronically) is the sampling of a 1758 biography you can see here. I could have lifted the page into this blog as well, because Google also let’s you “quote” it electronically; I didn't because it is too big.
Speaking of paper and toner, if my expenditures on these, used strictly to print books from the web, begin to approach $200, I would be well advised to spring for a digital reader; the cheapest of these are now available at around that price point. In addition to Amazon’s Kindle, I encounter BEBOOK and Plastic Logic as competent competitors—and my guess is that electronic giants (Sony comes to mind) will magically morph their handheld phones into almost paper-thin but pleasingly rigid book-readers with roughly pocketbook-sized displays easy to slip into a briefcase.
Dime novels surfaced about midway through the nineteenth century, but these were a far cry from serious books. As I recall my youth in Europe, books were expensive but highly-valued products in family cultures like my own. Their value—and here I mean both price and content—turned generations into obsessive collectors (and rescuers) of books. And every time we move, we re-live the curse of Marley’s ghost condemned to drag along behind him, where’er he goes, a chain of heavy ledgers; Marley had it easy compared to me. But these are burden we gladly bear; and people in my generation—and, as I’m observing, those in the next one too—put dumping books into the trash just short of infanticide.
The incredible power of symbolic compression that electronic storage has put in our hands pemits us to store 1,500 books on a Kindle 2—and the weight of those volumes in no way increases the weight of the reader itself. That weight? 10.2 ounces. But the Kindle is possibly not even a Model-T yet in this category; storage will increase. As people are wont to say, these days, looking at devices of this type—habituated to relentless, domineering, all-flattening, steam-roller Change—“That’s the future.”
Full Stop.
Bell, book, and kindle? The Encyclopedia Britannica informs me that the parent of that phrase refers to the ceremony of excommunication or anathema in Catholicism. The bell signified the public nature of the act, summoning the community of all believers; the book represented the Church’s authority; and the candle meant to symbolize that the anathamized individual might still repent. The thirteen clergymen administering the ceremony (a bishop and twelve priests)—and I like that number, always lucky for me—also carried candles as they entered the presumably always dark chamber in what is invariably called the Dark Age.
Now thinking of this I had one of those attacks the Germans call Galgenhumor, meaning “gallows humor.” What if, I thought, Bell, Book, and Kindle (here we could think of Bell as telephone) was another analogous ceremony? What if it symbolized the anathematizing of Modernity? Alas, such thoughts occur. But I’m just trying to be my age; old men are supposed to be grumpy. On a more sober level, the thought runs along these lines. Okay, assume that Kindle and its kind do indeed get cheaper, faster, slimmer, handier—indeed better and better and better. And suppose that books begin to fade away, as newspapers already seem intent on doing. Suppose that real books made of paper, glue, and board become cult objects discussed by tiny groups of aficionados. By analogy, yachting lives but transport by sailing ships is “The Past.” Let’s assume that all this happens. Could this curve really continue to go up, up, up and never down? Is that the way curves really work? I doubt it.
To change the subject—but yet still to stay on course—what is the future of fossil fuels? My favorite image is that humanity has been travelling on the surface of a huge bubble of oil. Imagine a soap bubble, if you please. We’re a vast, invisible bacterial culture on its surface. Now I have another image. It is that of the last spastic techie purchasing the last Kindle which contains every letter now contained in the Library of Congress—instantly accessible, perhaps, by a brain-implanted chip. And at that very moment, the bubble of oil bursts. I leave you with that image. I wonder if a drastically slashed per capita energy consumption obtained at much, much higher cost (see further here) will support what today we piously anticipate as The Future.
P.S. Brigitte after reading this (puns are contagious): "Could the grandchildren of our grandchildren go to bed with a candle instead of a Kindle?"
Speaking of paper and toner, if my expenditures on these, used strictly to print books from the web, begin to approach $200, I would be well advised to spring for a digital reader; the cheapest of these are now available at around that price point. In addition to Amazon’s Kindle, I encounter BEBOOK and Plastic Logic as competent competitors—and my guess is that electronic giants (Sony comes to mind) will magically morph their handheld phones into almost paper-thin but pleasingly rigid book-readers with roughly pocketbook-sized displays easy to slip into a briefcase.
Dime novels surfaced about midway through the nineteenth century, but these were a far cry from serious books. As I recall my youth in Europe, books were expensive but highly-valued products in family cultures like my own. Their value—and here I mean both price and content—turned generations into obsessive collectors (and rescuers) of books. And every time we move, we re-live the curse of Marley’s ghost condemned to drag along behind him, where’er he goes, a chain of heavy ledgers; Marley had it easy compared to me. But these are burden we gladly bear; and people in my generation—and, as I’m observing, those in the next one too—put dumping books into the trash just short of infanticide.
The incredible power of symbolic compression that electronic storage has put in our hands pemits us to store 1,500 books on a Kindle 2—and the weight of those volumes in no way increases the weight of the reader itself. That weight? 10.2 ounces. But the Kindle is possibly not even a Model-T yet in this category; storage will increase. As people are wont to say, these days, looking at devices of this type—habituated to relentless, domineering, all-flattening, steam-roller Change—“That’s the future.”
Full Stop.
Bell, book, and kindle? The Encyclopedia Britannica informs me that the parent of that phrase refers to the ceremony of excommunication or anathema in Catholicism. The bell signified the public nature of the act, summoning the community of all believers; the book represented the Church’s authority; and the candle meant to symbolize that the anathamized individual might still repent. The thirteen clergymen administering the ceremony (a bishop and twelve priests)—and I like that number, always lucky for me—also carried candles as they entered the presumably always dark chamber in what is invariably called the Dark Age.
Now thinking of this I had one of those attacks the Germans call Galgenhumor, meaning “gallows humor.” What if, I thought, Bell, Book, and Kindle (here we could think of Bell as telephone) was another analogous ceremony? What if it symbolized the anathematizing of Modernity? Alas, such thoughts occur. But I’m just trying to be my age; old men are supposed to be grumpy. On a more sober level, the thought runs along these lines. Okay, assume that Kindle and its kind do indeed get cheaper, faster, slimmer, handier—indeed better and better and better. And suppose that books begin to fade away, as newspapers already seem intent on doing. Suppose that real books made of paper, glue, and board become cult objects discussed by tiny groups of aficionados. By analogy, yachting lives but transport by sailing ships is “The Past.” Let’s assume that all this happens. Could this curve really continue to go up, up, up and never down? Is that the way curves really work? I doubt it.
To change the subject—but yet still to stay on course—what is the future of fossil fuels? My favorite image is that humanity has been travelling on the surface of a huge bubble of oil. Imagine a soap bubble, if you please. We’re a vast, invisible bacterial culture on its surface. Now I have another image. It is that of the last spastic techie purchasing the last Kindle which contains every letter now contained in the Library of Congress—instantly accessible, perhaps, by a brain-implanted chip. And at that very moment, the bubble of oil bursts. I leave you with that image. I wonder if a drastically slashed per capita energy consumption obtained at much, much higher cost (see further here) will support what today we piously anticipate as The Future.
P.S. Brigitte after reading this (puns are contagious): "Could the grandchildren of our grandchildren go to bed with a candle instead of a Kindle?"
Labels:
Books,
Catholicism,
Technology
Thursday, May 28, 2009
Classification Exercise
Clicking on the image should make it show up in a separate window. You might wish to do that before you begin to read.
When I go off on my walks, my mind sometimes takes over and produces this and that. This chart was the consequence of a stray thought about the Hindu caste system—and my dissatisfactions with it. I began to "revise" it, and one thing led to another. Getting home I made a quick sketch, and Brigitte and I then fell to talking about it. It was an interesting discussion—especially as we thought of different kinds of occupations or callings that we hadn't as yet accounted for, nor could we find a neighboring label that might be thought to fit it. I thought that I'd reproduce the schematic as we ended with it after handing the sheet of paper back and forth for a while. It's useful as an exercise in personal reflection and orientation . To use a pie-chart as the illustration was Brigitte's idea when I began grousing about the difficulties of getting this scheme coded in using HTML. The pie chart saved me hours of useless work.
The kinds of questions we posed as we discussed this scheme were: Where did we, as individuals, spend most of our time—or, what soon developed, where did we spend our time at different times in our lives? How do these categories relate? If we take the scheme as presented—thus assuming that it represents both major groupings differentiated by (1) an end or purpose, (2) by a kind of activity, (3) by an occupational category, and (4) possibly also by a talent or inclination—are they genuinely distinguishable, thus are they a good classification or, conversely, are they too arbitrary?
Let me give an example of my reasoning, using the DO category. I've placed in it occupations and callings that require the entire person acting in an individual capacity, usually body and mind (although being a Judge or an Executive might be a weak instance). Interestingly soldiers, actors, and dancers thus come to share one of the eight great clusters. Another example is provided by TEND. Here we have all the "caring" occupations—farming, forestry, fisheries, childcare, religious pastoring, medicine, and maintenance—all the way to the final disposition of bodies after they die. Maintenance activities are put here because they are necessary in caring for or tending to people and to the environment.
Another exercise we enjoyed was discussing how some of these activities shade off into others. Thus Teaching, in THINK shades off into TEND, in that teaching is a kind of nurturing. THINK also shades off, or is nourished by, CREATE—where theories are formed or vision illuminates thought. The two categories are separate, however, by way of acknowledging what we so often observe: some people are most decisively more thinkers than creators, or the other way about. Creators often lack elements that only those called to the rigors of thought can accomplish; thinkers are often so embroiled in the intellectual that the lightning of inspiration goes unnoticed. Fortunate indeed the people whose temperament extends over the entire octet presented here....
We hope that you have some fun with this!
Labels:
Classfication
Adjudication with Representation
She had a compelling life story, Ivy League credentials and a track record on the bench. She was a Latina. She was a woman. She checked “each of the grids,” as Mr. Obama’s team later put it. And by the time the opportunity arrived, it became her nomination to lose. [New York Times, “Using Past Battles to Avert Pitfalls,” May 298, 2009.]
Media eruptions in the wake of Sonia Sotomayor’s nomination to the Supreme Court reminded me again of the very curious conception of humanity—what it is to be a human being—that currently dominates our public culture. What appears on the surface to be a deeper commitment to the cause of justice is rooted, in actuality, in a rather deformed idea of human intelligence and conscience. The root of the idea seems to be determinism. It says: “You can’t possibly understand any group in the society unless you are a member of it.” Conclusion? We must strive to have a member of every group—well, a member of the large groups, anyway—in every major institution. Better yet, the representation should be at the same ratio the group represents in the population. An alternative rooting might be belief in the ruthless (or helpless) selfishness of people. Thus no one will conscientiously consider the rights and needs of a group of which he or she is not a member. Whites can only ever legislate or adjudicate for whites; they will always discriminate in their favor. Consequence? We must balance the white power with other colors in our Rainbow Coalition.
Among the oddities in this nomination is that I’ve seen Sotomayor described as a “non-white Hispanic” in one political blog, for instance, on the very strange understanding, perhaps, that a Latina is of some other race? Or did the blogger happen to behold Sonia when she was deeply tanned? Because the benighted Census Bureau gives us choices to call ourselves White or Hispanic? Whereas elsewhere the Bureau reports, in sub-sub classifications, some Hispanics as White-Hispanics and others as Black-Hispanics… Another oddity is that a highly restrictive classification, “Ivy League credentials” should be right there in a list of things indicative of diversity. Should we not, by the same logic, strive mightily for judicial appointments from no-name colleges and never-heard-of-universities too?
I find all of this exceedingly strange. And if I were a genuinely modern man, I would despair. I note that the 2000 Census only discovered 92,000 people resident in the country born in Hungary. Let’s multiply that number by four to represent children and grandchildren of these people; in that case the Hungarian-American “ethnicity,” to borrow a modern word, would amount to a slender 364,000, thus around about 1 percent of the U.S. population. In the modern mindset, if we take it seriously—which, fortunately, none of us really does—I could never ever hope to be represented on the Supreme Court by a member of my own unique ethnicity. I’m judicially dis-appointed.
I know. I know. Valid historical explanations can be spun here pointing at prejudicial practices well enough documented to make me admit, if I ever doubted it, that people are tainted by original sin. But these very explanations must be assessed in light of other, to the modern mind’s perhaps great astonishment. It is that people appointed to the bench who have “checked each of the grids” have, subsequently, astonishingly, seemingly voted in cases based on their unique judicial take on the issues and on their own conscience—not at all as they were supposed to. And not surprisingly. Appointments to the High Court are for life—and the officials who serve there are genuinely freed of political pressure; they never need to brave the process of confirmation again. All this is well known. And reassuring. Why then are we going through the same mad St. Vitus dance every time a new judge is appointed? And why are the beady eyes of the Media never focused on any other issue except one?
Labels:
Culture
Sunday, May 24, 2009
María Amelia López Soliño
The oldest blogger on the Internet has died at age 97. María began blogging in December 23, 2006 on her 95th birthday. “My grandson, who is very stingy, gave me a blog,” she reported. The website is here—and it’s a marvel. The Google Blogger site acquired 1,056 followers and, since its inception, has drawn 1.7 million visitors from five continents. I would suggest that you look at her very first entry, found here, where the words that I’ve just quoted appear in Spanish, and the second, here; both show what she looked like as a young lady. The entries also hint at the reasons why her site became so popular.
Is it true that hi-tech pursuits are the domain of youth? Yes in a sense, no in another. They’re certainly the domain of the young in spirit. The old in body sometimes need a nudge, María by her grandson, for instance; in my case it was John Magee with his amusing Patioboat and his wife (and my daughter) Monique, who drew my attention to it and, with a hint or two, urged that this project be launched.
One final note. The blogosphere provides an absolutely amazing and heretofore totally inaccessible view of wonderful people all over the world seen, as it were, from the inside. Bill Valicella makes much the same comment on Maverick Philosopher where I first learned of María’s existence—and passing.
Is it true that hi-tech pursuits are the domain of youth? Yes in a sense, no in another. They’re certainly the domain of the young in spirit. The old in body sometimes need a nudge, María by her grandson, for instance; in my case it was John Magee with his amusing Patioboat and his wife (and my daughter) Monique, who drew my attention to it and, with a hint or two, urged that this project be launched.
One final note. The blogosphere provides an absolutely amazing and heretofore totally inaccessible view of wonderful people all over the world seen, as it were, from the inside. Bill Valicella makes much the same comment on Maverick Philosopher where I first learned of María’s existence—and passing.
Labels:
Culture
Saturday, May 23, 2009
Mma Ramotswe
If that name makes you smile instantly in anticipation of something pleasant, indeed tempted perhaps to fetch a cup of something to enjoy it more, don’t. This is short. (Well, as usually, it got long instead. So go ahead.) But if, instead, you find yourself puzzled, be prepared to get instructions on how to enjoy your leisure just a little more.
Years ago now I’ve abandoned the habit of urging so-called high literature on people, having discovered that, to some extent, we understand high literature to be that which we enjoyed in youth, a part of what was then The Canon. What we did not personally absorb, we neither canonize nor recommend. Meanwhile, I’ve discovered that many of my friends of old have quietly slipped out the backdoor of the Canon as the modern party heated up; they’re now relaxing in Walhalla or, more likely, still creating great things in some distant reaches of eternity.
But, instead of praising, say, Thomas Mann’s Joseph and His Brothers, Sigrid Undset’s Kristin Lavransdatter, Herman Hesse’s Magister Ludi, Tsao Hsueh-Chin’s Dream of the Red Chamber, Murasaki Shikibu’s The Tale of the Genji, the collected works of Jane Austen, and the bulb is out in the attic so I couldn’t immediately see other treasures of mine… So, instead, these days I recommend literature that is, perhaps, not of the highest rank but, in some ways, of the highest value: wonderful storytellers whose inner vision resonates with mine. And one of those is Alexander McCall Smith, creator of Mma Ramotswe, the sole and therefore No. 1 Lady Detective of Botswana (in fiction, of course). And, yes, one of the series is coming to a Moving Picture Theater near you SOON!
There are at least ten novels in this series. The most recent appeared days ago in hardback—and I saw it COSTCO therefore it must be good too. New readers should begin at the beginning with The No. 1 Ladies’ Detective Agency. In some ways it is the most painful of them in that it brings views of the darker side of Africa, especially its diamond mining enterprise. They serve to place Mma Precious Ramotswe, a comfortable sort of lady described as “traditionally-built,” into the context of Botswana, with her own family history. Botswana? You will learn to appreciate the country the more you hear about it. Most people reading this blog in any detail will, I guarantee, be at least mildly pleased and well instructed by the experience.
This much for the general reader. The rest is for those of you who already know Mma but have wondered, as have I, what that title means. Mma is used to address women, Rra is used to address men. These titles are nowhere explained. And as I discovered today, the Internet is literally awash in blogs, discussion groups, and speculation—not least unhelpful words from Alexander McCall Smith not quite explaining the words. Ghulf Genes to the rescue.
First we deal with the pronunciation. Mma is voiced as “Ma,” Rra is voiced as “Ra” but with a rolled R. Please apply to native Spanish speakers on how to roll that R. Some swear that the roll of the R should follow the Ra, thus “Rar”; others dogmatically assert that there is only one R to roll, the first one. Make your own choice. As for how to roll the R silently as you read, on that subject I’m still searching for counsel.
The words are not abbreviations of longer words—but they do have plurals. Many Mmas, if I may put it that way, are called Bomma in the native language most Botswanans speak, Setswana (sometimes rendered Tswana). Many Rras are called Borra. The author of the series says that these are honorifics. The less informed say that they mean Madam and Sir respectively. I’ve searched high and low and finally managed to discover a book, entitled Culture and Customs of Botswana, by James Raymond Denbow and Phenyo C. Thebe, Greenwood Publishing Group, 2006. This book has 244 pages whereas the average wisdom available from the web is under 45 words. In this books, on page 167, which you can look at here, the authors finally draw the veil from the mystery and reveal that Mma derives from “Mother” and “Rra” from “Father.” The Lord be praised. What a surprise!
Brigitte and I got to talking, she remembering a German and I a Hungarian childhood. And we both remember that it was customary in our youth for people to address all ladies who were older as “aunt” or “auntie” and all males as “uncle”—and this with perfect strangers. We think there is a parallel here. Monique may chime in with parallels from Bolivia in Spanish. So there’s the final word on the matter. Get busy, Google, and bring this good message to the entire Internet. SOON.
Years ago now I’ve abandoned the habit of urging so-called high literature on people, having discovered that, to some extent, we understand high literature to be that which we enjoyed in youth, a part of what was then The Canon. What we did not personally absorb, we neither canonize nor recommend. Meanwhile, I’ve discovered that many of my friends of old have quietly slipped out the backdoor of the Canon as the modern party heated up; they’re now relaxing in Walhalla or, more likely, still creating great things in some distant reaches of eternity.
But, instead of praising, say, Thomas Mann’s Joseph and His Brothers, Sigrid Undset’s Kristin Lavransdatter, Herman Hesse’s Magister Ludi, Tsao Hsueh-Chin’s Dream of the Red Chamber, Murasaki Shikibu’s The Tale of the Genji, the collected works of Jane Austen, and the bulb is out in the attic so I couldn’t immediately see other treasures of mine… So, instead, these days I recommend literature that is, perhaps, not of the highest rank but, in some ways, of the highest value: wonderful storytellers whose inner vision resonates with mine. And one of those is Alexander McCall Smith, creator of Mma Ramotswe, the sole and therefore No. 1 Lady Detective of Botswana (in fiction, of course). And, yes, one of the series is coming to a Moving Picture Theater near you SOON!
There are at least ten novels in this series. The most recent appeared days ago in hardback—and I saw it COSTCO therefore it must be good too. New readers should begin at the beginning with The No. 1 Ladies’ Detective Agency. In some ways it is the most painful of them in that it brings views of the darker side of Africa, especially its diamond mining enterprise. They serve to place Mma Precious Ramotswe, a comfortable sort of lady described as “traditionally-built,” into the context of Botswana, with her own family history. Botswana? You will learn to appreciate the country the more you hear about it. Most people reading this blog in any detail will, I guarantee, be at least mildly pleased and well instructed by the experience.
This much for the general reader. The rest is for those of you who already know Mma but have wondered, as have I, what that title means. Mma is used to address women, Rra is used to address men. These titles are nowhere explained. And as I discovered today, the Internet is literally awash in blogs, discussion groups, and speculation—not least unhelpful words from Alexander McCall Smith not quite explaining the words. Ghulf Genes to the rescue.
First we deal with the pronunciation. Mma is voiced as “Ma,” Rra is voiced as “Ra” but with a rolled R. Please apply to native Spanish speakers on how to roll that R. Some swear that the roll of the R should follow the Ra, thus “Rar”; others dogmatically assert that there is only one R to roll, the first one. Make your own choice. As for how to roll the R silently as you read, on that subject I’m still searching for counsel.
The words are not abbreviations of longer words—but they do have plurals. Many Mmas, if I may put it that way, are called Bomma in the native language most Botswanans speak, Setswana (sometimes rendered Tswana). Many Rras are called Borra. The author of the series says that these are honorifics. The less informed say that they mean Madam and Sir respectively. I’ve searched high and low and finally managed to discover a book, entitled Culture and Customs of Botswana, by James Raymond Denbow and Phenyo C. Thebe, Greenwood Publishing Group, 2006. This book has 244 pages whereas the average wisdom available from the web is under 45 words. In this books, on page 167, which you can look at here, the authors finally draw the veil from the mystery and reveal that Mma derives from “Mother” and “Rra” from “Father.” The Lord be praised. What a surprise!
Brigitte and I got to talking, she remembering a German and I a Hungarian childhood. And we both remember that it was customary in our youth for people to address all ladies who were older as “aunt” or “auntie” and all males as “uncle”—and this with perfect strangers. We think there is a parallel here. Monique may chime in with parallels from Bolivia in Spanish. So there’s the final word on the matter. Get busy, Google, and bring this good message to the entire Internet. SOON.
Labels:
Fiction,
Literature,
Mma Ramotswe,
Ramotswe Mma,
Smith Alexander McCall
Friday, May 22, 2009
Paradigm Shifts - II
One of the more fascinating aspects of this subject—world views, paradigms—is that we apparently know much more about the cosmos than we know about our own backyard. Studies of the big bang knowledgably discuss events said to have transpired one or two millionth of a second after the big bang began, but our very best theories of the solar system’s origins are very speculative at best. I will document this presently but want to draw the conclusion first. The closer we are to actual realities—and therefore our data are ample and our measurements good—the more obvious become the paradoxical features of reality and the more clearly we see that our explanations are groping, flailing, and often rather arbitrary. For this reason astronomers are much more realistic and tentative in discussing the solar system than they are theorizing about the universe. And apparently far more people spend far more time on the cosmos than on the messy solar system. But let’s look at that system.
Today’s dominant theory of the solar system’s formation presumes that it is the condensation of a cloud of dust and gases, a nebula. This cloud eventually collapsed into our sun when matter at its center aggregated initially by random motions. Under the force of the increasing gravity, the rest of the cloud flattened into a disk rotating in a counter-clockwise direction as perceived relative to the sun from the Earth. Within that disk clumps formed. They are called planetismals—for planetary seeds. Planetismals aggregated yet more material and became planets. By secondary processes, these bodies in turn, also rotating, caused moons to form around them. The planets in the aggregate account for about 1 percent of all the matter in the original cloud, the sun for 99 percent. This suggests that the sun itself should have the greatest angular momentum in the system in that it has the greatest mass. The inner planets are small, dense, and heavy; the outer are large and gaseous—and this distributional effect is due to sorting by density under the gravitational influence of the sun. All this makes sense, of course: one thinks of a blender, with the rod of the blender creating a little hole in the middle representing the pull of gravity.
Indeed the picture, and the theory, would be indisputable if only the solar system—and stellar clouds, for that matter—obliged us by behaving properly. For starters there is the fundamental problem, explained away by various stratagems, that vast clouds of dust just don’t clump up but, instead, have the tendency to spread. But never mind that. Assuming that the process starts somehow, proper behavior would be indicated if (1) the sun had nearly all of the angular momentum of the system, (2) if all of the planets rotated around their axes in the same direction as the sun, i.e., counter-clockwise, (3) if their axes were oriented in the same direction, i.e. parallel with the sun’s, and (4) if their moons also obligingly circled them in the same direction and with the same spin-direction as the planets themselves exhibit when circling the sun. The composition of planets and moons should also be in line with their location. Thus the moon should be as dense as the earth and the moons of Jupiter as gaseous as that planet.
The real facts are otherwise. The sun, with 99 percent of mass, has only 3 percent of the solar system’s angular momentum—Jupiter has 60 percent. Venus, Uranus, and Pluto rotate clockwise. The axis of the Earth is tilted at an angle to the sun’s. The axes of Uranus and Pluto point at the sun. Some of Jupiter’s moons circle it counter-clockwise, others clockwise. And the moon is of a much lighter density than the Earth’s so that its origin cannot be—and is not—attributed to condensation. We are thus faced with a sequence of questions: How did the sun lose its momentum, how did the rebellious planets acquire their contrary rotations, how did the axes of some planets get their tilt? Where does the moon originate and when did we get her? In addition to such questions we have yet other strange anomalies: the asteroid belt is one, the rings of Saturn another, and the comets (they come in two categories) yet a third. Pluto presents us with the interesting fact it has been demoted from planetary status in our lifetime. It has a rather eccentric orbit.
Notice the elegance of the original theory and the messy details of actual behavior. In all such cases science has a tendency to introduce ad hoc explanations which, while plausible in human experience (“Shit happens,” as we say), are difficult to reconcile and to embed neatly in a coherent and unified theory.
The nebular theory, which goes way back to Descartes (who imagined a gigantic whirlpool in a cosmic fluid), later developed a “catastrophic” competitor. Under this scenario, the sun encountered another sun wandering the cosmos. The intruder, interacting with the sun, ripped a huge tide of matter from the sun which, condensing out, formed the planets. This theory had the benefit of at least potentially explaining the sun’s low angular momentum—by hypothesizing that the visiting body gave up some of its angular energy to the ripped-out tide. The action itself—being, in a sense, wide-open to the imagination—could at least vaguely explain the strange rotations of Venus, Uranus, and Pluto. But the catastrophic theory lost momentum. It had its origins with Buffon in 1745, thus in the period leading up to the French Revolution. All things “revolutionary” were strongly resisted in the next century—hence the success of Darwin’s gradualistic theory of evolution based on Lyle’s gradualistic geology, which Lyle in part formed to counter Georg Cuvier’s catastrophic vision; Cuvier, of course, had studied actual geology and had reached his startling conclusions by looking. The realistic critique of the “tidal” theory is that such a violent encounter would more likely result in the dispersal rather than in the condensation of matter.
Despite the rejection of catastrophic origins, that theory still plays a major role in today’s consensus opinion in yet more ad hoc formulations. Thus the contrary rotations of three planets (or two if you don’t include Pluto) are explained by collisions and interactions between planetismals; but how these bodies got into erratic motion in the first place is not really developed. Similarly, the asteroid belt is explained either as the shatter of a planet or the inhibiting influence of Jupiter preventing aggregation. G.P. Kuiper (known for predicting the Kuiper Belt on the fringes of the solar system) speculated in 1951 that the sun’s planetary systems are the consequence of a failure. “It almost looks as though the solar system is a degenerate double star,” he wrote, “in which the second mass did not condense into a single star but was spread out—and formed the planets and comets.” Jupiter’s distance from the sun and the mass of the planets as a whole are about right (based on observation of binary systems) to support this hypothesis. But this notion also fails to explain the existing anomalies. Theories for the moon’s origin are based either on catastrophic interactions, e.g., the moon having been ripped from the relatively light outer mantle of the Earth by a passing body (a tidal theory in miniature), or on erratic wandering bodies, with the moon being captured by the Earth. Comets, the strange orbits of which do not fit the nebular theory—and there are short as well as long-period comets—are often the deus ex machina invoked to explain catastrophic events. To be sure, they don’t properly fit any elegant picture of the solar system’s orderly formation.
The closer we are to our subject the more variables appear to be present and therefore mathematical modeling of systems is more difficult. In cosmology, for instance, a few simple subatomic particles plus gravity, pressure, mass, and heat are the manageably few variables. Even so, the big bang theory failed to predict the formation of suns and galaxies until Alan Guth offered a rather arbitrary scenario in 1980. In its earliest stages, Guth said, the universe temporarily accelerated its expansion, a process known as “inflation.” In this process evidently gravity had to work in reverse (!!) There is no way to verify inflation, but it is accepted. Without this “kick” or “leap” of faith, the theory would have died an early death. What kept it alive—and the reason why Guth found support—was because the overall expansion of the universe, discerned from the red shift, had to be explained somehow. But the point here is that even a simple model with just a few variables, had and continues to have problems. How much more so the solar system—and never mind DNA.
The reason why mathematical models are so important is because they seem to bring coherence to what at first appear as irreducibly chaotic phenomena. Modeling imposes an order on chaos but usually at the cost of simplifying the inputs by leaving out much or averaging what seem to be minor influences. Models then produce predictions which can be tested by observation. The difficulties and limitations of modeling, however, are neatly illustrated by the cleverness of pre-Galilean modelers of the solar system. Their model accurately predicted the future position of planets although it was constructed on the assumption that the earth stood still and all else moved. It also became very complex as ever new solutions had to be devised to accommodate ever new observations that failed to fit properly. The Ptolemaic model was eventually replaced when better instruments emerged. Today’s big bang theory is accumulating ever more hypotheses and ever more work-arounds, like the Ptolemaic model did in the past. The big bang is under challenge from all sides. It requires Guth’s strange inflation; the cosmos displays large structures for the formation of which insufficient time has passed; one of these is the Great Wall, the first such discovered, others have been noted as well; a picture of it is here. In recent decades, furthermore, gravitational anomalies have been observed countered by positing undetectable dark matter and dark energy which constitute the overwhelming mass of the cosmos(!??). Indeed, doubters within science itself have identified twenty or more anomalies the theory fails to explain adequately—and plasma physics, which emerged in the mid-twentieth century in sophisticated form, offers alternative avenues of explanation. Cosmology is therefore gradually sliding into crisis.
Einstein’s theory of relativity is also eroding as discoveries in quantum physics have produced physical evidence for faster-than-light communications between subatomic particles—to name just one instance. Quantum physics is itself the earliest child of a physics largely developed in the changing—and indeed darkening—times of what seems a new era to me. It dissolves Einstein’s reality into energy, suggests that the predictabilities of physics are confined to macroscopic aggregates of energy, i.e., matter as we see it, and that beneath it is a boundless indeterminacy which may require an observer to become visible at all. The universe is thus a vast cloud of probability; observation causes probability waves to collapse; the observer becomes integral to the universe and, in a sense, creates reality.
Arguably if the observer really has a role in quantum physics, that observer should be explained as well. The positivistic explanations of intelligence seem inadequate to the purpose and may foreshadow changes in the area most resistant to change, the naturalistic theories of life.
Finally, in psychology, a hydra-headed cluster of genuinely scientific findings, both positive and negative, are weakening the positivist consensus that mind its an epiphenomenon. Interesting approaches are from the angle of the paranormal—including psychic powers in animals and hard research into reincarnation. Negative data show the persistent failure of naturalistic science in locating memory in the brain itself.
In sum, the paradigms they are a changing. Will the new theories, as they emerge and take hold, exhaustively describe the cosmos, life, and man? I doubt it.
The last word belongs to Heraclitus who claimed that there is a tendency in nature for all things to become their opposites. He labeled this enantiodromia, meaning “counter-running.” One might say that every such transformation produces a new vision of reality but never one that is complete. Around and round she goes, but where she stops nobody knows.
Today’s dominant theory of the solar system’s formation presumes that it is the condensation of a cloud of dust and gases, a nebula. This cloud eventually collapsed into our sun when matter at its center aggregated initially by random motions. Under the force of the increasing gravity, the rest of the cloud flattened into a disk rotating in a counter-clockwise direction as perceived relative to the sun from the Earth. Within that disk clumps formed. They are called planetismals—for planetary seeds. Planetismals aggregated yet more material and became planets. By secondary processes, these bodies in turn, also rotating, caused moons to form around them. The planets in the aggregate account for about 1 percent of all the matter in the original cloud, the sun for 99 percent. This suggests that the sun itself should have the greatest angular momentum in the system in that it has the greatest mass. The inner planets are small, dense, and heavy; the outer are large and gaseous—and this distributional effect is due to sorting by density under the gravitational influence of the sun. All this makes sense, of course: one thinks of a blender, with the rod of the blender creating a little hole in the middle representing the pull of gravity.
Indeed the picture, and the theory, would be indisputable if only the solar system—and stellar clouds, for that matter—obliged us by behaving properly. For starters there is the fundamental problem, explained away by various stratagems, that vast clouds of dust just don’t clump up but, instead, have the tendency to spread. But never mind that. Assuming that the process starts somehow, proper behavior would be indicated if (1) the sun had nearly all of the angular momentum of the system, (2) if all of the planets rotated around their axes in the same direction as the sun, i.e., counter-clockwise, (3) if their axes were oriented in the same direction, i.e. parallel with the sun’s, and (4) if their moons also obligingly circled them in the same direction and with the same spin-direction as the planets themselves exhibit when circling the sun. The composition of planets and moons should also be in line with their location. Thus the moon should be as dense as the earth and the moons of Jupiter as gaseous as that planet.
The real facts are otherwise. The sun, with 99 percent of mass, has only 3 percent of the solar system’s angular momentum—Jupiter has 60 percent. Venus, Uranus, and Pluto rotate clockwise. The axis of the Earth is tilted at an angle to the sun’s. The axes of Uranus and Pluto point at the sun. Some of Jupiter’s moons circle it counter-clockwise, others clockwise. And the moon is of a much lighter density than the Earth’s so that its origin cannot be—and is not—attributed to condensation. We are thus faced with a sequence of questions: How did the sun lose its momentum, how did the rebellious planets acquire their contrary rotations, how did the axes of some planets get their tilt? Where does the moon originate and when did we get her? In addition to such questions we have yet other strange anomalies: the asteroid belt is one, the rings of Saturn another, and the comets (they come in two categories) yet a third. Pluto presents us with the interesting fact it has been demoted from planetary status in our lifetime. It has a rather eccentric orbit.
Notice the elegance of the original theory and the messy details of actual behavior. In all such cases science has a tendency to introduce ad hoc explanations which, while plausible in human experience (“Shit happens,” as we say), are difficult to reconcile and to embed neatly in a coherent and unified theory.
The nebular theory, which goes way back to Descartes (who imagined a gigantic whirlpool in a cosmic fluid), later developed a “catastrophic” competitor. Under this scenario, the sun encountered another sun wandering the cosmos. The intruder, interacting with the sun, ripped a huge tide of matter from the sun which, condensing out, formed the planets. This theory had the benefit of at least potentially explaining the sun’s low angular momentum—by hypothesizing that the visiting body gave up some of its angular energy to the ripped-out tide. The action itself—being, in a sense, wide-open to the imagination—could at least vaguely explain the strange rotations of Venus, Uranus, and Pluto. But the catastrophic theory lost momentum. It had its origins with Buffon in 1745, thus in the period leading up to the French Revolution. All things “revolutionary” were strongly resisted in the next century—hence the success of Darwin’s gradualistic theory of evolution based on Lyle’s gradualistic geology, which Lyle in part formed to counter Georg Cuvier’s catastrophic vision; Cuvier, of course, had studied actual geology and had reached his startling conclusions by looking. The realistic critique of the “tidal” theory is that such a violent encounter would more likely result in the dispersal rather than in the condensation of matter.
Despite the rejection of catastrophic origins, that theory still plays a major role in today’s consensus opinion in yet more ad hoc formulations. Thus the contrary rotations of three planets (or two if you don’t include Pluto) are explained by collisions and interactions between planetismals; but how these bodies got into erratic motion in the first place is not really developed. Similarly, the asteroid belt is explained either as the shatter of a planet or the inhibiting influence of Jupiter preventing aggregation. G.P. Kuiper (known for predicting the Kuiper Belt on the fringes of the solar system) speculated in 1951 that the sun’s planetary systems are the consequence of a failure. “It almost looks as though the solar system is a degenerate double star,” he wrote, “in which the second mass did not condense into a single star but was spread out—and formed the planets and comets.” Jupiter’s distance from the sun and the mass of the planets as a whole are about right (based on observation of binary systems) to support this hypothesis. But this notion also fails to explain the existing anomalies. Theories for the moon’s origin are based either on catastrophic interactions, e.g., the moon having been ripped from the relatively light outer mantle of the Earth by a passing body (a tidal theory in miniature), or on erratic wandering bodies, with the moon being captured by the Earth. Comets, the strange orbits of which do not fit the nebular theory—and there are short as well as long-period comets—are often the deus ex machina invoked to explain catastrophic events. To be sure, they don’t properly fit any elegant picture of the solar system’s orderly formation.
The closer we are to our subject the more variables appear to be present and therefore mathematical modeling of systems is more difficult. In cosmology, for instance, a few simple subatomic particles plus gravity, pressure, mass, and heat are the manageably few variables. Even so, the big bang theory failed to predict the formation of suns and galaxies until Alan Guth offered a rather arbitrary scenario in 1980. In its earliest stages, Guth said, the universe temporarily accelerated its expansion, a process known as “inflation.” In this process evidently gravity had to work in reverse (!!) There is no way to verify inflation, but it is accepted. Without this “kick” or “leap” of faith, the theory would have died an early death. What kept it alive—and the reason why Guth found support—was because the overall expansion of the universe, discerned from the red shift, had to be explained somehow. But the point here is that even a simple model with just a few variables, had and continues to have problems. How much more so the solar system—and never mind DNA.
The reason why mathematical models are so important is because they seem to bring coherence to what at first appear as irreducibly chaotic phenomena. Modeling imposes an order on chaos but usually at the cost of simplifying the inputs by leaving out much or averaging what seem to be minor influences. Models then produce predictions which can be tested by observation. The difficulties and limitations of modeling, however, are neatly illustrated by the cleverness of pre-Galilean modelers of the solar system. Their model accurately predicted the future position of planets although it was constructed on the assumption that the earth stood still and all else moved. It also became very complex as ever new solutions had to be devised to accommodate ever new observations that failed to fit properly. The Ptolemaic model was eventually replaced when better instruments emerged. Today’s big bang theory is accumulating ever more hypotheses and ever more work-arounds, like the Ptolemaic model did in the past. The big bang is under challenge from all sides. It requires Guth’s strange inflation; the cosmos displays large structures for the formation of which insufficient time has passed; one of these is the Great Wall, the first such discovered, others have been noted as well; a picture of it is here. In recent decades, furthermore, gravitational anomalies have been observed countered by positing undetectable dark matter and dark energy which constitute the overwhelming mass of the cosmos(!??). Indeed, doubters within science itself have identified twenty or more anomalies the theory fails to explain adequately—and plasma physics, which emerged in the mid-twentieth century in sophisticated form, offers alternative avenues of explanation. Cosmology is therefore gradually sliding into crisis.
Einstein’s theory of relativity is also eroding as discoveries in quantum physics have produced physical evidence for faster-than-light communications between subatomic particles—to name just one instance. Quantum physics is itself the earliest child of a physics largely developed in the changing—and indeed darkening—times of what seems a new era to me. It dissolves Einstein’s reality into energy, suggests that the predictabilities of physics are confined to macroscopic aggregates of energy, i.e., matter as we see it, and that beneath it is a boundless indeterminacy which may require an observer to become visible at all. The universe is thus a vast cloud of probability; observation causes probability waves to collapse; the observer becomes integral to the universe and, in a sense, creates reality.
Arguably if the observer really has a role in quantum physics, that observer should be explained as well. The positivistic explanations of intelligence seem inadequate to the purpose and may foreshadow changes in the area most resistant to change, the naturalistic theories of life.
Finally, in psychology, a hydra-headed cluster of genuinely scientific findings, both positive and negative, are weakening the positivist consensus that mind its an epiphenomenon. Interesting approaches are from the angle of the paranormal—including psychic powers in animals and hard research into reincarnation. Negative data show the persistent failure of naturalistic science in locating memory in the brain itself.
In sum, the paradigms they are a changing. Will the new theories, as they emerge and take hold, exhaustively describe the cosmos, life, and man? I doubt it.
The last word belongs to Heraclitus who claimed that there is a tendency in nature for all things to become their opposites. He labeled this enantiodromia, meaning “counter-running.” One might say that every such transformation produces a new vision of reality but never one that is complete. Around and round she goes, but where she stops nobody knows.
Labels:
Nebular Theory,
Paradigms,
Science,
Solar System,
Tidal Theory
Wednesday, May 20, 2009
Titles. Publishers, Please Listen!
Back When I wrote a novella entitled The Chained Karma of the Plutonium Priest—and if that doesn’t draw you, you’re not a science fiction fan. The story was published in Galaxy magazine, but the editor there titled it Plutonium. Yes. Unfortunately Galaxy had serialized one of my novels under the title of Helium, so this was, as it were, in the ballpark. Sort of. Helium, by the way, first saw the light as A Hostage for the Hinterland, my own title for it. No room for a long title? Column-inch-challenged? I shake my head. Plutonium did its magic anyway despite being a nickname. I was nominated science fiction writer of the year; I didn’t get the coveted prize, but I was up there on the stage with two other contenders. Later yet this same novella, genuinely expanded into a novel, not merely bulked up—in effect I wrote a brand new novel with the same plot—saw hardback publication by St. Martin’s Press. But they called it The Karma Affair, hoping to position it as a “cross-over” novel; they put a so-so but mysterious-looking dust jacket on the thing. Now, to be sure, I never wrote anything in science fiction which wasn’t a cross-over book. I’m just not the kind who cleanly fits a genre! Anyway, that book did rather well despite its undistinguished title, enough so that it got translated into a number of languages, including Italian. The Italians, finally, gave it a good title: La Fisica del Karma. This was close to being descriptive of one aspect of the book, as neither Plutonium nor Karma Affair really were. My own title, the best of them all, went to waste. But publishers have this right, you see. They buy the property but reserve the right to butcher the title.
Several incarnations later, now working in reference publishing, I originated a title called Manufacturing U.S.A. Here I got lucky. The deciding personality was Deidra Bryfonski at Gale Research. She over-ruled more junior editors who wished to change the name. No. She went with mine, and MUSA, as it came to be known, went out into the world with the handle that I gave it. MUSA promptly won Best Reference Book of the Year in its category and became the foundation of our company, Editorial Code and Data, Inc. The book is still appearing at two-year intervals, albeit its title now is Manufacturing and Distribution U.S.A., the scope having expanded and also its size; three volumes now. It’s current editor is Joyce P. Simpkin, a young woman but a veteran editor of ECDI. That company name, incidentally, I formed in respectful imitation of what I’d always thought was a great name; I was respectfully using only its rhythm and its honest and direct statement of what the company did: Industrial Light and Magic. ECDI subsequently won several other Best Reference awards as well, and, generally, under Gale, we were and are edited by pros.
Title-inflation is very often a problem nowadays as Chiefs of This and That (Toxic Asset Disposal perhaps) proliferate—but occasionally, if the creator has a knack, title-deflation is something to be avoided. Are you listening out there?
Several incarnations later, now working in reference publishing, I originated a title called Manufacturing U.S.A. Here I got lucky. The deciding personality was Deidra Bryfonski at Gale Research. She over-ruled more junior editors who wished to change the name. No. She went with mine, and MUSA, as it came to be known, went out into the world with the handle that I gave it. MUSA promptly won Best Reference Book of the Year in its category and became the foundation of our company, Editorial Code and Data, Inc. The book is still appearing at two-year intervals, albeit its title now is Manufacturing and Distribution U.S.A., the scope having expanded and also its size; three volumes now. It’s current editor is Joyce P. Simpkin, a young woman but a veteran editor of ECDI. That company name, incidentally, I formed in respectful imitation of what I’d always thought was a great name; I was respectfully using only its rhythm and its honest and direct statement of what the company did: Industrial Light and Magic. ECDI subsequently won several other Best Reference awards as well, and, generally, under Gale, we were and are edited by pros.
Title-inflation is very often a problem nowadays as Chiefs of This and That (Toxic Asset Disposal perhaps) proliferate—but occasionally, if the creator has a knack, title-deflation is something to be avoided. Are you listening out there?
Labels:
Fiction,
Publishing
Paradigm Shifts: I
This interesting phrase originated with Thomas S. Kuhn in a book titled The Structure of Scientific Revolutions (1962) available here. I came across it in the Econ Division’s library at one of my alma matris, Midwest Research Institute, early during my time there. I had no idea lounging there and reading the book over a number of lunch hours that its publication was a landmark event: a genuine cultural approach to science had been born. The piously accepted view until Kuhn was that science changed incrementally as theories were falsified. He proposed shifts of mood more profound and basic in which a theoretical framework grown too overburdened by bad results and clever workarounds is finally thrown over, replaced by a competing view. This notion seemed natural to me because I’d grown up with ideas about culture gathered from my cyclic historians. Thus cultural views cycle radically when the benefits of a dominant world view have been exhausted. In the case of cultural forms, “inwardly” oriented cultures give way to those “outwardly” oriented. The shifts between eras are marked by “renaissances.” It took me most of my life to realize? suspect? that nineteenth century Romanticism was actually a renaissance-like period—but one of those that stand between an “outward” and an “inward” era: those tend to be less splendid than the reawakenings that follow the “dark ages” of inwardness.
Paradigm shifts in science have some of the same flavor and are themselves powerfully influenced by cultural tendencies, but scientists themselves, at least in our times, have tended not to notice that, which is not really surprising in that science is supposed to be concerned with the “real” rather than the slippery, fuzzy, ambiguous, and maddening aspects of the typically human. The great sweep of contemporary science, now effectively visible to an energetic surveyor in some detail, reveals what must be disconcerting to the scientific mind—namely that physical reality itself is at least as maddeningly murky as the human.
It seems to me that we’re likely to see several very major shifts in paradigm in the course of the current century. The century itself appears to be the first or second step into the dark woods of an inward age—with all the chaos and trauma of transition still largely ahead. Thus the shifts in science will reflect the flavor of the age to come. Transformations are signaled in all major sciences: In biology Darwinism will be revised; in physics Einstein’s theories will be honorably retired; in quantum physics the Copenhagen interpretation will be jettisoned; in cosmology the big bang will fizz out; and in psychology the Freudian consensus, which began to quaver almost as soon as formulated, will be entirely replaced. Brave words—but only to the historically challenged; to others almost self-evident.
Such predictions are rather easier to make at certain times in history. All systems decay slowly over time. Falsifications, which are supposed to cause science to react rapidly, do not actually have that consequence because institutionalized priesthoods resist the piecemeal overturn of systems; but falsifications take place steadily and cumulate—and this accumulation is observable. More dramatically, heretics appear quite early in the process. They do not succeed initially but serve to synthesize and dramatize the conflict. They lead small groups to the fault lines of modern theories and stand there pointing at the strange vapors that rise up from the ground. Heretics are derided, marginalized, and suffer a range of ignominies. Nevertheless they do their job.
Not surprisingly many of these phenomena are also closely linked—so that weaknesses in physics show up in astronomy, in geology, in biology, and, once in biology, also in psychology.
The current paradigm is progressive in flavor. Roughly 14 billion years ago the universe burst into being in a grand explosion from a peapod-sized but almost immeasurably concentrated quantity of matter. The explosion created space and time as the matter expanded at colossal speed, spread, and settled into clouds, suns, and galaxies. The expansion is still going on. Space and time are functions of matter. The universe is ruled by four forces: gravity, electromagnetism, the strong force (it holds the atomic nucleus together), and the weak force (it explains nuclear decay and hence radiation). We’ve managed to integrate all but gravity into models, but a “unified theory” that integrates gravity still eludes us. In addition to these forces and the motions they produce or inhibit, chance rules the universe. Life is a complex interaction of matter in its two forms of matter and energy (which are fundamentally equivalent). Once life emerges it evolves driven by a survival urge aided by changes brought on by chance, but since life appears to aim for greater complexity, matter may have a tropism toward complexity on the one hand and toward entropy on the other. Consciousness is a secondary phenomenon, a shadow of the underlying neuronal activity not important in itself (an epiphenomenon); in other words, consciously or not, we would still do the same things. The universe may continue to expand until it achieves energy death or, all depending on how much matter is present, it may begin contracting again and end again, as it began, in a singularity.
Under this paradigm we cannot explain: (1) how the big bang ignited, (2) how galaxies formed without the presence of immense quantities of dark matter and dark energy that we cannot find, (3) how life originated, (4) why it tenaciously clings to forms defined by genetics, and (5) what any of this means.
The dominant cosmological paradigm is justified by Einstein’s theory that gravity is the deformation of space-time by the mass of matter and by observations that light from very distant galaxies is shifted to the red end of the spectrum; this shift is interpreted to mean that every galaxy is moving farther away from every other. If we imagine this explosion in reverse, it appears as if the big bang took place 14 billion years in the past. Interestingly enough, the theory assumes that space itself expands rather than that the galaxies are flying apart in a preexistent space—because the theories of Einstein teach that mass itself somehow “creates” space and time. A few have offered alternative explanations for the red shift, but the big bang theory has become fiercely-defended orthodoxy; alternatives need not apply.
Just to make a distinction, I note here that in the Newtonian cosmology both space and time had absolute meanings and were thus not functions of matter. In that time only one of the four “forces,” gravity, was even recognized to exist. Newton didn’t know what it was and said “I do not frame hypotheses.” His cosmology was strictly descriptive.
[To be continued.]
Paradigm shifts in science have some of the same flavor and are themselves powerfully influenced by cultural tendencies, but scientists themselves, at least in our times, have tended not to notice that, which is not really surprising in that science is supposed to be concerned with the “real” rather than the slippery, fuzzy, ambiguous, and maddening aspects of the typically human. The great sweep of contemporary science, now effectively visible to an energetic surveyor in some detail, reveals what must be disconcerting to the scientific mind—namely that physical reality itself is at least as maddeningly murky as the human.
It seems to me that we’re likely to see several very major shifts in paradigm in the course of the current century. The century itself appears to be the first or second step into the dark woods of an inward age—with all the chaos and trauma of transition still largely ahead. Thus the shifts in science will reflect the flavor of the age to come. Transformations are signaled in all major sciences: In biology Darwinism will be revised; in physics Einstein’s theories will be honorably retired; in quantum physics the Copenhagen interpretation will be jettisoned; in cosmology the big bang will fizz out; and in psychology the Freudian consensus, which began to quaver almost as soon as formulated, will be entirely replaced. Brave words—but only to the historically challenged; to others almost self-evident.
Such predictions are rather easier to make at certain times in history. All systems decay slowly over time. Falsifications, which are supposed to cause science to react rapidly, do not actually have that consequence because institutionalized priesthoods resist the piecemeal overturn of systems; but falsifications take place steadily and cumulate—and this accumulation is observable. More dramatically, heretics appear quite early in the process. They do not succeed initially but serve to synthesize and dramatize the conflict. They lead small groups to the fault lines of modern theories and stand there pointing at the strange vapors that rise up from the ground. Heretics are derided, marginalized, and suffer a range of ignominies. Nevertheless they do their job.
Not surprisingly many of these phenomena are also closely linked—so that weaknesses in physics show up in astronomy, in geology, in biology, and, once in biology, also in psychology.
* * *
The current paradigm is progressive in flavor. Roughly 14 billion years ago the universe burst into being in a grand explosion from a peapod-sized but almost immeasurably concentrated quantity of matter. The explosion created space and time as the matter expanded at colossal speed, spread, and settled into clouds, suns, and galaxies. The expansion is still going on. Space and time are functions of matter. The universe is ruled by four forces: gravity, electromagnetism, the strong force (it holds the atomic nucleus together), and the weak force (it explains nuclear decay and hence radiation). We’ve managed to integrate all but gravity into models, but a “unified theory” that integrates gravity still eludes us. In addition to these forces and the motions they produce or inhibit, chance rules the universe. Life is a complex interaction of matter in its two forms of matter and energy (which are fundamentally equivalent). Once life emerges it evolves driven by a survival urge aided by changes brought on by chance, but since life appears to aim for greater complexity, matter may have a tropism toward complexity on the one hand and toward entropy on the other. Consciousness is a secondary phenomenon, a shadow of the underlying neuronal activity not important in itself (an epiphenomenon); in other words, consciously or not, we would still do the same things. The universe may continue to expand until it achieves energy death or, all depending on how much matter is present, it may begin contracting again and end again, as it began, in a singularity.
Under this paradigm we cannot explain: (1) how the big bang ignited, (2) how galaxies formed without the presence of immense quantities of dark matter and dark energy that we cannot find, (3) how life originated, (4) why it tenaciously clings to forms defined by genetics, and (5) what any of this means.
The dominant cosmological paradigm is justified by Einstein’s theory that gravity is the deformation of space-time by the mass of matter and by observations that light from very distant galaxies is shifted to the red end of the spectrum; this shift is interpreted to mean that every galaxy is moving farther away from every other. If we imagine this explosion in reverse, it appears as if the big bang took place 14 billion years in the past. Interestingly enough, the theory assumes that space itself expands rather than that the galaxies are flying apart in a preexistent space—because the theories of Einstein teach that mass itself somehow “creates” space and time. A few have offered alternative explanations for the red shift, but the big bang theory has become fiercely-defended orthodoxy; alternatives need not apply.
Just to make a distinction, I note here that in the Newtonian cosmology both space and time had absolute meanings and were thus not functions of matter. In that time only one of the four “forces,” gravity, was even recognized to exist. Newton didn’t know what it was and said “I do not frame hypotheses.” His cosmology was strictly descriptive.
[To be continued.]
Tuesday, May 19, 2009
Precision Mail
I was told by a reliable source that you were an unabashed conservative. Is that true? I sure hope so… Because the fight to TAKE BACK our country begins NOW, Ms Brigitte Darnay!She read this message to me off the face of a direct mail envelope. I laughed spontaneously. Out in the sunroom we were, drenched in May light. Some of you know her well—and you will have also laughed. It’s truly funny. Oh Great Database, oh Holy Computer. How wise and true your aim, how marvelous your underlying logic. Oh, yes, Brigitte’s tendrils touch every interest; curiosity and wonder draw her on; nothing human is alien to her; her range extends from the lunatic fringe to wisdom’s Himalayas. And, to be sure, nothing propinques like propinquity, and if it’s in this barrel, surely it must be a fish.
I understand from news accounts that right wing groups are mounting big drives to defeat the next nominee to the highest court—whoever she is. They don’t think they can defeat the nomination, but the fight might open the purses of all those unabashed conservatives the Holy Computer has fished from the barrel. Money will flood in. Not from my unabashed conservative companion, but then the world today plays by the probabilities. Most of the names in the barrel are fish, and some can even write checks, and if the ratio of postage to yield is good enough, who cares, really, if precision mail occasionally hits the wrong target.
Monday, May 18, 2009
"Liberal" and Other Changing Words
Some words carry such a heavy cargo of meanings that controversies can boil up around them almost spontaneously. “Liberal” is one such word, here taken in its adjectival sense. It carries positive and negative connotations. Liberal describes a kind of education. It can mean generosity or amplitude (“a liberal helping”), also lose or free (“liberal translation”). One of its meanings is broadminded, thus “not bound by authoritarianism, orthodoxy, or traditional forms.” An obsolete meaning of it (obsolete according to Merriam-Webster, but I’m not so sure myself), is licentious, thus lacking moral restraint. The word’s root is from the Latin (liber, free), thus free concerning often restrictive customs and free with means often tightly held. In politics it once meant favoring free trade—what today is usually labeled conservative, but the older meaning still survives in Europe.
Curiously a liberal education, strictly speaking, is the medieval curriculum called the trivium and the quadrivium. Brigitte once gave me a wonderful book on this for Christmas, The Trivium, by Sister Miriam Joseph (Paul Dry Books, 2002), something of a classic on the subject. Let’s look at how “liberal” that curriculum sounds: the trivium itself is logic, grammar, and rhetoric; the quadrivium, according to Sister Miriam’s description is “the four arts of quantity pertaining to matter,” thus arithmetic, music, geometry, and astronomy. Liberal once, obsolete now. “Music?!” a modern would say. Words gain meanings and loose them. Courtesy once had a vastly greater meaning than it has at present; I have in mind Castiglione’s The Courtier; rhetoric once had status enough to label a whole curriculum; now people have trouble spelling the word.
“Stereotyping” is another word that has undergone extensive redefinition over time; it once meant a metal plate made from a pattern—to be used for printing. It used to mean making many true copies of a single mold; it has come to mean imposing a single meaning on highly variable and changing situations. Functionally, I think, it is a word meaning prejudice, but precisely because prejudice has become extremely pejorative in the wake of the civil rights movement, another word needed stamping out as it were. Prejudice itself is a word that has undergone great transformations. Edmund Burke (1729-1797) knew it, and used it, to mean pre-judgment, consensus, custom, tradition—thus in a positive sense*. Each of those words, in turn, resemble stock on Wall Street if we take a long enough view. Up they go and down, and around and round they go…
All this in shy reaction to a small controversy on another site which intrigued me enough to engage in this totally liberal, meaning free, examination of words—which never fail to fascinate me, especially not when some minor explosion causes their energetic ricochet to draw my attention from a safe distance.
----------
*Extensively discussed in Russell Kirk’s The Conservative Mind on p. 38 and elsewhere.
Curiously a liberal education, strictly speaking, is the medieval curriculum called the trivium and the quadrivium. Brigitte once gave me a wonderful book on this for Christmas, The Trivium, by Sister Miriam Joseph (Paul Dry Books, 2002), something of a classic on the subject. Let’s look at how “liberal” that curriculum sounds: the trivium itself is logic, grammar, and rhetoric; the quadrivium, according to Sister Miriam’s description is “the four arts of quantity pertaining to matter,” thus arithmetic, music, geometry, and astronomy. Liberal once, obsolete now. “Music?!” a modern would say. Words gain meanings and loose them. Courtesy once had a vastly greater meaning than it has at present; I have in mind Castiglione’s The Courtier; rhetoric once had status enough to label a whole curriculum; now people have trouble spelling the word.
“Stereotyping” is another word that has undergone extensive redefinition over time; it once meant a metal plate made from a pattern—to be used for printing. It used to mean making many true copies of a single mold; it has come to mean imposing a single meaning on highly variable and changing situations. Functionally, I think, it is a word meaning prejudice, but precisely because prejudice has become extremely pejorative in the wake of the civil rights movement, another word needed stamping out as it were. Prejudice itself is a word that has undergone great transformations. Edmund Burke (1729-1797) knew it, and used it, to mean pre-judgment, consensus, custom, tradition—thus in a positive sense*. Each of those words, in turn, resemble stock on Wall Street if we take a long enough view. Up they go and down, and around and round they go…
All this in shy reaction to a small controversy on another site which intrigued me enough to engage in this totally liberal, meaning free, examination of words—which never fail to fascinate me, especially not when some minor explosion causes their energetic ricochet to draw my attention from a safe distance.
----------
*Extensively discussed in Russell Kirk’s The Conservative Mind on p. 38 and elsewhere.
Saturday, May 16, 2009
Battle of Clerics
I came across some notes of mine from May 2005 titled “A Battle of Clerics.” My new look then at the subject of intelligent design (ID) must have been occasioned by some uproar. Indeed, checking back, it appears that around that time began a case later known as Tammy Kitzmiller, et al. v. Dover Area School District, et al. in which the U.S. District Court held that teaching intelligent design in public school biology classes violated the Establishment Clause of the U.S. Constitution. Intelligent design was held not to be science; the court also saw it as hopelessly entangled in a creationist belief.
I actually agreed with the U.S. District Court up to a point. I think that ID is a philosophical position—as is Darwinism. Neither is a “hard science.” Biology must remain, by its very nature, a soft or a “descriptive” science. The biological realm staunchly resists experimental approaches able to demonstrate how species may have been formed. The whole area is saturated with assumptions, hypotheses, theories, assertions, denials, and the rest—not one of which is capable of demonstration or falsification. A person may be persuaded of this or that, but the proofs are not compelling. We’re in the realm of natural philosophy here. Biology is no more a science than are cosmology, history, psychology, anthropology, or sociology. Science strictly construed is a term applicable, alas, only to physics and to chemistry. In the biological realm anatomy and medicine qualify up to a point, but even in medicine there are spooky areas that don’t fit the narrow gauge of experimental science (psychiatry, placebo effect, etc.).
What we see in the case of Kitzmiller v. Dover is indeed a battle between clerics, thus between dogmatists. The matter cannot be resolved on merits. This is the sort of problem that E.F. Schumacher (A Guide for the Perplexed) labeled a “divergent” problem: the more you debate it, the greater the heat and the confusion. (“Convergent” problems, by contrast, produce solutions.) The matter therefore must be lifted to a higher level where another philosophical system, using its own definitions of value and its own transmission of holy writ, will decide who is right. Not in reality, to be sure, but in the social context that prevails.
The problem with ID, I think, is three-fold. First, it expends a lot of effort on showing something that is rather obvious to any open-minded observer. It is the teleological pattern that entirely soaks the biological realm. Purpose is present. You don’t need intricate researches to make it plain. I reached that conclusion very rapidly after studying biology for about three months in my adult years. I saw “technology” all over the place. I loved looking at the details of cells. It seemed to me self-evident—and if there was anything new there, it was that “As in the macrocosm, so in the microcosm.” Purpose is present.
Second, ID positions itself as science. This is in part visible in William A. Dembski’s work; Dembski relies heavily on mathematical approaches which, to me, are a clear indication of the clerical character of this battle in biology: modern science believes it offers irrefutable evidence when it offers it in the form of equations. And the public, which gets a real scare when it sees Greek letters and square-root symbols, is properly awed. Mind you, I have read Dembski—a lot of Dembski. Carefully. It’s not the work itself. It is its positioning and rhetorical uses that are the problem. The nature of biological evidence does not change just because we approach it with a transcendentalist assumption. What you see in biology is mechanism. That which is the real puzzle we can't access at all by any methodology that uses chemistry, dissection, equations, or fossils. The facts, as they appear to me, are that the life sciences must be approached by philosophical, transcendental means. They are inaccessible to science understood as a genuine explanatory methodology. Biology cannot be explained that way. Give me an equation for purpose…
Third, ID proponents take an adversarial position to conventional, neo-Darwinist approaches. This flows from the second error, the assumption that biology is a hard science where—if only the workers in the field were to use the right method—new answers would be possible. I don’t think so. At the same time, however, there is ample room for debating naturalism as an ideology. But that’s an entirely different enterprise. One doesn’t need the bacterial flagellum for that. Two thousand years worth of philosophical tooling is on hand and ready to be used in the usual way.
When clerics battle, don’t look for charm or inspiration. It’s the Pied Piper who attracts the children to follow him out of the city. The poets and the visionaries might lead us if inspiration strikes them. The clerics will just keep shouting—and nobody the wiser.
I actually agreed with the U.S. District Court up to a point. I think that ID is a philosophical position—as is Darwinism. Neither is a “hard science.” Biology must remain, by its very nature, a soft or a “descriptive” science. The biological realm staunchly resists experimental approaches able to demonstrate how species may have been formed. The whole area is saturated with assumptions, hypotheses, theories, assertions, denials, and the rest—not one of which is capable of demonstration or falsification. A person may be persuaded of this or that, but the proofs are not compelling. We’re in the realm of natural philosophy here. Biology is no more a science than are cosmology, history, psychology, anthropology, or sociology. Science strictly construed is a term applicable, alas, only to physics and to chemistry. In the biological realm anatomy and medicine qualify up to a point, but even in medicine there are spooky areas that don’t fit the narrow gauge of experimental science (psychiatry, placebo effect, etc.).
What we see in the case of Kitzmiller v. Dover is indeed a battle between clerics, thus between dogmatists. The matter cannot be resolved on merits. This is the sort of problem that E.F. Schumacher (A Guide for the Perplexed) labeled a “divergent” problem: the more you debate it, the greater the heat and the confusion. (“Convergent” problems, by contrast, produce solutions.) The matter therefore must be lifted to a higher level where another philosophical system, using its own definitions of value and its own transmission of holy writ, will decide who is right. Not in reality, to be sure, but in the social context that prevails.
The problem with ID, I think, is three-fold. First, it expends a lot of effort on showing something that is rather obvious to any open-minded observer. It is the teleological pattern that entirely soaks the biological realm. Purpose is present. You don’t need intricate researches to make it plain. I reached that conclusion very rapidly after studying biology for about three months in my adult years. I saw “technology” all over the place. I loved looking at the details of cells. It seemed to me self-evident—and if there was anything new there, it was that “As in the macrocosm, so in the microcosm.” Purpose is present.
Second, ID positions itself as science. This is in part visible in William A. Dembski’s work; Dembski relies heavily on mathematical approaches which, to me, are a clear indication of the clerical character of this battle in biology: modern science believes it offers irrefutable evidence when it offers it in the form of equations. And the public, which gets a real scare when it sees Greek letters and square-root symbols, is properly awed. Mind you, I have read Dembski—a lot of Dembski. Carefully. It’s not the work itself. It is its positioning and rhetorical uses that are the problem. The nature of biological evidence does not change just because we approach it with a transcendentalist assumption. What you see in biology is mechanism. That which is the real puzzle we can't access at all by any methodology that uses chemistry, dissection, equations, or fossils. The facts, as they appear to me, are that the life sciences must be approached by philosophical, transcendental means. They are inaccessible to science understood as a genuine explanatory methodology. Biology cannot be explained that way. Give me an equation for purpose…
Third, ID proponents take an adversarial position to conventional, neo-Darwinist approaches. This flows from the second error, the assumption that biology is a hard science where—if only the workers in the field were to use the right method—new answers would be possible. I don’t think so. At the same time, however, there is ample room for debating naturalism as an ideology. But that’s an entirely different enterprise. One doesn’t need the bacterial flagellum for that. Two thousand years worth of philosophical tooling is on hand and ready to be used in the usual way.
When clerics battle, don’t look for charm or inspiration. It’s the Pied Piper who attracts the children to follow him out of the city. The poets and the visionaries might lead us if inspiration strikes them. The clerics will just keep shouting—and nobody the wiser.
Labels:
Aquinas,
Artistotle,
Dembski,
Evolution,
Intelligent Design,
Schumacher
Friday, May 15, 2009
Nomology?
It must be the new hot word among intellectuals. It means theory or science of laws, from the Greek nomos meaning law or convention. I came across it the other day when trying to understand what supervenience means and consulted the online version of the Stanford Encyclopedia of Philosophy. Thus followed a search-for-a-word within a search-for-a-word—because I’d never consciously seen that word before. The use of the term, as I saw it in context, was insider talk among philosophers, people who routinely use a word like iff [sic] meaning “if and only if” and fully expect their readers instantly to get it. Now that one I got—but only because I have computers in my background and have been known to curl up, in pleasant anticipation, to enjoy books like Bjarne Stroustrup’s The Design and Evolution of C++. But as for nomology, which I took in stride once I understood it, I did not expect to encounter it ever again unless it was in one of those “if you don’t understand it you don’t belong here” contexts of academic discourse. Well! Three days later here’s the same word in a political magazine we rather enjoy around here, The American Conservative.
This immediately brought to mind a brief exchange we had with Michelle on the French word, la marotte, meaning “the fool’s scepter,” “fad,” “idée fixe,” “hobby.” She’d never heard the word and at first refused to believe that it was French. At last her dictionary cowed her. Then she said: “I swear I’ll probably encounter it four or five times in the newspapers over the next couple of weeks. That happens all the time, doesn’t it?”
Yes, it does. At least it happens to all the rest of us in Ghulfdom, and I expect that it’s a universal experience the root of which is probably banal, but I prefer to think of it as “significant” somehow. That’s how you separate the inwardly living from the dead. The former are romantics. I expect to see nomology next in the New York Times. And six months out Bill Saffire, in the New York Times Magazine, will give us the lowdown. These words start going around when they come around. Those who use them use them as badges of distinction, however dubious.
This immediately brought to mind a brief exchange we had with Michelle on the French word, la marotte, meaning “the fool’s scepter,” “fad,” “idée fixe,” “hobby.” She’d never heard the word and at first refused to believe that it was French. At last her dictionary cowed her. Then she said: “I swear I’ll probably encounter it four or five times in the newspapers over the next couple of weeks. That happens all the time, doesn’t it?”
Yes, it does. At least it happens to all the rest of us in Ghulfdom, and I expect that it’s a universal experience the root of which is probably banal, but I prefer to think of it as “significant” somehow. That’s how you separate the inwardly living from the dead. The former are romantics. I expect to see nomology next in the New York Times. And six months out Bill Saffire, in the New York Times Magazine, will give us the lowdown. These words start going around when they come around. Those who use them use them as badges of distinction, however dubious.
Labels:
Language,
Philosophy,
Saffire
Things, Things, Things
I’ve always admired people able ruthlessly to rid themselves of trash. But I always have to curb that thought when once I have it. Another rapidly follows: “Why do they buy so much? That end-table there: it’s as good as new. That huge, ugly plastic toy: it looks like no child ever touched it.”
A family joke when going shopping: “Let’s be patriotic.”
Many years ago (at least thirty) I read a sci-fi short in which consumption is mandatory and under-shopping subject to incarceration. I wish I’d noted the author’s name. I’ve cited that story a hundred times in conversation.
At the moment I’m guilty of rescuing a pristine edition of a World Book Encyclopedia from someone’s trash. The story is told here. Do I need it? Good heavens no! Something’s out of joint: in me and also those who bought and tossed those never-opened thirty volumes.
We’re planning a big move after twenty years in the same house. Just as I begin the first shocked inspection of our clogged spaces here, I haul back thirty more volumes when I ought to stage about a thousand or more for disposition. And that’s just books. Madness.
Now ours is not by any measure a high-consumption household. We’re both children of World War II and of its half-starved aftermath in Europe. Yet things accrue. Indeed they accrue because, though useless now, they might come in handy someday. Tough times ahead. I’ll have to become the people I admire. Woe is me.
Things, things, things.
A family joke when going shopping: “Let’s be patriotic.”
Many years ago (at least thirty) I read a sci-fi short in which consumption is mandatory and under-shopping subject to incarceration. I wish I’d noted the author’s name. I’ve cited that story a hundred times in conversation.
At the moment I’m guilty of rescuing a pristine edition of a World Book Encyclopedia from someone’s trash. The story is told here. Do I need it? Good heavens no! Something’s out of joint: in me and also those who bought and tossed those never-opened thirty volumes.
We’re planning a big move after twenty years in the same house. Just as I begin the first shocked inspection of our clogged spaces here, I haul back thirty more volumes when I ought to stage about a thousand or more for disposition. And that’s just books. Madness.
Now ours is not by any measure a high-consumption household. We’re both children of World War II and of its half-starved aftermath in Europe. Yet things accrue. Indeed they accrue because, though useless now, they might come in handy someday. Tough times ahead. I’ll have to become the people I admire. Woe is me.
Things, things, things.
Wednesday, May 13, 2009
Time Horizons
This may be cheerless reading—hence you may want to skip this. My subject is time horizons, but in a special sense. What’s usually meant is that some people customarily look far ahead in time whereas, at the other extreme, some people live day-to-day. Those who have long time horizons are usually more successful; they plan ahead and also draw their experience from the distant as well as the immediate past.
The sense in which I want to use this phrase today is to signify ample time to concentrate on necessary tasks (long or adequate horizon) and short bits of time bounded by interruptions that come at frequent but random intervals. To achieve longer-term goals, we must engage in a flow of effort we can sustain. Arbitrary interruptions chop this time into segments that are too small. If they cannot be avoided, they produce short, inadequate horizons.
My thesis is that people who’ve passed 70 will find that their time horizons, in my sense, are shortening, and this because medical conditions increasingly interrupt ordinary life or limit our use of bodies and of minds. To be sure, fewer years are left to live as well, but that’s not what I mean. In the usual sense I’ve always had a very long time horizon; indeed I still do. I’m not talking about the approaching end but something much more practical. Arbitrary and unwelcome events that must be minded chop time apart and suck up psychic energy. One consequence is that initiative weakens. You hesitate to start something anticipating another shock. You become gun-shy. After what we call a “medical week” around here, especially two in a row, it’s difficult even to remember the sustained effort we suspended while playing “patient”—much less picking it up again—if, as often happens, ten days out yet another anxiety-producing “procedure” looms ahead. Ten days? Seems like a long time. But it has both a backward and a forward shadow of at least three days in each direction. The tendency in cases like this is to live more in the moment, to enjoy the momentary freedom from “all that.” Time and time again I hear myself saying: “Well, at least I have about three days.” Rarely am I tempted to use those three days in ways that may require effort and may produce stress—even though the goal is of my own choosing. I use the days to build up energy for getting past “the next one,” whatever it is, with the stiff upper lip and a sort of chipper humor trying to convince myself and others that I can handle it. I’m all right, Jack!
Pondering this sourly during a walk on Monday—sourly because two medical events, both very unpleasant, one for B., one for me, were scheduled for Tuesday and Wednesday—it occurred to me that poor people, living on the edge of economic disaster, have short time horizons in both senses because they too are subject to harsh interruptions that cannot be anticipated or dealt with summarily: failing brakes, a sick child, a breakdown in plumbing, a tooth ache, the arrest of a teenage child, a bus strike, a hike in rent—small things and large are magnified by poverty. The human being copes as best he or she is able. The slow breakdown of bodies or insufficient economic nourishment reaching a family both have the same effect. The highest arts of medicine in one case, or of government in the other fail to do much more than mitigate. Then if you combine the two, aging and poverty, “vale of tears” is not a bad description. Mine, fortunately, is just a complaint. I can’t get things done as once I used to. Botheration. And another procedure is ahead. “In about two weeks,” the doctor said. “We’ll arrange it. We’ll call you.”
The sense in which I want to use this phrase today is to signify ample time to concentrate on necessary tasks (long or adequate horizon) and short bits of time bounded by interruptions that come at frequent but random intervals. To achieve longer-term goals, we must engage in a flow of effort we can sustain. Arbitrary interruptions chop this time into segments that are too small. If they cannot be avoided, they produce short, inadequate horizons.
My thesis is that people who’ve passed 70 will find that their time horizons, in my sense, are shortening, and this because medical conditions increasingly interrupt ordinary life or limit our use of bodies and of minds. To be sure, fewer years are left to live as well, but that’s not what I mean. In the usual sense I’ve always had a very long time horizon; indeed I still do. I’m not talking about the approaching end but something much more practical. Arbitrary and unwelcome events that must be minded chop time apart and suck up psychic energy. One consequence is that initiative weakens. You hesitate to start something anticipating another shock. You become gun-shy. After what we call a “medical week” around here, especially two in a row, it’s difficult even to remember the sustained effort we suspended while playing “patient”—much less picking it up again—if, as often happens, ten days out yet another anxiety-producing “procedure” looms ahead. Ten days? Seems like a long time. But it has both a backward and a forward shadow of at least three days in each direction. The tendency in cases like this is to live more in the moment, to enjoy the momentary freedom from “all that.” Time and time again I hear myself saying: “Well, at least I have about three days.” Rarely am I tempted to use those three days in ways that may require effort and may produce stress—even though the goal is of my own choosing. I use the days to build up energy for getting past “the next one,” whatever it is, with the stiff upper lip and a sort of chipper humor trying to convince myself and others that I can handle it. I’m all right, Jack!
Pondering this sourly during a walk on Monday—sourly because two medical events, both very unpleasant, one for B., one for me, were scheduled for Tuesday and Wednesday—it occurred to me that poor people, living on the edge of economic disaster, have short time horizons in both senses because they too are subject to harsh interruptions that cannot be anticipated or dealt with summarily: failing brakes, a sick child, a breakdown in plumbing, a tooth ache, the arrest of a teenage child, a bus strike, a hike in rent—small things and large are magnified by poverty. The human being copes as best he or she is able. The slow breakdown of bodies or insufficient economic nourishment reaching a family both have the same effect. The highest arts of medicine in one case, or of government in the other fail to do much more than mitigate. Then if you combine the two, aging and poverty, “vale of tears” is not a bad description. Mine, fortunately, is just a complaint. I can’t get things done as once I used to. Botheration. And another procedure is ahead. “In about two weeks,” the doctor said. “We’ll arrange it. We’ll call you.”
Sunday, May 10, 2009
Mind-Sphere
I produced a short verse the other day concerning the noosphere and, based on the reaction to it, I feel a need to say a little more. Here’s the attempt.
The last century produced a veritable rash of bottom-up theories that I studied avidly as a young man but, as time passed, I came gradually to regard them as strangely juvenile—the more so as my juvenile years passed. Since the 1950s, when I encountered these theories, the world has also changed, of course, and not visibly for the better; hence my perception of that change had something to do with my growing doubts about modernity. In this period I also studied cyclic theories of history; that choice was justified; the broad development of events has generally confirmed what I had learned from the historians. At the same time, thus still in the 1950s, I had at least minimum exposure to ancient and medieval philosophical traditions in college. The exposure proved helpful. As my dissatisfactions with modernism grew, I came to value the older thought more and more. You might say that mine’s an average pattern: the rash enthusiasm of youth leads to sober acceptance of wisdom as experience hammers away.
The theories that I’ve found wanting all have their roots in the idea of progress, thus in the notion that change for the better takes place spontaneously over time. It is tempting to classify them as descendants of the evolutionary theory, but their origins predate Darwin and point back to the Enlightenment—indeed the post-Renaissance period. The theory of evolution, certainly in modern form, is itself a product of the same impulse—an impulse that burst on the scene in a politically visible form as the French revolution, an event that capped the eighteenth century.
The best explanation I’ve ever found for vast collective changes of this kind—e.g., the appearance of the idea of progress—I found in the cyclic theories of history. And the reason why such explanations are plausible to me is because they’re grounded in observations of human behavior—thus in conscious agencies—rather than in nature or in matter.
One example of such a theory is the emergence of the noosphere, first proposed by Vladimir Vernadsky, a Russian mineralogist. The picture we’re supposed to form is that of successive spheres, one above the other; thus we have the geosphere, the hydrosphere, the atmosphere. The biosphere developed along the way and came to be inserted, as it were, on top of the rock and into the water. The final sphere is that of mind. Pierre Teilhard de Chardin developed this idea further. He was a paleoanthropologist and simultaneously a Jesuit priest; he wrote about evolution; the Church frowned on his ideas and impeded their publication. What more do you need in the way of a perfect profile for a modern culture hero?
De Chardin’s writings are obscure enough—especially when he diverges from biology—so that you can understand them in more than one way. One is that biological evolution is producing a sphere of mind that, in due time, “at the omega point,” will create a collective consciousness that will be God in the world. Another possible reading is that evolution is producing a collective mind striving toward Omega, that last word a new name for the divinity. In any case we have here a bottom-up theory, the usually advanced modern thing, producing God from matter in a wooly sort of sense—or is it God leaning over as matter rises up? Je ne sais pas.
Close relatives of the noosphere are Jung’s Collective Unconscious, Bergson’s Élan Vital, the Gaia hypothesis, Whitehead’s Process Cosmology—and there are others, to be sure. In popular culture these same ideas enliven or underpin many episodes of Star Trek, to name just one modern series. Behind these newer forms are older versions of the same, including Hegel’s evolving God difficult to distinguish form the state and the closely related dialectical materialism of Marx that returns us to the worker’s paradise. For me the ultimate problem with all of these downside-up theories always turned out to be my inability to detect an agency responsible for such energetic teleological upward striving. I note parenthetically here that the big question isn’t whether or not God exists but whether agency exists. Once we determine agency at any level, God becomes necessary.
Now concerning what I thought was a tongue-in-cheek little verse suggesting that de Chardin had prophetically foreseen the Internet in general and the blogosphere (another sphere?!!) in particular—and that he saw this as the sphere of mind. Well. Yes. This is the hammer of experience beating down. We do have a noosphere now and, with the right instruments, we can even detect it filling the electromagnetic spectrum with invisible waves filled with meaning. But what is that meaning like? It's just the same-old, same-old. Sheer mass, however dense, does not appear magically to transmute the ho-hum into something higher. That this was possible was very clearly an idea de Chardin actually held. He thought that huge masses of matter, piled up on top of each other, would produce what physics calls a singularity (e.g., black hole). That the extremely tiny suddenly exhibits unimaginable properties, such as particle-wave duality—and finally that complexity, if raised to the nth degree, suddenly bursts into mind. Watching the noosphere on CNN I no longer wonder; the tempting melody sounds flat. Or am I just getting old?
Brigitte's Zen Mistress reaction to hearing this post was: "Did you notice that noosphere and blogosphere both have two Os?"
The last century produced a veritable rash of bottom-up theories that I studied avidly as a young man but, as time passed, I came gradually to regard them as strangely juvenile—the more so as my juvenile years passed. Since the 1950s, when I encountered these theories, the world has also changed, of course, and not visibly for the better; hence my perception of that change had something to do with my growing doubts about modernity. In this period I also studied cyclic theories of history; that choice was justified; the broad development of events has generally confirmed what I had learned from the historians. At the same time, thus still in the 1950s, I had at least minimum exposure to ancient and medieval philosophical traditions in college. The exposure proved helpful. As my dissatisfactions with modernism grew, I came to value the older thought more and more. You might say that mine’s an average pattern: the rash enthusiasm of youth leads to sober acceptance of wisdom as experience hammers away.
The theories that I’ve found wanting all have their roots in the idea of progress, thus in the notion that change for the better takes place spontaneously over time. It is tempting to classify them as descendants of the evolutionary theory, but their origins predate Darwin and point back to the Enlightenment—indeed the post-Renaissance period. The theory of evolution, certainly in modern form, is itself a product of the same impulse—an impulse that burst on the scene in a politically visible form as the French revolution, an event that capped the eighteenth century.
The best explanation I’ve ever found for vast collective changes of this kind—e.g., the appearance of the idea of progress—I found in the cyclic theories of history. And the reason why such explanations are plausible to me is because they’re grounded in observations of human behavior—thus in conscious agencies—rather than in nature or in matter.
One example of such a theory is the emergence of the noosphere, first proposed by Vladimir Vernadsky, a Russian mineralogist. The picture we’re supposed to form is that of successive spheres, one above the other; thus we have the geosphere, the hydrosphere, the atmosphere. The biosphere developed along the way and came to be inserted, as it were, on top of the rock and into the water. The final sphere is that of mind. Pierre Teilhard de Chardin developed this idea further. He was a paleoanthropologist and simultaneously a Jesuit priest; he wrote about evolution; the Church frowned on his ideas and impeded their publication. What more do you need in the way of a perfect profile for a modern culture hero?
De Chardin’s writings are obscure enough—especially when he diverges from biology—so that you can understand them in more than one way. One is that biological evolution is producing a sphere of mind that, in due time, “at the omega point,” will create a collective consciousness that will be God in the world. Another possible reading is that evolution is producing a collective mind striving toward Omega, that last word a new name for the divinity. In any case we have here a bottom-up theory, the usually advanced modern thing, producing God from matter in a wooly sort of sense—or is it God leaning over as matter rises up? Je ne sais pas.
Close relatives of the noosphere are Jung’s Collective Unconscious, Bergson’s Élan Vital, the Gaia hypothesis, Whitehead’s Process Cosmology—and there are others, to be sure. In popular culture these same ideas enliven or underpin many episodes of Star Trek, to name just one modern series. Behind these newer forms are older versions of the same, including Hegel’s evolving God difficult to distinguish form the state and the closely related dialectical materialism of Marx that returns us to the worker’s paradise. For me the ultimate problem with all of these downside-up theories always turned out to be my inability to detect an agency responsible for such energetic teleological upward striving. I note parenthetically here that the big question isn’t whether or not God exists but whether agency exists. Once we determine agency at any level, God becomes necessary.
Now concerning what I thought was a tongue-in-cheek little verse suggesting that de Chardin had prophetically foreseen the Internet in general and the blogosphere (another sphere?!!) in particular—and that he saw this as the sphere of mind. Well. Yes. This is the hammer of experience beating down. We do have a noosphere now and, with the right instruments, we can even detect it filling the electromagnetic spectrum with invisible waves filled with meaning. But what is that meaning like? It's just the same-old, same-old. Sheer mass, however dense, does not appear magically to transmute the ho-hum into something higher. That this was possible was very clearly an idea de Chardin actually held. He thought that huge masses of matter, piled up on top of each other, would produce what physics calls a singularity (e.g., black hole). That the extremely tiny suddenly exhibits unimaginable properties, such as particle-wave duality—and finally that complexity, if raised to the nth degree, suddenly bursts into mind. Watching the noosphere on CNN I no longer wonder; the tempting melody sounds flat. Or am I just getting old?
Brigitte's Zen Mistress reaction to hearing this post was: "Did you notice that noosphere and blogosphere both have two Os?"
Labels:
De Chardin,
Jung,
Noosphere,
Whitehead
Friday, May 8, 2009
One Size is Fine
One-size-fits-all-big government takeover of health care. John McCain and others.In what way, pray tell, does shoe-size really relate to health-care plans? Suppose we had a national health plan that provided medical services for actual ailments, preventive care, catastrophic care, psychiatric, dental, and optical care, hospitalization, sports as well as accidental injuries, pharmaceuticals, other medications, physical therapy, pregnancy, long-term care, acute and chronic conditions—in short for any and all matters that normally fit under the category of health care? I wave my hand. Here it is. Here is the program. It covers everybody in the same way: embryos to the oldest still alive, with or without preexisting conditions, whether acquired carelessly or innocently. Get it? Everything. Now what aspect of this program—provided something wonderful like that would be available—wouldn’t fit someone? That size should fit one and all. And if, by some innocent omission, I missed something in the above, the person who has it can go with my blessing to Mexico, Mali, Madras to get it taken care of there. I’ll chip in to help buy the ticket.
Controlling an irrational seizure, I write myself a personal note and underline it six times in heavy marker pen: Thou shalt Not watch congressional hearings on health care reform. It may be injurious to your health.
Labels:
Health Care
Wednesday, May 6, 2009
Shades of Mordor
We were talking long-distance with Michelle in Paris. These conversations go on for an hour or so. Along the way we talked about her problems in commenting on blogs. Brigitte then let slip the fact that I operate three blogs, not just one, and that one of those has a French name, La Marotte. Michelle’s reaction to that name deserves a special posting by itself, but today I’ll stick with another. In the course of this conversation I felt obliged to explain the reasons for these blogs, and I did so, pointing to the main thematics of the other two. But Ghulf Genes, I said, is kind of different because it was the first and no restrictions of subject matter apply. Michelle said: “Oh, I see. One blog to rule them all.” This produced a good deal laughter, and I thought I’d share her spontaneous reaction. Her mind works like mine. Pictures flash up...
And, of course, there is the fact that, still in her teens, Michelle persuaded me to read Lord of the Rings, which I resisted for the longest time—only to become a devoted follower of Tolkien’s myths. Our daughters have been active in introducing us to the new and splendid in music and in literature. Monique introduced me to Gabriel Garcia Márquez (A Hundred Years of Solitude) and to Richard Neuhaus (The Naked Public Square). Michelle is now working on me (a decade and counting?) to read the Harry Potter series, or at least to accept J.K. Rowling. I understand that The Tales of Beedle the Bard will be making it across the Atlantic for me to read, and I agreed to do so. Is that the sound of someone undermining the massive stone walls of my castle?
In the meantime I’ve been enriched by the thought that three blogs are like three magic rings. And there is one that rules them all. How many more blogs must I start to reach fullness of measure. Let me grab a slip of paper and calculate this out. Let’s see, 9 - 3 = 6. Well. Miles to go before I sleep.
And, of course, there is the fact that, still in her teens, Michelle persuaded me to read Lord of the Rings, which I resisted for the longest time—only to become a devoted follower of Tolkien’s myths. Our daughters have been active in introducing us to the new and splendid in music and in literature. Monique introduced me to Gabriel Garcia Márquez (A Hundred Years of Solitude) and to Richard Neuhaus (The Naked Public Square). Michelle is now working on me (a decade and counting?) to read the Harry Potter series, or at least to accept J.K. Rowling. I understand that The Tales of Beedle the Bard will be making it across the Atlantic for me to read, and I agreed to do so. Is that the sound of someone undermining the massive stone walls of my castle?
In the meantime I’ve been enriched by the thought that three blogs are like three magic rings. And there is one that rules them all. How many more blogs must I start to reach fullness of measure. Let me grab a slip of paper and calculate this out. Let’s see, 9 - 3 = 6. Well. Miles to go before I sleep.
Labels:
Fiction,
Ghulf Genes,
Márquez,
Marquez Gabriel Maria,
Neuhaus,
Tolkien
Tuesday, May 5, 2009
Day of the Midwife
Here in America we are aware today that it is the Cinco de Mayo south of the border. Within Ghulfdom we are aware that the fifth of May is International Day of the Midwife. One of us is a member of the profession, our youngest, Michelle. Here she is photographed at the airport last year—and the passport in her hand announces her international credentials. Odd that members of my immediate family were all born in Europe—except Michelle; she saw the light of day in Kansas City at St. Lukes just north of the lovely Plaza. But we’re a contrarian clan. The only member of our family who is a permanent resident of Europe is also Michelle. Paris is richer by her. It was over there, across the Atlantic, that she had her own four children—Max, Stella, Henry and Malcolm; the last two are twins and their names are therefore listed in alphabetical order. Thierry, the father, is an actor, a flutist, and on the side rather a good chef; and Michelle works at Bluets, in Paris, one of the renowned maternity hospitals that continue the tradition of the famed Dr. Fernand Lamaze. Michelle began her formal studies after she had first brought four babies into the world. Thus she experienced before she studied deeply. As it should be. It took four grueling years while she raised a young brood. Here she was, the sole American, in the exacting French medial education system, and she graduated second in a class of more than forty. When word of that finally came, there was wild dancing in America around the telephone.
This day of commemoration is sponsored by the International Confederation of Midwives, an organization formed in 1919 and housed in the Hague, in the Netherlands today. Some 80 countries are members. The website is here. Knowing something about this subject through Michelle, following the gradual evolution of this field, we are heart and soul behind the ICM in its efforts to promote this little-noted profession—no doubt much older than any. The French call their midwives sage femmes, which always gives me a kind of shiver and proud joy. So fitting, in seems to me. As a 16th century midwifery text has it, “A midwife has a lady’s hand, a hawk's eye, a lion’s heart.” Michelle would say: “Easier said than done, Dad!” Yes, that’s our Michelle. And yes, the words fit her. Happy Day of the Midwife to you!
This day of commemoration is sponsored by the International Confederation of Midwives, an organization formed in 1919 and housed in the Hague, in the Netherlands today. Some 80 countries are members. The website is here. Knowing something about this subject through Michelle, following the gradual evolution of this field, we are heart and soul behind the ICM in its efforts to promote this little-noted profession—no doubt much older than any. The French call their midwives sage femmes, which always gives me a kind of shiver and proud joy. So fitting, in seems to me. As a 16th century midwifery text has it, “A midwife has a lady’s hand, a hawk's eye, a lion’s heart.” Michelle would say: “Easier said than done, Dad!” Yes, that’s our Michelle. And yes, the words fit her. Happy Day of the Midwife to you!
Labels:
Midwifery
Monday, May 4, 2009
A Well-Worn Phrase
I used one yesterday to round out the rhythm of a sentence: “Be in the world but not of it.” The phrase has the smooth, comfortable feel of a very well-worn tool obtained from one of the two main sources for such things in English: the Bible or Shakespeare. It comes from neither. Its logical structure is ambiguous at best, but the meaning is accessible. It illustrates wonderfully how far we really are from mere collocations of atoms. The phrase suggests that we may have a choice: in the world or out. We don’t. And if we opt to be in, it suggests that we may in some sense be of it, in another sense not. But what do these options mean? I’d hate to be the one assigned to teach a computer, destined to serve in Artificial Intelligence, exactly how to handle this string of words, how to parse it, how to apply it in actual life situations. People have no such problems. What a piece of work is man…
Labels:
Language,
Meaning,
Shakespeare
Sunday, May 3, 2009
Veni, Vidi, Mori
The other day a huge debate ensued at this household because, thanks to something on TV or in the New York Times, I protested because the speaker or the author translated Caesar’s Veni, vidi, vici, as “I came, I saw, I conquered.” — “No way!” I cried. “I went, I saw, I conquered.” Brigitte begged to differ. And we were off. The truth is that two things temporarily disabled me. One was that Caesar had gone to Gaul, not come to the territory. The other was my very faulty memory according to which the French venir meant go rather than come. Off I stormed to marshal witnesses in print. Unfortunately I have two places of work, one upstairs and one in the basement, both chuck full of reference volumes that, as life would have it, often change places as I drag them from place to place. Thus I had the devil of a time discovering even one of my three Latin dictionaries. One is Latin-German, another Latin-French. Four trips up and down. Cursing in Hungarian—a language whose powers of sexual sacrilege are second to none. At last I found the Latin-English, alas also the smallest, but it had the word. I looked it up. Thus, in a way, I went, I saw—and I died. Arriving in the kitchen, I muttered in tones barely audible: “Well, looks like I was off the beam on that one.” Brigitte gracefully continued chopping carrots and never even gave me that look. But this episode once more proves that in any competition with her on the spelling or meaning of words, withdrawal is the best defense.
Labels:
Language
The Third Stance
One of the intriguing patterns Arnold Toynbee thought he had discovered in the late stages of a civilization was that its elite elements polarize into archaists and futurists, thus into those who seek solutions in the past and those who seek them in a radical break with the past. But this sort of view was not particularly original even in the mid-1930s when the first volume of Toynbee’s A Study of History appeared: the left and right were clearly visible and painted in appropriate colors already. Toynbee’s discernment showed itself because he suggested that neither of these orientations had much of a future. Futurists and archaists would engage in a more or less meaningless battle until disorder finally triumphed. But Toynbee also saw emerging at late stages of mature civilization a third element: neither of the left nor of the right—neither of the West nor of the East, to echo the Koran’s Niche of Lights (24:35)—but an elite that looked up, as it were, detached itself and turned its sight toward the transcendental, rose above the battle yet in the midst of life, in but not of the world, and, from this curiously immaterial but yet fixed point, it begins the process from which a new culture can eventually arise. Toynbee called this third position “transfiguration,” and “etherialization.” He saw it as a detachment from the macrocosm and a return to the microcosm—put very simply: “Stop fixing the world and start improving you.”
This was not merely an abstract codification of historical fact. Toynbee wasn’t merely looking back, discovering Christianity in the rotting womb of a decadent Rome, and rendering this picture as a general historical rule. But he did see, beginning around about the turn of time from B.C. to A.D. the rise of multiple original religious impulses. These included Gnosticism (before it became Christianized), the revived cults of Isis and Cybele, and, most notably, Mithraism. Indeed those times were very similar to ours beginning around the early 1960s. All manner of old religions were invading the Roman realm in new guises, including Eleusinian, Dionysian, Orphic, and neo-Platonic cults and groups, and—among them—a new Jewish religion. One element—that third element—of the society supported them all. Christianity eventually emerged as the victor and, in the process, also absorbed features of others. Rome became Christendom, a new culture.
This is a divergence from my theme but interesting in the context. Toynbee also noted the behavior of two proletariats, as he called them: the masses. The internal proletariat, thus within the limits of the civilization, tends in the late stage to become alienated from the dominant (read governing) minority—the turn-off. And the external proletariat (read third world) becomes antagonistic to the Imperium and, rather than striving to be included in it, begins to fight it.
I’ve often wondered what it might really have been like in old Rome—knowing full well that big abstractions almost completely lose the meaning that they try to synthesize. What does "archaist" actually mean in daily life? The answer, I find, is three-fold: study history, look around, and look inside. At my birth in 1936 the zillion cults of this latter day, not least weird religions from abroad, were not even faintly visible either in Hungary or in the United States. Hare, hare, hare! There was neither culture war at home nor yet its equivalent that we call the War on Terror now. And my own personal history has been a gradual detachment from the macrocosm, if you like. With the passing of the decades, I’ve come to see more and more clearly that the big battles aren’t worth fighting. You win today, but in the see-saw the other side will triumph tomorrow and think that a new epoch has begun. And so it shall be until a new dispensation eventually gives meaning to the whole again. That will take place when we are once more properly oriented to the greater reality.
This was not merely an abstract codification of historical fact. Toynbee wasn’t merely looking back, discovering Christianity in the rotting womb of a decadent Rome, and rendering this picture as a general historical rule. But he did see, beginning around about the turn of time from B.C. to A.D. the rise of multiple original religious impulses. These included Gnosticism (before it became Christianized), the revived cults of Isis and Cybele, and, most notably, Mithraism. Indeed those times were very similar to ours beginning around the early 1960s. All manner of old religions were invading the Roman realm in new guises, including Eleusinian, Dionysian, Orphic, and neo-Platonic cults and groups, and—among them—a new Jewish religion. One element—that third element—of the society supported them all. Christianity eventually emerged as the victor and, in the process, also absorbed features of others. Rome became Christendom, a new culture.
This is a divergence from my theme but interesting in the context. Toynbee also noted the behavior of two proletariats, as he called them: the masses. The internal proletariat, thus within the limits of the civilization, tends in the late stage to become alienated from the dominant (read governing) minority—the turn-off. And the external proletariat (read third world) becomes antagonistic to the Imperium and, rather than striving to be included in it, begins to fight it.
I’ve often wondered what it might really have been like in old Rome—knowing full well that big abstractions almost completely lose the meaning that they try to synthesize. What does "archaist" actually mean in daily life? The answer, I find, is three-fold: study history, look around, and look inside. At my birth in 1936 the zillion cults of this latter day, not least weird religions from abroad, were not even faintly visible either in Hungary or in the United States. Hare, hare, hare! There was neither culture war at home nor yet its equivalent that we call the War on Terror now. And my own personal history has been a gradual detachment from the macrocosm, if you like. With the passing of the decades, I’ve come to see more and more clearly that the big battles aren’t worth fighting. You win today, but in the see-saw the other side will triumph tomorrow and think that a new epoch has begun. And so it shall be until a new dispensation eventually gives meaning to the whole again. That will take place when we are once more properly oriented to the greater reality.
Labels:
Civilization,
Cyclic History,
Toynbee
Friday, May 1, 2009
Welcome, Brigitte
I celebrate the fact that on this May the First I can welcome as a new follower no less a person that Brigitte, my life's companion. To be sure, doing all the dirty work of giving her a blogger profile fell to me, but then that's only normal when it comes to computers and such. She wanted to comment on Patioboat and couldn't, and thus we now have, in this community, one of most avid readers of blogs ever. She was doing that back when I couldn't even spell the word and, indeed, professed a certain contempt for the activity. It's not difficult for me to find occasions for penitence. But never mind that. Welcome, Brigitte! May your comments proliferate throughout the blogosphere...!
Of Brains, Cars, Romans, and Phoenicians
Today’s headline in the Detroit Free Press was A NEW LEASE ON LIFE—a curious way of marking the fact that Chrysler has fallen into bankruptcy.
Almost instantly my brain presented an association that immediately made sense, but not quite in a lineal and therefore obvious way.
I remembered a rather impressive scene that Gibbon described—but it might have been Mommsen. It’s the account of an event during the second Punic War when Hannibal, in an attempt to relieve pressure on his forces elsewhere, moved on the city of Rome itself. According to this account, the Romans arrayed a large force in front of the city in splendid formation, standards lifted high, sunlight reflecting from shining armor, shields, etc. The Romans also engaged in what we’d call psychological warfare today. They managed to leak word to Hannibal’s forces that a large piece of land the Carthaginian army was then occupying had just changed hands in Rome, and at a very decent price… Stiff upper lip! In your face! We shall overcome!
To be sure, Rome was a formidable obstacle Hannibal couldn’t have taken in any case except, perhaps, at catastrophic cost. So far as I know, the only people ever to overrun Rome were some early Celts (take note John); but even they failed to take the last fortified tower. And being barbarians, they didn’t know what to do with the city and moved out again. Hannibal, in any case, was merely engaging in scare tactics, trying to move public opinion, hoping, perhaps, to produce hysteria he could exploit. The stand-off continued until Hannibal finally marched off to the south, “no shot having been fired,” as it were.
Why this image as a mental echo of the day’s headline? Because deep down I reacted to the headline with a feeling: I wanted a more dignified, a more elevated, and less servile and crawling sort of response to the grim news of fate that had reached Motown yesterday. And my brain immediately and obligingly fetched from my memory the nearest available and most vivid example of a dignified public response to a major threat: Rome marshalling its spirits in response to Hannibal who, by that time, had savaged Spain, marched north, through Gaul, crossed the Alps, beaten Rome in two stiff battles (as I recall) and now stood before the gates. The real state of mind inside the walls of Rome was actually fear and trembling—but those in charge rallied and made a great show. And that’s how I really wanted Detroit to take the news. With dignity. But that response was still a coil of feeling inside me. It hadn’t unpacked its meaning yet when picture was already there.
Chrysler almost went bankrupt thirty years ago. The government and the leadership of Leo Iacocca (a Ford guy) saved the company then. Next a German automaker (Daimler) made a stab at it. Then venture capitalists thought they’d pretend to love the cars in order to grasp Chrysler’s financial organ. Now it is the government again, and an Italian firm (Fiat) who are the saviors. Look. This is just a corporate entity. Let’s help Chrysler’s labor element find new ways of making a living. But enough of the dance macabre. A new lease on life? A better iron lung? — As you can see, my brain obligingly serves up images as my inner vision surveys this scene of mayhem.
* * *
A note or two for the younger reader. Edward Gibbon wrote The History of the Decline and Fall of the Roman Empire. Theodor Mommsen was a German scholar, Nobel Prize winner, and one of the best-known historians of Rome. He wrote History of Rome, which covers the period from the origins of the city to the time of Caesar in three volumes. The fourth, about the imperial period, never got done. — The wars of Rome with Carthage were called the Punic wars because the Carthaginians were supposedly Phoenicians, and the Romans referred to them as Punici; hence Punic. The Phoenicians were traders originating in what is now Lebanon. They colonized an area across the Mediterranean which is now Tunis. That’s where Carthage was. But the name Phoenician had followed them to Africa. Hannibal’s stand before Rome took place in 211 B.C.: one long time that that one image has lasted to inform a modern man about the loss of a relatively minor battle in the War of Cars.
Almost instantly my brain presented an association that immediately made sense, but not quite in a lineal and therefore obvious way.
I remembered a rather impressive scene that Gibbon described—but it might have been Mommsen. It’s the account of an event during the second Punic War when Hannibal, in an attempt to relieve pressure on his forces elsewhere, moved on the city of Rome itself. According to this account, the Romans arrayed a large force in front of the city in splendid formation, standards lifted high, sunlight reflecting from shining armor, shields, etc. The Romans also engaged in what we’d call psychological warfare today. They managed to leak word to Hannibal’s forces that a large piece of land the Carthaginian army was then occupying had just changed hands in Rome, and at a very decent price… Stiff upper lip! In your face! We shall overcome!
To be sure, Rome was a formidable obstacle Hannibal couldn’t have taken in any case except, perhaps, at catastrophic cost. So far as I know, the only people ever to overrun Rome were some early Celts (take note John); but even they failed to take the last fortified tower. And being barbarians, they didn’t know what to do with the city and moved out again. Hannibal, in any case, was merely engaging in scare tactics, trying to move public opinion, hoping, perhaps, to produce hysteria he could exploit. The stand-off continued until Hannibal finally marched off to the south, “no shot having been fired,” as it were.
Why this image as a mental echo of the day’s headline? Because deep down I reacted to the headline with a feeling: I wanted a more dignified, a more elevated, and less servile and crawling sort of response to the grim news of fate that had reached Motown yesterday. And my brain immediately and obligingly fetched from my memory the nearest available and most vivid example of a dignified public response to a major threat: Rome marshalling its spirits in response to Hannibal who, by that time, had savaged Spain, marched north, through Gaul, crossed the Alps, beaten Rome in two stiff battles (as I recall) and now stood before the gates. The real state of mind inside the walls of Rome was actually fear and trembling—but those in charge rallied and made a great show. And that’s how I really wanted Detroit to take the news. With dignity. But that response was still a coil of feeling inside me. It hadn’t unpacked its meaning yet when picture was already there.
Chrysler almost went bankrupt thirty years ago. The government and the leadership of Leo Iacocca (a Ford guy) saved the company then. Next a German automaker (Daimler) made a stab at it. Then venture capitalists thought they’d pretend to love the cars in order to grasp Chrysler’s financial organ. Now it is the government again, and an Italian firm (Fiat) who are the saviors. Look. This is just a corporate entity. Let’s help Chrysler’s labor element find new ways of making a living. But enough of the dance macabre. A new lease on life? A better iron lung? — As you can see, my brain obligingly serves up images as my inner vision surveys this scene of mayhem.
* * *
A note or two for the younger reader. Edward Gibbon wrote The History of the Decline and Fall of the Roman Empire. Theodor Mommsen was a German scholar, Nobel Prize winner, and one of the best-known historians of Rome. He wrote History of Rome, which covers the period from the origins of the city to the time of Caesar in three volumes. The fourth, about the imperial period, never got done. — The wars of Rome with Carthage were called the Punic wars because the Carthaginians were supposedly Phoenicians, and the Romans referred to them as Punici; hence Punic. The Phoenicians were traders originating in what is now Lebanon. They colonized an area across the Mediterranean which is now Tunis. That’s where Carthage was. But the name Phoenician had followed them to Africa. Hannibal’s stand before Rome took place in 211 B.C.: one long time that that one image has lasted to inform a modern man about the loss of a relatively minor battle in the War of Cars.
Subscribe to:
Posts (Atom)