In college I did rather poorly in required science courses, owing perhaps to the disconnect I felt between an abstract and an experiential level of knowledge. In these courses the two were separated and acquired side by side—if at all. We got the conceptual material in lectures, the experience in lab work. Try as I might, I couldn’t connect the two, especially not in chemistry.
In the sciences generally, as taught at universities (at least in my day—perhaps the methods have improved), the actual experience of discovery is too much compressed. The messy lab experiments did not, for me, actually illustrate the elegant formulae of chemical transformation that I’d jotted down in notes. The lab time was too short; the stuff I dribbled from jars into tubes was alien and messy; things never seemed to work out; the smells, the fumes…. Those who had originally discovered all the formulae had spent months and years in what must have been a fascinating process. We were supposed to absorb all that in two-hour stints—without training or time enough to hone the many skills involved. It didn’t work out for me.
Passing grades could be achieved by memorizing formulae and abstract tokens in chemistry and Latin names for species and body parts in biology. Rubber stamp on forehead. Passed required course. Keep the line moving.
Later on I really did learn a good deal about chemistry by working in industry. There I saw problems in a brand new and active context. I had to understand what really transpired and why. I got my knowledge by interaction with lots of people and spending time looking at things in refineries and chemical plants. The formulae became a language by which to talk about the bigger, the real thing. In college the lab was an interlude of chaos. In working life it was the center of events, the real thing, Kant’s Ding an Sich—even if not in the way Kant meant it. Behind the abstract thought now loomed this vast, real, imposing and concrete reality. The thought was thin, the thing was thick.
Yet later—decades later—I learned a lot about biology as well. Surprisingly my greatest help in comprehending that mysterious world came from years of experience I’d had of human institutions and technology. That gave me what amounted to an almost reflexive recognition of the Hermetic slogan: As above, so below. Human experience in its complexity mirrors the arrangements of organic nature at another level, enlarges it to ordinary view. The parallel reaches way down, down to the behavior, characteristics, and arrangements of sub-cellular organelles and their constituents in turn.
I came to believe, as a consequence of this experience, that education should be inverted. First, after high school, let us say, should come ten to twenty-five years of work in the world’s economy. Then college! Only fractions of effort would be required and would then yield ten times the benefit that would have been available to the same person at eighteen or nineteen. Education could thus be seen as completing the human, not to prepare him or her for work. Real preparation could still be provided by apprenticeship programs under patient masters.
Now, to be sure, I had more native inclination in my youth to understand literature, history, and philosophy—the humanities generally—than the sciences. But the same rules, it seems to me, also apply to these fields. Experience is the real teacher. Before we learn the names we ought to have experience of das Ding an Sich. If high schools were properly organized—and here and there they are—they would suffice for minimal preparation, not least in ethics and philosophy. At higher levels these subjects really require a rich experience of life before the problems they deal with even fully surface in human life and, therefore, the raw materials for understanding become available.
My own experience is that the really educated are self-taught—yes, even if, nominally, they already have advanced degrees when this real learning begins. It is thought based on experience that educates. By thought I don’t mean simple musing—not merely the storing of knowledge into cubicles provided by a classification system ready-made by public opinion. I mean genuine, ordered reflection.
Here, perhaps, it may be well to reflect for a moment on that word itself—reflection. It rests on abstractions, concepts, immaterial memories, phantasms, but it carries the richer connotation of the inner life that I’m seeking to portray here. Abstraction is a technical sort of word—not something we experience. What we experience is reflection. And it is a rich sort of thing, not the dry, shrunken head of something that once lived—rather something still vital and alive but on another level…
Plato has Socrates say that “the life which is unexamined is not worth living” (in Apology, the text is here). The statement is odd. The person who fails in examining his life wouldn’t know, would he, having never looked; thus the opinion is that of a third party. The phrasing is poor, but the meaning is clear enough. It signals that experience, as such, however valuable and indispensible it is to genuine knowledge, does not really reach the authentically human level until we view it separated from itself, thus in reflection. The experience, needless to say, may be mental, emotional, or spiritual. It doesn’t have to be sensory. But it should have been perceived.
At the same time, reflection without experience, which seems so common in many philosophical contexts, is next to useless. It reminds me of a ping-pong game without a ball. Such a game would feature many wild gestures at two ends of a table, many loud outcries no doubt, many loud arguments about who just scored a point, with denials and passionate affirmations—but really nothing there to prove the facts one way or the other.
Thursday, April 30, 2009
Wednesday, April 29, 2009
Ether
In something like 500 years we’ve replaced a theological with a mathematical scholasticism. The old one could be understood by a reasonably educated and diligent person; the new one requires specialized skills few have the patience to master. Furthermore, it has become difficult to check the new theology independently because the experiments that gave rise to the math are difficult (or very expensive) to reconstruct.
I often think that if I had access to the physical facts, I’d reach conclusions quite at variance with those embraced by science. Another thought is that physics posits tangible facts not because they really exist but because the equations come out a certain way. I’ve yet to read a popular book on physics in which the words persuade me. Elementary particles are supposed to be simultaneously waves and objects. A wave requires a medium like water. The water goes up and down. The cause of this motion is an invisible force. On a beach the force drives the water against the land. The damage, if any, or the rearrangement of the sand at least, is directly caused by the medium, not by the force. When a photon arrives at a screen, something tangibly lands.
My Dictionary of Physics (Penguin, 1977, p. 518) defines the wave much as I’ve defined it above, namely as the disturbance of a medium. For elementary particles, the medium itself is “space” generally and its magnetic or electrical properties particularly, the latter registering the disturbance caused by what I call a “force” and my dictionary simply labels as a “quantity.”
Based on this definition, “space” is filled with “properties” of an electromagnetic character. Now it strikes me as peculiar to dismiss the old idea of the “ether” as nonexistent and yet to assert that “space” is filled with a “property” which has inertial behavior and is subject to ever-so-faint disturbances by light. The mysterious “field” of modern physics thus turns out to be the ether after all. The last shall be the first, the first last. The Michelson-Morely team (the pair that tested for the ether in 1877 and found it not) may someday be found right on whereas, in the not too distant future, Einstein’s relativity may be honorably retired. That sort of thing wouldn’t surprise me.
I often think that if I had access to the physical facts, I’d reach conclusions quite at variance with those embraced by science. Another thought is that physics posits tangible facts not because they really exist but because the equations come out a certain way. I’ve yet to read a popular book on physics in which the words persuade me. Elementary particles are supposed to be simultaneously waves and objects. A wave requires a medium like water. The water goes up and down. The cause of this motion is an invisible force. On a beach the force drives the water against the land. The damage, if any, or the rearrangement of the sand at least, is directly caused by the medium, not by the force. When a photon arrives at a screen, something tangibly lands.
My Dictionary of Physics (Penguin, 1977, p. 518) defines the wave much as I’ve defined it above, namely as the disturbance of a medium. For elementary particles, the medium itself is “space” generally and its magnetic or electrical properties particularly, the latter registering the disturbance caused by what I call a “force” and my dictionary simply labels as a “quantity.”
Based on this definition, “space” is filled with “properties” of an electromagnetic character. Now it strikes me as peculiar to dismiss the old idea of the “ether” as nonexistent and yet to assert that “space” is filled with a “property” which has inertial behavior and is subject to ever-so-faint disturbances by light. The mysterious “field” of modern physics thus turns out to be the ether after all. The last shall be the first, the first last. The Michelson-Morely team (the pair that tested for the ether in 1877 and found it not) may someday be found right on whereas, in the not too distant future, Einstein’s relativity may be honorably retired. That sort of thing wouldn’t surprise me.
Tuesday, April 28, 2009
Slips on the Desk
My desk accumulates stacks of paper that I tend to sort roughly by size, from tiny stick-it notes, through small slips torn from pads (usually filled with numbers), on up to regular 8.5x11s with stuff printed from the web. I came across one of these, one ripped from a steno-pad of the sort Brigitte uses for her version of a diary; we have stacks of pads all filled up, ready, and waiting for historians—and brand new ones still tightly held in shrink-wrap straight from Staples waiting for her pen. My torn sheet bears, along with numbers, such annotations as “Oh, if only…” and “1848-1931,” no doubt the birth and death years of someone notable, and additions like 10 + 10 + 7 = 27, indicating that I must have been really tired; finally, at the top, the words “La Nuestra Señora.”
I remember jotting that phrase down in delight and wonderment, having come across it somehow, somewhere. I also remembered bookmarking a page from the Internet where I went later to confirm the name and to learn more. That page is here. It reveals that the original name of the settlement that mushroomed (pancaked?) into Los Angeles was originally La Nuestra Señora la Reina de los Angeles de Porciúncula—Our Lady the Queen of the Angels of the Little Portion—that last phrase coming from the Italian for a “very small parcel of land” (porziuncola). The very long name for the settlement came to be shortened to El Pueblo de la Reina de Los Angeles, later to Los Angeles, finally to L.A.
Back when I made the note a wonderment came over me—names, and lengths, very small things, very small places, and grandiose hopes. At my birth I was christened Szentmártoni Darnay Arzén Farkas Gyula Mária. Some thirty years after that, by then in Kansas City, we knew a family where the lady of the house, called Dorothy, was a tall, thin, strange woman who had a memorable laugh and lively way of never stopping to speak with machine-gun rhythm. Her mode of speech inclined her to shorten all names. Brigitte thus became Bridge. And I became Ars. Now whether she intended that word to end in an e or not of that I can’t be sure. But she certainly got rid of a lot of complexity in the process.
The longer the name the smaller and younger its object. As the object’s size and complexity increase and become difficult even to encompass, the name shrinks in proportion until it has been eroded by endless use to a mere syllable or two.
I remember jotting that phrase down in delight and wonderment, having come across it somehow, somewhere. I also remembered bookmarking a page from the Internet where I went later to confirm the name and to learn more. That page is here. It reveals that the original name of the settlement that mushroomed (pancaked?) into Los Angeles was originally La Nuestra Señora la Reina de los Angeles de Porciúncula—Our Lady the Queen of the Angels of the Little Portion—that last phrase coming from the Italian for a “very small parcel of land” (porziuncola). The very long name for the settlement came to be shortened to El Pueblo de la Reina de Los Angeles, later to Los Angeles, finally to L.A.
Back when I made the note a wonderment came over me—names, and lengths, very small things, very small places, and grandiose hopes. At my birth I was christened Szentmártoni Darnay Arzén Farkas Gyula Mária. Some thirty years after that, by then in Kansas City, we knew a family where the lady of the house, called Dorothy, was a tall, thin, strange woman who had a memorable laugh and lively way of never stopping to speak with machine-gun rhythm. Her mode of speech inclined her to shorten all names. Brigitte thus became Bridge. And I became Ars. Now whether she intended that word to end in an e or not of that I can’t be sure. But she certainly got rid of a lot of complexity in the process.
The longer the name the smaller and younger its object. As the object’s size and complexity increase and become difficult even to encompass, the name shrinks in proportion until it has been eroded by endless use to a mere syllable or two.
Labels:
Language,
Los Angeles
Monday, April 27, 2009
Words
Polonius: What do you read, my lord?Many years have passed now since my first encounter with the word “epiphenomenon.” I came across the term in studying modern theories of mind. To be sure I’d seen it earlier too, but I’d carelessly assumed that those who used the word, especially in reference to mind, meant “above the phenomenal,” thus separate from, indeed superior to the latter. That meaning seemed legitimate enough because the Greek prefix may be taken as indicating upon, besides, among, on the outside, above, over, and anterior. But later I discovered that the word, as used in the scientific literature, seemed to be understood in quite a different way. Eventually this sent me off to a succession of dictionaries. And I discovered that epiphenomenon actually meant—meaning that it was used by everybody except me—to indicate the very opposite of what I’d understood the word to signify. They meant that mind was a secondary phenomenon—not superior but, to the contrary, inferior, menial, a servant, a non-entity. Indeed, the word meant that mind was entirely the product of an underlying physical phenomenon which caused it. Deeper excursions confirmed this. With that confirmation I placed the word on my personal Index, as it were, and whenever I encountered it used even with vague approval, I would mentally do the sign of the cross, reach for my rabbit’s foot, cross my fingers, and mutter incantations.
Hamlet: Words, words, words.
Now the other day, perusing one of my favorite blogs, Siris, I came across this passage:
In Toronto I once brought a group of my fellow graduate students almost to tears of laughter by commenting offhand that I thought that anyone who talked much about supervenience ought to be drug out in the street and shot.
That word, supervenience, only served Brandon as a take-off point for an amusing discourse on the way Texans express their frustrations or their pride. And there was more of anthropological interest to those of us beyond the borders of the Lone Star State. After I read the posting, a strange feeling caused me to return to its beginning. I found the word again, stared at it hard, and had the sensation of once more being face-to-face with a small but malevolent something disguised, this time, in Roman garb. I went to Merriam-Webster to find confirmation, and indeed I did. Here was a close relative, at minimum, of my much-loathed epiphenomenon. The definition is “Coming or occurring as something additional, extraneous, or unexpected.” This definition seemed to me, however, to deserve horse-whipping at worst, little more, so I went on to see what the Wise, in this case the Stanford Encyclopedia of Philosophy (online) had to say on the subject. Here the first sentence of the article on the subject indeed made me regret that I do not own a gun:
A set of properties A supervenes upon another set B just in case no two things can differ with respect to A-properties without also differing with respect to their B-properties. In slogan form, “there cannot be an A-difference without a B-difference”.
This may be familiar ground for some (say two out of twenty million) but in a humble descendant of a man ennobled by an archduke in the current region of Baden-Württemberg circa 1475 or thereabouts for having been a court-jester of note (I kid not), this sort of thing sets up an urge to crack jokes. But I’ve deteriorated from that lofty status and actually practiced computer coding and such, the menial way of thinking mathematically; thus I plowed on, learned that the vernacular (dictionary) meaning is not to be accepted. I learned that supervenience is a technical term, that it is “proprietary,” meaning that it relates to the properties of things. I went beyond and eventually discovered the following paragraph and nutritious morsel of a quote:
But regardless of how long the notion of supervenience has been around, or who first used the term ‘supervenience’ in its philosophical sense, it is indisputable that Donald Davidson played a key role in bringing the idea to center stage. He introduced the term ‘supervenience’ into contemporary philosophy of mind in the following passage:
Mental characteristics are in some sense dependent, or supervenient, on physical characteristics. Such supervenience might be taken to mean that there cannot be two events exactly alike in all physical respects but differing in some mental respects, or that an object cannot alter in some mental respects without altering in some physical respects (1970, 214).
At last I’d traced the family lineage of this word to my own favorite bête noir. In the process I’d also discovered its home of origin, analytical philosophy, that realm of sentence parsers who, for lack of any meanings to discuss, cavort with grammar instead. Oh, how wonderful it is to discover that those who don’t believe that minds exist except as chemical traces will use so many tricky, tricky words to say what the chemicals are really saying....
Sukie sits back down politely and…here is hoping that…you know, cause I don’t have a clue.
Oh words, words, words, I can never find the words, words, words...
I can never find the words.
[But when I do, Sukie, then I weep.]
Maria Friedman for The Witches of Eastwick
Labels:
Language,
Mind,
Philosophy
Saturday, April 25, 2009
Other Transitional Figures
Emerging as I did from a Jesuit education—which I value, incidentally—one of the amusing paradoxes I encountered was that some famous authors of my youth were labeled heretics by both sides—my teachers as well as the Pundocracy of Unbelief. I use that ponderous term in order to signal a difference between science and scientism. Practitioners of science—physics, chemistry, astronomy, biology—are too busy to be ideological. They tend to be confident—but only too aware of their own and of their craft’s limitations. To have a dogmatic spirit must be a special cross to bear. It is a temptation to exercise power rather than a yearning for truth. I’d rather suffer from the latter. But the point is that any system of belief, religious or atheistic, will have those who yield to the temptation and thus morph into irritating busybodies.
The authors I have in mind, those who didn’t fit the orthodoxies on offer, I call transitional figures, believing (maybe with an excess optimism) that what comes around goes around, hence no curve ever just goes up. There will be seasons. And if rationalism (however much I value it) goes too far, it will be corrected by intuition. And vice versa.
I’ve mentioned two of these figures recently—Carl Jung and Teilhard de Chardin. The first began as a Freudian and then reintroduced the West to God in the form of the Collective Unconscious. The other began in Darwin’s neighborhood and produced an evolutionary theory which gradually forms a sphere of mind straining towards Omega. Both of these terms are nicely ambiguous. Both thinkers clung to naturalism when you get right down to it, but of an antsy sort. It’s a naturalism clawing away at veils and trying to break free. Messy, too. Over against it a stolid, Stoic atheism appears downright noble—but, alas, the Stoic must eventually confess that pop culture is its lineal descendant, and it ain’t pretty, much less noble. Slimy wiggle-things: are they new birth or corruption?
Other figures I think of as transitional are William James, Pierre Lecompte du Noüy, and David Bohm. They were a psychologist, a doctor, and a physicist. James influenced me greatly with his book entitled The Varieties of Religious Experience; du Noüy wrote on evolution. David Bohm strayed from physics enough in his late years to commit cosmology.
James was one of the first of those who almost touched the transcendental in his writings—and certainly did so in his actual life. He keenly felt the constraining limits of naturalistic science but was a modern man rooted in empirical soil and resistant to the older practice of “natural philosophy.”
Du Noüy’s book, Human Destiny, lay waiting for me a-mouldering on the shelves of the Grosse Pointe Library; written in 1947, it must have been enormously popular to make it to these shelves. Grosse Pointe is an auto-baron’s sleeping quarters (although I live, so to say, in the servant’s wing of it). The Library here is the worst I’ve ever had to call my own. By the time I was fully grown, the name had disappeared, but de Chardin, who had a similar take on evolution, was all the rage. De Chardin saw evolution is cosmic terms, du Noüy under a moral sway. He saw the triumph of life as “conscience” and pictured morality as the future path of evolution. Conscience will somehow lead to another leap forward in some remote future time. He offered very meager hopes for individuals in that he seemed to doubt that “souls” exist and thus continue beyond the grave. His focus of admiration was “human dignity.” He also embraced the piety that seems to have come down to us from the nineteenth century—held by scientifically inclined but well-meaning people (like my Mother)—that we “live on” through our deeds and contributions. Dulce et decorum est pro patria mori, etc. The man must have been a moral paragon to think that such a view could possibly inspire humanity to heroic self-denial; at bottom his vision offers no hope for the individual. But du Noüy was clearly not an intuitive, poetic type and therefore missed a whole dimension of his theme. He thought that the miracle—at least the last miracle—was the human brain—which tells me that he never pondered the paradoxes of consciousness or the mysteries of inner life. The book has a distinctly nineteenth century flavor. The New Age was still aborning…
David Bohm wrote the definitive textbook on quantum theory, under that very title. It is still used in teaching the subject—and a copy of it is on sale at my local Barnes & Noble. His claim to fame, however, is an alternative hypothesis to quantum theory; when applied it explains the same phenomena giving the same physical results, but the underlying theory requires a new cosmological view. The theory is laid out in a book, written with B.J. Hiley, titled The Undivided Universe: An ontological interpretation of quantum theory. His take on intelligence—as the manifestation of what he describes as an “unconditioned order” in the cosmos—is developed in a much more accessible work (almost no equations), Bohm’s Wholeness and the Implicate Order; it is available in trade paperback from Amazon.
One is tempted to include in this listing Henri Bergson as well, but he does not fit my criteria. He was not a scientist by avocation or profession. His poetic élan vital was the product of a philosophical intuition, not a diversion from a narrow scientific path forced on him by intuition. And ultimately the famed élan is the assertion of a “it is what it is,” thus “life is life.”
There are probably other important figures I just failed to come across, notice, or remember. But these five individuals have been important in forming my conviction that we are in a transitional period. Thought always leads. What I see around me contradicts that which I feel, but then the present is always a drag on the future. And the Renaissance dreamers were also surrounded by the mediaeval inertia, and its corruptions, visible all around them.
The authors I have in mind, those who didn’t fit the orthodoxies on offer, I call transitional figures, believing (maybe with an excess optimism) that what comes around goes around, hence no curve ever just goes up. There will be seasons. And if rationalism (however much I value it) goes too far, it will be corrected by intuition. And vice versa.
I’ve mentioned two of these figures recently—Carl Jung and Teilhard de Chardin. The first began as a Freudian and then reintroduced the West to God in the form of the Collective Unconscious. The other began in Darwin’s neighborhood and produced an evolutionary theory which gradually forms a sphere of mind straining towards Omega. Both of these terms are nicely ambiguous. Both thinkers clung to naturalism when you get right down to it, but of an antsy sort. It’s a naturalism clawing away at veils and trying to break free. Messy, too. Over against it a stolid, Stoic atheism appears downright noble—but, alas, the Stoic must eventually confess that pop culture is its lineal descendant, and it ain’t pretty, much less noble. Slimy wiggle-things: are they new birth or corruption?
Other figures I think of as transitional are William James, Pierre Lecompte du Noüy, and David Bohm. They were a psychologist, a doctor, and a physicist. James influenced me greatly with his book entitled The Varieties of Religious Experience; du Noüy wrote on evolution. David Bohm strayed from physics enough in his late years to commit cosmology.
James was one of the first of those who almost touched the transcendental in his writings—and certainly did so in his actual life. He keenly felt the constraining limits of naturalistic science but was a modern man rooted in empirical soil and resistant to the older practice of “natural philosophy.”
Du Noüy’s book, Human Destiny, lay waiting for me a-mouldering on the shelves of the Grosse Pointe Library; written in 1947, it must have been enormously popular to make it to these shelves. Grosse Pointe is an auto-baron’s sleeping quarters (although I live, so to say, in the servant’s wing of it). The Library here is the worst I’ve ever had to call my own. By the time I was fully grown, the name had disappeared, but de Chardin, who had a similar take on evolution, was all the rage. De Chardin saw evolution is cosmic terms, du Noüy under a moral sway. He saw the triumph of life as “conscience” and pictured morality as the future path of evolution. Conscience will somehow lead to another leap forward in some remote future time. He offered very meager hopes for individuals in that he seemed to doubt that “souls” exist and thus continue beyond the grave. His focus of admiration was “human dignity.” He also embraced the piety that seems to have come down to us from the nineteenth century—held by scientifically inclined but well-meaning people (like my Mother)—that we “live on” through our deeds and contributions. Dulce et decorum est pro patria mori, etc. The man must have been a moral paragon to think that such a view could possibly inspire humanity to heroic self-denial; at bottom his vision offers no hope for the individual. But du Noüy was clearly not an intuitive, poetic type and therefore missed a whole dimension of his theme. He thought that the miracle—at least the last miracle—was the human brain—which tells me that he never pondered the paradoxes of consciousness or the mysteries of inner life. The book has a distinctly nineteenth century flavor. The New Age was still aborning…
David Bohm wrote the definitive textbook on quantum theory, under that very title. It is still used in teaching the subject—and a copy of it is on sale at my local Barnes & Noble. His claim to fame, however, is an alternative hypothesis to quantum theory; when applied it explains the same phenomena giving the same physical results, but the underlying theory requires a new cosmological view. The theory is laid out in a book, written with B.J. Hiley, titled The Undivided Universe: An ontological interpretation of quantum theory. His take on intelligence—as the manifestation of what he describes as an “unconditioned order” in the cosmos—is developed in a much more accessible work (almost no equations), Bohm’s Wholeness and the Implicate Order; it is available in trade paperback from Amazon.
One is tempted to include in this listing Henri Bergson as well, but he does not fit my criteria. He was not a scientist by avocation or profession. His poetic élan vital was the product of a philosophical intuition, not a diversion from a narrow scientific path forced on him by intuition. And ultimately the famed élan is the assertion of a “it is what it is,” thus “life is life.”
There are probably other important figures I just failed to come across, notice, or remember. But these five individuals have been important in forming my conviction that we are in a transitional period. Thought always leads. What I see around me contradicts that which I feel, but then the present is always a drag on the future. And the Renaissance dreamers were also surrounded by the mediaeval inertia, and its corruptions, visible all around them.
Labels:
Evolution,
Philosophy
Friday, April 24, 2009
Ends, Means
It is almost a truism that “the end does not justify the means.” For that very reason, the rule or doctrine contained in that phrase invites a closer look. If it is true, why is it violated in so many small and, indeed, shocking ways. One of the most visible and outrageous examples is the sanctioning of torture by artful interpretation of Section 2340 of 18 U.S.C. The United States Code actually prohibits torture; it defines it as follows:
Having laid this out, I’m not going to say too much more about this business except to point out that the Administration’s acrobatics, turning itself into a pretzel to get around the law, was done for a “worthy cause.” That cause is the security of the United States and the prevention of another terrorist attack. That is the “end” for which “torture” is the means. And it’s that relationship I want to examine.
The more I pondered that relationship—and why it is both misunderstood, misstated on the Internet in discussion forums, and ignored in legal tight-rope walking—the more puzzled I became—until the actual solution to this problem hit me. But before I go there, I want to point out what I stumbled upon on the way. One is that the phrase is understood by young people in a reverse formulation: “the end justifies the means.” I found a site where this very concept was being stoutly defended, although, to be sure, by what sounded like teenagers. In slightly more adult discussions, the emphasis of the argument is on the nature and grandeur of the end. The glorious end is then allowed to shed its radiance backward, haloing whatever kind of action is sincerely undertaken for the noble cause. It does not surprise me, therefore, that senior senators will actually use the same argument, not quite as bluntly of course, but in circumlocutions. For a sample of such a presentation, I would suggest you check out this interview on PBS’ NewsHour aired April 22, 2009 and available in transcript here.
Now for my insight. The reason it didn’t come to me at once is because I hold, as a matter of habit, an old-fashioned concept of truth. And the insight, really, is that many people must not. The fundamental issue that illuminates the end/means argument is that there is such a thing as absolute truth. Some things are simply wrong, and they are wrong in an absolute and not merely in a relative sense. In the modern usage, including the way senators and teenagers approach this subject, an act, a means, is neither right nor wrong, not on the surface, not out of the box. You have to ask what the circumstances were. If the means served some noble end, its character undergoes a change. If it serves some low or indifferent end, then it may seem loathsome. This is moral relativism, and it seems to be widespread—not because the public is malevolent but merely because it’s ignorant.
The final point that I think must be made is that any kind of absolute truth also implies a hierarchical order in the entire universe, hence demands that there be a God. If the prevailing worldview is very fuzzy about such a being, the public's morality will eventually become quite blurred. The public will instinctively adopt a stand-in for the highest value, typically choosing the political collective. This view will be interestingly self-centered, so that waterboarding captured Muslims will be far more acceptable than waterboarding CEO’s of too-big-to-fail financial institutions. But I’d better curb my lower instincts here and stop.
One is tempted to ask Jay Bybee: What part of “no” don’t you understand? Is it the “n” or is it the “o”? Bybee was an Assistant Attorney General of the Justice Department, the man who gave the CIA the green light to use waterboarding and several other nasty techniques. The artful way of escaping this dilemma, adopted by Mr. Bybee, was to find that (1) the CIA did not intend to inflict severe physical or mental pain or suffering; it had other, more lawful intentions; and (2) so carefully, indeed artfully, parsing the concept of pain and of suffering that, miraculously, the actual effects of waterboarding (it makes you feel as if you’re drowning) don’t quite measure up to the definitions of 18 U.S.C.—or so at least Bybee argues. If you wish to read the memorandum yourself to check on my veracity, you can begin on this site. The item to download is labeled as “A 18-page memo, dated August 1, 2002, from Jay Bybee, Assistant Attorney General, OLC, to John A. Rizzo, General Counsel CIA.” Finally a footnote to the definition above: the parenthetical phrase refers to such things as capital punishment.[A]n act committed by a person acting under the color of law specifically intended to inflict severe physical or mental pain or suffering (other than pain or suffering incidental to lawful sanctions) upon another person within his custody of physical control.
Having laid this out, I’m not going to say too much more about this business except to point out that the Administration’s acrobatics, turning itself into a pretzel to get around the law, was done for a “worthy cause.” That cause is the security of the United States and the prevention of another terrorist attack. That is the “end” for which “torture” is the means. And it’s that relationship I want to examine.
The more I pondered that relationship—and why it is both misunderstood, misstated on the Internet in discussion forums, and ignored in legal tight-rope walking—the more puzzled I became—until the actual solution to this problem hit me. But before I go there, I want to point out what I stumbled upon on the way. One is that the phrase is understood by young people in a reverse formulation: “the end justifies the means.” I found a site where this very concept was being stoutly defended, although, to be sure, by what sounded like teenagers. In slightly more adult discussions, the emphasis of the argument is on the nature and grandeur of the end. The glorious end is then allowed to shed its radiance backward, haloing whatever kind of action is sincerely undertaken for the noble cause. It does not surprise me, therefore, that senior senators will actually use the same argument, not quite as bluntly of course, but in circumlocutions. For a sample of such a presentation, I would suggest you check out this interview on PBS’ NewsHour aired April 22, 2009 and available in transcript here.
Now for my insight. The reason it didn’t come to me at once is because I hold, as a matter of habit, an old-fashioned concept of truth. And the insight, really, is that many people must not. The fundamental issue that illuminates the end/means argument is that there is such a thing as absolute truth. Some things are simply wrong, and they are wrong in an absolute and not merely in a relative sense. In the modern usage, including the way senators and teenagers approach this subject, an act, a means, is neither right nor wrong, not on the surface, not out of the box. You have to ask what the circumstances were. If the means served some noble end, its character undergoes a change. If it serves some low or indifferent end, then it may seem loathsome. This is moral relativism, and it seems to be widespread—not because the public is malevolent but merely because it’s ignorant.
The final point that I think must be made is that any kind of absolute truth also implies a hierarchical order in the entire universe, hence demands that there be a God. If the prevailing worldview is very fuzzy about such a being, the public's morality will eventually become quite blurred. The public will instinctively adopt a stand-in for the highest value, typically choosing the political collective. This view will be interestingly self-centered, so that waterboarding captured Muslims will be far more acceptable than waterboarding CEO’s of too-big-to-fail financial institutions. But I’d better curb my lower instincts here and stop.
Labels:
Ethics,
Philosophy
Tuesday, April 21, 2009
Sensate Reflex
A while back I read again certain portions of Carl G. Jung's book entitled Synchronicity. The word is of Jung's own coinage and refers to the phenomenon we ordinarily call "meaningful coincidences" or, with a slightly more romantic toning, serendipity—that word, in turn, derived from a Persian fairy tale, The Three Princes of Serendip. Synchronicity, meanwhile, has taken pretty deep roots in the circles that concern themselves with borderline phenomena.
Jung's very choice of a neologism to label a relatively common, usually pleasant, but also mildly wonder-rousing experience that people have reflects what I call a kind of sensate bias, a word to which I'll return in a moment. The essence of the experience is its meaningful quality; in Jung's treatment of it, however, the emphasis shifts to the coincidence of something, presumably X and Y, in time. Meaning is introduced, but almost with a kind of timidity. Quite a few people, straining to reach once more a holistic stance toward reality, but wishing to be thought orthodox by Science capitalized, exhibit sensate bias. Jung illustrates this tendency in the present context rather neatly. Now to the substance.
The part I was rereading is the account of a very interesting and, indeed, classical near death experience (NDE) reported by one of his lady patients. She had the experience in the wake of giving birth under difficult circumstances. What struck me in reading the account this time—and reading beyond the case itself—was the curious bias that seems to have held back even a figure like Carl Jung from reaching the logical conclusions that the case that he reported clearly demanded.
The very essence of Jung's book is to show that synchronicity is evidence for meaning in the universe. In making room for meaning, Jung created a quaternity of principles supposedly underlying the cosmic design. He arranged these, initially, in the form of a cross as follows:
Later, in collaboration with the physicist Wolfgang Pauli, he modify this scheme further to yield the following:
Note that the word meaning here receives a set of quote marks around it, almost as if to signal that we must not take the word too literally. Jung is striving for a scientific flavor here even though what he is saying transcends the category of science as usually understood.
I can’t help but see this as what might be called a reflex by all those reared in the atmosphere of a sensate culture. That term comes from Pitirim Sorokin, the sociologist, who classified ages such as ours as totally focused on sensory experience whereas religious ages focus on internal, spiritual processes and look "upward" as it were, rather than "outward" at the world. Members of the sensate culture live in flatland. They cannot quite internalize the fact that the moment meaning is seriously contemplated to be part of the cosmos—wherever it may be found—the picture is immediately transformed. It becomes hierarchical.
Jung’s whole career had a transitional flavor. I myself strongly suspect that part of the nineteenth and all of the twentieth century were times of another renaissance, thus periods of transition between ideational/spiritual and sensate/materialistic ages—or the other way around. The period we call the Renaissance was the transition between Christian spiritualism and the great era of materialism that is now crumbling all around us. Such periods are outwardly oriented. The reverse motion is inwardly directed. In the chaotic times that always follow when materialism collapses by its own weight are dramatic in another way and do not leave the same kind of grandiose impression. Throughout his life Jung kept reintroducing his audiences to the ineffable dimensions of spirit and mind, but he tried so hard to remain part of the scientific caste that his creations were highly ambiguous at best, an example of which is the Collective Unconscious. It has a strong flavoring of divinity. His archetypes, which live in the Collective Unconscious, are reminiscent of Plato's eternal ideas. And, finally, his synchronicity reintroduces “meaning,” albeit in quotes.
I’m glad that Jung labored as he did. For many members of my generation, he was among the most important guides leading us out of the wilderness of materialism toward the new world that approaches, as it were, both from the future and the past. For many of us the old order was decadent beyond retrieval. At the same time, in my generation (although I may only be speaking for myself) the sensate bias that caused Jung to hesitate did not prove to be much of a barrier. I myself wave it off as mere hesitation to make the break clean.
Jung's very choice of a neologism to label a relatively common, usually pleasant, but also mildly wonder-rousing experience that people have reflects what I call a kind of sensate bias, a word to which I'll return in a moment. The essence of the experience is its meaningful quality; in Jung's treatment of it, however, the emphasis shifts to the coincidence of something, presumably X and Y, in time. Meaning is introduced, but almost with a kind of timidity. Quite a few people, straining to reach once more a holistic stance toward reality, but wishing to be thought orthodox by Science capitalized, exhibit sensate bias. Jung illustrates this tendency in the present context rather neatly. Now to the substance.
The part I was rereading is the account of a very interesting and, indeed, classical near death experience (NDE) reported by one of his lady patients. She had the experience in the wake of giving birth under difficult circumstances. What struck me in reading the account this time—and reading beyond the case itself—was the curious bias that seems to have held back even a figure like Carl Jung from reaching the logical conclusions that the case that he reported clearly demanded.
The very essence of Jung's book is to show that synchronicity is evidence for meaning in the universe. In making room for meaning, Jung created a quaternity of principles supposedly underlying the cosmic design. He arranged these, initially, in the form of a cross as follows:
Space | ||||
Causality | Synchronicity | |||
Time |
Later, in collaboration with the physicist Wolfgang Pauli, he modify this scheme further to yield the following:
Indestructible Energy | ||||
Constant Connection Through Effect (Causality) | Inconstant Connection Through Contingence, Equivalence, or "Meaning" (Synchronicity) | |||
Space/Time Continuum |
Note that the word meaning here receives a set of quote marks around it, almost as if to signal that we must not take the word too literally. Jung is striving for a scientific flavor here even though what he is saying transcends the category of science as usually understood.
I can’t help but see this as what might be called a reflex by all those reared in the atmosphere of a sensate culture. That term comes from Pitirim Sorokin, the sociologist, who classified ages such as ours as totally focused on sensory experience whereas religious ages focus on internal, spiritual processes and look "upward" as it were, rather than "outward" at the world. Members of the sensate culture live in flatland. They cannot quite internalize the fact that the moment meaning is seriously contemplated to be part of the cosmos—wherever it may be found—the picture is immediately transformed. It becomes hierarchical.
Jung’s whole career had a transitional flavor. I myself strongly suspect that part of the nineteenth and all of the twentieth century were times of another renaissance, thus periods of transition between ideational/spiritual and sensate/materialistic ages—or the other way around. The period we call the Renaissance was the transition between Christian spiritualism and the great era of materialism that is now crumbling all around us. Such periods are outwardly oriented. The reverse motion is inwardly directed. In the chaotic times that always follow when materialism collapses by its own weight are dramatic in another way and do not leave the same kind of grandiose impression. Throughout his life Jung kept reintroducing his audiences to the ineffable dimensions of spirit and mind, but he tried so hard to remain part of the scientific caste that his creations were highly ambiguous at best, an example of which is the Collective Unconscious. It has a strong flavoring of divinity. His archetypes, which live in the Collective Unconscious, are reminiscent of Plato's eternal ideas. And, finally, his synchronicity reintroduces “meaning,” albeit in quotes.
I’m glad that Jung labored as he did. For many members of my generation, he was among the most important guides leading us out of the wilderness of materialism toward the new world that approaches, as it were, both from the future and the past. For many of us the old order was decadent beyond retrieval. At the same time, in my generation (although I may only be speaking for myself) the sensate bias that caused Jung to hesitate did not prove to be much of a barrier. I myself wave it off as mere hesitation to make the break clean.
Labels:
Jung,
Meaning,
Synchronicity
Monday, April 20, 2009
Ahead of His Time
If Teilhard de Chardin could now observe
This vast electric universe converse
Across the continents and oceans to
And fro unceasingly by light and night,
He’d surely think that he’d foreseen before
It could cohere the realm that he had named
The Noosphere.
This vast electric universe converse
Across the continents and oceans to
And fro unceasingly by light and night,
He’d surely think that he’d foreseen before
It could cohere the realm that he had named
The Noosphere.
Labels:
De Chardin,
Noosphere,
Poems
Saturday, April 18, 2009
Exit Torture
In politics they call it “lowering expectations.” During my years in Kansas, some of my colleagues used to do that by saying, in a drawl, “Shucks, I’m just a simple country boy.” Whoops! Time to check your wallet. Beware of Greeks bearing the gift of humility. But this sort of thing doesn't bother me much. The savvy see through the maneuvers; the innocent learn from them when conned.
More painful are expectations arising from a long tradition of self-admiration reinforced in every way, not least in the way history is thought. It’s not a uniquely American trait. In Hungary they taught us geography using maps of the country quite oddly drawn. Hungary itself was in the center, relatively small. Drawn around it was another border making the country look a good deal bigger; one leg of it actually touched the Adriatic. The inscription on the map was:
The lines rhyme, have the same meter, and can be recited like a slogan. They translates as “little Hungary is not a country, big Hungary is paradise.” Guess which was the actual country we were living in…. The system was bending those little minds of ours into the pretzel of good citizenship.
The papers today are full of outrage over revelations that a recent Attorney General, his knees no doubt buckling under the majestic pressure coming from the White House, found official grounds for permitting torture. First of all, these news are stale. We’ve known the fact for quite a while, but now we have another occasion to tear our clothes and hair and roll on the ground in outrage. Come on! What this calls for is repentance and confession of shame—always a good deal more useful if sincere.
This shining city on a hill was born at a time when people owned people and could whip them bloody without any court’s intervention. During the Civil War we suspended habeas corpus. In World War II we interned Japanese citizens in camps, violating our own constitution, in spirit certainly. During and after World War II we actively recruited and brought to this country Nazi operatives. During the McCarthy years we persecuted and marginalized people with offending political leanings. We manipulated other governments all over the world and made “regime change” a legitimate government activity. Iran? If they resent us, there’s good reason. The aftermath of 9/11 is simply another repeat of patterns we’ve seen before. In a crisis we pack up our morals and ethics and put them in the closet high up there, covered in plastic to be sure, but covered.
Enough of this self-pleasing outrage. It suggests that we are sinless and a few bad actors are to blame. The proper way to look at this is to confess to oneself that, quite possibly, under sufficient pressure, my own knees might buckle too. And I pray to God they won’t.
More painful are expectations arising from a long tradition of self-admiration reinforced in every way, not least in the way history is thought. It’s not a uniquely American trait. In Hungary they taught us geography using maps of the country quite oddly drawn. Hungary itself was in the center, relatively small. Drawn around it was another border making the country look a good deal bigger; one leg of it actually touched the Adriatic. The inscription on the map was:
Kis Magyarország nem ország
Nagy Magyarország mennyország!
The lines rhyme, have the same meter, and can be recited like a slogan. They translates as “little Hungary is not a country, big Hungary is paradise.” Guess which was the actual country we were living in…. The system was bending those little minds of ours into the pretzel of good citizenship.
The papers today are full of outrage over revelations that a recent Attorney General, his knees no doubt buckling under the majestic pressure coming from the White House, found official grounds for permitting torture. First of all, these news are stale. We’ve known the fact for quite a while, but now we have another occasion to tear our clothes and hair and roll on the ground in outrage. Come on! What this calls for is repentance and confession of shame—always a good deal more useful if sincere.
This shining city on a hill was born at a time when people owned people and could whip them bloody without any court’s intervention. During the Civil War we suspended habeas corpus. In World War II we interned Japanese citizens in camps, violating our own constitution, in spirit certainly. During and after World War II we actively recruited and brought to this country Nazi operatives. During the McCarthy years we persecuted and marginalized people with offending political leanings. We manipulated other governments all over the world and made “regime change” a legitimate government activity. Iran? If they resent us, there’s good reason. The aftermath of 9/11 is simply another repeat of patterns we’ve seen before. In a crisis we pack up our morals and ethics and put them in the closet high up there, covered in plastic to be sure, but covered.
Enough of this self-pleasing outrage. It suggests that we are sinless and a few bad actors are to blame. The proper way to look at this is to confess to oneself that, quite possibly, under sufficient pressure, my own knees might buckle too. And I pray to God they won’t.
Labels:
Culture
Thursday, April 16, 2009
Of Microbes and Men
The headline in the New York Times, dateline April 13, was UNITED MILITANTS THREATEN PAKISTAN'S POPULOUS HEART. The story dealt with combinations of Taliban and local militants in several areas, and cited as examples the attack on the Sri Lankan cricket team in Lahore and the bombing of a Marriott Hotel in Islamabad, both done by lash-ups of locals and external groups. The article emphasized that intrusive activities by the United States have stimulated combinations in Pakistan of groups that have not always worked together. Brigitte read this story and saw analogies to biological phenomena—namely the body’s mobilization against invasion. I thought the analogy very neat indeed, especially in the context of cultures that, having an organic character, behave in the manner of an organism.
Pakistan, India. These two, of course, are parts of a decadent empire, more of a huge seedbed of cultures than a living organism, and within them new cultures have formed and compete against each other trying to win dominance. Our conventional view from here is obviously not sophisticated enough. We’re able to see only states and so-called “failed states.” We see neither Muslim fundamentalism in Pakistan nor Capitalistic India as both being contenders in a battle of cultures. We assume that modernizing India is the legitimate future and chastise it for not bringing rural India up to tractored and fully-irrigated snuff fast enough. We imagine that Pakistan ought to buckle down and bring its western regions under proper control, unable to see that its regions bordering Afghanistan (and those of Afghanistan bordering it) are culturally cohesive and fiercely resistant to so-called higher civilization. Our Western concept of progress is like a blinder. We can’t imagine any other future for any large population in the world that does not feature mass democracy and free-market economics dominated by capital.
But this is not Brigitte’s insight so much as a way of saying that we can’t see organically enough. Brigitte’s point is that excessive intrusions produce exactly the opposite of the desired result. Modest doses of antibiotics may help a body rid itself of harmful elements, but too much (read: a lot of collateral damage) rallies the body to an energetic rejection of the intruder, no matter how beneficial the gifts that it bears. Elements accustomed to operate competitively unite against the common enemy. Why can’t we see that? Brigitte asks.
I would answer that civilizations are rootless and therefore unaware of forces that maintain societies over the longer haul. If elements in Pakistan resist us, it is because they know that our intention is pure national interest. Our real attitude towards Muslim culture is contempt. We want access to Afghanistan to protect us against Al Qaida. We could care less about the strange cultural process taking place down on that far-off map. We are the user—and elements down there would be God damned if they’ll be used. Special Ambassador Richard Holbrook is talking to the wrong people. He is communing with Pakistan’s rootless elements who, in turn, are really just as helpless against the organic formations within their geography as we are. Folly. Folly.
Pakistan, India. These two, of course, are parts of a decadent empire, more of a huge seedbed of cultures than a living organism, and within them new cultures have formed and compete against each other trying to win dominance. Our conventional view from here is obviously not sophisticated enough. We’re able to see only states and so-called “failed states.” We see neither Muslim fundamentalism in Pakistan nor Capitalistic India as both being contenders in a battle of cultures. We assume that modernizing India is the legitimate future and chastise it for not bringing rural India up to tractored and fully-irrigated snuff fast enough. We imagine that Pakistan ought to buckle down and bring its western regions under proper control, unable to see that its regions bordering Afghanistan (and those of Afghanistan bordering it) are culturally cohesive and fiercely resistant to so-called higher civilization. Our Western concept of progress is like a blinder. We can’t imagine any other future for any large population in the world that does not feature mass democracy and free-market economics dominated by capital.
But this is not Brigitte’s insight so much as a way of saying that we can’t see organically enough. Brigitte’s point is that excessive intrusions produce exactly the opposite of the desired result. Modest doses of antibiotics may help a body rid itself of harmful elements, but too much (read: a lot of collateral damage) rallies the body to an energetic rejection of the intruder, no matter how beneficial the gifts that it bears. Elements accustomed to operate competitively unite against the common enemy. Why can’t we see that? Brigitte asks.
I would answer that civilizations are rootless and therefore unaware of forces that maintain societies over the longer haul. If elements in Pakistan resist us, it is because they know that our intention is pure national interest. Our real attitude towards Muslim culture is contempt. We want access to Afghanistan to protect us against Al Qaida. We could care less about the strange cultural process taking place down on that far-off map. We are the user—and elements down there would be God damned if they’ll be used. Special Ambassador Richard Holbrook is talking to the wrong people. He is communing with Pakistan’s rootless elements who, in turn, are really just as helpless against the organic formations within their geography as we are. Folly. Folly.
Labels:
Civilization,
Culture
Tuesday, April 14, 2009
Culture and Civilization
When we use the word culture more or less unthinkingly to mean “the things going on around us” and “the general arrangement of things,” we are very close to the original and still used secondary meaning of the word. That origin lies in “cultivation,” thus the arrangement and working of the soil, garden, field, and forest land done to sustain us. The modern word ecology carries a very similar meaning but with an unstated stress on the mutual linkage of everything.
Oswald Spengler, one of my cyclic historians, proposed (in Decline of the West) a distinction between culture and civilization. I found this view most apt and informative even when I first learned it at around nineteen or twenty years of age. Kultur for Spengler was the early and still organic phase of a distinctive society; Civilization was the late phase. The chief characteristic of civilization is deracination (still citing Spengler), a neat derivative from French which euphemistically makes its blunt Anglo-Saxon rootless sound sophisticated. You might say that the word itself describes civilization. We move from rootless to déraciné and in the process begin to float in the air, from Conan the Barbarian riding a great steed to Master of the Universe riding clouds of default swaps and derivatives. I realize that the analogy is inexhaustible: the homestead is culture, the mortgage-based derivative is civilization. One last pairing: tradition and public opinion: one is rooted in long-established and time-tested habits; the other is the unstable and flighty flutter of moods in the winds of the media.
We can escape neither culture nor civilization. It pervades, it penetrates, it’s everywhere. If you’re a person of culture, whether primitively so or highly cultivated, you will not, repeat not, feel at home in a civilization. The cultivated person will in some real ways be related, will belong to the same category, ultimately, as the fundamentalist.
More pairings will make this plain. Rationalism belongs to culture, intellectualism and sophistry to civilization. Culture cultivates the will, civilization the appetite. Sports and contest (whether of teams or spellers) are on one side, spectator-sports and soccer riots on the other.
The chief differences are that cultures are coherent, if often carrying an irrational element; they are also hierarchical, perhaps because they are coherent. Civilization is incoherent and ultimately flat. Its incoherence is in fact caused by its flatness. Power in civilizations derives from mass opinion. It is a single value although differently named: wealth, celebrity, popularity (as in politics).
Multiculturalism is an example of incoherence. All cultures are equal. Give us a break! I wouldn’t want to be a woman in a Muslim or a Hindu culture. No thanks. But let’s look at the ultimate meaning of that term. When everything is equal, nothing has value. Multiculturalism is the de facto rejection of culture dressed in a patronizing tolerance.
I’ve had the good fortune to experience real culture before the last wave of civilization submerged most of it in Europe. I was born in Hungary and therefore in the provinces of Christendom where modernism had had somewhat muted influence. In my early childhood I lived on the edges of Hungary yet, thus even more removed into spaces still dominated by the past. And, in the last year of World War II, and for a time thereafter, I lived in a backwater of Bavaria, in Tirschenreuth, where Christendom was still in full flower—more by the nature of its location and national neglect, I think, than any other reason. This background formed me. I’ve always been grateful—even if living in civilization gives me little comfort. Living in modernity, however, cannot be avoided—no matter where we first saw the light of day. My coming to America had nothing to do with it. Here, too, many have lived in pockets of resistance, but the times keep marching on, the blight spreads, indeed will do so until the season once more changes. I have a sense that, beneath the undifferentiated devastation modern civilization represents, new shoots are sprouting. In time a new culture will be born of these green things as the old order finally breaks down and its rubble is carried off as raw material to build something new. Signs of that are also evident.
Oswald Spengler, one of my cyclic historians, proposed (in Decline of the West) a distinction between culture and civilization. I found this view most apt and informative even when I first learned it at around nineteen or twenty years of age. Kultur for Spengler was the early and still organic phase of a distinctive society; Civilization was the late phase. The chief characteristic of civilization is deracination (still citing Spengler), a neat derivative from French which euphemistically makes its blunt Anglo-Saxon rootless sound sophisticated. You might say that the word itself describes civilization. We move from rootless to déraciné and in the process begin to float in the air, from Conan the Barbarian riding a great steed to Master of the Universe riding clouds of default swaps and derivatives. I realize that the analogy is inexhaustible: the homestead is culture, the mortgage-based derivative is civilization. One last pairing: tradition and public opinion: one is rooted in long-established and time-tested habits; the other is the unstable and flighty flutter of moods in the winds of the media.
We can escape neither culture nor civilization. It pervades, it penetrates, it’s everywhere. If you’re a person of culture, whether primitively so or highly cultivated, you will not, repeat not, feel at home in a civilization. The cultivated person will in some real ways be related, will belong to the same category, ultimately, as the fundamentalist.
More pairings will make this plain. Rationalism belongs to culture, intellectualism and sophistry to civilization. Culture cultivates the will, civilization the appetite. Sports and contest (whether of teams or spellers) are on one side, spectator-sports and soccer riots on the other.
The chief differences are that cultures are coherent, if often carrying an irrational element; they are also hierarchical, perhaps because they are coherent. Civilization is incoherent and ultimately flat. Its incoherence is in fact caused by its flatness. Power in civilizations derives from mass opinion. It is a single value although differently named: wealth, celebrity, popularity (as in politics).
Multiculturalism is an example of incoherence. All cultures are equal. Give us a break! I wouldn’t want to be a woman in a Muslim or a Hindu culture. No thanks. But let’s look at the ultimate meaning of that term. When everything is equal, nothing has value. Multiculturalism is the de facto rejection of culture dressed in a patronizing tolerance.
I’ve had the good fortune to experience real culture before the last wave of civilization submerged most of it in Europe. I was born in Hungary and therefore in the provinces of Christendom where modernism had had somewhat muted influence. In my early childhood I lived on the edges of Hungary yet, thus even more removed into spaces still dominated by the past. And, in the last year of World War II, and for a time thereafter, I lived in a backwater of Bavaria, in Tirschenreuth, where Christendom was still in full flower—more by the nature of its location and national neglect, I think, than any other reason. This background formed me. I’ve always been grateful—even if living in civilization gives me little comfort. Living in modernity, however, cannot be avoided—no matter where we first saw the light of day. My coming to America had nothing to do with it. Here, too, many have lived in pockets of resistance, but the times keep marching on, the blight spreads, indeed will do so until the season once more changes. I have a sense that, beneath the undifferentiated devastation modern civilization represents, new shoots are sprouting. In time a new culture will be born of these green things as the old order finally breaks down and its rubble is carried off as raw material to build something new. Signs of that are also evident.
Labels:
Autobiographical,
Civilization,
Culture
Thursday, April 9, 2009
Turbotaxonomy
Doing taxes is the royal pain it is for reasons more domestic than federally bureaucratic. The really big job, in other words, is to assemble all of the necessary slips of paper that actually prove that we exist. In Taxland, that is. Those pieces of paper rarely matter, but when they do, they really do. I remember the shock I experienced some years ago when, applying for Social Security on or around my reaching ultimate ripeness, the SS Administration turned up its nose at all manner of proofs that I exist—my immigration certificate, my citizenship certificate, my honorable discharge from the Army, and lesser documents yet. No! They wanted my birth certificate! But here I was, age sixty-six, and I’d never actually seen it—only a typed translation from the Hungarian into German, I think, that my father had procured by some odd stratagem in the late 1940s. I was born in Budapest just before that city was laid under siege by the armies of the U.S.S.R. during World War II. The city was badly bombed and shelled. Could I possibly get anyone now, two-score-years-and-counting later, to come up with that piece of paper from over there? So far as the SSA was concerned I hadn’t been born; I was the mirage of an elderly man; I existed physically, for there I stood—and in the United States of America I had the necessary papers to prove myself real—but not in SSA Land. I began the process by painstakingly assembling old Hungarian words to address the Hungarian Embassy in New York (on paper, of course), the folk bureaucratically indicated as the keepers of the gate. Amazingly it all went very smoothly. Thanks to humanity’s widespread passion for hoarding old records, the appropriate place in Budapest still knew me! Heaven be thanked! They hadn’t been bombed or burned! Now came another surprise. SSA accepted the certificate as it was, in Hungarian, and scoffed at my offer to provide them a translation. “We’ll take care of that,” they told me with dark knowingness. And now, that I existed, I could approach the table and sit down.
Even in those days now half forgotten when at tax time you praised the Lord for electrical calculators and sharp pencils, I’d come to the conclusion that actually doing the taxes, which amounted to filling in blanks as instructed, was not really beyond the powers of a person able to read. Indeed, back then, I used to joke that I made my living by the power to read alone; many had learned the art but had decided not to exercise it once out of school. But the trauma of tax-time was then no different. Find the little pieces of paper.
TurboTax, my well-paid slave (this year’s package cost me $60), automates the fill-out-the-form process, thus shaving a very thin sliver of effort from the entire process. You still have to enter the data on the keyboard, but all the numbers will end up in the right place, the slave does all the look-ups, does all the calculations—and dispenses good advice during the process. But this begins after you’ve assembled that ominously named documentation.
This year, horror of horrors, I discovered at the end of the longish day that we owed a huge amount in taxes, over against last year, despite the fact that our income was just a thumb-width greater than in 2007. Why? Why? Why? The printout of the new 2008 tax form on the screen, I compared it to the printed version of the 2007 return—and discovered that, somehow, we were missing (shriek, cackle!—as the comics put it)—two pieces of paper!!!
We were missing, to be specific, two form 1099-Rs, those that recorded our RMDs from our IRAs, one from Brigitte’s, one from mine. I’d filled in the actually RMD amounts from other data; but without the forms themselves, I could not discover whether we’d paid income taxes on those sums or, worse, had not. RMDs have now etched themselves into my memory after these experiences. The letters stand for “required minimum distribution” from “individual retirement accounts.”
As always on these momentous days, I descended from the fumes and nasty fires of my working desk upstairs to find Brigitte serenely living her life, watering plants in our bright sunroom. Brigitte is Keeper of Records and CFO of this unit; nowadays letters like that must be appended to underline the person’s importance. Her face took on a rigid form. The search for 1099-Rs began. It took a while. The tension mounted. At last we found a soiled copy of mine, abraded somewhat by having slipped off a tax-pile on a desk. We already had the number, not the paper! At last. The second one required, finally, a telephone call to Merrill Lynch, the spelling of whose name (two Rs, two Ls) tax time always re-teaches me. Now we are in the clear, at last, and the turbotaxonomy of our lives is ordered properly. I can breathe easier now and therefore have decided to postpone e-mailing our Federal return until tomorrow.
Even in those days now half forgotten when at tax time you praised the Lord for electrical calculators and sharp pencils, I’d come to the conclusion that actually doing the taxes, which amounted to filling in blanks as instructed, was not really beyond the powers of a person able to read. Indeed, back then, I used to joke that I made my living by the power to read alone; many had learned the art but had decided not to exercise it once out of school. But the trauma of tax-time was then no different. Find the little pieces of paper.
TurboTax, my well-paid slave (this year’s package cost me $60), automates the fill-out-the-form process, thus shaving a very thin sliver of effort from the entire process. You still have to enter the data on the keyboard, but all the numbers will end up in the right place, the slave does all the look-ups, does all the calculations—and dispenses good advice during the process. But this begins after you’ve assembled that ominously named documentation.
This year, horror of horrors, I discovered at the end of the longish day that we owed a huge amount in taxes, over against last year, despite the fact that our income was just a thumb-width greater than in 2007. Why? Why? Why? The printout of the new 2008 tax form on the screen, I compared it to the printed version of the 2007 return—and discovered that, somehow, we were missing (shriek, cackle!—as the comics put it)—two pieces of paper!!!
We were missing, to be specific, two form 1099-Rs, those that recorded our RMDs from our IRAs, one from Brigitte’s, one from mine. I’d filled in the actually RMD amounts from other data; but without the forms themselves, I could not discover whether we’d paid income taxes on those sums or, worse, had not. RMDs have now etched themselves into my memory after these experiences. The letters stand for “required minimum distribution” from “individual retirement accounts.”
As always on these momentous days, I descended from the fumes and nasty fires of my working desk upstairs to find Brigitte serenely living her life, watering plants in our bright sunroom. Brigitte is Keeper of Records and CFO of this unit; nowadays letters like that must be appended to underline the person’s importance. Her face took on a rigid form. The search for 1099-Rs began. It took a while. The tension mounted. At last we found a soiled copy of mine, abraded somewhat by having slipped off a tax-pile on a desk. We already had the number, not the paper! At last. The second one required, finally, a telephone call to Merrill Lynch, the spelling of whose name (two Rs, two Ls) tax time always re-teaches me. Now we are in the clear, at last, and the turbotaxonomy of our lives is ordered properly. I can breathe easier now and therefore have decided to postpone e-mailing our Federal return until tomorrow.
Labels:
Taxes
Monday, April 6, 2009
Meddling
Among the consequences of narrow materialism are certain interventions by experimenters into the medical arts. Today the New York Times brings us a story of scientists able to erase memories selectively in mice and rats using a drug called Zip, suggesting that people’s traumatic memories, drug addictions, or anything else currently thought unpalatable may in the future be removed from the behavioral menu by zip-zapping their memories. I pity misbehaving school boys in the future; come to think of it, I already pity them; they’re already targets of Big Pharma. George Orwell’s imagination evidently failed to disclose the sophisticated means whereby Big Brother in the future will show how much he loves us. By coincidence—a meaningful coincidence, perhaps—just yesterday I finished reading an Agatha Christie novel quite unusual in its subject (Passenger to Frankfurt). It projects a crisis in the world (resurgence of anarchy and Nazism) dealt with by the deployment of a wonder-chemical, the invention of Project Benvo, which compels those who’re dosed to display benevolence for the rest of their lives. The unexpected consequences of modern discoveries are another sub-theme of this novel.
Now glancing at my own reaction I’m reminded of the story of the two old ladies in church listening with great approval as the fiery preacher in turn thunders against greedy merchants, seductively dressed women, and adultery. Each time he reaches a crescendo, the old ladies nod and say “Amen, Amen.” But when the preacher next turns to excoriate those who indulge in the consumption of snuff—then the two ladies look at each other. And they say: “Now he’s meddling!”
Consuming, as I do, seven different medications, I’m not exactly abstemious when it comes to pharmaceuticals and hence only, at best, passively resist such “interventions.” Indeed, without interventions, I probably would have said my farewells and adieus in 1995. But there’s a limit.
When it comes to mind and memory, we know too little. When it comes to medicating people to control behavior—done to minors mercilessly and their parents yielding to Big Brother (or Sister) like shy primitives—I draw the line. We know too little. And our governing model is not at all persuasive to me. What is that model? It is that pure chance, thus Bertrand Russell’s “accidental collocation of atoms” produced all living things including brains. There is neither meaning nor purpose in this great process and therefore intervention is permitted by definition if only you can get “the parents,” as it were—the public—to agree. Whoa! I for one would like to zap the Zip before it reaches adulthood and it becomes politically correct to get a memory tune-up especially, perhaps, after adulteries or divorces.
Now glancing at my own reaction I’m reminded of the story of the two old ladies in church listening with great approval as the fiery preacher in turn thunders against greedy merchants, seductively dressed women, and adultery. Each time he reaches a crescendo, the old ladies nod and say “Amen, Amen.” But when the preacher next turns to excoriate those who indulge in the consumption of snuff—then the two ladies look at each other. And they say: “Now he’s meddling!”
Consuming, as I do, seven different medications, I’m not exactly abstemious when it comes to pharmaceuticals and hence only, at best, passively resist such “interventions.” Indeed, without interventions, I probably would have said my farewells and adieus in 1995. But there’s a limit.
When it comes to mind and memory, we know too little. When it comes to medicating people to control behavior—done to minors mercilessly and their parents yielding to Big Brother (or Sister) like shy primitives—I draw the line. We know too little. And our governing model is not at all persuasive to me. What is that model? It is that pure chance, thus Bertrand Russell’s “accidental collocation of atoms” produced all living things including brains. There is neither meaning nor purpose in this great process and therefore intervention is permitted by definition if only you can get “the parents,” as it were—the public—to agree. Whoa! I for one would like to zap the Zip before it reaches adulthood and it becomes politically correct to get a memory tune-up especially, perhaps, after adulteries or divorces.
Labels:
Christie Agatha,
Pharmaceuticals,
Russell
Thursday, April 2, 2009
A House Gone South
There is a house on Charlevoix, a choice estate high on a mound
And bound around its ample girth stand firs and bushes dense and green—
Stood once stand still, but in the past a flag cracked high atop a mast
And nights on summer walks I saw warm lights from windows large and small
And in the Fall at times red lights and movement, music, cheer announced
A party underway to celebrate the harvest on Wall Street,
Halloween, or was it just the summer’s end? The house had two
Perhaps three children, too, and in the day you’d see abandoned on
The drive the boy’s turned-over trike and girls’ forgotten dolls and toys.
But that was then.
One fine day three years ago a For Sale sign appeared, discreet and
Tucked away, almost as if to say we want to sell but yet we
May still change our mind. Bushes slowly hid the sign behind
Their leaves, and then time seemed to stop as on a dime because the life
Inside the place began to flow more thinly now, then slow…
Although a single car (once there were three) still stood out in the snow.
The auto vanished too, replaced there by a box placed there by PODS
The mobile storage folk, to hold, perhaps, the last few residues
Of once rich harvesters of higher revenues now rent apart
Who knows, nasty divorce? or the worst auto sales since before
The day that Ford began to build the Model-T in old Detroit?
And that was later.
The other day, the sale sign gone, the vegetation taking charge,
The pod departed and no trace of life, I dared to stray beyond
The fence into the mansion’s unkempt grounds whence no longer
Veiled by bush or tree, I now could see the house abandoned to the
Birds, the mouse. And there I glimpsed through broken glass interiors left
To the past—torn bits of carpeting, a half-demolished desk-like
Thing, a fallen plastic jar that, rolling, had disgorged a wreath
Of sorts onto the naked floor. Another look revealed yet more
Unsightly scars, not least the plywood sheets that had replaced the
Windows raiding boys, presumably, had earlier destroyed and,
Running off, had left behind. “Alas, this house’s gone south,” I thought and
Shook my head, troubled in mind but homeward bound to eat my daily bread.
And bound around its ample girth stand firs and bushes dense and green—
Stood once stand still, but in the past a flag cracked high atop a mast
And nights on summer walks I saw warm lights from windows large and small
And in the Fall at times red lights and movement, music, cheer announced
A party underway to celebrate the harvest on Wall Street,
Halloween, or was it just the summer’s end? The house had two
Perhaps three children, too, and in the day you’d see abandoned on
The drive the boy’s turned-over trike and girls’ forgotten dolls and toys.
But that was then.
One fine day three years ago a For Sale sign appeared, discreet and
Tucked away, almost as if to say we want to sell but yet we
May still change our mind. Bushes slowly hid the sign behind
Their leaves, and then time seemed to stop as on a dime because the life
Inside the place began to flow more thinly now, then slow…
Although a single car (once there were three) still stood out in the snow.
The auto vanished too, replaced there by a box placed there by PODS
The mobile storage folk, to hold, perhaps, the last few residues
Of once rich harvesters of higher revenues now rent apart
Who knows, nasty divorce? or the worst auto sales since before
The day that Ford began to build the Model-T in old Detroit?
And that was later.
The other day, the sale sign gone, the vegetation taking charge,
The pod departed and no trace of life, I dared to stray beyond
The fence into the mansion’s unkempt grounds whence no longer
Veiled by bush or tree, I now could see the house abandoned to the
Birds, the mouse. And there I glimpsed through broken glass interiors left
To the past—torn bits of carpeting, a half-demolished desk-like
Thing, a fallen plastic jar that, rolling, had disgorged a wreath
Of sorts onto the naked floor. Another look revealed yet more
Unsightly scars, not least the plywood sheets that had replaced the
Windows raiding boys, presumably, had earlier destroyed and,
Running off, had left behind. “Alas, this house’s gone south,” I thought and
Shook my head, troubled in mind but homeward bound to eat my daily bread.
The Middle Class
Several years ago both Monique and I participated in the production a four-volume work entitled Social Trends & Indicators USA, a project that we’d envisioned ourselves and which then appeared under the copyright of ECDI† in 2003. The concept behind SIUSA (as we abbreviated the project in-house) was to illuminate trends in society by the use of government statistics. Each entry in these volumes consisted of a graphic followed by a page or two of succinct commentary. We placed the actual statistics we’d used to make the graphs at the back of each volume in numerical format—following the general rule that in socio-economic studies others should be able to examine your logic with all numbers disclosed. Six of us labored on this project virtually day and night to bring it home by the deadline. It was a memorable year filled with valuable discoveries. The highlight of each week was a meeting in the course of which each of us presented the entries we’d produced that week in summary—and we discussed them all. Perhaps the most general of our discoveries was that demographic structure drives everything else, all things being equal—but often also when they’re not. But all this just as a preamble to something else.
The other day Monique sent me a link with the laconic subject: “When you have a spare hour…” The link turned out to be the video of the 2007 Jefferson Memorial Lecture presented at UC Berkley by Elizabeth Warren, the Leo Gottlieb Professor of Law at Harvard Law School. (Yes, I do! I bow to those who fund these chairs. Many people don’t share their wealth, and those who do deserve the bow.) The lecture’s title was “The Coming Collapse of the Middle Class.” You can see the lecture here. Yes, it takes an hour. And yes. It’s worth it!
In this living version of the Ghulf clan there are quite a few of us with a keen interest in the cultural and cosmic climates. Tracking and documenting pieces of it that become visible in some way—through statistics, literature, comic books, pop music, esoteric culture, however we can get at it—is occupation as well as preoccupation. SIUSA and the other USAs we have produced were and are part of that effort. So is our interest in Peak Oil and my own in cyclic history. Well, Elizabeth Warren undertook a very careful study, in methodology identical to that which we followed in SIUSA. She compares a representative (median) family of four as it existed in the early 1970s to one as it was in the early 2000s. She too presents graphics and then makes succinct but penetrating comments—and the conclusion that emerges is dramatic. Listening to her one feels as if watching (helplessly because we are so tiny) the slow-motion evolution of a major disaster that’s coming with what seems to be a kind of inevitability.
But enough said. The lecture is worth seeing, the conclusions worth pondering. It’s better always to face the future knowing than not—and we may even be enabled to change the course of the raging river despite only holding a little cup in the hand.
__________
†Editorial Code and Data, Inc.
The other day Monique sent me a link with the laconic subject: “When you have a spare hour…” The link turned out to be the video of the 2007 Jefferson Memorial Lecture presented at UC Berkley by Elizabeth Warren, the Leo Gottlieb Professor of Law at Harvard Law School. (Yes, I do! I bow to those who fund these chairs. Many people don’t share their wealth, and those who do deserve the bow.) The lecture’s title was “The Coming Collapse of the Middle Class.” You can see the lecture here. Yes, it takes an hour. And yes. It’s worth it!
In this living version of the Ghulf clan there are quite a few of us with a keen interest in the cultural and cosmic climates. Tracking and documenting pieces of it that become visible in some way—through statistics, literature, comic books, pop music, esoteric culture, however we can get at it—is occupation as well as preoccupation. SIUSA and the other USAs we have produced were and are part of that effort. So is our interest in Peak Oil and my own in cyclic history. Well, Elizabeth Warren undertook a very careful study, in methodology identical to that which we followed in SIUSA. She compares a representative (median) family of four as it existed in the early 1970s to one as it was in the early 2000s. She too presents graphics and then makes succinct but penetrating comments—and the conclusion that emerges is dramatic. Listening to her one feels as if watching (helplessly because we are so tiny) the slow-motion evolution of a major disaster that’s coming with what seems to be a kind of inevitability.
But enough said. The lecture is worth seeing, the conclusions worth pondering. It’s better always to face the future knowing than not—and we may even be enabled to change the course of the raging river despite only holding a little cup in the hand.
__________
†Editorial Code and Data, Inc.
Labels:
Economics,
Middle Class,
Trends
Subscribe to:
Posts (Atom)