Thursday, July 11, 2013

Give up the myth of human progress?

“Clearly science and technology have put extraordinary knowledge and power at the command of beings who come into the world with the same brains and mental faculties as humans born 5,000 years ago. Any victory over our species’ destructive tendencies will likewise have to come from institutional and cultural development. We know what humans are capable of: in the wrong circumstances and with the wrong formation, they can behave monstrously. The hope for progress can consist only in the belief that there is some form of collective human life in which the capacity for barbarism will rarely find expression, and in which humans’ creative and cooperative potential can be realized without hindrance. [John] Gray regards such hope as utopian, but it can be supported both by experience and by reflection. Moral and political progress is inevitably more difficult than scientific progress, since it cannot occur in the minds of a few experts but must be realized in the collective lives of millions; but it does happen. Experience shows that some societies are much more decent than others, and that in fits and starts, cruelty, oppression and discrimination have become on balance less acceptable over time.” Thomas Nagel. NYT. 7/5/13.

In this quote, Thomas Nagel defends the myth of human progress. And I wonder, reading it, whether the holocaust is not made more horrible by its deep challenge to this myth. The German people were (and still are) shining examples of the best of Western civilization--and evil still took and used them without interference to accomplish in reality the things that only exist in our unspoken nightmares.

I've been thinking off and on about this myth of human progress. I love this myth. Believing it gives my life and my own civilization meaning and purpose. I am at the crest of thousands of years of progress. I am better than my forebears. And those who come after will be better, happier, and healthier than me. Their children would be geniuses to my age. And yet, lets face it, this is a thoroughly secular eschatology bound on a highly selective reading of the facts.

In short, the Judeo-Christian worldview should have none of it. It is a kind of antichrist. One of those lies that hold down the truth.

On the other hand, can someone just give up the myth of human progress? Can someone simply rewire the operating system they imbibed with their mother's milk?

Thursday, April 18, 2013

Hacking Language Acquisition: Some Thoughts.

What is language? How are languages constructed? And what are the best ways to quickly learn a language? These and similar questions have been rolling around in my mind over the last few months.

It all started because I have been teaching Greek to some dedicated friends. Now if you know anything about classical languages, you know that many people every year take these languages, such as Greek, Hebrew, and Latin, in graduate school as prerequisites toward further study. And you may also know that the dropout rate post-graduation for the enjoyment and retention of these languages is not good. Now I and my friends are putting a lot of effort into learning Greek. I don't want them to be a statistic--and, frankly, I don't want to become one either. So I ask the question, "What does it take to escape the attrition trap and break through into enjoying and so sustaining and even growing, in a language?"

I work a job, so I don't have a lot of time for deep reading and research about this. But here are a few things I've learned.

Immanuel Kant was a Linguist

According to Immanuel Kant, human beings construct the world along two planes: extension and change, in other words, along space and time.

  1. The noun system captures space
  2. The tense system captures time

Both employ the same method to do so.

Beginning with a root, prefixes and suffixes are added or removed to fix that root within a matrix that assigns it jobs--being a direct object or a subject, for example.

We memorize a lexical form (lemma) of a word, but understand that the lemma really exists as a root that can manifest anywhere along a matrix.

Understood in this way, there is a deep repetition, a strategic recurrence, between verbal and noun systems. Furthermore, taken together they answer Kant's qualifications for building the kind of world that human beings experience (an appropriately phenomenological world.)

With this in mind, one can begin to interrogate a language and ask why this matrix is chosen and not another one? Why five or seven cases and not thirteen or twenty-one?

Open- and Closed-Class Words

Linguists parse the words of a language into two categories: open-class and closed-class. Open-class words are the usual nouns, adjectives, adverbs, and verbs that we usually associate with language. Open-class words are also called content words or lexical words. These are the words that carry the meaning in sentences, and it is interesting that new words that come into a language are always open-class words (thus the "open.") Closed-class words, also called function or grammatical words, are things like determiners, qualifiers, prepositions, conjunctions, and intensifiers. They serve a variety of functions, as their names demonstrate. They do not, themselves, carry meaning in the way that content words do. Instead, they serve to grammatically connect the open-class words. There are far fewer closed-class words in a language than open-classed words by several standard deviations. And, here's another kicker, unlike open-class words, to which new terms may be coined or invented or taken whole from another language, closed-class words are stubbornly fixed. Languages have all the function words that they need, and it is near impossible to delete or add to their number, even when it would be useful to do so. Closed-class function words, then, are the inner skeleton upon which the open-class content of the language is attached. In short, if you are going to learn a new language, get to know function words well, and become adept at watching for them when reading. "Once the framework of grammar has been transferred to long-term memory," says author Tim Ferris, "acquiring vocabulary is a simple process of proper spaced repetition." I've read that an ESL teacher, for example, should aim at the perception of the structure of a text before the individual words. So by zeroing in on function words, one will learn more about a target language than would be done through memorizing hundreds of content words.

". . . the task facing the child is not to learn how language works, starting from scratch. Instead, since children are born with an implicit knowledge of languages in general, they have to figure out how the particular language (or languages) they hear functions. For example, all languages have something like prepositions, words that show relationships among things (The book is on the table). In languages like English, these words that show position come in front of the noun, so they are called prepositions. In other languages, these words follow the noun, so in those languages, a child would encounter sentences with this pattern: The book is the table on.In such languages,these words are called postpositions because they come after (post), not before (pre)." [1]

An Interesting Note from a Translator

"A translation is not just turning one language into another. It’s also about opening up a foreign mindset . . . to hear the text and experience it absolutely as intensely as I can, allowing myself to fall into its way of thinking about things. A good translator has to be an interested sponge when it comes to the idiom and cultural setting of the language he or she is translating from [--] fascinated by the picayune details of language. Every complex translation would be somewhat different if we had done it a month before, or a month later, or even an hour."[2]

Benny the Irish Polyglot Says to Ditch the English

One of the biggest lessons Benny learned in his transformation to langauge polyglot was to give up his English as quickly as possible. Be stupid, he says. Make mistakes. But leave your English behind and deal with the frustrating, bewildering, but language-acquisition-fast method of leaving your comfort language behind.[3]

Now, where dead languages are concerned, you can't just avoid your English-speaking expat friends and hang out with the locals. What you can do, however, is to up your exposure and go cold-turkey on sections of text. For example, one could go native on the Gospel of Mark or on a chapter of the Gallic Wars or something.

Tengentially, get away from code-thinking as soon as possible. A new language is not just a code for your old one. Stop occasionally and forget your native tongue while holding the new language in stasis like wine held in the tongue. Enjoy and parse out the sensation before swallowing.

A Large Bit on the Importance of Reading

"Studies of vocabulary development through reading give further support to the claim that most vocabulary is acquired. Anderson and Nagy carried out a series of studies on how children acquire words during reading. They found, for example, that there is about a one in twenty chance that a student will acquire a new word from seeing it in context. . . . If students see a word more often, they are more likely to acquire the word.Anderson and Nagy report that the average fifth grader reads for about twenty-five minutes a day. They comment "This number is certainly lower than would be desired, but it translates into about a million words of text covered in a year." If even 2 percent of the words were unfamiliar, students would encounter twenty thousand new words in a year. If they acquired one out of every twenty, they would acquire at least one thousand words a year.

"These authors go on to say, "An avid reader might spend an hour or two a day reading, and thus cover four or more times as much text. The rate of learning from context for self-selected text is likely to be closer to one unfamiliar word in ten than one in twenty. For children who do a fair amount of independent reading, then, natural learning could easily lead to the acquisition of five to ten thousand words a year, and thus account for the bulk of their annual vocabulary growth."

. . . .

"A study carried out with adult speakers of English and students learning ESL showed that both groups were able to define many new vocabulary words just from reading a novel.

. . . . .

"[Another researcher] found that picking up words from reading is ten times faster than learning words through intensive vocabulary instruction. However, they also suggest that some vocabulary study can be useful. They encourage teachers to develop a sense of what they call "word consciousness." "We believe that the goal of instruction should be to develop what one lexiphile has termed word consciousness."

. . . .

"One of the benefits of acquiring vocabulary through reading is that students develop a more complete understanding than the superficial knowledge gained by memorizing a definition. . . . When students see and hear a word in different contexts, they build a subconscious understanding of that word. Extensive reading is the best way for students to build a rich vocabulary." [4]

Lend Language Your Ears

Human beings use air, their vocal chords, and the contraction and expansion of their oral and nasal cavities to express or suppress sounds. The Phonecians were the first to begin writing down sounds phonetically rather than resorting to picture language. Therefore writing for every other language that does this, which are most languages, is a kind of shorthand for the contraction or relaxation of organic sound production. Writing isn't where meaning lives. Writing tells you how to move your body to make the kind of sounds that produce meaning. Generally speaking: language is an oral thing. When you look at a page of English or French or Latin or German, you are seeing instructions for producing the sort of sounds that community of speakers agreed on. What this means is that when learning a new language, you need to keep your mouth and ears involved. Never read silently, and listen to as much as possible.

Set Short-Term Goals

One thing that Tim Ferris and Benny the Irish Polyglot talk about is not dying at the hands of language perfection. Language courses tend to teach against an ideal of perfection, and this, they say, kills motivation. Language is the way that minds connect with minds. It is the tool for making connections between people and cultures. So learn what you have to have to get to the point where you can start connecting and let the rest take care of itself. That's what they would say.

This is not as applicable where classical languages are concerned, but it is still helpful. If the point is to make connection, then read for meaning first before you read for parsed perfection.

Ferris and Benny talk about setting smaller short-term goals: to carry on a two-minute conversation, to order coffee, to read a weather report, to read a chapter of Plato without reference to a lexicon.

And finally, I'm thinking about the idea of micro-grammars within languages right now. It is one thing to learn how to say hello or to construct a simple sentence, but the conversation dies quickly thereafter because of a lack of micro-grammar. A micro-grammar is a word I use to talk about spheres of language: the weather, sports, the office, family life, religion, what's new in politics. Native speakers move from micro-grammar to micro-grammar as easily as moving from room to room, but even in native language there are times when one must acquire new words for a new environment (the micro-grammar of your city or neighborhood.) I'm not sure how yet to best incorporate this into language acquisition, but when I think of various children's books I can see that they recognize the existence of such grammars and take steps to teach words accordingly.

The Two-Face Technique

From the beginning, I have thought about language acquisition using the metaphor of climbing Mount Everest. There are a number of reasons for this. The effort and focus that climbers display even years before their attempt. The social, material, and physical expense and exertion, if not pain, required to successfully summit the mountain. The way that climbing Everest has become a well-understood and apportioned process of having such-and-such gear and moving up the mountain through established camps. The fact that hundreds if not thousands of people make an attempt every year (you aren't any different from them.)

Now one element of the metaphor that has become very useful is the difference between Everest's south and north faces. The south face is the (qualified) easier of the two. The south face was the way Hillary and Norgay made the summit in 1953. The north face, however, is a beast.

For my purposes, each face represents a technique of language acquisition. The north face represents what we usually think of when we think of learning a language: wrote memorization and paradigms. The raw violence of making our minds sink new synapses into new patterns unattached to any other familiar information. The south face represents the way native speakers learned their language. South face techniques are reading aloud and reading a lot. They are fun and easy, and in my experience they charge a session with energy and life. North face exercises feel like work. South face s just fun.

The trouble, of course, is that south face takes a good long while and a lot of exposure--far more than we'd achieve even through a course of immersion. (Non-native speakers tend to achieve a homeostasis of "good enough," which is why I say even immersion is not sufficient.) Few people have the time or patience for such an approach. On the other hand, north face is not so great either. I already mentioned the abysmal rates of attrition by graduates who have taken even years of a language in formal instruction. So what to do?

My hypothesis is that good language work needs both faces delivered in appropriate amounts. Overall, south face activities are best, but north face activities should be used to speed up the dial. You swallow a paradigm or construction quickly, via north face, and that reinforces and makes your south face work more capable. The resulting success pumps endorphines into the whole and keeps the arrow of acquisition moving forward.

__________


[1] David E. Freeman and Yvonne S. Freeman, Essential Linguistics (Portsmouth, NH: Heinemann, 2004), 14.
[2] Dennis Abrams, "The Art of Translation: Something New, Something Old" Publishing Perspectives
http://publishingperspectives.com/2013/04/the-art-of-translation-something-new-something-old/ accessed April 2, 2013.
[3] Watch his interesting TEDx talk at http://youtu.be/HZqUeWshwMs
[4] Freeman. Linguistics. 2004.

Friday, November 02, 2012

The Three Options

Bluntly stated, there seems to be three options available to the publicly thinking Christian today. And by publicly thinking I mean the Christian who is interested in thinking outside of the ecclesiastical circle, the Christian interested in speaking in the public square.

The public square is always tended by a gatekeeper, and as I've said before, the gatekeeper in our day is Enlightenment secularism manifested culturally in the materialism of the hard sciences. This is to say that the public square and all conversation allowed in the public square is implicitly in dialogue or in open agreement with the philosophical assumptions of science.

So, then, the public Christian has three ways of speaking. First, he may fully adopt materialist presuppositions and compartmentalize his mind, so that he is a Christian in some spheres and a secularist in the other. A subset of this is that he may refuse to compartmentalize the two and live buffeted by the tension. Second, he may retreat from the the metaphysics of modernity into a neoclassical metaphysics. My hypothesis is that the Intelligent Design community is pursuing this option, and that this option fuels the success of that effort to revive the argument from design. Third and finally, he may adopt the extremely new metaphysics of process or emergence.

I believe that an orthodox confession can be made within any of these three strategies--and, for me, this is a very new idea! Using the metaphor of a car, where orthodox confession is the body, the underlying metaphysic is its engine. One can swap out an engine and retain the integrity of the body (to lesser and greater degrees.) And, stepping further out, my guess is that the publicly thinking Christian cannot remain neutral forever but will, by steps known or unknown, inevitably choose one of the above three options.

Final caveat: in the course of time more options may develop. These are the ones I'm aware of presently. The Radical Orthodoxy movement of ten years ago (to the present?) was, to my mind, a variant of option two. And it is demonstrable that process or emergence has yet to produce what could be called thick orthodoxy, or something that the Nicene Fathers might recognize and agree with. This, however, does not mean it cannot, but only that it has not at this point in its development.

Tuesday, October 23, 2012

Revival or the saints?

On Friday, December 17, 2010, Pope Benedict XVI addressed students throughout the United Kingdom during his apostolic visit. He could have spoken apologetically or enumerated the evils of modernity. He could have sounded a prophetic warning, or he could have just said something politically winning and nice. He certainly started that way: "First of all, I want to say how glad I am to be here with you today." No doubt not a few expected the rest to follow along in the same manner--but it didn't. The Pope had scarcely finished thanking everyone when, to my astonishment, he began talking about saints. "I hope that among those of you listening to me today there are some of the future saints of the twenty-first century." My astonishment was (and is) not that he talked about saints, but that he was inviting the students who heard him to become saints. He wasn't talking St. Catherina of Siena (1347-138), daughter of an Italian cloth dyer, but appealing to a would-be St. Ashley Butler of Leeds (1992-20??), marketing major and a soccer enthusiast and daughter of a network administrator.

"Perhaps some of you have never thought about this before," he continued. "Perhaps some of you think being a saint is not for you. . . . When I invite you to become saints, I am asking you not to be content with second best. I am asking you not to pursue one limited goal and ignore all the others. [I am asking you to put those other goals in the context of true happiness, which ] is to be found in God.

"We need to have the courage to place our deepest hopes in God alone, not in money, in a career, in worldly success, or in our relationships with others, but in God. Only he can satisfy the deepest needs of our hearts. . . . God wants your friendship. And once you enter into friendship with God, everything in your life begins to change. As you come to know him better, you find you want to reflect something of his infinite goodness in your own life. You are attracted to the practice of virtue. You begin to see greed and selfishness and all the other sins for what they really are, destructive and dangerous tendencies that cause deep suffering and do great damage, and you want to avoid falling into that trap yourselves. You begin to feel compassion for people in difficulties and you are eager to do something to help them. You want to come to the aid of the poor and the hungry, you want to comfort the sorrowful, you want to be kind and generous. And once these things begin to matter to you, you are well on the way to becoming saints."

It is well known that Benedict XVI has dedicated his pontificate to combating the secularism of the West and the spiritual indifference of its people. His life has been a long exposure to modernity in all its forms. As a young man, he was well-equated with Nazism, having nearly been a Hitler Youth but later drafted into the military. After the war, he became a student at the Ludwig-Maximilian University in Munich. There he studied Fyodor Dostoyevsky, Martin Heidegger, and Karl Jaspers and became a fan of the Roman Catholic neo-Kantian theologian Karl Rahner. He witnessed the social turmoil and youth uprisings of the 1960s as a professor at Freising College. He would later teach at the universities of Bonn, Tübingen, and Regensburg.

So given his exposure to modernity and the overtly stated agenda of his pontificate, Benedict XVI chose to talk to the youth of the West about sainthood.

Contrast this with another doctrine aimed at spiritual indifference and even secularism. Contrast it with the doctrine of spontaneous, regional, and supernatural revival developed best in the United States by Reformed theologian Jonathan Edwards. A New England Puritan, Edwards witnessed the First Great Awakening of 1733-35 from his post as minister in Northhampton Massachusetts. The First Great Awakening extended from Great Britain to the American Colonies during the 1730s and 40s. It was the result of dynamic preaching that urged hearers to repentance and personal renewal. It turned the face of frontier Protestant Christianity inward away from corporate liturgy and doctrine and toward private emotion and personal response. People were encouraged to examine their lives, to repent from sin and commit themselves to moral living. Edwards wrote a book based on his experience of the Awakening. It was called A Treatise Concerning Religious Affections and was published in 1746.

In Religious Affections, Edwards constructs a method of self-examination for separating true religious experience from its counterfeit. New faith, he says, must be tested to see if it is genuine. Good works in themselves mean little. Edwards concludes that the twelve fruits of the Spirit are the true signs, and of them, love is the chief. "Love is the chief of the affections, and as it were the fountain of them." And, he continues, only spiritual men, that is human beings awakened by the Holy Spirit, exhibit its fruits.

Since the First Great Awakening, evangelical churches have prayed for revival in the face of growing secular indifference. With Edwards they have understood revival to be a changing of the mind and emotions of each human being toward repentance and amendment of life. With Edwards they have understood this to be completely a work of the Holy Spirit. And, keeping the First and subsequent Awakenings in mind, they have understood the phenomenon of Spirit revival to be pneumatic, unexpected, and in some way geographically located. The revival tradition has taken on other characteristics since Edwards as well, especially under the influences of charismatic preachers like Billy Sunday and Billy Graham and the pentecostal movements of the twentieth century.

My experience of revival as it is practiced in evangelical churches today is pedestrian. Revival announced in an American church today means a jejune week of long evening Sunday services. There will be jeremiads against the evils of the culture and full-throated appeals toward repentance--hear the echoes of the tent-meetings of Methodist perfectionism, of "praying through" to the "second blessing", of tongue-speaking and people slain in the Spirit. There will be prayer, fervent and pious prayer. And there will be appeals for revival made to God--where revival is like a spiritual thunderhead and the praying congregation Elijah kneeling on the dry mountain.

This post has gone much longer than expected, but here are two responses to the challenge posed by Western secularism: popular Protestant revivalism and Roman Catholic sainthood. There is room here, I think, for a compare and contrast that would lead to interesting conclusions. I hope I have laid a very little of the groundwork here--and a general work it has been. For myself I think that the revival tradition makes little sense in its expectation for a falling of "latter rain" upon the citizens of [insert town, city, state, or country.] But there are commonalities between Benedict's challenge and the challenge made by Edwards and picked up in the evening preaching from revival pulpits, save Protestant recoil from the papist grammar of "sainthood" even as they (we) pray for private sanctification and public rejuvenation.

Tuesday, October 02, 2012

The argument from neglect: a good paragraph

Breaking my usual essay-only obsession to post a good paragraph from Philip Clayton and Steven Knapp. The Predicament of Belief: Science, Philosophy, and Faith. OUP. 2012.

"To the extent we think of God as a personal, active being, we inevitably apply [human moral] standards. Frankly, and I say this with the utmost reverence, the personal God does not pass the test of parental moral responsibility. If God is really personal in this way, then we must conclude that God has a morally abysmal record of inaction or ineffective action. This I shall call the argument from neglect . . . To meet this objection, a defender of personalistic theism has to do two things: first, show that there may be a good reason why a personal and active God, if there is one, either cannot or chooses not to perform the acts we would expect a benevolent God to perform; second, avoid what is in effect the reductio ad absurdum of constraining divine action so extensively that it becomes pointless or irrelevant. This chapter is devoted to addressing these two challenges." [Ooooh, that last sentence is like a spooky orchestra hit for theology nerds!] (56)

Thursday, May 10, 2012

Armchair Linguistics

There’s been a lot of talk about language on the web recently with the publication of cultural anthropologist and linguist Daniel Everett’s book Language: The Cultural Tool (Pantheon 2012). In it, Everett argues that culture is the mother of language. It is a direct challenge to the dominance of Noam Chomsky's view, that we are hard-wired for language.

The debate between Everett and Chomsky is an important and interesting one. Their two poles, culture and biology, mark out a linguistic field that serves my purpose. There are aspects of both views that appeal to me. I agree with Chomsky that language is irrevocably grounded in biology. And I agree with Everett that language is best seen as a tool. Language is a tool we use strategically to carry the freight of meaning from ourselves to others.

Over the years I have come up with my own linguistic rules of thumb. Untutored and unlettered, and lacking in the sophistication of even an undergraduate in the field, I nevertheless hope to publish them for the hope of future development, and for that reason I welcome your comments.

1. People mess things up through use

Languages slowly devolve toward grunts and clicks over time. Prepositions, for example, that used to denote fine shades of difference come to say pretty much the same thing. Similarly, verbs for actions that we do a lot, such as "bring" or "go" or "give" or "enter/exit", become linguistically complicated. The rococo nuances available to a language during an Elizabethan high point erode into simpler modes--shorter sentences, more practical applications, in short, toward the Strunk and White.

2. Language--this is my middle-way between Everett and Chomsky--is about feel and sound

To me Chomsky is right to a degree, in that we only have a finite amount of sounds--even if practically that finitude is infinite, but Everett is also right, in that our culture will choose a set from that near-infinite set and create all words from that chosen subset. Languages could, I suppose, then be classified by subset, and dialects by the agreement or disagreement with the governing subset.

In Greek, for example, we discover a set of sounds (and human sounds are made by how the mouth feels, so feeling may enter, too, into how a culture chooses or dismisses a sound) chosen by that culture. And what culture? Oh, the culture that figured out musical scales and poetical meter. Plato has a section of The Republic devoted to poetic meter. Aristotle also lectured on it. People took it seriously. They listened. Therefore, we are invited to listen as well. At its deepest level, a language is a way of playing jazz on the most important instrument in the universe--the human voice. To learn a language is to be invited to step into the set and learn to jam, and we, like musicians, must learn to listen as well as to play. Playing is good. Listening is better. (A corollary to this is that language is a craft, not a sum.)

3. The observable world makes the rules

There are men, women, and things. There is us and those around us. There is the thing we throw and the thing thrown at us. The world presents problems to every culture, to every speaker, that he or she must solve. Some might also say that one's vocabulary determines what you see in the world. It is a romantic idea. IMHO, the jury is still out on that one.

4. Vowel sounds are the silly putty of language. Consonants, not so much

If you were trying to send a message to people 100 years from now, don't hand it to the vowels. You'd better leave it with the consonants. The effects of this choice are hobgoblin and pervasive.

5. There are two strategies

To my mind, there are two basic strategies for encoding meaning into the extended musical riff we call a sentence: word order and inflection. Your language may borrow something from the other team, but it always emphasizes one or the other.

English is a word-order strategy. Our subjects come before our objects. Greek is an inflected language. Prefixes and suffixes are added or removed from a word to make it perform in different ways.

The gulf between these two cannot be overestimated. They are not only two strategies, they are two ways of hearing. I'm just starting to realize how much this is true.

Saturday, March 24, 2012

Dinnerstein and Downes on finding your project

One thing I’ve said before, and it deserves repeating again, is that academic theology is not interested in discussing the how to of the trade. There is, of course, a practical reason for this. There are no jobs, and so only the most brilliant or lucky should survive. Therefore, the absence of deep pedagogy is a mercy killing; and who can argue with that coming out of this Great Recession.

Nevertheless, it is a truth to my mind that theology requires the pouring out of deep pedagogy. (I hesitate to use the word mentoring since the forces of management have largely taken that over.) Theology contemplates the pouring out God, and should make pouring out people. As one who has zero experience in the classroom, perhaps that relationship suffices. But I can tell you from the student side that it does not. It is like watching virtuosos playing the piano, but never getting a lesson. The student interested by nature in theological progress is left to their own devices.

I have found that listening to artists in other artistic disciplines is a great help in filling in the pieces. Artists aren’t reticent, but eager to talk about how they do the work. Time and again, the way that a sculptor or painter approaches the canvas or clay, or the way an author breaks down a plot, carries great lessons for the would-be theologian. Take the following exchange between pianists Lara Downes and Simone Dinnerstein.[1] The topic is Dinnerstein’s best-selling 2007 recording of J. S. Bach’s Goldberg Variations. Dinnerstein was pregnant with her first child while preparing for recording. In this section of their discussion, they touch on finding and being confident in your own voice, on respecting great voices without being dominated by them, and on being taught and (briefly) teaching.

Lara: You’ve referred to the notion of "playing for yourself" having acquired new dimension when you were pregnant, and I really related to that. When I was pregnant, I felt both more connected to myself, and more independent, than at any other time before, and I remember feeling very strong in my musical choices at that time.

Simone: You feel like you are a little world of your own.

Lara: Yes, really. Can you articulate some of the truths about the Goldbergs, in particular, that emerged in that little world, for you?

Simone: I don't know how much of my musical decisions were directly related to being pregnant. But I did feel that I needed to really listen to what the music was telling me and not think about any received wisdom about how I should interpret the music. I had of course listened to many recordings of the Goldbergs over the years, but I was trying to approach the score with a clear head. In particular, I started to think a lot about what constituted pulse and rhythmic expression, about the shapes of the phrases and how they would be sung, or breathed.

Lara: I think that's what I meant about the freedom of choice I experienced during my pregnancy. It was something about letting go of preconceptions, both others’ and my own, and exploring freely. Before that time, what had been your strongest influence with the Goldbergs? What was your relationship to the Gould recordings?

Simone: I really loved the 1981 recording and that had been my first introduction to the Goldbergs. Hearing him play the aria was one of those moments like an epiphany that you always remember. I listened to it obsessively over the years, as well as all of his other recordings. I made me feel quite intimidated, like everything that needed to be said had been said. But then in my late twenties I started to listen to other pianists, and I had another epiphany when I hear Jacques Loussier's recording of the Goldbergs. It opened up a completely different world to me.

Lara: Well, I would absolutely have bet that you came first to the 1981 recording! Even though your own interpretation is completely different/unique, somehow if I had to guess... My first was the 1955 version, and it makes a tremendous difference which you hear first, doesn't it? It's like an imprinting.

Simone: Yes - it's funny how that happens.

Lara: Let's talk about all the ancestors. When we're very young, we're so apt to copy, and so warned not to, and then I think we go through this long process of establishing independence. But then I feel like all the ghosts kind of come back into the room, and we can allow them to speak to us and share their contributions. After all they've made us: the generations of musicians who have come before.

Simone: That's beautifully put. My first teacher, Solomon Mikowsky, used to play many recordings for me of the works I was studying. We'd listen together and talk about the different musical choices that were made. he did this from when I was around ten. Then my second teacher, Maria Curcio, was quite different. She had an extremely specific way she wanted me to play and the way I learned from her was by becoming her. For a while I lost myself in her, but I learned so much. After that I studied with Peter Serkin, who was a very searching musician and wouldn't give me any answers, just looked and explored the music with me. So after all of that, it took awhile for me to get the sounds of my teachers out of my head and feel strong enough to make my own decisions without any guilt.

Lara: Yes. And whenever I work with young pianists, I'm aware of how difficult it is as a teacher to do anything but "show the way". It's a tremendous challenge to resist that easy communication!

Simone: Personally I think that is the best way to teach.

Here are two other examples of listening to the liberal arts. In the first, Peter Sellars talks about democracy and performance. In the second, Samuel Nigro gets under the skin of how modernity feels.

[1] Lara Downes, March 22, 2012 “Looking at the Goldbergs Part II -- Simone Dinnerstein” On the Bench: Conversations with Other Pianists, Feb. 12, 2012, http://onthebenchconvos.blogspot.com/2012/02/looking-at-goldbergs-part-ii-simone.html

Wednesday, February 29, 2012

Alan Lightman sounds the alarm

Sound the alarm! Science’s priestly reign over the public square may soon be overthrown! The fortress of doubt could be breached. Already, the foundations of theoretical physics are straining and cracking. The barbarians are at the gates. They will torch the manicured gardens of reason. Who then will keep order? What of the state? How will the West survive?

To understand how we have reached this precipice, one need only look at developments in theoretical physics over the last few decades. Theoretical physics is the purest expression of science. Exploring the universe with sophisticated and occult mathematics, it searches for the deepest and most explanatory properties of nature. In the name of Isaac Newton, its faithful hunt natural laws as unapologetic Platonists. Their holy grail is a master principle that will explain everything.

According to cosmologist Alan Gurth, “Back in the 1970s and 1980s, the feeling was that we were so smart; we had everything figured out.” It was true, theoretical physicists had come an amazing distance. They had accurately modeled three of the fundamental forces of nature: the strong and weak forces and electromagnetism. No one doubted that the remaining fourth force, gravity, would soon be wedded to quantum physics, with the result that a final theory--a theory of everything--would emerge. In the light of the theory of everything, the universe would no longer be a mystery, but a necessity. Enter the multiverse.

So much has been made of the multiverse on television and in the movies that it seems silly to explain it. Nevertheless, the multiverse is a cosmos fecund with an infinitude of universes, each with an unpredictable and unique set of physical properties. Most would be stillborn wastes of dead rock or awash in the violent spray of hyper radiation. But the tiniest fraction of a fraction of these might contain complex organisms or, rarest of all, intelligent life.

What makes the multiverse idea so necessary to cosmologists is a characteristic of the one universe we do know about--our own. As it turns out, our universe is stunningly, amazingly, fantastically, and completely fine tuned to support life. This characteristic has only grown more miraculous as physicists have better understood how delicate and complex it all is. I imagine that somewhere in the first quarter of the twentieth century this fine tuning was ignored in public and rarely discussed in private. Back then, Einstein’s general relativity was upsetting the comfortable givenness of the solid-state model of the universe. But as our models have become more complex, the evidence of fine tuning has grown to an acuity that no one can ignore.

Such fine tuning forced working physicists into a conundrum. They could roll away the stone and resurrect the argument from design, much to the smug satisfaction of the Intelligent Design community. Indeed, many theists and polytheists argue that the fine-tuning of the universe suggests a transcendent designer. Francis Collins, for example, at the 2011 Christian Scholars’ Conference said, “To get our universe, with all of its potential for complexities or any kind of potential for any kind of life-form, everything has to be precisely defined on this knife edge of improbability. . . . [Y]ou have to see the hands of a creator.” But religion is not an option for science, even though many scientists hold religious beliefs. Science as science cannot embrace unqualified and unrepeatable hypotheses. If it should do this, it instantly becomes another propagandist in a thoroughly political universe, opening the way to the naked power of fascism, the hive mind of socialism, or the cultic and bloody mysteries of theocracy. Here be barbarians.

As it stands, physicists have two options: string theory and the multiverse. String theory has been around for decades. It suggests that the smallest bits of stuff that exist are vibrating, tiny, one-dimensional loops or strings of energy. The differences in their vibrations give rise to the fundamental forces and particles familiar to physics. Many hoped string theory would be able to unify gravity with quantum physics. And if string theorists could pull off this correlation, they would realize the Platonic ideal of a fully explicable cosmos. But, there remains a problem.

At its inception, string theory required a number of extra dimensions: seven at the beginning, with each dimensional fold corresponding to a different universe. Now, however, that number has grown to 10 to the 500th possible universes. It may as well be an infinity, explaining everything and so explaining nothing. Never mind that, as of this writing, string theory has not been supported by a single experimental result, nor has it suggested demonstrable areas of further investigation. It's failure leaves only the multiverse.

Lightman tries his best to assert that a multiverse is at least suggested by modern physics. He points out that eternal inflation suggests it, and cites Alan Guth’s original inflation theory, which was developed by Andrei Linde, Paul Steinhardt, and Alex Vilenkin some twenty years ago. But eternal inflation says that the universe is expanding upon a field of dark energy that has different properties at different points in space--the same energy of which he admits “no one knows what it is.” He goes on to admit that physicists “give a fantastically large range for the theoretically possible amounts of dark energy” (emphasis his). He then abandons eternal inflation and resorts to a pathetic argument from authority, writing, “Some of the world’s leading physicists have devoted their careers to the study of these two theories.” Eventually, however, he has to admit that “neither eternal inflation nor string theory has anywhere near the experimental support of many previous theories in physics, such as special relativity or quantum electrodynamics.” By this he means that the latter two have been independently verified by a number of experiments over the last half of the previous century and have suggested further avenues of research whereas the former are nifty math gymnastics for the initiated. In other words, the multiverse is not the elegant explanation physicists expected. They went looking for a universe of light and form, but wound up with something dark and formless.

Keep in mind that the multiverse idea is no friend to theoretical physics. Lightman admits that “if the multiverse idea is correct, then the historic mission of physics to explain all the properties of our universe in terms of fundamental principles--to explain why the properties of our universe must necessarily be what they are--is futile, a beautiful philosophical dream that simply isn’t true.” If the multiverse idea is true, he continues, then "there is no hope of ever explaining our universe's features in terms of fundamental causes and principles."

Therefore because of our universe’s demonstrable fine tuning for life, theoretical physicists have oh so quietly abandoned empirical science for faith. “Some [physicists] feel relieved,” Lightman says. “Some feel like their lifelong ruminations have been pointless. And some remain deeply concerned, because there is no way they can prove [the multiverse]."

Appealing evangelistically to his scientific peers, Lightman says, "Not only must we accept that basic properties of our universe are accidental and uncalculable. In addition, we must believe in the existence of many other universes. But we have no conceivable way of observing these other universes and cannot prove their existence. . . . We must believe in what we cannot prove.” And so the multiverse, though a perennial boon to science fiction, is as whimsical a figure as the flying spaghetti monster.

What a horrible state of affairs! For without the despotic threat of militant empiricism, the barbarians will most surely come. They will burn libraries in an inferno of anti-intellectualism. They will invoke and totemize the fine tuning of the universe to summon legions of theosophic spiritualisms. Eros will seduce reason, and governments will descend into a night of long knives. Heaven help us! The priesthood is forfeit. The public square lies open. Oh, Alan Lightman, how will they let you live?

Epilogue

Who would have expected it, but the so-called war between religion and science has been but a cordial tete-a-tete all this time. Kept under the watchful eye of white-cloaked science, the churches could relax. All those threats about secularism did but thin the ranks of the Elmer Gantry, allowing ecclesial powers to pay more attention to the faithful. Who needs the hard and divisive labor of doctrine, discipline, and exegetical homiletics when one can employ the far more friendly and quantitative techniques of psychology and business management? Church discipline, private rebuke, and public apologetics are not necessary when only the faithful attend. Science too has benefited. In public, religion has been a noteworthy and engaging sparring partner: good for putting scientists on best-seller lists and magazine covers; good for TED talks, speaker's fees, innumerable conference sessions, and humorous anecdotes (and the benefits flow both ways). In private, scientists haunted by the specter of Oppenheimer have been glad to have an ethical stopgap to keep the whole thing human.

Read part one of this article, Science's Crisis of Faith, or read the whole thing as a document.

Tuesday, February 28, 2012

David Hume. An Enquiry concerning Human Understanding

An Enquiry concerning Human Understanding is the distilled presentation of a thinker who was together a scientist, psychologist, metaphysician, and skeptic in a manner that continues to fascinate contemporary minds. The product of both youthful fire and mature consideration, the Enquiry, “contain[s] everything of Consequence relating to the understanding.” In the face of skepticism, the Enquiry offered progress based on experience. In a time of dogmatism, the Enquiry dissected the bases of religious faith and delivered a still-powerful critique. Its attempt was nothing less than the construction of an anatomy of human nature.

Its author, David Hume, was born on April 26, 1711. He grew up in Ninewells and Edinburgh, Scotland. His widowed Mother educated her “uncommonly wake-minded” son until his enrollment at University of Edinburgh at age eleven where he initially considered a career in law. Yet, in a decision that must have weighed heavily against limited means, the fifteen-year-old left the university to answer inner questions of theology and metaphysics. “I could think of no other way of pushing my Fortune in the World, but that of a Scholar & Philosopher.” Residing either in France or England, Hume served as tutor to the Marquis of Annnandale, as a librarian of the Advocates Library in Edinburgh and as secretary to the Earl of Hartford. He was best known to contemporaries as a historian due to the pro-Torey History of England, even though time judged his philosophy more influential. Hume befriended many notaries, including Jean Jacques Rousseau, Adam Smith, and James Boswell. After his death, others admitted admiration, including Auguste Comte, Charles Darwin, and Thomas Henry Huxley.

The Hume family were Calvinists and faithful members of the Church of Scotland. One of Hume’s Uncles was a bishop in the same. The young David took religion very seriously during an era characterized by religious indifference. He confessed, for example, to vanity for thinking himself smarter than his peers. He also applied himself to a moral code modeled upon the pedantic, The Whole Duty of Man. Yet, after leaving the University, he began questioning religious dogmas, and especially proofs for God’s existence. He wrote, “It began with an anxious Search after Arguments, to confirm the common Opinion: Doubts stole in, dissipated, return’d, were again dissipated, return’d again.”

David Hume’s ascent to prominence in Europe’s literati had steep beginnings. At sixteen Hume had begun the labor that would, by twenty-seven, become the Treatise of Human Nature (1739). It was a failure. “Never literary attempt was more unfortunate than my Treatise of Human Nature,” he wrote. “It fell dead-born from the press, without reaching such distinction, as even to excite a murmur among the zealots.” Still, Hume continued writing. His Essays Moral and Political (1741-2) were very successful. Hume began courting the essay form popular in the Eighteenth Century, and considering himself a man of letters. He also began rewriting sections of the Treatise.

I had always entertained a notion, that my want of success in publishing the Treatise of Human Nature, had proceeded more from the manner than the matter, and that I had been guilty of a very unusual indiscretion, in going to the press too early. I, therefore, cast the first part of that work anew in the Enquiry concerning Human Understanding.

The Enquiry, first published anonymously as Philosophical Essays Concerning Human Understanding (1748), would prove to be the perfect vehicle for introducing Hume’s science of human nature. Its success when compared against its progenitor is unsurprising. Hume learned the benefits of rhetoric from English essayist Joseph Addison. It is to Addison that the Enquiry owes its informality and narrative style. Furthermore, the Treatise is three or four times the length of the Enquiry. Lastly, the former rambles in soporific repetition, while the latter’s essays continue only as long as a bourgeois attention-span might allow.

There were some difficulties along the way to the Enquiry’s success. Some felt it should never be published. Hume initially circulated the Enquiry for private comment. Henry Home, one of Hume’s close friends, advised against its publication. Home feared the consequences that might follow exposure of Hume’s skeptical treatment of religion. Hume had omitted a section in the Treatise on miracles for that very reason. This time, however, Hume was indifferent to scandal. Further, it looked initially like the Enquiry would emulate the failure of the Treatise. Hume, returning to England in 1748, wrote, “I had the mortification to find all England in a ferment, on account of Dr. Middleton’s Free Enquiry, while my performance was entirely overlooked and neglected.” Within a few years, however, one of the essays, “Of Miracles,” evoked responses which gained notoriety for the whole. The Enquiry, bundled with other works, was reissued in ten editions during his lifetime.

David Hume’s life spans one hundred years of time called the Enlightenment. The religious conflicts and scientific advancements of the previous decades bequeathed new confidence in human abilities to understand and manipulate the world. It was a confidence to doubt and to see anew. Men like Francis Bacon and Galileo Galilei no longer trusted the received Aristotelianism of the Schools. Rather, beginning with doubt, they made up the difference with discoveries obtained by induction from personal observations. Supreme among them was Isaac Newton. Newton’s physical and mathematical insights overturned centuries of Ptolemaic astronomy and revealed the limitless universe in ways wholly new and unexpected. His success was a powerful recommendation for doubt and observation as the best method for obtaining knowledge. It attracted many admirers, none the least of which were René Descartes, John Locke, and David Hume.

Descartes and Locke sought to do for the inner world of human nature what Newton had done for the outer world of human knowledge. Knowledge, to be certain, must be placed upon a proper foundation. In his Discourse on the Method (1637), Descartes applied radical doubting to discover the nature of even his own existence in order to define some point of absolute certainty. He wrote, “I had to raze everything to the ground and begin again from the original foundations, if I wanted to establish anything firm and lasting in the sciences.” Like Newton before him, Descartes’ method was much imitated, none the least by John Locke in his Essay Concerning Human Understanding (1690). In the prologue to the Essay, Locke explained that he and several friends had been unsuccessfully discussing “principles of morality and revealed religion” in the winter of 1670. He continued, “it came into my thoughts that we took a wrong course; and that before we set ourselves upon inquiries of that nature, it was necessary to examine our own abilities, and see what objects our understandings were, or were not, fitted to deal with.” Both men, using the doubt and observation of the new science, sought to discover the basis of human knowledge by an examination of the self. Yet there were differences in their assumptions wide enough to split those who came after them into two rival movements. These were Continental Rationalism, which followed Descartes, and British Empiricism, which followed Locke.

The differences between these two movements, as outlined by Thomas Reid in his An Inquiry into the Human Mind (1764), were twofold. First, Rationalists believed that there were a certain cluster of permanent and innate ideas or concepts: ideas of the self, causation and infinite perfection. These innate ideas were forever intuitively known to reason. Empiricists disagreed, saying that every idea may be traced back to sensory experience or emotion. Second, Rationalists believed that truth may be properly and mathematically deduced from innate ideas along the same lines as a geometric proof. Induction from observation was the favored method of the Empiricists. The principle philosophers associated with Continental Rationalism were René Descartes, Baruch Spinoza, and G. W. Leibniz. British Empiricism claimed John Locke, George Berkeley and David Hume.

Hume loved empiricism and hated rationalism. He, like Locke, followed the methodology of doubt espoused by Newton and Descartes. Yet, by the time he began his philosophical studies, doubt had begun to answer the question of certainty with ever-deepening skepticism. Locke taught that all ideas arise from experience, but left open the nature of the underlying substance that causes sensation. Thus, “sensitive knowledge” has no underlying definition. George Berkley, seeing the weakness in Locke’s epistemology, or theory of knowledge, posited a solipsistic idealism in which the sensate world is but a field of mental impressions existing only in the individual mind. Philosophers Nicholas Malebranche and Pierre Bayle agreed; truth about the world is not as obvious as commonly supposed. Reason proved uncertain whereas skepticism proved reasonable.

Until recently, Hume was included in the ranks of the skeptics. Philosophy so demanded an answer to the “skeptical challenge” that portions of the Treatise were over-emphasized, destructive passages highlighted. Hume’s was then a negative voice, if not the negative voice. Twentieth-century scholarship has begun to redeem this image.

Hume is now understood as the first post-skeptical philosopher of the early modern period. He took for granted the doctrine of the skeptics rather than promulgating skepticism of his own. His is an attempt to find a way forward for philosophy given the skeptical situation. Certainly he forever retained, if not purified, the hallmarks of Locke’s empiricism. Yet, reform was needed. The fixed point of human nature, the mind, should be studied scientifically, just as Newton had done with natural phenomenon. The result would be a “science of man,” a “solid foundation laid on experience and observation” which could then be extended to all other sciences.

Hume’s skeptical challenge may be understood as a change from ontology to psychology, or from a worldview based on hierarchies of being to one centered around pure anthropology. Descartes’s cogito, “I doubt therefore I am,” beginning with the pronoun “I,” had already begun a turn to the subject. Yet, Descartes still relied on theistic arguments in order to complete his system. Hume did not assume religious categories, nor did he want to. They had already been addressed by the skeptics. His task, rather, was to understand how the mind works. Not what the mind is, but, instead, how it does what it does, always with an ear to the bar of common sense. It is this project which occupies the Enquiry.

Two main themes develop from the Enquiry as it pursues this goal. The first of these is positive, and the second negative. The first exalts imagination and the passions, and the second condemns religious reasoning whether natural or revealed. The first is an examination of causation and the next a condemnation of religious excess.

Hume’s importance in the history of philosophy derives in some respect from his exegesis of causation. With it, Hume scored a victory for British Empiricism over the Continental Rationalists like Malebranche. The Rationalist, rightly equating both self-knowledge and a knowledge of the surrounding world with the necessity of causality, deemed it an innate idea on par with mathematical absoluteness. Hume, however, using empirical induction, saw only one sensation following upon another and called the relationship between them a psychological habit born of instinctive imagination.

When one particular species of event has always, in all instances, been conjoined with another, we make no longer any scruple of foretelling one upon the appearance of the other...We then call that one object, Cause; the other, Effect. We suppose that there is some connection between them; some power in the one, by which it infallibly produces the other, and operates with the greatest certainty and strongest necessity.

What is left between isolated events is but probability and of human nature, not a self but merely bundles of impressions. “[There is] nothing but a species of instinct or mechanical power that acts in us unknown to ourselves.”

Hume’s views on causation met a critical reception. Even close associates such as Henry Home and John Stewart disagreed with his diagnosis of the original quality of human nature. The most famous of his critics, however, is Immanuel Kant who, troubled into life by Hume, carefully corrected the Empiricist in his Critique of Pure Reason (1781) and Prolegomena (1783) which writing also had the effect of raising German interest in Hume’s theories.

The second theme of the Enquiry concerns Hume’s attack on both natural and revealed religion specifically in the essays “On Miracles” (contra revelation) and “Of Particular Providence and a Future State” (contra design (neither Hume nor his contemporaries were aware of the ontological argument)). Hume disliked any form of reasoning that went beyond the limits of perception. His charge against religious dogma was that it was unreasonable. “Religious belief,” he wrote, “is a form of make-believe which … leads by degrees to dissimulation, fraud and falsehood.” It was therefore a subject which he could not ignore. By hobbling design with causation and revelation with an appeal to evidential common sense, Hume sought to show that “the cause or causes of order in the universe probably bear some remote analogy to the human mind.”

Hume’s hostility (often hidden by rhetorical device) garnered some of the earliest responses to his philosophy in close criticisms penned by clergymen both Anglican and otherwise. Significant among these were William Adams (1752), John Leland (1755) and George Campbell (1762). Hume was branded an atheist. The label cost him appointment to a chair of moral philosophy at the University of Edinburgh in 1745. Nevertheless, his religious critique earned him many accolades. His influence is discerned in the theologies of Immanuel Kant, Friedrich Schleiermacher, and Karl W. Feuerbach. They also earned him a permanent place in the philosophy of religion.

David Hume died on August 25, 1776, a few months after finishing the autobiographical “My Own Life” (a nod to friend Benjamin Franklin’s own autobiography which Hume had read in manuscript.) He was buried at Calton Burial Ground. In the last decade of his life, many philosophers engaged Hume’s ideas, including Thomas Reid and James Beattie (Essay on the Nature and Immutability of Truth (1770)). Despite the fact that he once wrote, “I cou’d cover the Floor of a large Room with Books and Pamphlets wrote against me,” his work is respected today both for its commitment to the tenants of empiricism and for its religious critiques. There is a growing dialogue, too, with Hume’s insights in the cognitive sciences, specifically in the work of W. V. O. Quine (Word and Object, 1960) and Richard Rorty.


Bibliography

Aiken, Henry D. Introduction. Dialogues Concerning Natural Religion. By David Hume. New York: Hafner Publishing Co., 1969.
Fieser, James. “David Hume (1711-1776): Writings on Religion.” The Internet Encyclopedia of Philosophy. 1 Aug. 2003.
---. “David Hume (1711-1776): Metaphysical and Epistemological Theories.” The Internet Encyclopedia of Philosophy. 1 Aug. 2003.
Hume, David. An Enquiry Concerning Human Understanding. Ed. Tom L. Beauchamp. Oxford: Oxford University Press, 1999.
Humer, James M. “Hume.” A Companion to the Philosophers. Ed. Robert L. Arrington. Oxford: Blackwell Publishers, Ltd., 1999. 309-318.
Kenny, Anthony. “British Philosophy in the Eighteenth Century.” A Brief History of Western Philosophy. Oxford: Blackwell Publishers, Ltd., 1998. 230-243.
Morris, William Edward. “David Hume.” 2003. Stanford Encyclopedia of Philosophy. 15 July 2003. Stanford University.
Mossner, E. The Life of David Hume. Oxford: Oxford University Press, 1980.
Norton, David Fate, ed. The Cambridge Companion to Hume. Cambridge: Cambridge University Press, 1993.
Noxon, J. Hume’s Philosophical Development. Oxford: Oxford University Press, 1973.
Smith, T.V. and Marjorie Grene. Philosophers Speak For Themselves: From Descartes to Locke. Phoenix Books P17. Chicago: University of Chicago Press, 1967.

(c) 2004 Thom Chittom
Barnes and Noble Classics

Friday, February 17, 2012

G. W. Leibniz. Discourse on Metaphysics

G W. Leibniz's Discourse on Metaphysics is the olive branch of a mind at once seizing and being seized by the churning intellectualism of the seventeenth century. Gottfried Wilhelm Leibniz (1646-1617), forty at the time he penned his first mature outline of a universal metaphysic, was a man standing in between the ideological armies of Cartesian modernism and Aristotelian conservatism. His Discourse sounds above the crush of these two ideologies. It paves another way. And, in doing so, it has long outgrown its place as a piece of private correspondence to inform thinkers across the disciplines of the present day.

The roots of Leibniz’s intellectual modernism are traced to his youth. G. W. Leibniz was born July 1, 1646, at Leipzig, the son of a professor of moral philosophy. His mother also had university ties; her father was a Professor of Law. Culturally, the violence of the Thirty-Years war would subside in another two years time. Yet, Leibniz’s own intellectual violence grew with age. Upon his father’s death, the six-year old was allowed access to an expansive library. He taught himself enough Greek and Latin to, “immerse myself in the historians and poets.” He absorbed the ideas of Plato, Herodotus, Aristotle, Cicero, Augustine, Aquinas, and many others, all without intellectual prejudice. Leibniz’s gift to all was the unity of a curious mind. Authors were valued each for their own contribution. It was a habit of synthesis he would carry with him throughout life.

As a young man, Leibniz’s democratic intelligence had great effect upon his career. He studied law and philosophy at the universities of Leibzig and Altdorf, yet declined a university post in lieu of the broader opportunities of public service. (No doubt, the cultural stagnations attending a protracted war made a professorship repellent to the autodidact.) His employer was the Elector of Mainz. Leibniz’s flair for law, mining diplomatically from both scholastic canon and scientific analysis, brought him success and opportunities for travel. One diplomatic mission took him to Paris in the spring of 1672.

Paris in the seventeenth century was the intellectual center of the world. Aristotelian scholasticism, traditionally dominant in the schools, was failing in influence before the demonstrable results of the modern philosophers: Galileo, Torricelli, Cavaliere, Descartes, Pascal, and Hobbes. The substantial forms of the schoolmen were being poured into new wineskins of geometry, mechanics, and motion. New solutions reopened new problems to scrutiny: In a world of atomic flux, how can there be any continuity of matter? What is the relationship between contingency and necessity? Can we obtain a clear and distinct idea of things as they are (en res) by observing things as they appear to be?

In order to participate in the Parisian milieu, Leibniz needed to learn many new ideas. His education was Aristotelian. He was a humanist in the renaissance tradition. Therefore, the modernism of Paris was entirely new. He worked hard to understand and embrace the truths of its observational methodology. He negotiated meetings with the philosophers Antoine Arnauld and Nicolas Malebranche (he would also visit Baruch Spinoza in Holland). He read unpublished manuscripts by Pascal and Descartes. Yet, it was the teaching of Christiaan Huygens that equipped him to practice modern philosophy.

Huygens’s mathematical instruction found fertile soil; Leibniz’s progress was quick. He was elected to the Royal Society of London in 1673. His famous calculating machine was built that same year. He suggested ideas for improvements in barometry, time keeping and the calculation of distance, as well as for mechanical devices such as pumps, carriages, gears, and lenses. By the time Leibniz returned to Hanover in 1676, his mastery of mathematics was so thorough that, in 1675, at twenty-eight years of age, he had discovered the infinitesimal, or differential, calculus. Its dynamism formed the basis of all of Leibniz’s subsequent philosophical reasoning to his death on November 14, 1716.

Until 1675, a type of curve existed, called a mechanical or transcendent curve, which defied geometrical analysis. The mechanical curve resembles the shape made by a sail in the wind. Cartesians judged the curve to be forever inexplicable to mathematics because its explanation would require the knowledge of all possible numbers, a knowledge in extenso. Leibniz disagreed. The entire shape of the curve does not need to be explained, he reasoned, but only its geometrical movement. If the difference between each of its infinitesimally small points could be expressed geometrically, then all that would be needed to complete the analysis would be the movement of the equation. Expressing the generation of the curve geometrically–its “construction principle”–is tantamount to expressing the curve itself, and without the requirement of in extenso understanding. All that is needed to plot an infinitely complex shape is a foundational measurement and dynamic principle of continuity based on that measurement.

Leibniz’s mathematical insight was, in reality, an application of his rationalist understanding of knowledge. Throughout his educational life, he envisioned a universal characteristic language, an algebraic mathematics of logic (ars characteristica) by which all knowledge could not only be explained and compared, but ultimately constructed. Descartes had considered such a thing, but dismissed it since such a language would depend upon a true and absolute, if not omniscient, understanding. Leibniz, again, disagreed. Mathematics, he said, works by the dynamic manipulation of a handful of algebraic symbols. The sum of all equations is not needed to make mathematical advances. It was the same solution he proposed for the problem of the transcendental curve.

Yet, this application of thought traced far larger horizons. This was a worldview, a cosmology. The universal characteristic was a logos, a ratio of the universe. It was a logic so large that it could only be called a metaphysic. Leibniz’s universal characteristic required a metaphysical basis which would explain not only the change and extension of substances (if not a re-definition of substance itself), but also the contingency and necessity of God and of humanity. It would be a unique synthesis of truths from both Aristotelian scholasticism and Cartesian modernism. He appended descriptive abstracts of his metaphysic in his correspondence with Arnauld in 1686 (these are today incorporated as sectional summaries.) The whole was also prepared for publication, though its was not printed in his lifetime. It is known today as the Discourse on Metaphysics.

There are three principle aspects to the Discourse, all of which serve to make it a metaphysic in the style of his universal characteristic: first, there is existence; then, substance; and, lastly, there is proportion, or how these substances work together.

“God is an absolutely perfect being,” he writes, “whence it follows that God . . . acts in the most perfect manner.” Leibniz was not known for his religiosity, but was following his age by grounding his universal metaphysic in the nature of God. Descartes had done the same in the Meditations on First Philosophy (1641), as had the medieval scholastics. Substances must have reason for existing in the infinite variety he conceived. Therefore, God must exist and have a reason for making the world as it is. Leibniz found the answer in the perfection of God, which results in a world made as perfectly as possible. He expressed perfection through a variety/simplicity criterion in which the best solution finds geometric expression in the balance between the complexity of a figure and the simplicity of its informing equation: “God . . . has chosen the most perfect [method of creation], that is to say the one which is at the same time the simplest in hypothesis and the richest in phenomena.” Voltaire later ridiculed Leibniz through the character of Dr. Pangloss in Candide (1759) for his belief that this was the best of all possible worlds, yet Leibniz’s variety/simplicity criterion is understandable given the cosmological philosophy of the time.

The perfection of God makes simple substances a metaphysical, and not just philosophical, assertion. God, from an innumerable number of possible worlds, actualizes the substances that together express the most perfect world. There are no atoms, no vacuum, only an infinity of simple substances. Substances, also called “ideas” and “souls”, had not yet achieved the true idealism that would characterize their expression in Leibniz’s more mature Monadology (1714). Yet, the conceptual foundation was already there in the concept of expression and the identity of indiscernables.

To define these terms, it is helpful to reconsider the mechanical curve which Leibniz explained using the differential calculus. Each of the curve’s infinite number of points contains virtually the entire curve, though the curve itself stretches infinitely into space. This point-by-point containment corresponds to the concept of expression. Each substance in Leibniz’s metaphysic is entirely complete in itself. It contains in itself all that defines it, both past, present and future. Its relationships to other substances are also contained in its concept. Every substance uniquely reflects the universe from its own position in relation to the whole as a windowless world unto itself.

Such uniqueness is what is meant by the identity of indiscernibles. Leibniz’s epistemology defined a true expression as that which contains in its subject all which may be predicated of it. Therefore, a subject is absolutely complete in itself; it contains all of its possible predications. This means that no two substances are completely identical. The identity of indiscernibles was informed both by the pre-formationist biology of the day and the recent discovery of the microscope.

Having laid the foundation of his metaphysics, Leibniz now had only to put the machine in motion, a process he understood as a function of God’s harmonious interaction with the universe. Leibniz called the interaction of substances a pre-established harmony. The system that results is, as he had hoped, one of immense difference and organic dependency.

That is not to say there are no internal problems created by the system. His effort to defend his metaphysic against Arnauld’s charge of determinism attests to that. Indeed, Leibniz’s effort to make peace between necessity and contingency within the same metaphysic makes the Discourse a classic text on the subject. Scholars still debate whether freedom is possible in his system, constrained as it is by the completeness and resulting necessity of each substance.

On the other hand, scholars are finding much that is to be learned from Leibniz’s philosophy. Many of his ideas, never properly assembled and scattered through thousands of pages of correspondence (luckily, Leibniz, who served as chief librarian in the court of Hanover and who was offered a position at the Vatican, collected his correspondence), are today being published for the first time. Three volumes of the Prussian Academy’s forty-volume critical edition are available. Thus, Leibniz’s contribution to the history of ideas is being reassessed. The organic and dynamic relationship he saw in all things influences many disciplines, including modern metaphysics of modality, phenomenology, symbolic logic, the desire of physics for its own universal characteristic and even Chinese Studies (Leibniz made careful analysis of the I Ching and of Chinese political science). He has also been called the father of supercomputers, relay switches, and virtual reality. It is no wonder that Alfred North Whitehead included Leibniz in his century of ideas.

So then, G. W. Leibniz’s Discourse on Metaphysics is an olive branch arguing for the related dependency of all ideas upon each other, regardless of their source. To dismiss him as a system is to dismiss the consistency of the human mind and the possibility of intellectual progress. Therefore, reading the Discourse is both a tour and a possibility. It is a tour because the Discourse introduces its reader to the basic problems and ideas that informed the early Enlightenment. It is a possibility because those same ideas ultimately create and critique the philosophical story of today.

Bibliography

Writings

Leibniz, G. W. Samtliche Schriften und Briefe, ed. German Academy of Sciences. 40 vols. Darmstadt and Berlin: Akademie Verlag, 1923- .
---. Die Philosophischen Schriften von G. W. Leibniz. 7 vols. Ed. C. I. Gerhardt. Berlin: Weidmann, 1875-90.
---. Philosophical Papers and Letters, 2nd ed. Ed. and trans. L. E. Loemker. Dordrecht: Reidel, 1969.
---. Leibniz: Philosophical Writings. Ed. and trans. G. H. R. Parkinson. London: Dent, 1973.
---. New Essays on Human Understanding. Ed. and trans. P. Remnant and J. Bennett. Cambridge: Cambridge University Press, 1981.
---. Theodicy, trans. E. M. Huggard (LaSalle, IL: Open Court, 1985).
---. G. W. Leibniz: Philosophical Essays, ed. and trans. R. Ariew and D. Garber (Indianapolis: Hackett, 1989).
---. Leibniz: Monadology and other Philosophical Essays. Trans. Paul Schrecker and Anne Martin Schrecker. Bobbs-Merrill. Library of Liberal Arts. Indianapolis, 1965.
---. Leibniz: Selections. Ed. Philip P. Wiener. New York: Charles Scribners Sons, 1951.

Further Reading

Bobro, Marc and Kenneth Clatterbaugh “Unpacking the Monad: Leibniz’s Theory of Causality” Monist 7.3 (1996): 408.
Broad, C. D. Leibniz: an Introduction. Cambridge: Cambridge University Press, 1975.
Brown, Stuart. “Leibniz and the Classical Tradition” International Journal of the Classical Tradition 2.1 (1995): 68.
Hooker, M., ed. Leibniz: Critical and Interpretive Essays. Minneapolis: University of Minnesota Press, 1982.
Jolley, Nicholas, ed. The Cambridge Companion to Leibniz. Cambridge: Cambridge University Press, 1995.
Jolley, Nicholas. “Leibniz.” A Companion to the Philosophers. Ed. Robert L. Arrington. Malden, MA: Blackwell, 1999. 360-366.
Mates, B. The Philosophy of Leibniz: Metaphysics and Language. New York: Oxford University Press, 1986.
Riley, Patrick. Leibniz’ Universal Jurisprudence: Justice as the Charity of the Wise. Cambridge, MA: Harvard University Press, 1996.
Sesonske, Alexander. “Pre-Established Harmony and Other Comic Strategies” Journal of Aesthetics & Art Criticism 55.3 (1997): 253.
Steinhart, Eric. “Leibniz’s Palace of the Fates: A Seventeenth-Century Virtual Reality System” Presence: Teleoperators & Virtual Environments 6.1 (1997): 133.
The Monist 81.4 (1998) is entirely devoted to various aspects of Leibniz’s metaphysics.

Web Sites

http://www.uh.edu/~gbrown/start.html Gregory Brown’s Leibniz homepage. Brown’s website is an expansive portal to all things Leibniz. It includes links to the Leibniz Listserv, various societies and journals as well as information in the form of journal and encyclopedia articles.

http://www.maths.tcd.ie/pub/HistMath/People/Leibniz/RouseBall/RB_Leibnitz.html An article on Leibniz mathematical ideas which forms part of a collection of mathematical biographies made available online.

http://mally.stanford.edu/leibniz.html This Stanford University website is Leibniz-at-a-glance, including dates of authorship for his principles works, events in his life, and a short bibliography for further reading which focuses specifically on his conceptual philosophy.

http://www.peirce.org/writings/p119.html An online publication of the article, “How to Make Our Ideas Clear” by Charles S. Pierce reprinted from Popular Science Monthly 12. Jan. 1878, 286-302. The article is concerned with logic in general and the contributions made by Descartes and Leibniz in particular.

http://etext.leeds.ac.uk/leibniz/leibniz.htm An online publication of George MacDonald Ross’s book, Leibniz (Oxford University Press (Past Masters) 1984) made available through the University of Leeds Electronic Text Centre. A comprehensive and understandable study of Leibniz’s thought, with special attention given to the mathematical basis of his insights in philosophy and logic.

http://www.tulane.edu/~plodge/lzlinks/ Leibniz Links: A selection of web pages devoted to the philosophy of G. W. Leibniz compiled by Paul Lodge.

http://www.philosopher.org.uk/ Philosophy Since the Enlightenment by Roger Jones. This website is very helpful for seeing Leibniz’s philosophy against the philosophical spectrum of present and past.
(c) 2002 Thom Chittom
Printed by permission, Barnes & Noble World Digital Library

Putting first things first

Dear Theophilus,

Thank you for writing, and for sharing your struggles and doubts so openly. Let’s be honest: being a Christian is an ongoing negotiation. To confess is to wrestle, to discover as we live life, questions, crosses, and thanks be to God, worship. To confess is to wrestle and not turn away.

As you've talked here with me and with others, I see that you are wrestling, and that you are not turning away from your questions--and yours are good questions. You say you feel foolish. You say, “my ignorance must shine out to those who are better educated.” Actually what shines out is courage and tenacity, and to that I respond that "no temptation has come upon you that is not common to all" whether formally educated or not. Remember that some come at a question early in our Christian life which may only confront others later, or not at all. A PhD in theology may, because of their own story, find themselves wrestling in an area that a teenage parishioner has already come to terms with. And daily life may cause that faithful teen to revisit her question in greater depth later--questions moving up and over and around each other in a giant ascending corkscrew the steps of which go from one glory to the next.

Now before we address some of the questions you mention below, there is something you should well consider. This formation or corkscrew or ladder or path we find ourselves on, this is a theological path, not a philosophical one. The difference is important.

Philosophy is the effort human beings make to inquire about the world using all available skills--mental and physical. It is a discipline that in our era is a pursuit of reason and the mind. Theology, on the other hand, has an element of life and death to it, in that we are personally, ethically, socially, morally, physically involved in its questions and its answers because theology is done coram Deo, in the presence of God.

The theologian can do philosophy, but the philosopher cannot do theology. A philosopher can ask so-called theological questions, but it is a scholastic exercise. Theology bleeds. Said in another way, a philosopher can play with theology but doesn't fear for his life. A theologian (and all Christians are theologians because they all wrestle) cannot afford such a luxury. "The fear of the Lord is the beginning of wisdom," says the proverb. And as Martin Luther said in his Heidelberg Disputations, "Arrogance cannot be avoided or true hope be present unless the judgment of condemnation is feared in every work."

The philosopher cannot ascend to heaven. Human reason cannot of itself and upon its own presuppositions arrive at worship of the triune God. "Where is the wise man? Where is the scholar? Where is the philosopher of this age? Has not God made foolish the wisdom of the world" (1 Cor 1.20)? And Luther again: "Man must utterly despair of his own ability before he is prepared to receive the grace of Christ." Luther had a wonderful phrase for philosophers who attempted through their dialectic to grasp at theological knowledge. He called them theologians of glory and said they were trying to climb up and get a peek at the naked God.

Over against them, Luther taught a theology of the cross. "He deserves to be called a theologian," he wrote, "who comprehends the visible and manifest things of God seen through suffering and the cross." Jesus crucified is who God is, he maintained. Jesus is the beginning of theology and the very true proclamation of God's truth. But it is a truth whose truthfulness is confessed rather than arrived at.

Yet how can someone confess what he or she is unsure of? One may as well ask, "How can a man be born when he is old? Can he enter a second time into his mother's womb?" And the answer is the same: "That which is born of Spirit is spirit." Our minds, our selves, were at one point chaotic, dark, and demonic. But then the Spirit came, hovering upon us, overshadowing us, and we came alive and confessed. Now we see by resurrection light, and no longer with the sputtering candle of philosophy. Light is now ours, the true light that gives light to everyone. "For God, who said, 'Let light shine out of darkness,' has shone in our hearts to give the light of the knowledge of the glory of God in the face of Jesus Christ" (1 Cor 4:6).

Now the point I'm getting at is this. Our friend is correct who said that the historicity of Scripture is an important part of the confession of the Christian, giving him confidence and an answer to cultured despisers. But historicity is not the foundation. The foundation is that by the will of the Father and the quickening illumination of the Spirit one has beheld Jesus, and having beheld him, nothing else will ever truly matter ever again. "The [way, the truth, and the] life was made manifest, and we have seen it, and testify to it and proclaim to you the eternal life, which was with the Father and was made manifest to us" (1 Jn 1.2). And now "the Spirit himself bears witness with our spirit that we are children of God" (Rm 8.16).

This “bearing witness with our spirits that we are children of God” means that, beholding the cross, we are called beyond the doubt and skepticism of the philosophers to embrace a holy inquiry. This holy inquiry is called by St. Anselm fides quaerens intellectum, or faith seeking understanding. Augustine put it this way: "Understanding is the reward of faith. Therefore do not seek to understand in order to believe, but believe so that you may understand." He then quotes John 6:29: "This is the work of God, that you believe in him whom he has sent.”

So, Theophilus my friend, if you long to know without doubt that there is an incredibly loving, interactive God who created you, cares for you, understands you, and encourages you to know him, then examine your heart. And if the Spirit does not testify within you so that you cry "Abba, Father!" then call upon the name of the Lord. He will give you a new heart, and put a new spirit within you. He removes the heart of stone and gives back again a heart of flesh. It is the good pleasure of the Father in heaven to give the Holy Spirit to those who ask him. This is the Spirit of him who raised Jesus from the dead, the Spirit of wisdom and of revelation. As you seek, remember that God is light; in him is no darkness at all. His presence will go with you, his wisdom instruct you, and he will give you rest.

There is no confession apart from his rest, and what theology exists is only a theology of glory which calls evil good and good evil, adding sin to sin so that one becomes doubly guilty. Therefore the struggle with which we began is not the struggle of a slave, but of an heir. And it is not a struggle of the solitary, but the awestruck, happy liturgy of the redeemed people of God ascending as incense from every tribe, tongue, time, and nation to the throne of God.

This, then, is the proper foundation to begin asking about the Scriptures and their interpretation. Without it, we fall under the condemnation of the Pharisees to whom Jesus said, “You search the Scriptures because you think that in them you have eternal life; and it is they that bear witness about me, yet you refuse to come to me that you may have life” (Jn 5.39-40).

I hope you do not feel I am dismissing your questions by asking you first this most important of all questions. That is not the case. Instead, by asking I face you open-handed as seriously and as soberly as I can.

Warmest regards,