Friday, September 05, 2014

What Jody is After

So jumping right in for the initiated: What is Jody after? What is his point? (And i may have to lump my cousin into it as well.)

Examining his statements carefully, here's my restatement. He'll have to tell me if I'm off the mark or not.

Jody cares because he wants liturgy to matter.

He says that the liturgy matters because we are material beings who worship. Worship requires a grammar because it communicates meaning. This grammar is not meaningless, but communicates the values of its culture--the church, the family of God. The church is the family that God has called into being, he says, and it is the subsequent sacrament of the first sacrament, Christ.

This grammar isn’t concerned with what things are. It is a passive sentence that tells the story of an active God. It is concerned with what things are for. In the performance of its grammar the community confesses God’s acts and interprets what they mean to itself. Misconfessing this at the very least spoils the message and at most is blasphemous!

The words and prayers of the church are traditions that affect real change. Blessing is a way the grammar sets something apart to be what it is for. Ordination is a way the grammar sets someone apart to play a role in the corporate performance. And there is also a reordering of the thing's identity, where identity is ontologically dependent upon use.

In baptism and ordination, this ontological change is called an indelible mark, which changes the individual to reorient them toward Christ. It doesn't mean they are super-people, but it is a testimony to the activity of the Holy Spirit.

The Spirit and the incarnation make all of this more than a (Wittgensteinian) parlor game. The incarnation, because matter is rightfully part of the divine plan. It is useful and necessary. The Spirit, because it affects creation of world and word and so gives direction and structure to the community itself. (This last paragraph is a muddled mess of theological gobbledygook.)

At this point, Jody gestures toward "the power of language to reorient and redefine." We're both boxing at Wittgenstein's speech-acts now.

At any rate, I think he’s hearing me say that liturgy does not matter. That’s not what I’m saying. I’m saying that the liturgy is a family custom, and like all family customs, it occasionally does things that are a little crazy--but that’s okay.

I'll add more to this post--this is just half-written. It is a draft to get something out there.

Thursday, July 11, 2013

Give up the myth of human progress?

“Clearly science and technology have put extraordinary knowledge and power at the command of beings who come into the world with the same brains and mental faculties as humans born 5,000 years ago. Any victory over our species’ destructive tendencies will likewise have to come from institutional and cultural development. We know what humans are capable of: in the wrong circumstances and with the wrong formation, they can behave monstrously. The hope for progress can consist only in the belief that there is some form of collective human life in which the capacity for barbarism will rarely find expression, and in which humans’ creative and cooperative potential can be realized without hindrance. [John] Gray regards such hope as utopian, but it can be supported both by experience and by reflection. Moral and political progress is inevitably more difficult than scientific progress, since it cannot occur in the minds of a few experts but must be realized in the collective lives of millions; but it does happen. Experience shows that some societies are much more decent than others, and that in fits and starts, cruelty, oppression and discrimination have become on balance less acceptable over time.” Thomas Nagel. NYT. 7/5/13.

In this quote, Thomas Nagel defends the myth of human progress. And I wonder, reading it, whether the holocaust is not made more horrible by its deep challenge to this myth. The German people were (and still are) shining examples of the best of Western civilization--and evil still took and used them without interference to accomplish in reality the things that only exist in our unspoken nightmares.

I've been thinking off and on about this myth of human progress. I love this myth. Believing it gives my life and my own civilization meaning and purpose. I am at the crest of thousands of years of progress. I am better than my forebears. And those who come after will be better, happier, and healthier than me. Their children would be geniuses to my age. And yet, lets face it, this is a thoroughly secular eschatology bound on a highly selective reading of the facts.

In short, the Judeo-Christian worldview should have none of it. It is a kind of antichrist. One of those lies that hold down the truth.

On the other hand, can someone just give up the myth of human progress? Can someone simply rewire the operating system they imbibed with their mother's milk?

Thursday, April 18, 2013

Hacking Language Acquisition: Some Thoughts.

What is language? How are languages constructed? And what are the best ways to quickly learn a language? These and similar questions have been rolling around in my mind over the last few months.

It all started because I have been teaching Greek to some dedicated friends. Now if you know anything about classical languages, you know that many people every year take these languages, such as Greek, Hebrew, and Latin, in graduate school as prerequisites toward further study. And you may also know that the dropout rate post-graduation for the enjoyment and retention of these languages is not good. Now I and my friends are putting a lot of effort into learning Greek. I don't want them to be a statistic--and, frankly, I don't want to become one either. So I ask the question, "What does it take to escape the attrition trap and break through into enjoying and so sustaining and even growing, in a language?"

I work a job, so I don't have a lot of time for deep reading and research about this. But here are a few things I've learned.

Immanuel Kant was a Linguist

According to Immanuel Kant, human beings construct the world along two planes: extension and change, in other words, along space and time.

  1. The noun system captures space
  2. The tense system captures time

Both employ the same method to do so.

Beginning with a root, prefixes and suffixes are added or removed to fix that root within a matrix that assigns it jobs--being a direct object or a subject, for example.

We memorize a lexical form (lemma) of a word, but understand that the lemma really exists as a root that can manifest anywhere along a matrix.

Understood in this way, there is a deep repetition, a strategic recurrence, between verbal and noun systems. Furthermore, taken together they answer Kant's qualifications for building the kind of world that human beings experience (an appropriately phenomenological world.)

With this in mind, one can begin to interrogate a language and ask why this matrix is chosen and not another one? Why five or seven cases and not thirteen or twenty-one?

Open- and Closed-Class Words

Linguists parse the words of a language into two categories: open-class and closed-class. Open-class words are the usual nouns, adjectives, adverbs, and verbs that we usually associate with language. Open-class words are also called content words or lexical words. These are the words that carry the meaning in sentences, and it is interesting that new words that come into a language are always open-class words (thus the "open.") Closed-class words, also called function or grammatical words, are things like determiners, qualifiers, prepositions, conjunctions, and intensifiers. They serve a variety of functions, as their names demonstrate. They do not, themselves, carry meaning in the way that content words do. Instead, they serve to grammatically connect the open-class words. There are far fewer closed-class words in a language than open-classed words by several standard deviations. And, here's another kicker, unlike open-class words, to which new terms may be coined or invented or taken whole from another language, closed-class words are stubbornly fixed. Languages have all the function words that they need, and it is near impossible to delete or add to their number, even when it would be useful to do so. Closed-class function words, then, are the inner skeleton upon which the open-class content of the language is attached. In short, if you are going to learn a new language, get to know function words well, and become adept at watching for them when reading. "Once the framework of grammar has been transferred to long-term memory," says author Tim Ferris, "acquiring vocabulary is a simple process of proper spaced repetition." I've read that an ESL teacher, for example, should aim at the perception of the structure of a text before the individual words. So by zeroing in on function words, one will learn more about a target language than would be done through memorizing hundreds of content words.

". . . the task facing the child is not to learn how language works, starting from scratch. Instead, since children are born with an implicit knowledge of languages in general, they have to figure out how the particular language (or languages) they hear functions. For example, all languages have something like prepositions, words that show relationships among things (The book is on the table). In languages like English, these words that show position come in front of the noun, so they are called prepositions. In other languages, these words follow the noun, so in those languages, a child would encounter sentences with this pattern: The book is the table on.In such languages,these words are called postpositions because they come after (post), not before (pre)." [1]

An Interesting Note from a Translator

"A translation is not just turning one language into another. It’s also about opening up a foreign mindset . . . to hear the text and experience it absolutely as intensely as I can, allowing myself to fall into its way of thinking about things. A good translator has to be an interested sponge when it comes to the idiom and cultural setting of the language he or she is translating from [--] fascinated by the picayune details of language. Every complex translation would be somewhat different if we had done it a month before, or a month later, or even an hour."[2]

Benny the Irish Polyglot Says to Ditch the English

One of the biggest lessons Benny learned in his transformation to langauge polyglot was to give up his English as quickly as possible. Be stupid, he says. Make mistakes. But leave your English behind and deal with the frustrating, bewildering, but language-acquisition-fast method of leaving your comfort language behind.[3]

Now, where dead languages are concerned, you can't just avoid your English-speaking expat friends and hang out with the locals. What you can do, however, is to up your exposure and go cold-turkey on sections of text. For example, one could go native on the Gospel of Mark or on a chapter of the Gallic Wars or something.

Tengentially, get away from code-thinking as soon as possible. A new language is not just a code for your old one. Stop occasionally and forget your native tongue while holding the new language in stasis like wine held in the tongue. Enjoy and parse out the sensation before swallowing.

A Large Bit on the Importance of Reading

"Studies of vocabulary development through reading give further support to the claim that most vocabulary is acquired. Anderson and Nagy carried out a series of studies on how children acquire words during reading. They found, for example, that there is about a one in twenty chance that a student will acquire a new word from seeing it in context. . . . If students see a word more often, they are more likely to acquire the word.Anderson and Nagy report that the average fifth grader reads for about twenty-five minutes a day. They comment "This number is certainly lower than would be desired, but it translates into about a million words of text covered in a year." If even 2 percent of the words were unfamiliar, students would encounter twenty thousand new words in a year. If they acquired one out of every twenty, they would acquire at least one thousand words a year.

"These authors go on to say, "An avid reader might spend an hour or two a day reading, and thus cover four or more times as much text. The rate of learning from context for self-selected text is likely to be closer to one unfamiliar word in ten than one in twenty. For children who do a fair amount of independent reading, then, natural learning could easily lead to the acquisition of five to ten thousand words a year, and thus account for the bulk of their annual vocabulary growth."

. . . .

"A study carried out with adult speakers of English and students learning ESL showed that both groups were able to define many new vocabulary words just from reading a novel.

. . . . .

"[Another researcher] found that picking up words from reading is ten times faster than learning words through intensive vocabulary instruction. However, they also suggest that some vocabulary study can be useful. They encourage teachers to develop a sense of what they call "word consciousness." "We believe that the goal of instruction should be to develop what one lexiphile has termed word consciousness."

. . . .

"One of the benefits of acquiring vocabulary through reading is that students develop a more complete understanding than the superficial knowledge gained by memorizing a definition. . . . When students see and hear a word in different contexts, they build a subconscious understanding of that word. Extensive reading is the best way for students to build a rich vocabulary." [4]

Lend Language Your Ears

Human beings use air, their vocal chords, and the contraction and expansion of their oral and nasal cavities to express or suppress sounds. The Phonecians were the first to begin writing down sounds phonetically rather than resorting to picture language. Therefore writing for every other language that does this, which are most languages, is a kind of shorthand for the contraction or relaxation of organic sound production. Writing isn't where meaning lives. Writing tells you how to move your body to make the kind of sounds that produce meaning. Generally speaking: language is an oral thing. When you look at a page of English or French or Latin or German, you are seeing instructions for producing the sort of sounds that community of speakers agreed on. What this means is that when learning a new language, you need to keep your mouth and ears involved. Never read silently, and listen to as much as possible.

Set Short-Term Goals

One thing that Tim Ferris and Benny the Irish Polyglot talk about is not dying at the hands of language perfection. Language courses tend to teach against an ideal of perfection, and this, they say, kills motivation. Language is the way that minds connect with minds. It is the tool for making connections between people and cultures. So learn what you have to have to get to the point where you can start connecting and let the rest take care of itself. That's what they would say.

This is not as applicable where classical languages are concerned, but it is still helpful. If the point is to make connection, then read for meaning first before you read for parsed perfection.

Ferris and Benny talk about setting smaller short-term goals: to carry on a two-minute conversation, to order coffee, to read a weather report, to read a chapter of Plato without reference to a lexicon.

And finally, I'm thinking about the idea of micro-grammars within languages right now. It is one thing to learn how to say hello or to construct a simple sentence, but the conversation dies quickly thereafter because of a lack of micro-grammar. A micro-grammar is a word I use to talk about spheres of language: the weather, sports, the office, family life, religion, what's new in politics. Native speakers move from micro-grammar to micro-grammar as easily as moving from room to room, but even in native language there are times when one must acquire new words for a new environment (the micro-grammar of your city or neighborhood.) I'm not sure how yet to best incorporate this into language acquisition, but when I think of various children's books I can see that they recognize the existence of such grammars and take steps to teach words accordingly.

The Two-Face Technique

From the beginning, I have thought about language acquisition using the metaphor of climbing Mount Everest. There are a number of reasons for this. The effort and focus that climbers display even years before their attempt. The social, material, and physical expense and exertion, if not pain, required to successfully summit the mountain. The way that climbing Everest has become a well-understood and apportioned process of having such-and-such gear and moving up the mountain through established camps. The fact that hundreds if not thousands of people make an attempt every year (you aren't any different from them.)

Now one element of the metaphor that has become very useful is the difference between Everest's south and north faces. The south face is the (qualified) easier of the two. The south face was the way Hillary and Norgay made the summit in 1953. The north face, however, is a beast.

For my purposes, each face represents a technique of language acquisition. The north face represents what we usually think of when we think of learning a language: wrote memorization and paradigms. The raw violence of making our minds sink new synapses into new patterns unattached to any other familiar information. The south face represents the way native speakers learned their language. South face techniques are reading aloud and reading a lot. They are fun and easy, and in my experience they charge a session with energy and life. North face exercises feel like work. South face s just fun.

The trouble, of course, is that south face takes a good long while and a lot of exposure--far more than we'd achieve even through a course of immersion. (Non-native speakers tend to achieve a homeostasis of "good enough," which is why I say even immersion is not sufficient.) Few people have the time or patience for such an approach. On the other hand, north face is not so great either. I already mentioned the abysmal rates of attrition by graduates who have taken even years of a language in formal instruction. So what to do?

My hypothesis is that good language work needs both faces delivered in appropriate amounts. Overall, south face activities are best, but north face activities should be used to speed up the dial. You swallow a paradigm or construction quickly, via north face, and that reinforces and makes your south face work more capable. The resulting success pumps endorphines into the whole and keeps the arrow of acquisition moving forward.

Linearization or Discourse Up and Down

Dr. Steven Runge talks about something called the linerization problem. "Linearization describes the fact that we can only produce one word at a time, one sentence at a time," he says. That means that "the reader/hearer can only take in one word at a time, one sentence at a time." So the hearer or reader has to construct the architecture of meaning that's coming at her, and she gets one shot at it. How does she do it?

Runge says that she does it through two methods. First, she uses deictic markers or textual markers that help her structure the stream of information she is hearing or reading. These tell her what is more important or less, whether the subject has changed or is going in a new direction. Citing the work of linguist Walter Kintsch, Runge calls this method construction.

As it turns out, there is a lot of debate about construction. After all, if meaning is really found in the big structures of language, then why is it that so few of those handy markers are present at that level. And why are so many found at the level of sentences? Runge says that it is a matter of debate whether meaning is top-down or bottom up, but he thinks it is both.

What he's really fascinated by is how textual markers on the sentence level go on to build those big, discourse-level structures our reader uses for understanding. "Call me silly," he says, "but it would seem that if one has properly understood how a device operates in simplex context at the lower-levels, then one will be in a much better position to adequately describe its much more complex interaction with other features at the higher-levels of discourse processing, i.e. the integration stage." Discourse Analysis needs its more humble, lower-level cousin, discourse grammar. "Lower-level structures are they keystone to understanding higher-level structures."

The second method the hearer or reader uses to extract meaning is more contextual. Runge calls it integration.

The newly forming mental representation of a text doesn’t exist in an isolated silo of our brain. Instead Kintsch has demonstrated that we integrate the new one into our existing, larger mental representation. This integration is not simply with the earlier portion of what we’ve read or even other books we’ve read, but with the sum of our knowledge about the world and how it operates based on our prior learning and experiences. . . . Differences in background knowledge, goals, and presuppositions all play a role in how we process a text. We don’t just read a text, we also integrate it with what we already know.

Integration is why two people can read the same text and get completely different answers. They may both be doing the construction side at nearly the same level, but their integration is widely different, as are no-doubt their life experiences. "Our own mental representation of the world . . . plays a huge role in how we process new texts or communication."

__________


[1] David E. Freeman and Yvonne S. Freeman, Essential Linguistics (Portsmouth, NH: Heinemann, 2004), 14.
[2] Dennis Abrams, "The Art of Translation: Something New, Something Old" Publishing Perspectives
http://publishingperspectives.com/2013/04/the-art-of-translation-something-new-something-old/ accessed April 2, 2013.
[3] Watch his interesting TEDx talk at http://youtu.be/HZqUeWshwMs
[4] Freeman. Linguistics. 2004.

Friday, November 02, 2012

The Three Options

Bluntly stated, there seems to be three options available to the publicly thinking Christian today. And by publicly thinking I mean the Christian who is interested in thinking outside of the ecclesiastical circle, the Christian interested in speaking in the public square.

The public square is always tended by a gatekeeper, and as I've said before, the gatekeeper in our day is Enlightenment secularism manifested culturally in the materialism of the hard sciences. This is to say that the public square and all conversation allowed in the public square is implicitly in dialogue or in open agreement with the philosophical assumptions of science.

So, then, the public Christian has three ways of speaking. First, he may fully adopt materialist presuppositions and compartmentalize his mind, so that he is a Christian in some spheres and a secularist in the other. A subset of this is that he may refuse to compartmentalize the two and live buffeted by the tension. Second, he may retreat from the the metaphysics of modernity into a neoclassical metaphysics. My hypothesis is that the Intelligent Design community is pursuing this option, and that this option fuels the success of that effort to revive the argument from design. Third and finally, he may adopt the extremely new metaphysics of process or emergence.

I believe that an orthodox confession can be made within any of these three strategies--and, for me, this is a very new idea! Using the metaphor of a car, where orthodox confession is the body, the underlying metaphysic is its engine. One can swap out an engine and retain the integrity of the body (to lesser and greater degrees.) And, stepping further out, my guess is that the publicly thinking Christian cannot remain neutral forever but will, by steps known or unknown, inevitably choose one of the above three options.

Final caveat: in the course of time more options may develop. These are the ones I'm aware of presently. The Radical Orthodoxy movement of ten years ago (to the present?) was, to my mind, a variant of option two. And it is demonstrable that process or emergence has yet to produce what could be called thick orthodoxy, or something that the Nicene Fathers might recognize and agree with. This, however, does not mean it cannot, but only that it has not at this point in its development.

Tuesday, October 23, 2012

Revival or the saints?

On Friday, December 17, 2010, Pope Benedict XVI addressed students throughout the United Kingdom during his apostolic visit. He could have spoken apologetically or enumerated the evils of modernity. He could have sounded a prophetic warning, or he could have just said something politically winning and nice. He certainly started that way: "First of all, I want to say how glad I am to be here with you today." No doubt not a few expected the rest to follow along in the same manner--but it didn't. The Pope had scarcely finished thanking everyone when, to my astonishment, he began talking about saints. "I hope that among those of you listening to me today there are some of the future saints of the twenty-first century." My astonishment was (and is) not that he talked about saints, but that he was inviting the students who heard him to become saints. He wasn't talking St. Catherina of Siena (1347-138), daughter of an Italian cloth dyer, but appealing to a would-be St. Ashley Butler of Leeds (1992-20??), marketing major and a soccer enthusiast and daughter of a network administrator.

"Perhaps some of you have never thought about this before," he continued. "Perhaps some of you think being a saint is not for you. . . . When I invite you to become saints, I am asking you not to be content with second best. I am asking you not to pursue one limited goal and ignore all the others. [I am asking you to put those other goals in the context of true happiness, which ] is to be found in God.

"We need to have the courage to place our deepest hopes in God alone, not in money, in a career, in worldly success, or in our relationships with others, but in God. Only he can satisfy the deepest needs of our hearts. . . . God wants your friendship. And once you enter into friendship with God, everything in your life begins to change. As you come to know him better, you find you want to reflect something of his infinite goodness in your own life. You are attracted to the practice of virtue. You begin to see greed and selfishness and all the other sins for what they really are, destructive and dangerous tendencies that cause deep suffering and do great damage, and you want to avoid falling into that trap yourselves. You begin to feel compassion for people in difficulties and you are eager to do something to help them. You want to come to the aid of the poor and the hungry, you want to comfort the sorrowful, you want to be kind and generous. And once these things begin to matter to you, you are well on the way to becoming saints."

It is well known that Benedict XVI has dedicated his pontificate to combating the secularism of the West and the spiritual indifference of its people. His life has been a long exposure to modernity in all its forms. As a young man, he was well-equated with Nazism, having nearly been a Hitler Youth but later drafted into the military. After the war, he became a student at the Ludwig-Maximilian University in Munich. There he studied Fyodor Dostoyevsky, Martin Heidegger, and Karl Jaspers and became a fan of the Roman Catholic neo-Kantian theologian Karl Rahner. He witnessed the social turmoil and youth uprisings of the 1960s as a professor at Freising College. He would later teach at the universities of Bonn, Tübingen, and Regensburg.

So given his exposure to modernity and the overtly stated agenda of his pontificate, Benedict XVI chose to talk to the youth of the West about sainthood.

Contrast this with another doctrine aimed at spiritual indifference and even secularism. Contrast it with the doctrine of spontaneous, regional, and supernatural revival developed best in the United States by Reformed theologian Jonathan Edwards. A New England Puritan, Edwards witnessed the First Great Awakening of 1733-35 from his post as minister in Northhampton Massachusetts. The First Great Awakening extended from Great Britain to the American Colonies during the 1730s and 40s. It was the result of dynamic preaching that urged hearers to repentance and personal renewal. It turned the face of frontier Protestant Christianity inward away from corporate liturgy and doctrine and toward private emotion and personal response. People were encouraged to examine their lives, to repent from sin and commit themselves to moral living. Edwards wrote a book based on his experience of the Awakening. It was called A Treatise Concerning Religious Affections and was published in 1746.

In Religious Affections, Edwards constructs a method of self-examination for separating true religious experience from its counterfeit. New faith, he says, must be tested to see if it is genuine. Good works in themselves mean little. Edwards concludes that the twelve fruits of the Spirit are the true signs, and of them, love is the chief. "Love is the chief of the affections, and as it were the fountain of them." And, he continues, only spiritual men, that is human beings awakened by the Holy Spirit, exhibit its fruits.

Since the First Great Awakening, evangelical churches have prayed for revival in the face of growing secular indifference. With Edwards they have understood revival to be a changing of the mind and emotions of each human being toward repentance and amendment of life. With Edwards they have understood this to be completely a work of the Holy Spirit. And, keeping the First and subsequent Awakenings in mind, they have understood the phenomenon of Spirit revival to be pneumatic, unexpected, and in some way geographically located. The revival tradition has taken on other characteristics since Edwards as well, especially under the influences of charismatic preachers like Billy Sunday and Billy Graham and the pentecostal movements of the twentieth century.

My experience of revival as it is practiced in evangelical churches today is pedestrian. Revival announced in an American church today means a jejune week of long evening Sunday services. There will be jeremiads against the evils of the culture and full-throated appeals toward repentance--hear the echoes of the tent-meetings of Methodist perfectionism, of "praying through" to the "second blessing", of tongue-speaking and people slain in the Spirit. There will be prayer, fervent and pious prayer. And there will be appeals for revival made to God--where revival is like a spiritual thunderhead and the praying congregation Elijah kneeling on the dry mountain.

This post has gone much longer than expected, but here are two responses to the challenge posed by Western secularism: popular Protestant revivalism and Roman Catholic sainthood. There is room here, I think, for a compare and contrast that would lead to interesting conclusions. I hope I have laid a very little of the groundwork here--and a general work it has been. For myself I think that the revival tradition makes little sense in its expectation for a falling of "latter rain" upon the citizens of [insert town, city, state, or country.] But there are commonalities between Benedict's challenge and the challenge made by Edwards and picked up in the evening preaching from revival pulpits, save Protestant recoil from the papist grammar of "sainthood" even as they (we) pray for private sanctification and public rejuvenation.

Tuesday, October 02, 2012

The argument from neglect: a good paragraph

Breaking my usual essay-only obsession to post a good paragraph from Philip Clayton and Steven Knapp. The Predicament of Belief: Science, Philosophy, and Faith. OUP. 2012.

"To the extent we think of God as a personal, active being, we inevitably apply [human moral] standards. Frankly, and I say this with the utmost reverence, the personal God does not pass the test of parental moral responsibility. If God is really personal in this way, then we must conclude that God has a morally abysmal record of inaction or ineffective action. This I shall call the argument from neglect . . . To meet this objection, a defender of personalistic theism has to do two things: first, show that there may be a good reason why a personal and active God, if there is one, either cannot or chooses not to perform the acts we would expect a benevolent God to perform; second, avoid what is in effect the reductio ad absurdum of constraining divine action so extensively that it becomes pointless or irrelevant. This chapter is devoted to addressing these two challenges." [Ooooh, that last sentence is like a spooky orchestra hit for theology nerds!] (56)

Thursday, May 10, 2012

Armchair Linguistics

There’s been a lot of talk about language on the web recently with the publication of cultural anthropologist and linguist Daniel Everett’s book Language: The Cultural Tool (Pantheon 2012). In it, Everett argues that culture is the mother of language. It is a direct challenge to the dominance of Noam Chomsky's view, that we are hard-wired for language.

The debate between Everett and Chomsky is an important and interesting one. Their two poles, culture and biology, mark out a linguistic field that serves my purpose. There are aspects of both views that appeal to me. I agree with Chomsky that language is irrevocably grounded in biology. And I agree with Everett that language is best seen as a tool. Language is a tool we use strategically to carry the freight of meaning from ourselves to others.

Over the years I have come up with my own linguistic rules of thumb. Untutored and unlettered, and lacking in the sophistication of even an undergraduate in the field, I nevertheless hope to publish them for the hope of future development, and for that reason I welcome your comments.

1. People mess things up through use

Languages slowly devolve toward grunts and clicks over time. Prepositions, for example, that used to denote fine shades of difference come to say pretty much the same thing. Similarly, verbs for actions that we do a lot, such as "bring" or "go" or "give" or "enter/exit", become linguistically complicated. The rococo nuances available to a language during an Elizabethan high point erode into simpler modes--shorter sentences, more practical applications, in short, toward the Strunk and White.

2. Language--this is my middle-way between Everett and Chomsky--is about feel and sound

To me Chomsky is right to a degree, in that we only have a finite amount of sounds--even if practically that finitude is infinite, but Everett is also right, in that our culture will choose a set from that near-infinite set and create all words from that chosen subset. Languages could, I suppose, then be classified by subset, and dialects by the agreement or disagreement with the governing subset.

In Greek, for example, we discover a set of sounds (and human sounds are made by how the mouth feels, so feeling may enter, too, into how a culture chooses or dismisses a sound) chosen by that culture. And what culture? Oh, the culture that figured out musical scales and poetical meter. Plato has a section of The Republic devoted to poetic meter. Aristotle also lectured on it. People took it seriously. They listened. Therefore, we are invited to listen as well. At its deepest level, a language is a way of playing jazz on the most important instrument in the universe--the human voice. To learn a language is to be invited to step into the set and learn to jam, and we, like musicians, must learn to listen as well as to play. Playing is good. Listening is better. (A corollary to this is that language is a craft, not a sum.)

3. The observable world makes the rules

There are men, women, and things. There is us and those around us. There is the thing we throw and the thing thrown at us. The world presents problems to every culture, to every speaker, that he or she must solve. Some might also say that one's vocabulary determines what you see in the world. It is a romantic idea. IMHO, the jury is still out on that one.

4. Vowel sounds are the silly putty of language. Consonants, not so much

If you were trying to send a message to people 100 years from now, don't hand it to the vowels. You'd better leave it with the consonants. The effects of this choice are hobgoblin and pervasive.

5. There are two strategies

To my mind, there are two basic strategies for encoding meaning into the extended musical riff we call a sentence: word order and inflection. Your language may borrow something from the other team, but it always emphasizes one or the other.

English is a word-order strategy. Our subjects come before our objects. Greek is an inflected language. Prefixes and suffixes are added or removed from a word to make it perform in different ways.

The gulf between these two cannot be overestimated. They are not only two strategies, they are two ways of hearing. I'm just starting to realize how much this is true.

Saturday, March 24, 2012

Dinnerstein and Downes on finding your project

One thing I’ve said before, and it deserves repeating again, is that academic theology is not interested in discussing the how to of the trade. There is, of course, a practical reason for this. There are no jobs, and so only the most brilliant or lucky should survive. Therefore, the absence of deep pedagogy is a mercy killing; and who can argue with that coming out of this Great Recession.

Nevertheless, it is a truth to my mind that theology requires the pouring out of deep pedagogy. (I hesitate to use the word mentoring since the forces of management have largely taken that over.) Theology contemplates the pouring out God, and should make pouring out people. As one who has zero experience in the classroom, perhaps that relationship suffices. But I can tell you from the student side that it does not. It is like watching virtuosos playing the piano, but never getting a lesson. The student interested by nature in theological progress is left to their own devices.

I have found that listening to artists in other artistic disciplines is a great help in filling in the pieces. Artists aren’t reticent, but eager to talk about how they do the work. Time and again, the way that a sculptor or painter approaches the canvas or clay, or the way an author breaks down a plot, carries great lessons for the would-be theologian. Take the following exchange between pianists Lara Downes and Simone Dinnerstein.[1] The topic is Dinnerstein’s best-selling 2007 recording of J. S. Bach’s Goldberg Variations. Dinnerstein was pregnant with her first child while preparing for recording. In this section of their discussion, they touch on finding and being confident in your own voice, on respecting great voices without being dominated by them, and on being taught and (briefly) teaching.

Lara: You’ve referred to the notion of "playing for yourself" having acquired new dimension when you were pregnant, and I really related to that. When I was pregnant, I felt both more connected to myself, and more independent, than at any other time before, and I remember feeling very strong in my musical choices at that time.

Simone: You feel like you are a little world of your own.

Lara: Yes, really. Can you articulate some of the truths about the Goldbergs, in particular, that emerged in that little world, for you?

Simone: I don't know how much of my musical decisions were directly related to being pregnant. But I did feel that I needed to really listen to what the music was telling me and not think about any received wisdom about how I should interpret the music. I had of course listened to many recordings of the Goldbergs over the years, but I was trying to approach the score with a clear head. In particular, I started to think a lot about what constituted pulse and rhythmic expression, about the shapes of the phrases and how they would be sung, or breathed.

Lara: I think that's what I meant about the freedom of choice I experienced during my pregnancy. It was something about letting go of preconceptions, both others’ and my own, and exploring freely. Before that time, what had been your strongest influence with the Goldbergs? What was your relationship to the Gould recordings?

Simone: I really loved the 1981 recording and that had been my first introduction to the Goldbergs. Hearing him play the aria was one of those moments like an epiphany that you always remember. I listened to it obsessively over the years, as well as all of his other recordings. I made me feel quite intimidated, like everything that needed to be said had been said. But then in my late twenties I started to listen to other pianists, and I had another epiphany when I hear Jacques Loussier's recording of the Goldbergs. It opened up a completely different world to me.

Lara: Well, I would absolutely have bet that you came first to the 1981 recording! Even though your own interpretation is completely different/unique, somehow if I had to guess... My first was the 1955 version, and it makes a tremendous difference which you hear first, doesn't it? It's like an imprinting.

Simone: Yes - it's funny how that happens.

Lara: Let's talk about all the ancestors. When we're very young, we're so apt to copy, and so warned not to, and then I think we go through this long process of establishing independence. But then I feel like all the ghosts kind of come back into the room, and we can allow them to speak to us and share their contributions. After all they've made us: the generations of musicians who have come before.

Simone: That's beautifully put. My first teacher, Solomon Mikowsky, used to play many recordings for me of the works I was studying. We'd listen together and talk about the different musical choices that were made. he did this from when I was around ten. Then my second teacher, Maria Curcio, was quite different. She had an extremely specific way she wanted me to play and the way I learned from her was by becoming her. For a while I lost myself in her, but I learned so much. After that I studied with Peter Serkin, who was a very searching musician and wouldn't give me any answers, just looked and explored the music with me. So after all of that, it took awhile for me to get the sounds of my teachers out of my head and feel strong enough to make my own decisions without any guilt.

Lara: Yes. And whenever I work with young pianists, I'm aware of how difficult it is as a teacher to do anything but "show the way". It's a tremendous challenge to resist that easy communication!

Simone: Personally I think that is the best way to teach.

Here are two other examples of listening to the liberal arts. In the first, Peter Sellars talks about democracy and performance. In the second, Samuel Nigro gets under the skin of how modernity feels.

[1] Lara Downes, March 22, 2012 “Looking at the Goldbergs Part II -- Simone Dinnerstein” On the Bench: Conversations with Other Pianists, Feb. 12, 2012, http://onthebenchconvos.blogspot.com/2012/02/looking-at-goldbergs-part-ii-simone.html

Wednesday, February 29, 2012

Alan Lightman sounds the alarm

Sound the alarm! Science’s priestly reign over the public square may soon be overthrown! The fortress of doubt could be breached. Already, the foundations of theoretical physics are straining and cracking. The barbarians are at the gates. They will torch the manicured gardens of reason. Who then will keep order? What of the state? How will the West survive?

To understand how we have reached this precipice, one need only look at developments in theoretical physics over the last few decades. Theoretical physics is the purest expression of science. Exploring the universe with sophisticated and occult mathematics, it searches for the deepest and most explanatory properties of nature. In the name of Isaac Newton, its faithful hunt natural laws as unapologetic Platonists. Their holy grail is a master principle that will explain everything.

According to cosmologist Alan Gurth, “Back in the 1970s and 1980s, the feeling was that we were so smart; we had everything figured out.” It was true, theoretical physicists had come an amazing distance. They had accurately modeled three of the fundamental forces of nature: the strong and weak forces and electromagnetism. No one doubted that the remaining fourth force, gravity, would soon be wedded to quantum physics, with the result that a final theory--a theory of everything--would emerge. In the light of the theory of everything, the universe would no longer be a mystery, but a necessity. Enter the multiverse.

So much has been made of the multiverse on television and in the movies that it seems silly to explain it. Nevertheless, the multiverse is a cosmos fecund with an infinitude of universes, each with an unpredictable and unique set of physical properties. Most would be stillborn wastes of dead rock or awash in the violent spray of hyper radiation. But the tiniest fraction of a fraction of these might contain complex organisms or, rarest of all, intelligent life.

What makes the multiverse idea so necessary to cosmologists is a characteristic of the one universe we do know about--our own. As it turns out, our universe is stunningly, amazingly, fantastically, and completely fine tuned to support life. This characteristic has only grown more miraculous as physicists have better understood how delicate and complex it all is. I imagine that somewhere in the first quarter of the twentieth century this fine tuning was ignored in public and rarely discussed in private. Back then, Einstein’s general relativity was upsetting the comfortable givenness of the solid-state model of the universe. But as our models have become more complex, the evidence of fine tuning has grown to an acuity that no one can ignore.

Such fine tuning forced working physicists into a conundrum. They could roll away the stone and resurrect the argument from design, much to the smug satisfaction of the Intelligent Design community. Indeed, many theists and polytheists argue that the fine-tuning of the universe suggests a transcendent designer. Francis Collins, for example, at the 2011 Christian Scholars’ Conference said, “To get our universe, with all of its potential for complexities or any kind of potential for any kind of life-form, everything has to be precisely defined on this knife edge of improbability. . . . [Y]ou have to see the hands of a creator.” But religion is not an option for science, even though many scientists hold religious beliefs. Science as science cannot embrace unqualified and unrepeatable hypotheses. If it should do this, it instantly becomes another propagandist in a thoroughly political universe, opening the way to the naked power of fascism, the hive mind of socialism, or the cultic and bloody mysteries of theocracy. Here be barbarians.

As it stands, physicists have two options: string theory and the multiverse. String theory has been around for decades. It suggests that the smallest bits of stuff that exist are vibrating, tiny, one-dimensional loops or strings of energy. The differences in their vibrations give rise to the fundamental forces and particles familiar to physics. Many hoped string theory would be able to unify gravity with quantum physics. And if string theorists could pull off this correlation, they would realize the Platonic ideal of a fully explicable cosmos. But, there remains a problem.

At its inception, string theory required a number of extra dimensions: seven at the beginning, with each dimensional fold corresponding to a different universe. Now, however, that number has grown to 10 to the 500th possible universes. It may as well be an infinity, explaining everything and so explaining nothing. Never mind that, as of this writing, string theory has not been supported by a single experimental result, nor has it suggested demonstrable areas of further investigation. It's failure leaves only the multiverse.

Lightman tries his best to assert that a multiverse is at least suggested by modern physics. He points out that eternal inflation suggests it, and cites Alan Guth’s original inflation theory, which was developed by Andrei Linde, Paul Steinhardt, and Alex Vilenkin some twenty years ago. But eternal inflation says that the universe is expanding upon a field of dark energy that has different properties at different points in space--the same energy of which he admits “no one knows what it is.” He goes on to admit that physicists “give a fantastically large range for the theoretically possible amounts of dark energy” (emphasis his). He then abandons eternal inflation and resorts to a pathetic argument from authority, writing, “Some of the world’s leading physicists have devoted their careers to the study of these two theories.” Eventually, however, he has to admit that “neither eternal inflation nor string theory has anywhere near the experimental support of many previous theories in physics, such as special relativity or quantum electrodynamics.” By this he means that the latter two have been independently verified by a number of experiments over the last half of the previous century and have suggested further avenues of research whereas the former are nifty math gymnastics for the initiated. In other words, the multiverse is not the elegant explanation physicists expected. They went looking for a universe of light and form, but wound up with something dark and formless.

Keep in mind that the multiverse idea is no friend to theoretical physics. Lightman admits that “if the multiverse idea is correct, then the historic mission of physics to explain all the properties of our universe in terms of fundamental principles--to explain why the properties of our universe must necessarily be what they are--is futile, a beautiful philosophical dream that simply isn’t true.” If the multiverse idea is true, he continues, then "there is no hope of ever explaining our universe's features in terms of fundamental causes and principles."

Therefore because of our universe’s demonstrable fine tuning for life, theoretical physicists have oh so quietly abandoned empirical science for faith. “Some [physicists] feel relieved,” Lightman says. “Some feel like their lifelong ruminations have been pointless. And some remain deeply concerned, because there is no way they can prove [the multiverse]."

Appealing evangelistically to his scientific peers, Lightman says, "Not only must we accept that basic properties of our universe are accidental and uncalculable. In addition, we must believe in the existence of many other universes. But we have no conceivable way of observing these other universes and cannot prove their existence. . . . We must believe in what we cannot prove.” And so the multiverse, though a perennial boon to science fiction, is as whimsical a figure as the flying spaghetti monster.

What a horrible state of affairs! For without the despotic threat of militant empiricism, the barbarians will most surely come. They will burn libraries in an inferno of anti-intellectualism. They will invoke and totemize the fine tuning of the universe to summon legions of theosophic spiritualisms. Eros will seduce reason, and governments will descend into a night of long knives. Heaven help us! The priesthood is forfeit. The public square lies open. Oh, Alan Lightman, how will they let you live?

Epilogue

Who would have expected it, but the so-called war between religion and science has been but a cordial tete-a-tete all this time. Kept under the watchful eye of white-cloaked science, the churches could relax. All those threats about secularism did but thin the ranks of the Elmer Gantry, allowing ecclesial powers to pay more attention to the faithful. Who needs the hard and divisive labor of doctrine, discipline, and exegetical homiletics when one can employ the far more friendly and quantitative techniques of psychology and business management? Church discipline, private rebuke, and public apologetics are not necessary when only the faithful attend. Science too has benefited. In public, religion has been a noteworthy and engaging sparring partner: good for putting scientists on best-seller lists and magazine covers; good for TED talks, speaker's fees, innumerable conference sessions, and humorous anecdotes (and the benefits flow both ways). In private, scientists haunted by the specter of Oppenheimer have been glad to have an ethical stopgap to keep the whole thing human.

Read part one of this article, Science's Crisis of Faith, or read the whole thing as a document.

Tuesday, February 28, 2012

David Hume. An Enquiry concerning Human Understanding

An Enquiry concerning Human Understanding is the distilled presentation of a thinker who was together a scientist, psychologist, metaphysician, and skeptic in a manner that continues to fascinate contemporary minds. The product of both youthful fire and mature consideration, the Enquiry, “contain[s] everything of Consequence relating to the understanding.” In the face of skepticism, the Enquiry offered progress based on experience. In a time of dogmatism, the Enquiry dissected the bases of religious faith and delivered a still-powerful critique. Its attempt was nothing less than the construction of an anatomy of human nature.

Its author, David Hume, was born on April 26, 1711. He grew up in Ninewells and Edinburgh, Scotland. His widowed Mother educated her “uncommonly wake-minded” son until his enrollment at University of Edinburgh at age eleven where he initially considered a career in law. Yet, in a decision that must have weighed heavily against limited means, the fifteen-year-old left the university to answer inner questions of theology and metaphysics. “I could think of no other way of pushing my Fortune in the World, but that of a Scholar & Philosopher.” Residing either in France or England, Hume served as tutor to the Marquis of Annnandale, as a librarian of the Advocates Library in Edinburgh and as secretary to the Earl of Hartford. He was best known to contemporaries as a historian due to the pro-Torey History of England, even though time judged his philosophy more influential. Hume befriended many notaries, including Jean Jacques Rousseau, Adam Smith, and James Boswell. After his death, others admitted admiration, including Auguste Comte, Charles Darwin, and Thomas Henry Huxley.

The Hume family were Calvinists and faithful members of the Church of Scotland. One of Hume’s Uncles was a bishop in the same. The young David took religion very seriously during an era characterized by religious indifference. He confessed, for example, to vanity for thinking himself smarter than his peers. He also applied himself to a moral code modeled upon the pedantic, The Whole Duty of Man. Yet, after leaving the University, he began questioning religious dogmas, and especially proofs for God’s existence. He wrote, “It began with an anxious Search after Arguments, to confirm the common Opinion: Doubts stole in, dissipated, return’d, were again dissipated, return’d again.”

David Hume’s ascent to prominence in Europe’s literati had steep beginnings. At sixteen Hume had begun the labor that would, by twenty-seven, become the Treatise of Human Nature (1739). It was a failure. “Never literary attempt was more unfortunate than my Treatise of Human Nature,” he wrote. “It fell dead-born from the press, without reaching such distinction, as even to excite a murmur among the zealots.” Still, Hume continued writing. His Essays Moral and Political (1741-2) were very successful. Hume began courting the essay form popular in the Eighteenth Century, and considering himself a man of letters. He also began rewriting sections of the Treatise.

I had always entertained a notion, that my want of success in publishing the Treatise of Human Nature, had proceeded more from the manner than the matter, and that I had been guilty of a very unusual indiscretion, in going to the press too early. I, therefore, cast the first part of that work anew in the Enquiry concerning Human Understanding.

The Enquiry, first published anonymously as Philosophical Essays Concerning Human Understanding (1748), would prove to be the perfect vehicle for introducing Hume’s science of human nature. Its success when compared against its progenitor is unsurprising. Hume learned the benefits of rhetoric from English essayist Joseph Addison. It is to Addison that the Enquiry owes its informality and narrative style. Furthermore, the Treatise is three or four times the length of the Enquiry. Lastly, the former rambles in soporific repetition, while the latter’s essays continue only as long as a bourgeois attention-span might allow.

There were some difficulties along the way to the Enquiry’s success. Some felt it should never be published. Hume initially circulated the Enquiry for private comment. Henry Home, one of Hume’s close friends, advised against its publication. Home feared the consequences that might follow exposure of Hume’s skeptical treatment of religion. Hume had omitted a section in the Treatise on miracles for that very reason. This time, however, Hume was indifferent to scandal. Further, it looked initially like the Enquiry would emulate the failure of the Treatise. Hume, returning to England in 1748, wrote, “I had the mortification to find all England in a ferment, on account of Dr. Middleton’s Free Enquiry, while my performance was entirely overlooked and neglected.” Within a few years, however, one of the essays, “Of Miracles,” evoked responses which gained notoriety for the whole. The Enquiry, bundled with other works, was reissued in ten editions during his lifetime.

David Hume’s life spans one hundred years of time called the Enlightenment. The religious conflicts and scientific advancements of the previous decades bequeathed new confidence in human abilities to understand and manipulate the world. It was a confidence to doubt and to see anew. Men like Francis Bacon and Galileo Galilei no longer trusted the received Aristotelianism of the Schools. Rather, beginning with doubt, they made up the difference with discoveries obtained by induction from personal observations. Supreme among them was Isaac Newton. Newton’s physical and mathematical insights overturned centuries of Ptolemaic astronomy and revealed the limitless universe in ways wholly new and unexpected. His success was a powerful recommendation for doubt and observation as the best method for obtaining knowledge. It attracted many admirers, none the least of which were René Descartes, John Locke, and David Hume.

Descartes and Locke sought to do for the inner world of human nature what Newton had done for the outer world of human knowledge. Knowledge, to be certain, must be placed upon a proper foundation. In his Discourse on the Method (1637), Descartes applied radical doubting to discover the nature of even his own existence in order to define some point of absolute certainty. He wrote, “I had to raze everything to the ground and begin again from the original foundations, if I wanted to establish anything firm and lasting in the sciences.” Like Newton before him, Descartes’ method was much imitated, none the least by John Locke in his Essay Concerning Human Understanding (1690). In the prologue to the Essay, Locke explained that he and several friends had been unsuccessfully discussing “principles of morality and revealed religion” in the winter of 1670. He continued, “it came into my thoughts that we took a wrong course; and that before we set ourselves upon inquiries of that nature, it was necessary to examine our own abilities, and see what objects our understandings were, or were not, fitted to deal with.” Both men, using the doubt and observation of the new science, sought to discover the basis of human knowledge by an examination of the self. Yet there were differences in their assumptions wide enough to split those who came after them into two rival movements. These were Continental Rationalism, which followed Descartes, and British Empiricism, which followed Locke.

The differences between these two movements, as outlined by Thomas Reid in his An Inquiry into the Human Mind (1764), were twofold. First, Rationalists believed that there were a certain cluster of permanent and innate ideas or concepts: ideas of the self, causation and infinite perfection. These innate ideas were forever intuitively known to reason. Empiricists disagreed, saying that every idea may be traced back to sensory experience or emotion. Second, Rationalists believed that truth may be properly and mathematically deduced from innate ideas along the same lines as a geometric proof. Induction from observation was the favored method of the Empiricists. The principle philosophers associated with Continental Rationalism were René Descartes, Baruch Spinoza, and G. W. Leibniz. British Empiricism claimed John Locke, George Berkeley and David Hume.

Hume loved empiricism and hated rationalism. He, like Locke, followed the methodology of doubt espoused by Newton and Descartes. Yet, by the time he began his philosophical studies, doubt had begun to answer the question of certainty with ever-deepening skepticism. Locke taught that all ideas arise from experience, but left open the nature of the underlying substance that causes sensation. Thus, “sensitive knowledge” has no underlying definition. George Berkley, seeing the weakness in Locke’s epistemology, or theory of knowledge, posited a solipsistic idealism in which the sensate world is but a field of mental impressions existing only in the individual mind. Philosophers Nicholas Malebranche and Pierre Bayle agreed; truth about the world is not as obvious as commonly supposed. Reason proved uncertain whereas skepticism proved reasonable.

Until recently, Hume was included in the ranks of the skeptics. Philosophy so demanded an answer to the “skeptical challenge” that portions of the Treatise were over-emphasized, destructive passages highlighted. Hume’s was then a negative voice, if not the negative voice. Twentieth-century scholarship has begun to redeem this image.

Hume is now understood as the first post-skeptical philosopher of the early modern period. He took for granted the doctrine of the skeptics rather than promulgating skepticism of his own. His is an attempt to find a way forward for philosophy given the skeptical situation. Certainly he forever retained, if not purified, the hallmarks of Locke’s empiricism. Yet, reform was needed. The fixed point of human nature, the mind, should be studied scientifically, just as Newton had done with natural phenomenon. The result would be a “science of man,” a “solid foundation laid on experience and observation” which could then be extended to all other sciences.

Hume’s skeptical challenge may be understood as a change from ontology to psychology, or from a worldview based on hierarchies of being to one centered around pure anthropology. Descartes’s cogito, “I doubt therefore I am,” beginning with the pronoun “I,” had already begun a turn to the subject. Yet, Descartes still relied on theistic arguments in order to complete his system. Hume did not assume religious categories, nor did he want to. They had already been addressed by the skeptics. His task, rather, was to understand how the mind works. Not what the mind is, but, instead, how it does what it does, always with an ear to the bar of common sense. It is this project which occupies the Enquiry.

Two main themes develop from the Enquiry as it pursues this goal. The first of these is positive, and the second negative. The first exalts imagination and the passions, and the second condemns religious reasoning whether natural or revealed. The first is an examination of causation and the next a condemnation of religious excess.

Hume’s importance in the history of philosophy derives in some respect from his exegesis of causation. With it, Hume scored a victory for British Empiricism over the Continental Rationalists like Malebranche. The Rationalist, rightly equating both self-knowledge and a knowledge of the surrounding world with the necessity of causality, deemed it an innate idea on par with mathematical absoluteness. Hume, however, using empirical induction, saw only one sensation following upon another and called the relationship between them a psychological habit born of instinctive imagination.

When one particular species of event has always, in all instances, been conjoined with another, we make no longer any scruple of foretelling one upon the appearance of the other...We then call that one object, Cause; the other, Effect. We suppose that there is some connection between them; some power in the one, by which it infallibly produces the other, and operates with the greatest certainty and strongest necessity.

What is left between isolated events is but probability and of human nature, not a self but merely bundles of impressions. “[There is] nothing but a species of instinct or mechanical power that acts in us unknown to ourselves.”

Hume’s views on causation met a critical reception. Even close associates such as Henry Home and John Stewart disagreed with his diagnosis of the original quality of human nature. The most famous of his critics, however, is Immanuel Kant who, troubled into life by Hume, carefully corrected the Empiricist in his Critique of Pure Reason (1781) and Prolegomena (1783) which writing also had the effect of raising German interest in Hume’s theories.

The second theme of the Enquiry concerns Hume’s attack on both natural and revealed religion specifically in the essays “On Miracles” (contra revelation) and “Of Particular Providence and a Future State” (contra design (neither Hume nor his contemporaries were aware of the ontological argument)). Hume disliked any form of reasoning that went beyond the limits of perception. His charge against religious dogma was that it was unreasonable. “Religious belief,” he wrote, “is a form of make-believe which … leads by degrees to dissimulation, fraud and falsehood.” It was therefore a subject which he could not ignore. By hobbling design with causation and revelation with an appeal to evidential common sense, Hume sought to show that “the cause or causes of order in the universe probably bear some remote analogy to the human mind.”

Hume’s hostility (often hidden by rhetorical device) garnered some of the earliest responses to his philosophy in close criticisms penned by clergymen both Anglican and otherwise. Significant among these were William Adams (1752), John Leland (1755) and George Campbell (1762). Hume was branded an atheist. The label cost him appointment to a chair of moral philosophy at the University of Edinburgh in 1745. Nevertheless, his religious critique earned him many accolades. His influence is discerned in the theologies of Immanuel Kant, Friedrich Schleiermacher, and Karl W. Feuerbach. They also earned him a permanent place in the philosophy of religion.

David Hume died on August 25, 1776, a few months after finishing the autobiographical “My Own Life” (a nod to friend Benjamin Franklin’s own autobiography which Hume had read in manuscript.) He was buried at Calton Burial Ground. In the last decade of his life, many philosophers engaged Hume’s ideas, including Thomas Reid and James Beattie (Essay on the Nature and Immutability of Truth (1770)). Despite the fact that he once wrote, “I cou’d cover the Floor of a large Room with Books and Pamphlets wrote against me,” his work is respected today both for its commitment to the tenants of empiricism and for its religious critiques. There is a growing dialogue, too, with Hume’s insights in the cognitive sciences, specifically in the work of W. V. O. Quine (Word and Object, 1960) and Richard Rorty.


Bibliography

Aiken, Henry D. Introduction. Dialogues Concerning Natural Religion. By David Hume. New York: Hafner Publishing Co., 1969.
Fieser, James. “David Hume (1711-1776): Writings on Religion.” The Internet Encyclopedia of Philosophy. 1 Aug. 2003.
---. “David Hume (1711-1776): Metaphysical and Epistemological Theories.” The Internet Encyclopedia of Philosophy. 1 Aug. 2003.
Hume, David. An Enquiry Concerning Human Understanding. Ed. Tom L. Beauchamp. Oxford: Oxford University Press, 1999.
Humer, James M. “Hume.” A Companion to the Philosophers. Ed. Robert L. Arrington. Oxford: Blackwell Publishers, Ltd., 1999. 309-318.
Kenny, Anthony. “British Philosophy in the Eighteenth Century.” A Brief History of Western Philosophy. Oxford: Blackwell Publishers, Ltd., 1998. 230-243.
Morris, William Edward. “David Hume.” 2003. Stanford Encyclopedia of Philosophy. 15 July 2003. Stanford University.
Mossner, E. The Life of David Hume. Oxford: Oxford University Press, 1980.
Norton, David Fate, ed. The Cambridge Companion to Hume. Cambridge: Cambridge University Press, 1993.
Noxon, J. Hume’s Philosophical Development. Oxford: Oxford University Press, 1973.
Smith, T.V. and Marjorie Grene. Philosophers Speak For Themselves: From Descartes to Locke. Phoenix Books P17. Chicago: University of Chicago Press, 1967.

(c) 2004 Thom Chittom
Barnes and Noble Classics