Thursday, June 15, 2006

Taylor on Modernity

[NOTE: I made a few changes to this from my posting it yesterday; a few things just didn't feel quite right. Mostly an issue of terminology that didn't convey the right ideas and might have implied a few wrong ones. (KW, 6-16-06 9:10am)]

As I was surfing the net, I found an article by Charles Taylor, titled Two Theories of Modernity. There Taylor discusses two conceptions of modernity, either as a difference between civilizations or as the demise of "traditional" myths and the emergence of "modern" discoveries. Taylor understands society as "the picture of a plurality of human cultures, each of which has a language and a set of practices that define specific understandings of personhood, social relations, states of mind/soul, goods and bads, virtues and vices, and the like" (1). The second understanding, which he terms "acultural," sees the change "in terms of some culture-neutral operation," such as "the growth of reason...the growth of scientific consciousness, or the development of a secular outlook, or the rise of instrumental rationality, or an ever-clearer distinction between fact-finding and evaluation" (2). In these cases the change to modernity is seen as a result of increased technology, ease of living, mobility, etc. These, it is thought, are indifferent to our notions of personhood, the Good, or society and could occur in any culture regardless of their norms and practices.

When it comes to explanations in terms of "rationality", this is seen as the exercise of a general capacity that was only awaiting its proper conditions to unfold. Under certain conditions, human beings will just come to see that scientific thinking is valid, that instrumental rationality pays off, that religious beliefs involve unwarranted leaps, that facts and values are separate. These transformations may be facilitated by our having certain values and understandings, just as they are hampered by the dominance of others; but they are not defined as the espousal of some such constellation. They are defined rather by something that we come to see concerning the whole context in which values and understandings are espoused. (3)
Generally speaking, most explanations of modernity are of the acultural type, describing it as the rise of reason against Romantic irrationalism. Even those beliefs that see modernity as a negative occurrence provide acultural reasons, as "the loss of the horizon; by a loss of roots; by the hubris that denies human limits and denies our dependence on history or God, which places unlimited confidence in the powers of frail human reason; by a trivializing self-indulgence which has no stomach for the heroic dimension of life, and so on" (4). Here, then, the move to modernity results in a clash of worldviews, of beliefs, of propositional affirmations. Taylor disagrees with these assessments and even thinks that they are dangerous. By seeing the changes as merely institutional, it ignores the ground from which modernity sprung, namely a change in values and norms, a change in culture.

The motive for the cultural approach, Taylor muses, is probably due to the partisan nature of the debate: we simply grapple for whatever is readily available to make things understandable.

Now nothing stamps the changes as more unproblematically right than the account that we have "come to see" through certain falsehoods, just as the explanations that we have come to forget important truths brands it as unquestionably wrong. (6)
Given the dominance of Christendom in the beginning of modernity, there is little surprise that a cultural approach to the question wasn't given--it is the only cultural game in town, as it were, and all other civilizations were "barbarians, or infidels, or savages" (6; this description equally applies to the views of most civilizations before Christendom as well). Given the attempts to squash pluralism from the 16th century and beyond (Taylor goes into this in his contribution to Heidegger, Coping, and Cognitive Science), searching for cultural causes would be unthinkable. Similarly, for those who wish to decry modernity, the greatest self-condemnation would be to realize that the vilifiers similarly take part in the culture of modernity, even if they do not accept its theoretical conclusions (more on this later). Taylor also attributes the paradigm of "materialism" for the acultural approach: it is better to account for social changes in terms of more concrete developments, such as industrialization or increased mobility, rather than some vague realm of spirituality or values/morality.

There are two particular problems with the acultural approach: first, modernity appears to have arisen more from a moral outlook than in terms of "coming to see" truths. "[S]cience itself has grown in the West in close symbiosis with a certain culture in the sense I am using that term here, namely, a constellation of understandings of person, nature, society, and the good" (8). By seeing this general constellation of values, modernity appears in a different light--not as a doctrine, but as a culture. Hence, one can misunderstand modernity in one of two ways: first, one can characterize the change to modernity as the "product of unproblematic discovery of the ineluctable consequence of some social change" (as the proponents of modernity do), or, second, one can ignore important aspects of the change in modern civilization, including notions of individualism and "the affirmation of ordinary life" (8; as the opponents of modernity do). We assume that others in history held similar (though perhaps undeveloped) views that we do, not seeing the central role that such distinctions as the "inward/outward" have had in shaping post-Augustinian Western culture/modernity.

The second, and more damaging, way of misunderstanding modernity through an acultural approach is how it affects our understanding of other cultures. Reducing modernity to some "universally applicable operation" that is indifferent to culture, such as reason or technology, "imposes a falsely uniform pattern on the multiple encounters of non-Western cultures with the exigencies of science, technology, and industrialization" (8-9). As the supposedly primary causes for modernity, we will assume that the newly industrialized culture must now "come to see" certain truths and, hence, make some cultural changes, "such as the 'secularization' or the growth of atomistic forms of self-identification" (9). As I've said elsewhere, this further accentuates the tie between technology and ontology--we think that having the former will instrumentally change the latter, that we "come to see" the latter in the 'correct' way when we accept the former. The acultural understanding, then, levels off all cultures, making them simply 'less advanced' versions of ourselves who will eventually "come to see" as we do once they see the success of instrumental rationality and technology.

In short, exclusive reliance on an acultural theory unfits us for what is perhaps the most important task of social sciences in our day: understanding the full gamut of alternative modernities in the making in different parts of the world. It locks us into an ethnocentric prison, condemned to project our own forms onto everyone else and blissfully unaware of what we are doing. (9)
For those unaware of his work, this is the basic thesis put forward in his magnum opus, Sources of the Self: our view of "the Good" integrally shapes our views. There he examines how people have viewed the self throughout history, tying it into their/our moral outlook. Most relevant to the above is his examination of the rise of modern individualism and the valuation of the 'common life,' finding its roots in Augustinian Christianity and the Reformation, respectively. Prior to Augustine's ruminations on the self, individualism was practically non-existent; one's tie to one's culture and traditions trumped any reflexivity. The reflexive approach itself was practically non-existent; what was important was one's tie to one's culture, one's traditions/history, one's family/clan. But with Augustine, and later in Descartes, the self was atomized and the detached observer became "the Good"; the self was no longer essentially tied to its history and culture, but only contingently so. Through the exercise of rationality one can transcend their particular context, finding universal truths that are divorced from such contingencies as history.

Subjectivity was strongly denounced and approaching the world as a valueless mass of objects in motion (the 'outer') to which the individual mind attributed meaning (the 'inner') was seen as valuable, as 'the Good' that one should strive to achieve. To put it in terms that us moderns can understand, taking a subjective approach to the world of objects is "intellectually dishonest," it "violates logic," applying what are obviously moral judgments to supposedly amoral/non-contextual reason. God himself, as Heidegger and I have both argued, is the great 'meaning giver' in a meaningless cosmos, the Being that is merely contingently attached to the world, the mind that is purely rational (recall how God alone could save Descartes from his doubt, could allow him to rationally affirm meaning). The affirmation of the common life allowed for the development of democracy, of the universal rites of all mankind, and of the overthrow of the 'upper class.' Unfortunately, this also (perhaps paradoxically) led to capitalism and the currently large (and growing) gap between socio-economic classes--the poor getting poorer and the rich getting richer. These were the goods that partially inform modernity and, as such, many forms of modern Christianity are just as guilty of modernity as their atheist counterparts. In fact, it was the views of Christianity as it developed that set the stage for atheistic modernity, without which the latter probably would not have developed.

I think there is much merit in Taylor's conception of modernity. Here are some preliminary musings on the idea. If modernity is best understood through a cultural analysis, this should have important ramifications for how one views post-modernity. I need to be careful here as Taylor himself rejects postmodernity and attributes to it views that many do, relativism being the prime example. I should also say that I think the term itself is now meaningless and should just be rejected (but see how I continue to use it?). Either way, I would disagree with Taylor on his interpretations of some so-called postmoderns (such as Derrida), as do a number of his peers (Dreyfus being one that immediately comes to mind). What values, then, does postmodernism espouse? First, the self as culturally embedded and spatially/bodily concerned with its world; in other words, postmodernism values culture over reason, seeing in the former the grounds for the latter. The disengaged self of Augustinian/Cartesian thought with its inherent valuation of the disengaged perspective is not a good, but is in fact destructive on many levels (though not all; there is still something to say for it). Rather, the individual and their thinking are at least partially (but essentially) constituted by their historical, culture, spatial, embodied, and socio-economic situatedness and finitude. This is not the self that attributes meaning to a meaningless cosmos, but the one who grows up in a world of meaning and values of various kinds, that cannot help but find meaning in even the most trivial event. This "cannot help" is not meant in a negative way, but in terms of an excess--human beings, in their embeddedness with other humans and being/beings and their necessary concern for their being, are inherently meaningful and can no more 'attribute' meaning to a meaningless cosmos than it can be meaningless in the traditional sense. Perhaps we can say that human beings are meaning par excellence.

Second, but related to the above, postmoderns appreciate and attempt to understand the historically contingent nature of their culture, including that of modernity; one need not accept modernity's evaluative judgments nor must one "come to see" rationality as basic by affirming logic or technology. The world is driven by more than calculative reason, as seen in Taylor's cultural approach: our values precede and are instantiated in our practices and propositional affirmations. Heidegger would state this in terms of our attunement to beings--it precedes every rationality, every reason, every formality, and in fact makes them possible. Taylor himself does take a Heideggerian approach here, as elsewhere. Someday I will have to bring up Tim Ingold's criticism of something like the acultural approach to anthropology, as it is similarly instructive.

This second point naturally leads to an openness to other cultures, to other ways of valuing the world. This does not mean the uncritical acceptance of other cultures, cultural practices, or religious beliefs but it does mean that one should strive diligently to understand the Other and not simply reduce them to some version of our selves. This requires a principle of charity--that we be charitable in our conversations because it may be the case that we are simply not attuned to the Other's culture and, thus, may seriously misunderstand their practices (the Conquistadors in Mesoamerica come to mind). This principle of charity, of course, is part and parcel to hermeneutics--we assume that the other actually has something to say instead of assuming from the start that what they are saying is either meaningless or incomparable to what we already possess (much of modern Evangelical apologetics comes to mind).

Postmodernity is not a doctrine or dogma, but a culture, a stance on certain values. The usual squabbles over 'analytic' and 'continental' thought miss this point by focusing on the difficulty of prose, the affirmation of logic/reason, or the incompatibility of various claims. Even most proponents of continental thinkers miss this: they interpret postomdernity through an acultural lens and thus end up being little more than warmed-over modernists. Postmodernity cannot be defined (if it can at all) by referring to doctrines or beliefs, but by a more ambiguous attunement to the world. "Ambiguous" not only because it is hard to pin down, but also because it is what allows any 'pinning' to begin with and thus escapes that which it makes possible.

Labels:

1 Comments:

Anonymous Anonymous said...

Kevin, I LOVE the bit about computer chips and rats with human brain cells. I am preparing to write a book for post graduates in College on Ethics. Any further such comments are welcome. Would love to include them in my book!
V Thompson

10:08 PM  

Post a Comment

<< Home