• Home
  • What is Critical Religion?
  • Blog
  • Events
  • Scholars
  • Links
  • Recordings
  • Organisation
  • Ekklesia
  • Contact

The Critical Religion Association

~ Critical Approaches to the Study of Religion

The Critical Religion Association

Tag Archives: culture

Modern Government, Sovereignty and the Category of Religion: Beyond the Post-Secular

24 Monday Feb 2014

Posted by tstack1970 in Critical Religion, University of Aberdeen

≈ 2 Comments

Tags

Critical Religion, culture, government, post-secular, religion, secular, sovereignty

Having participated in a number of Critical Religion events, including two workshops held at Stirling, I can report that Tim Fitzgerald, Naomi Goldenberg and I have submitted for review the manuscript of a volume with the title Modern Government, Sovereignty and the Category of Religion: Beyond the Post-Secular.

The core argument of the book is that religious-secular distinctions have been crucial to the way in which modern governments have marked out their sovereignty – as crucial as the territorial boundaries that they have drawn around nations. Our authors, selected from a host of contributors to seven workshops held between 2009 and 2012, bear out the argument through a range of disciplines including history, anthropology, moral philosophy, theology and religious studies, combining theory with the detailed empirical analysis of contexts as diverse as Japan, Mexico, the United States, Israel-Palestine, France and the United Kingdom. Taken together, the chapters provide a multi-dimensional picture of how the category of religion has served to define the sovereignty of modern government.

Edith Doron spoke at an Aberdeen workshop in 2009 of the Abrahamic hut that she constructed while working at the Brooklyn Children’s Museum. Just tell me one thing, asked the anxious director, is this a cultural or religious exhibit? Similarly, employers and teachers alike find themselves deciding whether someone’s dress falls within the boundaries of acceptable “cultural” diversity or constitutes an inappropriate and perhaps illegal display of “religious” symbolism. There can be ambiguity and that can lead to controversy, such as when the UK Supreme Court in 2009 was asked to rule on whether Jewish Free Schools were deciding admissions on the basis of legitimate “cultural” criteria or were discriminating on the basis of “religion”. The distinction between “religious” and “cultural” is only one example of the range of ways in which we distinguish between religious and non-religious or secular. In the same year, The Times columnist Libby Purves complained of an Islington clerk who was allowed to refuse to perform same-sex civil partnership ceremonies on religious grounds. She should, wrote Purves, have “sighed, muttered a prayer, and found another job. The tribunal should never have rolled over as it did, agreeing to exempt a public servant from civic duty. Religion is religion, law is law”. Such controversies indicate that the distinction is anything but cut-and-dried. Only rarely, however, does the ambiguity lead us to question the categories themselves—to ask what we mean in the first place by religion and by culture or law or politics, and why we find ourselves moved to distinguish between them.

The volume is the fruit of several years of sustained debate in our workshops and conferences (including a major British Academy conference in 2010) about what happens when “religious” gets distinguished from “non-religious” or “secular”. Our debates have focused on the consequences of religious-secular distinctions. How and why do people – politicians, academics, peasants, managers, teachers, journalists, clergy, workers, lawyers – distinguish between “religious” and “non-religious” or “secular”? And what happens when they make such a distinction? Some of those consequences are very specific while others are general and far-reaching. The Brooklyn Children’s Museum director was clearly anxious about losing funding or visitors. Employees may have to think about what jewellery or clothing to wear to work. School pupils and their parents need to reflect on the uniform code, while Catholic or Jewish school boards may find themselves defending their admission policies. The Islington Civil Registry had to find another clerk to perform the ceremony, while taking back the clerk who was suspended. Those are specific consequences of particular ways in which “religion” gets distinguished from the “secular”. Our authors focus on the more general consequences of more structural or systematic religious-secular distinctions. The Islington Civil Registry was only applying a piece of legislation which defined the limits of “religious freedom” in administrative law, and was itself was likely modelled on legislation elsewhere. School uniform codes and admissions policies have a long and complex history; the case of Jewish Free Schools was only one episode of it. Beyond the Brooklyn Children’s Museum, the broader history of museums has much to do with marking out a sphere outside “religion” in which objects can be displayed in ways that might otherwise prove sacrilegious.

In the volume, we concentrate mainly on the category of religion as it has crystallised in modern times, by which I mean the past three centuries, roughly, of Europe’s struggle to establish the world order under which we live. There are lively debates about the possible pre-modern roots of what we mean today by “religion”. My co-editor Naomi Goldenberg, for example, draws on Daniel Boyarin (2004), S.N. Balagangadhara and De Roover (2007) and other scholars who trace our modern concept of religion further back into the history of Christianity. Goldenberg even suggests that its roots lie in ancient Greek ways of classifying cults. I accept that our modern category of religion was already in gestation before the modern period. Indeed, I argued in a previous publication (2012) that the Jesuit missionary José de Acosta, writing in 1590 of the pre-Hispanic civilizations of the Americas, took an important step toward our contemporary sense of “religion” by distinguishing Aztec “religion” from other, implicitly “non-religious” things that Aztecs did. It was only a step, however, and I believe that the category continued to develop in the centuries that followed. Of course “modern” is a gross generalization for the period, especially if one takes a global perspective as we have done. Our authors make it abundantly clear that modernity has had no single history. But the historical processes of this period – state-building, colonialism, capitalism – have connected up places in a way that warrants generalization.

Our authors show that the category of religion has played a key role in this modern period. Religious-secular distinctions are written into the entire fabric of modern life, such that to figure out the category of religion is to figure out much of what passes for modernity. Or at least, to take religion for granted – to assume that religion is an object in the world, independent of the category – is to miss much of what modernity has meant. To take religion for granted is, to begin with, to mistake for history one of modernity’s foremost origin myths, the Wars of Religion story. Religion was the cause of the 17th-century European wars, we are told, and the wars ended when European powers decided to tolerate religion instead of fighting over it. William Cavanaugh (2009) and co-editor Tim Fitzgerald (2007) have argued that the modern idea of religion was itself a product of the wars; in my terms, that it was an important stage in the gestation of what we mean today by “religion”. They have shown, moreover, that the category of religion developed in tandem with many other categories, such as tolerance. It was not just that people decided to add “religion” to the list of things that they tolerated. The idea of “tolerance” was transformed in the process – religion was the first object of the modern idea of tolerance.

We converge on one key aspect of the modern history of “religion”: the role of modern government in shaping the category. “Modern government” is another gross generalization, but there are important similarities in how governments have gone about the business of governing in these past three centuries. Governments everywhere have been caught up in colonialism and capitalism, which have pushed them to develop the extraordinary power that they now have at their disposal. Governments have not only appropriated functions that were exercised by other institutions, such as schools and law courts, but they have created an array of functions that did not exist previously, such as healthcare. Scholars have observed that governments exercising such functions have classified populations by gender, race, class and region and treated them accordingly. Less attention has been paid to the way in which governments classify institutions, practices and persons as “religious” and “non-religious”. Just as with gender, race, class and region, it is not that governments have simply applied religious-secular distinctions; governments have, our authors contend, played a key role in developing religious-secular distinctions in the first place. A 2007 volume edited by Fitzgerald highlighted the role of colonial encounters in shaping the modern idea of “religion”, and Fitzgerald went on to argue in his 2009 monograph that global capitalism is another arena in which “religion” has been forged. Our authors pay attention to the colonial and capitalist dimensions of modern government’s concern with the category of religion, while including other dimensions such as gender politics (Goldenberg and Finn), immigration policy (Nillson), church-state conflict (Stack), and peace-making (Israel).

Our authors treat many different aspects of government but emphasise how modern government has used the category of religion to stake its claim to sovereignty. Although Foucault (1980: 121) famously suggested that sovereignty was a thing of the past (“cutting off the King’s head”) and that modern government relies instead on disciplinary measures, it seems more reasonable to conclude that discipline and sovereignty have developed in tandem. The term “sovereignty” has been used in many ways historically and in recent years, but I mean it in the sense of the power to authorise. To have sovereignty is to decide what is and is not legitimate, as well as who can legitimately deviate from the general rule and when. Sovereignty in this sense is not new. Medieval European free cities, for example, were somewhat autonomous in their government, and burghers were allowed to trade in goods that were normally reserved for the Crown, yet the Crown claimed to authorise the activities of towns and cities by issuing them with royal charters. In the past three centuries, however, governments have extended the reach of their authority, as scholars such as Agamben (2005) have argued. They have appeared, on the one hand, to concede sovereignty by allowing for democratic representation. On the other hand, governments have created labyrinthine rules and regulations for any and every area of life, including health, education and the economy, while retaining the authority to decide what exceptions can be made.

We show in the volume that “religion” comes during this period to designate an extraordinary range of practices and institutions that government is unable to control directly, including some on which government is heavily dependent. Government is dependent on such institutions both for the business of governing (Owen and Taira list the roles performed by The Druid Network in prisons and hospitals) and for displaying its own sovereignty (Goldenberg and Finn point to the role of churches in state ceremonies). As such, these institutions pose a potential threat to governments’ sovereignty, but governments try to contain the threat by claiming to authorise them as “religion” and by policing the boundaries of that category. Thus government continues to rely on these institutions that it cannot control directly, while performing its sovereign ability to “authorise” these institutions in the first place. What government recognizes and thus admits as “religion” is subject, of course, to innumerable institutional struggles.

Trevor Stack

The Undergraduate Religious Studies Major as Preparation for a PhD in the Humanities and Social Sciences

13 Friday Sep 2013

Posted by Angela Sutton in Critical Religion, Vanderbilt University

≈ 2 Comments

Tags

Africa, capitalism, Critical Religion, culture, economic theory, gender, global, interdisciplinarity, undergraduates

Undergraduate students who are sold on the Religious Studies major for their undergraduate education are often promised that they will become better writers, critical thinkers, and that they will leave university with a mastery of oral communication and presentation skills. These skills serve them well in any job or other postgraduate endeavor. But a degree in Religious Studies confers so much more than that. As Religious Studies encompasses every facet of the human experience, the scholar of religion by necessity becomes fluent in the humanities and social sciences as a whole. The interdisciplinary degree prepares students for postgraduate work in any of the humanities and social sciences in a way that enriches the student’s background and allows them the lateral thinking necessary to figure out the best approaches for their proposed project.

I did not understand this when I started my undergraduate degree in History and Religious Studies at the University of Stirling, Scotland, back in 2002. I quickly learned that my history courses all followed a similar format consisting largely of textual analysis, historiography, and that certain way historians are taught to think and write. My Religious Studies courses, on the other hand, were as different from one another as they were from the subject of history. These courses had very few methodologies in common. While in one class we depended on a wide variety of economic theory to analyze the role religion plays in global economies, in the next we spent the entire semester reading just a few dense texts very closely to uncover the gendered philosophies behind major world religions.

At the time I couldn’t see how this degree could help me on my quest to become a historian. Our classes read some histories, sure, and discussed historiography when we read theorists and philosophers in the order of publication, but there was so much other, well, stuff.

Instead of comparing religions, classes consisted of thorough exposure to the foundations of theoretical work in the humanities. They were hard-hitting and emphasized thematic, interdisciplinary study. Now that I am in my final year of the PhD at Vanderbilt University in the United States, I can see just how much time this other stuff has saved me. My dissertation project uses the written sources of mainly seventeenth-century European slave traders (the British, Dutch, Prussian, and Swedish), to investigate how coastal West Africans asserted influence in the mercantile culture of the Atlantic slave trade. This will uncover their role in contributing to the early modern capitalist economy. Like Religious Studies, it too is by necessity interdisciplinary.

The work I had done as an undergraduate in the Religious Studies program introduced me to the fields of inquiry I need to be familiar with in order to complete this project. For example, in a course on religion and postcolonialism, our class poured over the works of Homi Bhabha, Edward Said, and Gayatri Spivak, which introduced me to the trajectories of the developing world, and the role Europeans played in this. Reading Karl Marx and Adam Smith in the religion and economy course introduced me to economic theory, and piqued my interest in the very fascinating debate on the connections between slavery and capitalism, at which Eric Williams is the center. Exegesis of religious texts like the Quran sharpened my skills in close readings of primary sources. This skill is essential for my project, as studying the history of Africans through European documents requires the most critical eye.

In addition to this, the language of many great philosophers of religion was German, and reading these texts in the original language (which was optional of course- my professors at Stirling were not sadists) improved my language skills and my readiness to learn further languages, such as Dutch and Swedish, for my project. Not to mention that all the theory we read (Freud, Kristeva, Foucault, just to mention a few) as part of larger writing projects in the Religious Studies department showed me how to apply theory, and how to know when to apply (and more importantly, when not to apply) it. In my honors year, writing an ethnography for my Religious Studies undergraduate dissertation conferred familiarity with the discipline-specific language of anthropologists and archaeologists, which I now make use of to get at historical issues of pre-colonial West Africa about which the Eurocentric texts are silent.

This is but one example of how the interdisciplinary nature of the Religious Studies degree at the undergraduate level readies students to branch out to challenging PhD projects in virtually any area of the humanities and social sciences. The very cutting edge of the field is increasingly concerned with matters of interdiscliplinary inquiry, and some departments are changing their name to “Religion” in recognition of this shift. The critical study of Religion, with a capital “R,” gave me the confidence to tackle a complex project that draws on multiple methodologies, and I can’t recommend this type of critical program enough to any undergraduates who wish to continue in academia.

The elephant in the room – religion and the peace process in Northern Ireland

10 Monday Jun 2013

Posted by Francis Stewart in Critical Religion, University of Stirling

≈ Comments Off on The elephant in the room – religion and the peace process in Northern Ireland

Tags

Catholic, Critical Religion, culture, Northern Ireland, peace, Protestant, religion, violence

Since its legal inception in 1921 Northern Ireland has been plagued with violence and dispute. This blog does not provide the forum or indeed space to fully explore the myriad causes for the violence. Suffice it to say that it is a conglomeration of perceived imperial action by the UK through both military and political means, a monochromatic entrenchment of the past, cultural clashes and a severe identity crisis. Conspicuous by its absence from the list appears to be the question of ‘religion’.

Certainly when the Good Friday Agreement was created, and subsequently voted in within two referenda in both the North and South of Ireland, religion was not a factor in what was to be the proposed peace process. In fact religion is mentioned once, in the third section of the lengthy agreement under the headline of ‘human rights’. The salient part of the sentence confirms ‘the right to freedom and expression of religion’, frankly so vague as to be virtually useless given the situation it was linked with and intended to move beyond.

Inferences from this statement would seem to indicate two things, the first being that religion has no part to play within the peace process or indeed any lasting peace within the province, and second, that religion is a homogenous construct and practise within Northern Ireland. Both are significantly problematic and both will, ultimately, ensure that peace remains nothing but a cracked glass waiting for the final knock to shatter it (see also, Susan Mckay, Northern Protestants: An Unsettled People, (Blackstaff, 2006)).

Removing religion from a peace process that has in part, certainly within the media, been explained (away) as a violent, religiously-motivated conflict, seems naïve at best and disingenuous at worst. Paramilitaries on both sides of the divide frame their actions and indeed perspective through their religious understandings. Republican paramilitaries who died were given the last rites where possible and a Catholic funeral (with military overtones). Loyalists paramilitaries often utilised mottos such as ‘For God and Ulster’.

John Brewer, David Mitchell and Gerard Leavey have just released a book (Ex-combatants, Religion and Peace in Northern Ireland, London: Palgrave, 2013) that documents the role religion played in the lives of paramilitaries from both sides of the divide.  It focuses on religion as a motivating factor in choosing to pick up a weapon or join a paramilitary organisation, religious experiences of imprisoned paramilitaries, and the relationship between paramilitary members and the churches.

Various religious ideologies are wrapped up in the conflict: loyalist wall murals quote Old Testament scripture, whilst Republican murals espouse New Testament ideas such as that of laying down one’s life for another in connection with the hunger strikers. The interviews by Brewer et al reveal the range of religious ideologies and personal faith that existed for paramilitary members (chapter 4 in particular). Ignoring the role of religion and the religious dimension in the conflict prevents a full understanding of what actually happened and why. By extension it prevents the development of a useful model to understand the ongoing concern of extremism within certain interpretations of Islamism.

So why is religion, and its varying interpretations, not discussed more substantially in the Good Friday Agreement, nor addressed seriously – or even included – within the ongoing peace process? Brewer posits that the answer may lie in the source of the structure of the agreement itself, that is within the field and purpose of transitional justice, which is often not amenable to religion playing a role (pp 160) as it interferes with the American cold-war triumphalism in which it was created (pp vii). I don’t disagree, but it does not provide enough of an answer.

Let’s push the idea further; perhaps its exclusion is also based on the possibility that inclusion of religion within both the reality of the conflict and the peace process would necessitate an acknowledgement that religion is significantly more important to identity construction and defence than is perhaps comfortable. Religion is intangible, difficult to understand and virtually impossible to define. Other factors that cause or contribute to conflict are significantly easier to categorise and even develop pathways to either re-route or correct or legislate for or against.

A uniform concept of ‘religion’ is in itself problematic and erroneously assumes a common understanding and agreement as to what constitutes religion. Fitzgerald has argued that the term ‘religion’ is a Western construct with a particular agenda that includes exclusionary aspects regarding what is and is not ‘religion’. It has the potential to exclude those who hold strong opinions on both ethical matters and issues of faith yet would not self-identify as ‘religious’. In other words the term ‘religion’ is both constructed and constrictive, and in a situation such as that in Northern Ireland it is arguable that the problematic nature of the term is a contributing factor to the conflict.

The tantalising question arises: if we allow for a less constrictive understanding of religion within a situation such as Northern Ireland, what possibilities for reconciliation emerge? To answer that, even partially, requires a clear framework on which to set about addressing the question of religion. Refusing any boundaries or encouraging a general relativism is just as damaging and problematic as assuming too narrow an understanding.

A framework which enables a broad critique of religion and a variety of religious understandings and approaches is a necessity. Understandings of religion in Northern Ireland are so intrinsically linked to the character of the people and the very landscape itself (not just the murals but also how cities such as Belfast are physically carved up through permanent peace lines) it is possible to overlook the place of religion outside of institutions so vocal and prominent during the conflict. A study of grassroots organic approaches to peace is called for, but it cannot be limited to one framework or approach, it must be made from a variety of different approaches and ideas. In Northern Ireland we have a saying, ‘grasp the nettle’, which means do what needs to be done – a very apt approach that studies into the peace process need to take on board with regards to the place of religion.

The Squaring of Zero, Part II

09 Monday Jul 2012

Posted by Andrew W. Hass in Critical Religion, University of Stirling

≈ Comments Off on The Squaring of Zero, Part II

Tags

concept of zero, crisis, culture, negation

(This is Part II of a blog entry from last month. Comments are welcome below on both Parts.)

So where then does the symbol of zero enter our Western world? If we turn to the etymology of the word “zero” we will find a telling trajectory of its history. And the origins in fact turn out to be not from the West at all, but from the East. This perhaps should not surprise us, since we know that both Hinduism and Buddhism are much more embracing of the notion of nothingness or the void. The notion is built into the very roots of their thinking, since all reality first stems from and then returns to the void. We might even say that coming to terms with this void is the heart and soul of these systems of thought and practice, even in all their variations. Take for instance the Atman, the supreme principle of the universe in Hindu belief. This principle, as a total and all-encompassing infinity, is in effect identical with a pure nothing, since it is everywhere and nowhere at the same time. In coming to terms with this nothing one comes to term with both self and universe.

In India, the Sanskrit word for “empty” or “blank” is sunya. This sunya is transliterated, within the Indian system of numerology, as the idea of zero and indeed the symbol “0” as we know it today. If we think about the round circle, it suddenly takes on an appropriateness to the notion of nothing, even pictographically. For at the centre of its circumference is a blank, a void, an abyss. It as if we are peering into an empty chasm, brought into greater relief by the circumference, but of course a relief that is an inverse relief, with an infinite inversion.

This symbol and its idea then begins to move West. Sunya is transliterated in Arabic as çifr. The Islamic world picked up the zero form of O when they conquered India in the 8th century. From there they passed it on to the West. This development, one might argue, is one of the most essential and primary dividing lines between the Western and Arabic worlds, but one that is rarely if ever understood or acknowledged. For in accepting and adopting the concept of nothingness from their contact with India, the Arabic people, and the Islam they espoused, was in effect rejecting the Greek heritage. They were gainsaying the idea of logos and its conceptual tradition built up by the august Greeks, and gainsaying what came to be the ruling Aristotelian cosmological view, which had rejected any possibility of the void (even if, ironically, it was through medieval Arabic scholarship that Aristotle was re-introduced to the West). Islam could reconcile the idea of the nothing with the Abrahamic notion of void as it is presented in the first creation story of Genesis (the Elohimic tradition), without having to accept the Logos tradition that Christianity later appropriated from the Greeks, as in John’s reworking of Elohim’s void in John 1.1: “In the beginning was the Logos”. In permitting the void conceptually, there was thus little resistance to its use as a written symbol, and hence the zero entered into the Arabic system of numerical notation. This is the system the West inherited to replace the Roman numeral system, and still uses today. But the inheritance was not without its misgivings: originally zero, as “0”, was called the “infidel symbol”, since it admitted a concept that defied Christian orthodoxy. It was only after accounting systems required more sophisticated notation – and the rise of capitalism is extremely significant in this regard – that Western Christian resistance to the “0” eventually breaks down.

Finally, in its etymological development, çifr gives way to the Latin cifra or ciphra, from which we get our word “cipher”. From cipher we get zefiro or zephiro, which in turn, through cognate Latinate languages (French, Italian), becomes “zero”. (Connected to cifra is also the French word chiffre, which means “digit”.) Nothing then becomes official, at least in terms of accounting. And it becomes acceptable, at least in terms of a workable, if still dangerous, concept.

So from both the symbol and the word, we can see that zero is not something indigenous to the Hellenised West. Moreover, the passage back to its Eastern roots is one often fraught with tension and unease, or even, as we continue to see in today’s geo-political and geo-theological world, with division and conflict.

 

(To follow up in greater detail on the idea and history of zero, there are four key texts, all of which have helped to inform the discussion here: Brian Rotman, Signifying Nothing: The Semiotics of Zero (Macmillan Press, 1987); Robert Kaplan, The Nothing That Is: A Natural History of O (London: Oxford University Press, 1999); Charles Seife, Zero: The Biography of a Dangerous Idea (London: Souvenir Press, 2000); and John D. Barrow, The Book of Nothing (London: Vintage, 2001).)

Note that due to holidays, it may take time for comments to be approved and responded to, but it WILL happen!

 

The Squaring of Zero, Part I

13 Wednesday Jun 2012

Posted by Andrew W. Hass in Critical Religion, University of Stirling

≈ Comments Off on The Squaring of Zero, Part I

Tags

concept of zero, crisis, culture, negation

(This is Part I of a two-part posting. Part II will appear early next month, when the opportunity for comments will be made available.)

We have been thinking in past blogs about the nature of negation, and how it has ascended into the imagination of our culture and society not necessarily as something to be scorned or regretted, but as something with which to be, in some cultural, philosophical, or even religious form, reconciled. Of course its primary symbol, in terms of production, is the figure of zero. But before we can understand how this figure might work its way into and through our present world, we need first to ask, whence zero? For its history is by no means one we might expect.

If we go back to the beginnings of scripted language and numerology, zero was not necessarily there at the outset. The ancient Egyptians developed a system of accounting based on a pictography – notation in pictures. Of course with pictographic language, a positive referent is needed to which one can point in the world. But when it comes to an understanding of nothing, pictography is ill-suited. For how does one picture nothing? The whole point of nothing is that it cannot be seen. To envision it, it must be turned into something abstract, like a concept, beyond pictures. Now we know the ancient Egyptian civilization was famed for mathematics – their pyramids proved their excellence at geometry, the configuration of shapes through mathematical precision. And yet in all this excellence, they never required zero in their computations, and therefore never developed any corresponding symbol. This says as much about their cosmological and theological understanding as it does about their mathematical acumen. For from the Book of the Dead we learn that death was not about returning to an abyssal place of nothing. Significantly, the ferryman who transported the dead soul across the river to the netherworld denied passage to anyone “who does not know the number of his fingers”. This showed the importance of accounting: as accounting was important for the Pharaohs who exacted some form of taxation upon their people, so too in death it is important to know how to account for oneself. (One must be counted, it appears, even in the afterlife.) And so there was a deliberate avoidance of nothing, because nothing troubles the system of accounting, whether financial, philosophical or religious. It is therefore not surprising that the Egyptians developed such a sophisticated technique of bodily preservation upon death. Mummification, we might say, is a gesture against the void, or it is a gesture of containment and preservation against that which negates us. The pyramids, we remember, functioned as tombs. So it is that the shape of O, as zero, figures neither in the pyramidical shape nor in the afterlife. Zero would be a perilous ticket for the ferryman.

The ancient Greeks too did not have a symbol for zero. This might seem even more incredible, since they had a distinct predilection for conceptualising. But as early as the Presocratics, those philosophers who preceded Socrates and Plato, there was a general repulsion to the concept of nothing. Parmenides, for example, talked much about the concept of a changeless One, but was adamant about the impossibility for “what is not” to exist, or even to be thought of. He therefore instructs us not to think on it. And for the most part the Greeks heeded his instruction, and shunned thinking about the nothing altogether. If we consider Greek thinking from the Presocratics onwards, we know that so much emphasis is placed on ratio, on ordering things in relation to one another. This is inherent in their term “logos”, which is accompanied by the notions of rationality and proportionality. (Ratio is part of the rational.) Reality then, underwritten as it is by logos, must remain accountable, or countable. The Pythagoreans were extreme in championing countability, to the point where reality in fact becomes number. But zero does not figure in this reality. In Greek logic (the logic of logos) zero cannot be a number as such. For the “0” introduces a void, and voids, by definition, cannot be counted. It is void of all quantification. If the cosmos is structured upon the logos, even a quasi-divinised Logos, which allows us to think rationally about it, to speak of it and (ac)count for it, it must remain positive. The idea of the nothing or of the negative cannot be part of the equation or the calculation. Thus like the Egyptians, the Greeks also did not develop any symbol for the naught in their numerology.

Nor did the Romans. Having been Hellenised by the Greeks, the Roman numeral system developed conspicuously without any figure for zero. And this from an empire who took accounting, and indeed taxation, to new and perfected heights across an extraordinary range of geography and peoples. This absence is felt throughout Roman culture, even in something as functional as their clocks: the Roman sundials were without a zero point, which means time was always positive – a god, in fact, like the Greek’s Chronos. This despite the fact that the sundial’s circular path outlined an “O”, the figure used elsewhere for the sign of nothing – a sign of the times to come, we might say, when the Roman numeral system proved inadequate, and the West had to turn and face its own nothing.

The Role of the University Amplified

21 Tuesday Jun 2011

Posted by Andrew W. Hass in Critical Religion, University of Stirling

≈ 1 Comment

Tags

A C Grayling, crisis, Critical Religion, culture, education, funding, government, higher education, humanities, liberal education, managerialism, politics, university

I return to the topic of the role of the University, addressed in my first blog (31 January 2011), because of several recent events. The first gave me reason for great applause: the 2011 Gifford Lecture (31st May), in the form of one-off public seminar entitled “The Role of the University in the 21st Century”. The second gave me reason for great pause: last week’s announcement of A.C. Grayling’s new private university in London.

The first, made up of a panel of five speakers within the academy, finally began to address and debate the fundamental question of the University’s identity in our present culture and economic climate, precisely the question I had been calling for. Since others have given a synopsis of this event (see http://www.ekklesia.co.uk/node/14887, e.g.), I will not go into further detail here. But it was clear in talking to colleagues and panel members afterwards that this was only a start. No solutions were proffered, no blueprints for the future drafted. This was simply an opportunity to get the central issues, beyond just the headline tag lines of cutbacks and pending HE white papers from governments, out on the table for scrutiny. And I was delighted to see such strong and passionate discussion in the form of a much needed diagnostic.

The second, Grayling’s announcement of his New College of the Humanities, an independent, elite, for-profit university, employing high profile lecturers across a select range of disciplines and charging fees (£18,000) double the highest rates to be charged in England under the coalition government’s recent tuition fee ceiling rise, has provoked an intense reaction from those within and without academia, and not least from those at Grayling’s own institution, Birkbeck College, University of London. There is much one could say about the reaction alone, and Grayling’s own defence, as chronicled in the Guardian. But the principle of moving towards the wholly private university here in the UK does raise some concern. The idea of an independent university is not inherently wrong; one can see many good reasons for wanting to get out of reliance on public funding and government control, especially with the growing attitudes we’ve seen in Westminster over the last several governments (regardless of party). But the long-term consequences, as we can see from the American model, would be significant: the idea of the world-renowned British university education, which has maintained some relative degree of consistency, would give way to a great disparity in HE offering, far more than what is being threatened with current coalition policy. The elite institutions would become more elite, and infinitely more expensive, while the lesser institutions would become more parochial, and more interest-driven. In America this has led to a vast institutional difference in quality between degrees with the same name, but here in Britain it would also lead to a further classism. The quality of one’s education would be so much more dependent on the money one has before a degree is even started. As much as Grayling’s new model tries to encourage equality through competitive means-tested scholarships, we all know how these work, especially in a for-profit structure: privilege begets privilege, and means-testing becomes so quickly adjusted to the higher scale of those who have gained the competitive edge through previously having more than others. Grayling’s elite college will simply become an independent Oxbridge, a Harvard or Princeton only the wealthiest can afford. This may be what Grayling wants: a place to produce the cultural elite. But if we exclude Oxbridge, the cultural elite is not what the publicly-funded British university system was ever intended for. Its strength, at least until recently, has relied precisely on the fact that it provided a more equitable opportunity for all its citizens to be grounded in some form of tertiary education. And no more than in Scotland, where undergraduate education is still offered for free.

Of course, as I suggested in my January comments, the democratisation of HE on an economic model – the university understood primarily as an engine of the economy – has become self-defeating. If the State wants to invest in universities because they are seen as the chief provider of the workforce for a knowledge-based economy, then it will naturally demand more control of its output, and impose greater and greater pressure to corporatize and managerialize their systems. And by doing this, it quantifies education: in operational terms, accountability becomes predicated upon (fiscal) efficiency, while in pedagogical terms, learning and teaching become predicated upon professional ends alone, particularly towards the attainment of a sufficient enough salary (£21,000, under the government’s new regulations) to begin paying off the massive student debt accrued while gaining a degree. Here, economisation begets economisation: a student has no choice but to think of her or his education solely in terms of the market. But if everyone is doing this, then a simple undergraduate degree, in supply and demand logic, will begin to mean very little. The system implodes upon market saturation. And we are back asking the question: what good is a university degree for? And more fundamentally, what good is a university for?

We need to get beyond the paradigm of the university and its degrees solely as an economic good. But I am not convinced privatisation is the way forward, especially in Britain, where classism requires much less excuse to recrudesce, and would wring its hands at the thought of more private elite academies. How might the governments of the British Isles continue to think about universities in terms of publicly-funded institutions, without burdening them further with the task of chief contributor to economic development and sustainability? How might governments justify funding the HE sector, without requiring corporate accountability that necessitates fiscal streamlining and only economically viable subject areas? How might governments give back the university its historical autonomy, while still being convinced that such autonomy is a good, sound, even if not immediately quantifiable, investment?

I want here briefly to suggest four ways in which governments and academics alike might rethink their view of the university’s role, towards a more robust understanding of what overall purpose tertiary education might serve in today’s (Western) world. Each of these ways has an analogue in government thinking and policy that exist already, but thinking and policy not directly intended to maximise national economic interests. If governments would be willing to place the university under these analogous policy approaches, we might extricate ourselves from the self-defeating path the present policies on HE are doomed to follow.

The first is heritage. The university has long been a place, and creation, of heritage, of preserving what has been passed on to us, and what is valuable in and of its own right. Just as the monasteries, from the 6th C onwards, and out of which the idea of a medieval university eventually grew, were the preservers of ancient texts, and the developers of skills and practices that not only aided in that preservation, but allowed the old to be appropriated in new contexts, so too our universities have been the preservers of much of our most cherished knowledge, whether textual or otherwise, and have gone out of their way to allow the old to be appropriated in the new. What if governments looked at the universities as heritage sites? The British governments fund and support heritage sites around the UK not because they produce economic wealth (though income generated from tourism is not negligible), but because they have intrinsic value that goes deep into what it means to be British (Scottish, English, Welsh, or Irish), and what it means to have a rich and unique culture. What if governments took UNESCO’s World Heritage Convention mandate – “nature conservation and the preservation of cultural properties” – and applied it to universities? Here both the sciences (natural and social) and the humanities (along with the arts) would be seen as having intrinsic worth for their own cultural sake, and not because they necessarily add to economic prosperity.

The second is cultivation. The analogue to agriculture is obvious: every nation is highly invested in developing, sustaining and renewing its natural resources, primarily to furnish its own people with the necessities for living – food, clothing and shelter – but also to bolster its own GDP through exports. In the turn towards knowledge-based economies, governments have increasingly seen the mind as a natural resource, cultivated in the classrooms of primary, secondary and tertiary education. And the mind is certainly something to be cultivated, whether for professional means or otherwise. But with growing ecological concerns, development is now having to be balanced with sustainability and renewability. Nature, we have come to realise, is not a place for pillaging or exploiting without some serious deleterious consequences. Neither is the mind. Its development needs to be balanced with ideas and skills that are not strictly for instrumental and economic ends. Think of climate change: governments invest a lot of time and money fashioning and signing treaties to limit factors seen to damage our environment, at some cost to their GDPs and GNPs. The mind, too, needs to be seen with such balance. It is not just about cultivating a task-oriented faculty, employable only in prescribed contexts with quantifiable output. It is also about cultivating an intellect and an imagination, renewable in different contexts, perhaps even at the cost of immediate quantification and utility. The Germans, those masters of instrumental engineering, but to whom we also owe the invention of the modern university, have a wonderful word for this kind of comprehensive cultivation: Bildung. It can mean not only education, but a cultivation of an inner sense of what it means to be a human being physically, psychologically, morally, and spiritually, and a social sense of how that human being should engage with the world. It links cultivation and culture through creating, shaping, maturation and harmonization. The university needs to be seen once again as a ground for this kind of cultivation, now with a certain “intellectual ecology” in place.

The third is critique. This is perhaps the least expected way to conceive of the university, but in many ways the most immediately imperative. The university needs to remain a place of critical reflection on the ways we are told reality has been in the past, reality presently is today, and reality ought to be in the future. To do this, it must retain a strong degree of autonomy or “liberation”, i.e. freedom from control by the state, business and any other extrinsic seats of authority (church, international organisations, etc.). In this sense, we need to be able to speak of the “liberal sciences” as much as the “liberal arts”. If we relinquish this autonomy, as we are being forced to do under the economisation model, what space is left to challenge the very assumptions that are being imposed upon us, that we are expected to take for granted, including the assumption that the principle role of the university is to be an engine of the economy? The site of this very blog, Critical Religion, is a good example of attempting academic critical exploration: it is not a matter of exorcizing religion as an out-moded way of thinking or practice, but on the contrary, of exercising our very conceptions of religion to see how certain thoughts and practices, which may have once been seen as exclusively religious, are entwined with other modes of thinking and practice in today’s complex world. The analogue here to government might seem difficult to ascertain, for what government invites constant critique of its own operations? But, outside of dictatorships, most governments operate with precisely such mechanisms in place. In our own parliamentary system we have an official opposition party, who sits directly opposite the government to call its thinking and policy to account. The best governments, we know, are those not with an unrestrained mandate to do whatever they wish, but those held in check by strong and responsible opposition. What, then, if governments saw the universities as a kind of shadow cabinet on world affairs, past and present? Such a cabinet may not, and perhaps should not, have direct control over those affairs, but it should have much to say about the state of their health, and should influence them accordingly.

The fourth is creativity. Here the analogue is straightforward: governments invest much in national arts organisations. And at least here in Britain, governments do not expect to have direct, or even indirect, influence on the creative processes of those organisations. What if Westminster dictated to the National Theatre exactly what kind of plays it must commission or mount each season, or restricted BBC television to shows that in no way challenged or satirised the ruling culture? We are not naïve to think there is no influence whatsoever with state-run arts in the UK. But its governments know that in granting their funding they must also grant a great deal of autonomy to each organisation, if they are to survive the market. For the creative world is not about legislation and order. It is about allowing the artist’s voice to come forward in whatever creative form he or she feels most relevant, most powerful, most penetrating. The university has always been a place of immense creativity, not only within the arts, but within all manner of disciplinary enquiry. Scientists tell us some of the greatest breakthroughs in research come through creative moments that are not hypothesised or predicted. The arts are continually reliant upon people educated in humanities subjects that have no direct utilitarian purpose, other than to expose one to aesthetic or philosophical traditions (among others) and to then encourage the development of new creative traditions, or expressions, or ways of thinking. All governments know the arts are a crucial part of the cultural fabric of any society, and British governments especially are willing to take a loss, as it were, to ensure such fabric remains rich and variegated. What if the universities were seen as part of this same cultural fabric? They might generate certain “industries” with economic benefit; but their real benefit lies in the on-going creative energy and spirit that contribute to a much wider cultivation we spoke of above. As others have said, “That capability that leads to economically significant outcomes is derivative from a deeper creativity.”♦ The sooner governments can understand and accept this, the sooner the university can function to the full extent inherent in its very name: a universe undergoing constant re-creation.

This fourfold way of rethinking the university and its purpose cannot, by any means, be exhaustive. But perhaps it might be a start for those in offices of power, and who control funding from the public purse, to understand the university beyond the restrictive, and ultimately self-defeating, parameters set by the economic and business paradigms. After all, their own governmental structures and policies allow for interests well outside the immediate generation of measurable wealth. The university needs to be part of these interests. The poets, the theologians, the philosophers, even the pure mathematicians, all keep telling us there are some things that cannot be measured. We need to safeguard, as our public duty, and not merely as our private privilege, the place where such voices can still be heard, studied, and inflected.

 

(♦ Geoffrey Boulton and Colin Lucas, “What are Universities For?” (September 2008). After I had written my January 2011 blog with an almost identical title, someone pointed out to me this article, written two and a half years earlier, and under the auspices of LERU, the League for European Research Universities. The authors are from the University of Edinburgh and of Oxford respectively.)

Churches, marriage and same-sex relationships…

22 Sunday May 2011

Posted by Bashir S. in Critical Religion, University of Stirling

≈ Comments Off on Churches, marriage and same-sex relationships…

Tags

Biblical criticism, crisis, Critical Religion, culture, gender

This week, the Church of Scotland will be discussing a specially commissioned report on Same Sex Relationships and the Ministry at its General Assembly in Edinburgh. Essentially, it will be seeking to reconcile the unavoidable fact that a number of its clergy live in gay relationships they’d prefer to acknowledge openly, with its public and theological position on sexuality.

The Churches face a problem of course. Whilst our civil institutions become ever more scrupulous about anything that could constitute an obstacle to the legitimate aspirations of gay people, they remain guardians of a tradition steeped in patriarchal structures and heteronormative metaphors that raise – for those they marginalise – deeply painful issues concerning authority, identity and belonging.

In the context of much larger questions concerning the global capitalist exploitation of our environment or our failure to eradicate material poverty or even to ensure everyone has access to clean water, it is perhaps not surprising to find many people – both outside and inside the Christian community – impatient with such a ‘non-issue’. The question of whether it is right to ordain a man or woman who seeks to live openly in a stable, supportive same sex relationship seems irrelevant to the big questions. But, of course it is a significant point, touching as it does on the ordering of human relationships; a fundamental question of great moment in any society. In the United Kingdom and large parts of the western world, Christianity has provided the framework for domestic and sexual relationships for hundreds of years in such a way that, until very recently, people have really not had to give it much thought. Though critics from Harriet Taylor and J S Mill in the 19th century onwards have called marriage a form of female slavery, it has remained the default domestic position. More recently, legislation has loosened the bonds of women, taken away male prerogatives and allowed for an increase in non Church weddings, contenting itself with the more neutral territory of registration but, until now, civil society has not suggested anything substantially different from what the Church has itself prescribed. Recently, attending a lovely family wedding at a registry office in London, I was struck by how far this wedding followed the pattern of the Church weddings I’ve attended – it was a life-long, exclusive partnership in which reference was made to having and raising children. There were rings, bouquets, bridesmaids, a best man and photographers.

Yet In spite of the ritual similarities between registry office weddings and Church weddings, there are differences of course. Churches refer to ‘holy matrimony’ and seek to give significance to heterosexual relationships in very particular ways, claiming, for example, that it has been ‘instituted of God’ (Canon 31:1 of the Scottish Episcopal Church, or set up ‘for a remedy against sin’ (Book of Common Prayer, 1662). It is in the words of the canons of the Church of England, “…according to our Lord’s teaching … a union permanent and lifelong, for better for worse, till death them do part, of one man with one woman, to the exclusion of all others on either side, for the procreation and nurture of children, for the hallowing and right direction of the natural instincts and affections, and for the mutual society, help and comfort which the one ought to have of the other, both in prosperity and adversity.” (Canon B 30).

Arguably, then, marriage as it exists across most of the western world today is still thoroughly bound up in a specific vision of social relations that might or might not be exclusively Christian in origins but which have been thoroughly Christianised. This prescribed form of human relating brings together sex, property and children under a heading of heterosexual – and thus, historically at least, hierarchical – partnership, and promotes this as the premier form of mutual human support. Other potentially supportive relations, including same-sex partnerships are bracketed off as, at best, insignificant and at worst, a matter for shame and guilt.

Yet Christian churches clearly can change as new priorities emerge. In Sweden, for example, a proposal first brought forward in 2003, that marriage should be open to same sex couples was initially rejected by the Central Board of the Lutheran Church of Sweden on the traditional grounds that it could only denote a relationship between a man and a woman. In 2009, however, the Theological Committee of the Church changed its view and recommended that gay couples should be allowed to marry and that priests of the Lutheran Church of Sweden could perform such weddings in their churches (see Svenskakyrkan Church Synod Liturgy Committee report 2009:2 Wedding and Marriage).

The Lutheran Church of Sweden was, of course, responding to pressure– to the changing legal position in Sweden on marriage as a civil institution. It courts criticism from Christians who believe there is a deeper or eternal order existing beyond the realm of changeable human being – beyond changes implemented in response a secular government to reflect its secular concerns – to which biblical language and the traditions of the Church point. Yet Christian theology and Church order have been marked from the beginning by manifestly human heteropatriarchal social structures, inherited from the cultural milieu of the early Christian Church. Moreover, in taking such a radical step the Swedish Church has arguably put itself in a good position to act as a positive force in society, underpinning and supporting trusting relationships rather than undermining them. This too is surely something that could be aligned with the Gospel – perhaps with its refusal to make idols out of conventional family ties and responsibilities.

The murder of Osama bin Laden – the end of the beginning of the clash of civilisations?

02 Monday May 2011

Posted by Michael Marten in Critical Religion, University of Stirling

≈ 14 Comments

Tags

Africa, al Qaida, Christian, civilisation, clash of civilisations, Critical Religion, culture, global, hybridity, Muslim, Osama bin Laden, religion, South East Asia

This morning I awoke to the news that Osama bin Laden was dead, murdered by the United States of America in a what appears to have been a heavily fortified compound in Pakistan; more precise details will no doubt emerge over time. The news is currently being presented in such a way as to suggest capture, not death, was the objective, though whether that was in any way realistic is open to serious debate: surely resistance was expected, and so the statement that bin Laden ‘did resist the assault force’ should come as no great surprise.

Although bin Laden was regarded as significant in many western policy circles, serving as a very useful oppositional figure (and one we will no doubt see replaced in a short time), he was not highly regarded by most Muslims, who saw his understanding of Islam as being no less abhorrent than many Christians’ perspectives of Hitler’s understanding of Christianity. His significance lay in substantial measure in his elevation to a position as ‘super-terrorist’ by US Presidents Clinton, Bush (the Lesser) and Obama on the one hand, and every self-serving dictator claiming to be an ally of the USA-led actions against ‘international terror’ on the other: indeed, one might reasonably argue that bin Laden was emboldened by all the attention he received.

In substantial part this way of thinking about bin Laden arose from a racist strand of thought that was articulated in American neoconservative thinktanks, represented most publicly in two different though related books: Francis Fukuyama’s The End of History and the Last Man and Samuel Huntingdon’s The Clash of Civilizations and the Remaking of World Order (Fukuyama has since distanced himself a little from his thesis, though he is still firmly in the neoconservative camp).  Huntingdon’s book in particular has been influential well beyond its literary or intellectual merit. His thesis of distinct civilisational or religious blocs – one of them being Islam – that were in competition or even war with one another dominated Bush’s administration, in particular as it suited his own simplistic dualism of good and evil struggling against each another. Although strenuously denied by Obama and especially by his immediate supporters, this kind of thinking has continued without change, albeit in more nuanced form, as the ‘drone war’ amply illustrates.

This thinking is not confined to conservative thinktanks and policy-makers, however, as the cheering crowds outside the White House celebrating bin Laden’s murder demonstrate. There is clearly no understanding of bin Laden’s significance or otherwise beyond American (and to a lesser extent, European) interests, and the conflation of his thinking into ‘fundamentalist Islam’ (as Tony Blair and others called it) simply highlights the paucity of intelligent reflection and comment (for a better assessment, the Independent’s Robert Fisk offers careful engagement with bin Laden and his changing thought in The Great War for Civilisation: The Conquest of the Middle East). In fact, bin Laden’s death is largely irrelevant to most Muslims in the Middle East and South East Asia, beyond perhaps removing a stigma that had become attached to idea of Islam – this is how we can read the Egyptian Muslim Brotherhood’s statement that bin Laden’s death has removed one of the causes of violence in the world. Bin Laden was not a cleric, had no formal training in Islamic law, spoke for no government, no substantial movement and had few followers: it is hard to underestimate his irrelevance to most Muslims, who might have agreed with his assessment of the cause of problems faced by Muslims, but disagreed with his proposed methodology for dealing with these problems, as Tony Karon has argued.  In so far as localised movements used or use the al Qaida name, whether in Iraq, in the Arabian Peninsula or elsewhere, it was and is always as part of a nationalist or irredentist movement, riding on the coat-tails of a wealthy supporter of attacks against a perceived enemy of Islam. As the name itself suggests (it translates simply as ‘the base’), people don’t really ‘join’ al-Qaida, they simply adopt the name if it suits them at that particular moment in time.

And that is a key issue: these nationalist movements will not go away unless some meaningful compromise or agreement can be reached on issues they are addressing. We might not sympathise with their modes of engagement, but their causes are often at least partially legitimate. None of this is about what we might think of as ‘religion’ in the sense of Islam being a key issue: these are struggles over land, rights, political engagement, freedom and the like, though they may be presented as being about Islam by some. Even bin Laden saw nationalist struggles as significant: one of his most important early demands was the removal of American troops from Saudi Arabia (he saw this as a violation of the land of Mecca and Medina, the two foremost holy cities in Islam), and his aim of defeating America in the same way (he claimed) he had defeated the Soviet Union was at least in part about liberating Muslims from American influence.

So if Americans and Europeans now think that they can begin to relax over the prospect of ‘international terror’, they are very mistaken. US policy in particular is catastrophically misaligned in the Middle East, Africa and South East Asia (where the majority of the world’s Muslims live), proclaiming democracy, whilst propping up regimes that clearly only serve US interests rather than the interests of the people of these countries. For those who hitherto refused to see this reality it has been made very clear over the last year, with two key factors playing a role: the first is Wikileaks and the unprecedented insight into US-policy making it offers, and the second is the ‘Arab spring’, as al-Jazeera elegantly calls the uprisings across the Middle East. Bin Laden was a minor, irrelevant issue in this context: he had not commented significantly on any of the current issues, had not engaged in any noticeable way with the rebellions, and so his murder, whilst perhaps a satisfying act of violent revenge for Americans, serves no useful or meaningful purpose in resolving these wider global conflicts.

After all, US and European policies towards Muslim-dominated countries in the Middle East and South East Asia are unlikely to change simply because bin Laden is now dead, and so rather than this really being the end, this is more likely to be the end of the beginning. So long as Americans and Europeans continue to think in simple dichotomies of good (us) and evil (them), advanced (us) and primitive (them), having rights (us) and threatening our rights (them), and so on, the ‘clash of civilisations’ will continue. Huntingdon thought he was describing a reality, when in fact he was describing a choice – in classic Marxist/Leninist terms we can see this as an ideologically-driven reversal of cause and effect designed to preserve existing systems of dominance. When viewed through a Fukuyama/Huntingdon lens, religion, culture, civilisations all become more important categories of analysis than they deserve to be in the wider struggle for rights, self-determination and freedom. If US and European policy continues to follow a doctrinaire view of the world as split into competing or warring blocs based on misappropriated understandings of religions, civilisations and cultures – note the plurals – rather than understanding the hybridity and connectedness underpinning our world, continuing conflict and equivalent resistance is assured.  Sometimes that resistance will take the form of so-called acts of terror. Whether the tears of an Afghan mother or father mourning the death of a child in a drone attack ‘defending American freedom’ are worth the same as the tears of an American mother or father mourning the death of a child in an attack on ‘imperialist invaders’ is an active choice we make. We can make that choice and we can vote for governments that make that choice, but if we choose to prioritise our needs, our understanding of culture, religion or civilisation, then we must always expect that others will contest that. Murdering bin Laden does not help with these choices, rather it is simply more of the same: unless we make choices that subvert the dominant paradigm propogated by those that determine our countries’ foreign policy, this might just be the end of the beginning, rather than the beginning of the end of the clash of civilisations.

Sport, Politics and Religion

18 Monday Apr 2011

Posted by CRA Editor in Critical Religion, University of Stirling

≈ Comments Off on Sport, Politics and Religion

Tags

Africa, body, Critical Religion, culture, gender, Muslim, performance, politics, religion, sport, women

This blog posting comes from Colette Gilhooley, who is writing her MLitt in Postcolonial Studies under Professor David Murphy.

A combination of International Women’s Day and the anticipation of the Olympics may make this an opportune time to look at issues facing female athletes which have come to my attention recently. It has been said that Pierre de Coubertin ‘revived the Olympic Games as an instrument of reconciliation, [yet] his successors as president of the International Committee have been tireless in their insistence that ‘politics’ should not interfere with sport’ (Guttmann, 2003: 372). The Olympic Games are an opportunity for people to demonstrate their sporting abilities and to represent their countries on an international stage and their identities as part of that culture which may, I would argue, include politics. Allen Guttmann has called attention to the link made by writers between economic systems and modern sports, suggesting that ‘modern sports are an example of Weberian instrumental rationality, a subtle means of social control’ (Guttmann, 2003: 374). If this is the case, then perhaps it is not surprising that some women’s sports have been given less coverage than others, reflecting how traditionally women have had less economic opportunities than their male counterparts. ‘Sports are the mirror image of – rather than an emancipatory alternative to – the repressive, exploitative, achievement-orientated world of work’ (Guttmann, 2003: 374). While one can acknowledge that sports are part of a cultural and economic system which could be argued to be ‘repressive’, I would like to suggest that the work of Florence Ayisi suggests an alternative to this idea.

In 2007 Florence Ayisi made a film called Zanzibar Soccer Queens which is a documentary following a group of female footballers who are ‘a team of strong-willed women determined to better their lives and define new identities through playing football. In the interviews on the film some of the men expressed their concerns regarding the tension of the football strips the women wear and the traditions of women’s dress code within a predominantly Islamic culture. ‘The problem with women wearing shorts and exposing their bodies is that when men are watching they can be tempted,’ explains Abdallah Mzee, Koran School teacher. The problem seems to be the male gaze and the association of football and certain sports as being predominantly male.

Allen Guttmann (2003) states that in the sexual politics of modern sports, ‘women have refused to be content with conventionally feminine sports (like tennis) and have ‘intruded’ into traditionally male sports (like rugby)’ (Guttmann, 2003: 370).  He further suggests that if male sports have traditionally been an area in which to demonstrate the masculine ‘physical prowess’, then women doing these sports should also, ideally, result in the opportunity for women to demonstrate their physical prowess; however, Guttmann notes that this is not the case (Guttmann, 2003: 370).

Guttmann argues that the ‘sexual politics’ in modern sport is among other things about the transition between the conventional sports played by genders and women breaking these traditional boundaries (Guttmann, 2003: 370). Mr Msoma, Chairman of Sports Council Zanzibar, states that there are some understandings, which seem to be predominantly psychological issues and misunderstood ideas, regarding barring women’s participation in sports which the authorities are struggling to deal with in Zanzibar. Playing football allows the women the opportunity to transcend traditional gender boundaries of their culture and redefine their identities using football as a way to do this. Warda, a midfielder of the football team, has contrasted religion and football demonstrating the importance of both influences in her life: ‘When playing football you can say anything, but when praying you have to say what you have been told by God’. By contrasting religion and football, Warda is able to demonstrate the freedom she feels as an individual on the soccer pitch where she is able to speak for herself, compared to the set performative practices which are part of her religion. Although some women have been discouraged from playing football, many of them see football as a therapeutic influence which has helped them to deal with the traumas in their lives. Furthermore, it has provided them with positive opportunities including the chance to travel and learn, which will help them to break free from the oppressive patriarchal influence inherent in their culture: ‘Unveiling their soccer dreams is evidence of social change and personal development, emancipation and empowerment through sports’.

While sport can be empowering, it is not without its dangers, particularly when there is an association between sports and cultural identity. Eudy Simelane was captain and midfielder of South Africa’s women’s soccer team Banyana Banyana. Simelane was a Lesbian feminist activist who was raped and killed in 2008 by members of her town because of her sexuality. At the time the state did not recognise the practice of ‘Corrective Rape’ (an attempt to punish and change somebody’s sexuality through rape) or rapes that were the result of hate crimes against the homosexual community. Through her work, Simelane was able to try and combine politics and sport and raise awareness of women’s rights by being the first openly lesbian football player in South Africa.

Many of the reasons given in the interviews against homosexuality seem to be connected to religious or cultural reasons, including the threat to the traditional cultural understanding of genders and the performative roles that go with them. Homosexuality has been described as being ‘Unafrican‘ and not part of South African culture; however, this can lead to questions on the nature of what ‘Culture’ consists of and who has the authority to decide.

Jody Kollapen, Former Chair of the South African Human Rights Commission has described culture as being ‘dynamic, our cultures have evolved over thousands of years and therefore culture has to keep up to date’. Sport and culture are, indeed, very closely linked, and I think it would seem like a missed opportunity for the Olympic Games and sport to not engage with political aspects of culture. Sport is a platform for opportunity for attention to be brought to cultural issues, such as in the case of Eudy Simelane and the very real concerns facing female athletes ability to realise and perform their identities through sports.

(Guttmann, Allen, 2003. Sport, Politics and the Engaged Historian, Journal of Contemporary History, vol. 38, no. 3, pp. 363-375. New Delhi.)

Recent blog postings:

  • When Regular PCR Tests Become Penance: Agamben, Biopolitics and Critical Religion  2 September 2022
  • Butler, gender performativity and religion 4 August 2021
  • Logic in Magic, and Human Cognition: Towards a new theory 17 March 2021
  • Politics of Love: Secularism, Religion, and Love as a Political Discourse 18 November 2020
  • The Contagion of White Christian Libertarianism and America’s Viral President 30 October 2020

Frequent blog tags:

academia Africa art Bible Biblical criticism body capitalism categories Christian church clash of civilisations concept of zero crisis Critical Religion culture economics economic theory education epistemology female genius feminism freedom of religion gender global higher education Hindu Hinduism humanities impact India interdisciplinarity interfaith dialogue international relations Islam Israel Japan Jew law liberal education managerialism Middle East mission history modernity music Muslim Naomi Goldenberg negation Northern Ireland nothing Palestine patriarchy performance politics postcolonial power REF religion religion-secular binary religious education religious freedom religious observance religious studies ritual sacred schools Scotland secular spiritualities stained glass theology United Kingdom university University of Stirling vestigial states women

Follow us on Twitter

  • RT @Ekklesia_co_uk: Keynote speaker: Tommy Curry (@DrTJC) Personal Chair of Africana, Philosophy and Black Male Studies, Edinburgh Univers… 8 months ago
  • RT @ImplicitReligio: Registration for the 44th Implicit Religion conference is open: eventbrite.co.uk/e/implicit-rel… 20 - 22nd May, online only, f… 8 months ago
  • RT @aaolomi: Stories, travelogues, and encyclopedias from the Islamic world recount tales of strange beasts of the sea. Mysterious and deep… 12 months ago
Follow @CriticoReligio

‘Like’ us on Facebook

‘Like’ us on Facebook

Our blog is published in association with

Ekklesia

Top Posts & Pages

  • Butler, gender performativity and religion
  • Some (Mainly) Very Appreciative Comments on Brent Nongbri’s "Before Religion"
  • What is Critical Religion?
  • Fitzgerald, Timothy
  • Saade, Bashir
  • The Perry Expedition (1853-1854) and the Japanese Encounter with “Religion”
  • Zhe Gao
  • Myths and Superpowers: “Metaphysical” Superheroes?
  • Home
  • Stewart, Francis

The Critical Religion Association…

... an international scholarly association pioneering intellectual engagement with questions on 'religion' and related categories.

About this site

This site is mostly maintained by Dr R Nadadur Kannan. Please contact us with any queries.
You can keep in touch with our work on Twitter, on Facebook, and through our mailing list.

About the blog

The Critical Religion blog is a shared (multi-author) blog.
The views represented are the personal views of individual authors and do not represent the position of the Critical Religion Association on any particular issue.

Copyright and Funding

Please note that all text and images on this site is protected by copyright law. Blog postings and profile texts are the copyright of their respective authors. We warmly welcome links to our site: each page/blog entry includes a variety of convenient sharing tools to help with this. For more information, see the note at the bottom of this page. Please do not reproduce texts in emails or on your own site unless you have express written permission to do so (if in doubt, please contact us). Thank you.

For a note about funding, see the information at the bottom of this page.

The CRA and the CRRG

The Critical Religion Association (this website) emerged from the work of the University of Stirling's Critical Religion Research Group created in early 2011. Interest in the CRRG grew beyond all expectations, and the staff at Stirling sought to address requests for involvement beyond Stirling by creating the CRA as an international scholarly association in November 2012. The CRRG passed on the blog and other key content to the CRA, and this is being developed here.
The CRRG website is now devoted exclusively to the scholarly work of the staff at the University of Stirling.

Critical Religion online

Apart from this website, the Critical Religion Research Group also has accounts elsewhere online:
- we are on Twitter;
- we are on Facebook;
- we have audio on Audioboo;
We will soon also offer video.

RSS feeds

  • RSS - Posts
  • RSS - Comments

Administration

  • Register
  • Log in
  • Entries feed
  • Comments feed
  • WordPress.com

Blog at WordPress.com.

  • Follow Following
    • The Critical Religion Association
    • Join 177 other followers
    • Already have a WordPress.com account? Log in now.
    • The Critical Religion Association
    • Customize
    • Follow Following
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar