The Spiritual Menu: An Alternative Solution to the World Religions Paradigm

Over the weekend I came across this image on the internet:

spiritual menu

It comes from the Hotel Preston, in Nashville Tennessee.  According to a number of sources the menu on the right (though the pillow menu looks pretty nice too), is the brain-child of Howard Jacobs, the chief operating officer for Provenance Hotels, the owner of the Hotel Preston.

Among the spiritual and pillow menus, amenities of the hotel also include a pet goldfish, as well as a ‘pet spiritual menu:’

Screen Shot 2015-05-03 at 13.02.54

While this is a very clever (and more than likely quite successful) marketing scheme, the Spiritual Menu might be helpful in another way, particularly concerning how we approach and study ‘religion.’  My intention with it, then, will be to use it as an alternative methodological approach to researching and teaching ‘religion’ beyond the limits of the normative ‘World Religions Paradigm.’  To do this, however, I need to first provide some background on the latter.

The World Religions Paradigm 

When I decided to ‘return’ to school after a few years working full time, one of the first courses I took was an ‘Intro to Religion.’  Though it would become the subject to which I would devote my scholarly energies from that point on, I was a bit anxious about this course.  I had a fairly poor experience the first time I tried to attend university, and one of the first courses I took then was also on the ‘World’s Religions.’  In this version, the instructor spent most of his time showing pictures of himself standing in front of Buddhist Temples.  I ended up failing the class because I stopped attending.  My second experience was much better.  In fact, I might even go so far as to blame this course for the route that my academic interests would take.  smithFor this class we were assigned a single text: Huston Smith’s The World’s Religions.  I loved this book (and still do).  I was enthralled by Smith’s narrative, by the way he introduced ‘religion’ via stories, summarising a millennia of beliefs and practices into short and practical explanations.

The text is simple: a somewhat reflexive introduction followed by a Chapter each on Hinduism, Buddhism, Confucianism, Taoism, Islam, Judaism, Christianity, Primal Religions, and a Conclusion about the interrelatedness of religious belief told through narratives and stories across thousands of years of human cultural development.

For me, the text’s simplicity was ideal for my introduction to religion.  Here were the ‘world’s religion,’ in simple prose, presented as they occurred in the real world, almost progressively, like an evolutionary system of socio-cultural belief leading toward some sort of conclusion.

A few years later, when I was working on my first Master’s degree in Religious Studies, I was introduced to Ninian dimensionsSmart’s Dimensions of the Sacred, which also introduced me to his own The World’s Religions. world religions While the former introduced me to a theoretical world of functionalist approaches to the ‘meaning’ of religion, the latter seemed a rather more complex version of Smith’s World’s Religions.  I didn’t think much of it, but it did indeed assist me in growing my knowledge about the subject.

In fact, the trend of presenting ‘religion’ in a ‘world’s religion’ category has carried on for some time, the most recent addition being the Norton Anthology of World Religions nortonedited by Jack Miles, with contributive ‘chapters’ by Wendy Doniger, Donald S. Lopez, Jr., James Robson, David Biale, Lawrence S. Cunningham, and Jane Dammen McAuliffe.  The anthology itself is split into two books, with sections devoted to Hinduism, Buddhism, Daoism (Volume One), Judaism, Christianity, and Islam (Volume Two).  As described by the publisher, this is a “landmark work in which the six major, living, international world religions speak to readers in their own words.”

Again, this seems like a fairly straight-forward text, offering primary source ‘voices’ with which to tell the story of these ‘religions.’  However, and as I too came to realise over my years of studying religion, this is not without its faults.  For example, while this makes the job of teaching about religion slightly easier (if not more marketable), it also quite simplistically isolates the concept of ‘religion’ into a particular six-to-seven part typography.  Likewise, this presents the issue of a normative or ‘western-centric’ perspective, so that ‘religion’ is thus defined here by our isolating it to these particular cases.  This becomes even more problematic when we begin to study ‘religious beliefs and practices’ that might not fit into these typographies, such as Scientology or ‘New Age.’  Which, as we might argue from the outset, moves us outside of the realm of strict objectivity by underscoring our intentions with preconceived notions about what ‘religion means’ before we’ve even had the chance to discuss it.

This argument is made much better by others.  For example, Suzanne Owen published an article a few years back that I think quite nicely addresses the issues inherent in using the World Religions Paradigm.  First, her description:

For comparative purposes, scholars have placed the different manifestations of religion into various categories separated according to criteria chosenbeforehand. The divisions could be decided along historical lines, e.g. ‘primitive,classical, living’, or geographically. The most popular typology dividesWorld Religions from other religions. The World Religions generally includeBuddhism, Christianity, Hinduism, Islam and Judaism, with many lists includingSikhism and also Zoroastrianism and Baha’i, organized first geographicallyand then historically in textbooks and most modules covering them. Otherreligions include various New Religious Movements and the indigenous traditionsof Africa, North America, China, Japan and so on.  [Suzanne Owen, “The World Religions Paradigm: Time for a Change” (Arts and Humanities in Higher Education, Vol. 10, No. 3, 2011), 254.] 

To support this description, she cites Suthren Hurst and Zavos (2005):

This model conceptualises religious ideas and practice as being configured by a series ofmajor religious systems that can be clearly identified as having discrete characteristics.These systems are seen as existing alongside each other in a common space in the globalfields of cultural, social and political life. They apparently compete, have dialogue witheach other, regenerate themselves or degenerate within this space; a series of systems,then, with their own historical agency.  [J. Suthren Hirst and J. Zavos, “Riding a tiger? South Asia and the problem of ‘religion” (Contemporary South Asia, Vol. 14, No. 1, 2005), 5.]

As she then points out, while the World Religions Paradigm seems to be surviving the number of criticisms it has received over the last few years (decades, even), it is still thriving (exemplified by the Norton Anthology).  He are some great examples of the criticism and discussion about it that I think are worth a listen:   

A podcast interview with Jim Cox, a renowned phenomenologist discussing the use and issues of the paradigm: http://www.religiousstudiesproject.com/podcast/podcast-james-cox-on-the-world-religions-paradigm/

A roundtable discussion about the paradigm including Suzanne Owen and Jim Cox, alongside a number of academics who are quit critical of its usage: http://www.religiousstudiesproject.com/podcast/podcast-after-the-world-religion-paradigm/

Another roundtable, though perhaps not as ‘professional’ as the one above, where a group of us met a few years back to discuss it from the perspective of those in the process of working toward the PhD: http://www.religiousstudiesproject.com/podcast/roundtable-on-religious-studies-and-academic-credibility/

As well, the brilliant minds behind the Religious Studies Project (David G. Robertson and Christopher R. Cotter) have a forthcoming text on the subject set to be released in the very near future.

To conclude here, then, and thus move on to my use of the Spiritual Menu, I return once more to Suzanne Owen’s conclusion, as I think it might do a more concise job of summarising both the departmental and discursive issues in using the paradigm to teach religion:

On the whole, religious studies departments are still constrained by theWorld Religions paradigm for various reasons, such as the expectations ofstudents and institutional concerns. This affects recruitment, as they continueto advertise posts for specialists in a particular religion rather than for someonewho is a specialist in the study of religion. University undergraduate coursescontinue to teach descriptions of particular religions in turn, divided accordingto historical and geographical criteria. However, departments these dayscannot afford to have a specialist in each of the World Religions, especiallyif they have to share the department with a dozen theologians and biblicalscholars. Several departments are trying to find alternative approaches, but theWorld Religions paradigm is still growing vigorously in primary and secondaryschools and thus continues to inform the non-specialists who inhabit themedia and political arenas.  [Owen, “The World Religions Paradigm: Time for a Change,” 266.] 

As she states here, not only does it cause issues concerning the way that ‘religion’ is presented in the classroom, and is thus perceived by students (such as myself), using this paradigm also affects the discourse beyond the classroom.  In the British case (the context within which she is writing) this translates into a public perception that further normativizes the notion that ‘religion’ is something that consists of an ‘us vs. them’ binary.  What this further produces is a somewhat inherent bias that not only raises certain ‘religious beliefs and practices’ above others, but that equally denigrates others that don’t fit into this sort of typography.    

The Spiritual Menu

While the Spiritual Menu appears to be yet another example of the World Religions Paradigm, I think it also provides an outlet from the issues that arise in using it.  Here’s a quick description of my argument:

While on the surface it appears to divide ‘the spiritual,’ which we might translate as ‘religion,’ along similar lines of the World Religions Paradigm’s promotion of ‘popular religions’ (Judaism, Christianity, Islam, Taoism, Hinduism, Buddhism, Scientology), it is doing so by means of textual narratives, such as we see in the Norton Anthology.

This is similar as well to the use of narratives in Smith’s and Smart’s World Religions.

In this way, ‘religion’ is presented via narrative representations, much like Smart’s dimensions in regard to the ‘mythology’ or ‘doctrine’ that underscores a definitive aspect of religious belief and practice.

Said otherwise, these are presented via particular discourses.

Thus, rather than seeing the texts offered in the Menu as furthering the notion that the best means of presenting ‘religion’ is done though a typography divided by the World Religions Paradigm, they can instead be reflective of particular discourses pertaining to how individuals might define themselves ‘discursively’ via myth and doctrine.

What I might also argue from this line of thinking is where the contributors of the Norton Anthology might have ‘gone astray,’ beyond the idea that the religions they present have the bizarre ability to “speak to readers in their own words,” is not so much found in their using discourse as a means of allowing the ‘subject’ to speak for itself, but in their isolating this discourse within a paradigm at all.

The ‘Menu’ is thus nothing more than a discursive sampler: texts used by individuals that represent, on one end, the discourses we might see as ‘underscoring’ a ‘religion,’ that on the other are used by individuals identifying with that ‘religion.’  In the same way, these texts are not the religion itself.  The Bhagavad Gita is not Hinduism.  What is Scientology is not Scientology.  Rather, they are narrative representations filled with language used by individuals in their processes of identity construction.  Therefore, unlike where the Norton Anthology uses similar ‘primary sources’ to describe how a religion might ‘speak for itself,’ the use of the menu here gives us a much more clear and nuanced look at how individuals might use a similar source in order to shape the language they use to describe themselves ‘religiously.’  In other words: a Scientologist might use What is Scientology to describe him or herself as a Scientologist; the book is a discursive tool, not the discourse itself.

Thus, again, while the means with which those who use the World Religions Paradigm is not inherently problematic, their doing this within the confines of a paradigm that provides a normative and biased position on the meaning of ‘religion’ confuses their intentions by turning their attention to the religion describing itself, rather than the religious individually describing themselves from within the context of that religion.  This is, as well, quite contrary to the objectivity necessary of religious scholarship.

Conclusion

To conclude here, I will borrow and amend an insightful statement made by Niki Leondakis, the chief operating officer with the Kimpton Hotel chain based in San Francisco, which has equally adopted the Spiritual Menu: “offering a menu that includes as many philosophies and beliefs and spiritual perspectives was much more in keeping with the culture of our company.”

Or, as I might argue: by translating the mythological and doctrinal narratives that are used by individuals in the process of their ‘religious identity construction’ as a ‘menu,’ through which they isolate their own discursive understandings of ‘religion,’ we can form a much more complex and varied person-to-person perspective on how individuals use, and thus define, the concept for their own intentions.  Which, I believe, seems much more in keeping with the culture of religious studies.

Based on ‘Real Life’

In mid July of this year, we, as a collective human society, will have one more book to read by the novelist Harper Lee.  Written before her famous To Kill A Mockingbird, Go Set a Watchmen is a ‘sequel’ to the former, set twenty years later.  While the publication of this book has brought with it a renewed interest in her writing, it has also inspired a bit of scepticism about the legality, even morality, in publishing it (considered problematic given Lee’s presumed health issues).  Within the former category, a recent article in the New York Times caught my attention, particularly in how the author, Laura Tavares, makes use of To Kill a Mockingbird in a way that elevates it above the restrictions of mere aesthetic media.


As a contribution to the New York Times’ “The Learning Network Blog,” under the category of ‘Text to Text,’ a cross-textual discussion that links similar textual entities via shared interests, Tavare’s article associates Chapter 15 in Lee’s novel with an article on the recent Equal Justice Initiative’s report on Lynching in America: “History of Lynchings in the South Documents Nearly 4,000 Names.”

Her intention, as described at the beginning, is to elucidate for instructors (who might be reading the article for the sake of using it in their classrooms) how such a novel might ‘speak’ to ‘real life:’

To encourage students to make these important connections, we’ve chosen to pair an excerpt from Chapter 15 of the novel with The Times’s article on the Equal Justice Initiative report, “History of Lynchings in the South Documents Nearly 4,000 Names,” with the goal of helping students more deeply understand “Mockingbird,” the world of the novel, and our own world.


Here’s a ready example.  In his “The Author as Anthropologist: Some West Indian Lessons about the Relevance of Fiction for Anthropology,” in Eduardo P. Archetti’s Exploring the Written: Anthropology and the Multiplicity of Writing, Thomas Hylland Eriksen distinguishes between two ways in which the novel might function ‘as an ethnography:’

First, novels may serve as ethnographic sources and may to this effect rank with informant’s statements. At this level, the author—whether he is a Mittelholzer or a Naipaul—more or less unwittingly reveals aspects of his society. As Bakhtin and many others have reminded us, the author is a prisoner of his own time. The author, known through the novel, is here seen as an aspect of the production of society. 

[…]

Second, novels may be read as ethnographic descriptions; that is, the formation conveyed may be taken more or less at its face value, as a kind of ethnographic documentation. (191)

In this way, he continues, the novel and the ethnography are ‘relevant’ to each other, but they are not the same thing.  To further delineate his meaning here, he states:

[Novels] cannot be used as plain ethnography since they do not profess to represent the truth and because their relationship to social reality is ultimately uncertain. Besides, if they are to be exploited as ethnographic sources (and not as evidence), the reader must be familiar with the society at the outset of the reading. They cannot, therefore, replace the ethnographic footwork either. It therefore seems a paradox that some of the best anthropological writings extant on Trinidad are works of fiction (cf. Melhuus, infra, for a Mexican parallel). In order to asses their validity, a reader must have first-hand experience of the society. (190)

However, and though his distinction here between the ‘source’ and the ‘description’ is a useful one in determining the differences between the way fiction might ‘function’ in a way exclusive of its existence as an aesthetic piece of entertainment, I would argue that he is incorrect in his strict separation between the ethnography and the novel.  This is especially the case with his opening remarks about the ‘simple distinction’ between the two forms of writing:

Fictional accounts, then, present persons and events which have been invented by the writer. Anthropological texts try to present a few aspects of social reality as accurately as possible, taking account of the limitations entailed by fieldwork, ‘cultural translation’ (or, if one prefers, cultural reduction) and attempts at linguistic representations of society.  Lies and deliberate misrepresentations are banished from anthropological scholarship, which should additionally—unlike fictional writing—try to present empirical material systematically and comprehensively and distinguish between description and analysis so that the reader may draw his or her own theoretical conclusions. (168-169)

I would further argue that he is quite mistaken here, particularly concerning the difference between ‘fictional’ and ‘anthropological’ accounts.  Both are artifice, meaning both are designed and dictated by choice.  Likewise, both are the result of a textual process, a ‘storytelling’ wherein the author has tried to re-create a discourse in a way that represents his or her subject in a manner ‘true’ to his or her interpretation.  In fact, I would agree in many ways with Clifford (1986) that ethnography is, in fact, a type of ‘fiction:’

To call ethnographies fictions may raise empiricist hackles. But the word as commonly used in recent textual theory has lost its connotation of falsehood, of something merely opposed to truth. It suggests the partiality of cultural and historical truths, the ways they are systematic and exclusive. Ethnographic writings can properly be called fictions in the sense of ‘something made or fashioned,’ the principal burden of the word’s Latin root, fingere. But it is important to preserve the meaning not merely of making, but also of making up, of inventing things not actually real [emphasis in original]. (Clifford, “Introduction,” Writing Culture, 1986, 6)

Beyond mere etymological determination, I think Clifford is correct here mainly because I think any and all textual representations are ‘fictional’ by their inherent ‘artificial nature.’  Eriksen can argue all he wants that fiction represents ‘lies’ or ‘deliberate misrepresentations,’ but I would again contend that this is equally a problem for the ethnographer for no other reason than the fact that he or she is, as Malinowski stated, ‘creating’ or ‘describing’ his or her subjects.  As intermediaries between subject and reader, the ethnographer is just as much an author of ‘fiction’ as the novelist inventing his or her own subjects.


Which brings me back to Tavares and Eriksen.  In my opinion, the former’s use of Lee’s novel and the latter’s differentiation between the novel as a source or description of ethnographic ‘truth’ share the same DNA.  In fact, I’d even go so far as to state that they are both siblings of the parentage between Ethnography (texts designed to present a cultural or historical representation of a certain people, time, and place) and the novel (a text designed to present a fictional creation of an author intent on representing a particular individual or individuals in the certain time and place).

However, this also brings forth an issue that I believe is perfectly exemplified by the image I used for this weeks ‘feature image:’

Screen Shot 2015-04-28 at 18.19.10

While I am quite willing to blatantly claim that all textual representations are fiction by means of their ‘artifice-ness,’ this of course brings us into a discourse where, like the notion of ‘everything is fiction,’ we get somewhat distracted by what might be ‘based on real life’ and what might be a story assumed by some as the same.  This is not equal, however, to a declaration that the story of Noah, which might be defined as both, either, or neither a myth and truth, is definitively one of these things.  Rather, my point of having it here, and the point of this post in general, is a reminder that when we declare ‘everything’ as fiction because of the role that artifice plays in the creation and presentation of interpreted ‘things,’ a movie about Noah and a movie about William Wallace are equally ‘based on real life.’  In other words, the distinction between what is ‘fact’ (quantitative data about lynchings in the US) and what is ‘fictional’ (Lee’s To Kill a Mockingbird) might blur into a perception where they become equal representations of some type of ‘truth.’

I, for one, am ok with this.

Cheaters never prosper. Well, that’s not true. Sometimes they do.

I like distractions.

As I’ve mentioned earlier, I’m an ‘emotional writer,’ meaning that I have, like many I know, a certain method to my madness.  One necessity that I require is a good distraction.  Too much time, effort, and focus on one thing makes, in my opinion at least, for too myopic of a perspective.  Distractions are fun, and they break up the monotony of doing research.  It’s helpful, and I think healthy, to look away from one’s work from time to time.

Right now, of course, distractions are the last thing I need (or want, for that matter).

However, I came across something recently that I needed to discuss, even if only briefly.


One of my random sources of distraction is the inane and ridiculous website, Reddit.com.  It’s not an easy thing to describe this website, what ‘up-voting’ means, or how it affects your ‘karma.’  So I won’t here.  The best way to understand what it is would be to just go there and make sense of it in your own way.

While browsing through the ‘all’ section, I came across this ‘meme:’

meme

To explain this for the uninitiated, this is what is called a ‘confession bear.’  When you want to confess something, such as was done here, you use this meme to do so.  I should also add that reddit is, if you want it to be, an anonymous website.

This meme also came with a discussion, like a forum.  For anyone interested in its context and contents, see here: http://www.reddit.com/r/AdviceAnimals/comments/32seka/my_mom_was_so_proud_she_made_me_oreo_pancakes_for/

In essence, this individual was admitting to having their PhD Thesis ‘ghostwritten’ for them.  While this is an interesting statement in itself, a good friend of mine sent me a link to an article (https://chronicle.com/article/The-Shadow-Scholar/125329/) written a few years ago by a gentleman who did this very thing for a living.

Published in the Chronicle for Higher Learning in November of 2010, it soon became one of, if not the most famous, articles ever published by The Chronicle.  In it, the author, who refers to himself as ‘Ed Dante,’ tells us about his job, citing the number of pages he tends to write during any regular work week, how he prepares his ‘research,’ and shares some personal insights about how he came to be a plagiarist for hire.  As well, throughout the article he repeatedly refers to his most recent client who has had the unfortunate circumstance of having an abstract that he wrote for her accepted, and who now needs him to write the dissertation.  The article is an engaging read, and likely (as it did) will inspire quite a flurry of emotional responses for those who read it.  It was so successful, that news agencies picked it up (like the ABC News clip below), and it was turned into a text, this time under the name, ‘Dave Tomar.’  Dave, if that is his real name, is now a ‘legitimate’ author.


My interest in Dave/Ed’s story is of course piqued by the notion that any sort of writing is ‘fictional’ in its ‘made-from-ness,’ as well as whether we should consider anything he says to be ‘true,’ because, like writers of both fiction and non-fiction, he lies for a living (“all constructed truths are made possible by powerful ‘lies’ of exclusion and rhetoric […] even the best ethnographic texts—seriously true fictions—are systems, or economies, of truth”[1]).  However, I’m also intrigued by the very nature of Dave/Ed’s description, and it’s here where I think I’ll conclude this brief distraction.

I think his story speaks to an amended version of what a colleague at last year’s British Association for the Study of Religions Conference called ‘indentured academia.’  Rather than an indentured perspective on ‘being’ an academic, I think it better speaks to the notion of ‘indebted academia,’ the idea that, because academia is becoming a ‘business,’ students have no choice but to treat their education like a transaction.

In this way, it begins to seem, at times at least, all about the money.  Alongside rising tuition costs, textbook costs, etc., Dave/Ed’s story appears very much like a product of that.  Of course, while I would immediately respond to my own argument here with the counter statement that even with this more ‘financial-based’ perspective, ‘cheating’ should never be an option worth considering, it’s still a discursive influence we can’t just ignore.

We might ask, then, as costs rise, and as academia becomes more and more of a business, does the ‘value’ somehow go down?  For me, that hasn’t been the case, and cheating isn’t anything new.  Yet, even if Dave/Ed’s story is a singular example, and even if it doesn’t account for the wider meaning of ‘academia,’ it is still something that I think must be considered when we begin to examine how ‘academia’ and ‘academics’ are perceived discursively within and without the context of ‘higher education.’

[1] James Clifford, “Introduction: Partial Truths” in James E. Clifford and George E. Marcus, eds., Writing Culture: The Poetics and Politics of Ethnography (Berkeley: University of California Press, 1986), 7.

***I should mention that a good friend of mine, and pretty clever guy, Jonathan Tuckett just successfully defended his Thesis, and I swear this post is neither inspired by, nor is in reflection of, his achievement.  I swear.  Really.  Check him out, he writes fiction (the made-up kind, as well as the made-from)!***

Whose Story is it Anyway?

Ok.  So up at the wall Jon Snow is dealing with this election thing to decide who will be the new Lord Commander, cause the guy who was in charge before was killed at Craster’s and then Jon had to kill all those guys and get his wolf back.  Anyway, so there’s an election, and its between this one guy whose been there for like eighty years or something, and this other guy who was sort of in charge and was really mean to Jon Snow, and kept making sure everyone knew he was a bastard and hated him cause he knew Jon was a better fighter.  Just as they’re about to do the election, Samwell Tarly steps up and points out all the great things that Jon has done and all the other Night’s Watch guys cheer and agree so they add Jon’s name to the election.  When they finally do vote, it comes down to a tie between Jon and the mean guy who hates him, but it’s decided by the old blind guy that used to be a Targaryen prince or something and he votes for Jon so he becomes the new Lord Commander.  But, see now he has to deal with the fact that one of the people who thinks they’re the King, Stannis, is there, and he wants Jon to help him lead an army south to take back Winterfell, but Jon wants to stay at the wall.  And then there’s the Boltons who have moved into Winterfell, and in the book Ramsay, the son of the guy who killed Robb Stark, marries this girl who they disguise as Jon’s sister Arya, in order to create some sort of political claim to their ownership of Winterfell, but Arya is actually in Braavos learning to become an assassin.  But in the TV show they change that to Sansa to keep the actress in the show, cause the books haven’t gotten to that point.  So that’s why they’re changing all the plot-lines and stuff.  


The above is a paraphrased answer that I gave to someone yesterday who began our conversation with: “I’ve heard there’s a lot of changes between the books and the TV show, what’s a good example?”

Regrettably, for her, my explanation kept going for quite some time.

For those entirely ignorant of current popular culture, the television show Game of Thrones premiered the first episode of its fifth season on Sunday (or, for those who don’t mind pirating, the first four episodes, which were leaked online).  This season, as admitted by those producing and writing the series, differs more in content front the story-line in the books than the previous four seasons.  This means that a number of characters have been dropped, and that certain story-lines have been amended or altered, including the ‘Sansa’ details mentioned above.  This has, expectedly, raised the hackles of a number of fans, to the point that names like Lady Stoneheart and Coldhands have become signifiers of anger and rage-filled disappointment.

lady stone heartcoldhands

Here’s a good example: http://gotgifsandmusings.tumblr.com/post/115991793402/unabashed-book-snobbery-gots-10-worst

While not a devoted adherent to the idea that an adaptation must remain as accurate as possible to the source material, the conversations I’ve been having with friends and colleagues (including the illustrious Beth Singler, who quite helpfully pointed me in the direction of some of these sources), have indeed piqued my interests concerning the precarious notion of who gains ownership over stories, when those stories get told and re-told by different people in different ways.

This likewise brings me to a passage from Clifford Geertz’s Works and Lives: The Anthropologist as Author, that I think might make some sense of this issue:

But perhaps the most intense objection, coming from all quarters, and indeed rather general to intellectual life these days, is that concentrating our gaze on the ways in which knowledge claims are advanced undermines our capacity to take any of those claims seriously.  Somehow, attention to such matters as imagery, metaphor, phraseology, or voice is supposed to lead to a corrosive relativism in which everything is but a more or less clever expression of opinion.  Ethnography becomes, it is said, a mere game of words, as poems and novels are supposed to be.  Exposing how the thing is done is to suggest that, like the lady sawed in half, it isn’t done at all. (Geertz, Works and Lives, 2).

While Geertz’s argument here is pointed at the issues we might find ourselves confronted with were we to consider the ‘literary aspects’ of ethnographic construction (everything is fiction?), I think his statement also speaks to the issues some people are having with the choices being made by the creators of the TV show.

The show is an adaptation, which also means that it is an artifice of an artifice.  It’s the interpretation of two individuals designed for the purpose of presenting a story through an entirely different perspective.  Yet, this is not something unique to just the differences between the show and the books.  In fact, because each and every individual reading of Martin’s novels is in itself an adaptation, and since no two minds are mirrored images of each other, each time someone reads the texts (or watches the show), we get an innumerable number of adaptations.  This is demonstrated by my description above.  While in my mind I can see the episode, and remember the way the texts are designed, when I précis this into a description, I have adapted the story to suit my own story-telling purposes.

I would argue that this is like revealing the ‘magic’ of the magic trick involved in any sort of story-telling, from ethnography, to fiction, to the stories we tell each other about our day-to-day existence.  Seeing how the lady is sawed in half, or rather, seeing how the illusion makes it look as if she might be sawed in half, is the same as realising that all stories, by their inherent ‘artifice’ nature, are adaptations.  In this way, there is never, nor can there ever be, a genuine ‘truth,’ an original ‘source,’ or a ‘right’ way of telling a story.  This is perhaps even more apparent when a story is an adaptation of a story.

For these reasons, I would further argue that the adaptation provided by the TV show should be seen as nothing more than just another adaption, and therefore should not be understood as different from our individual readings of the novels.  The TV show is just another way of trying to tell Martin’s story, which is also just an adaptation of the story within his own mind.  While the TV show might look different than the novels, the novels likely look different than what’s in his mind, which is something we will never see.

This is a reminder.  It’s something that we can relate to when we consider the stories we hear from others about themselves, about others, the ones we tell about ourselves, and the ones we tell about them.  In a world where everything is fiction, or rather, where everything is artifice, the notion that an adaptation is telling a story incorrectly is rather moot.  Even when the ‘original’ author might agree.  In the end, all stories are adaptations, even when they are initially told.  Which also means that all stories, just like looking at the discourse that gives meaning to a word, rather than just defining it, are neither right, nor wrong, by the mere fact that all stories are nothing more than re-tellings of a story none of us will ever see.

Thank God for Book Reviews

Other than as an assignment for courses taken long ago, I had never written a book review.  Or rather, I had never written a review for the purposes of publication.  So when I volunteered my services for the Journal of Secularism and Nonreligion, I wasn’t entirely sure what the experience, or outcome, would be.  This post is a short story about that, with a specific emphasis on three aspects of that process that stand out in my memory.


Editing

I am no stranger to editing, and I hold no envy for those who do it.

I am also, by my own admission, what I call an ’emotional writer.’  This doesn’t mean that I get ’emotionally attached’ to my writing, or that my feelings get hurt when my writing is evaluated or edited.  Rather, my writing is ’emotional’ in the sense that for me the time and place when and where the writing gets done play a large part in how I ‘do’ the writing itself.

In this way, I’ve always been keenly interested in how writers write.  I love hearing about the process, how they establish a place to write, how they do it, whether they type or write by hand, what bizarre and personal little rituals they do.  I love that kind of stuff.  I also think it tells us something quite unique and specific about the character (perhaps even identity) of that person.

For example, Hemingway was notorious for writing while standing, as well as designing the writing process in such a way as to be inspired or influenced by his surroundings. hemingway-standing-deskLegend tells us that a number of his novels, such as The Sun Also Rises, and For Whom the Bells Toll were written in sections, in different countries, to convey a certain mood.

Likewise, my Thesis has been focused on certain novels by Ian McEwan, and I found myself giddily excited a few years back to find this video of him describing his writing process (with, interestingly, an embarrassed curiosity as to why people would be interested in that sort of thing).

See also this description:

When I wrote my review of Nick Spencer’s Atheists: The Origin of the Species, the writing process was divided into two parts: reading and writing.  It took a week or so to read the book, make notes, re-read sections, and formulate the structure of the review.  I made a list of important passages, as well as compiled an outline of the text itself, isolating what I thought was Spencer’s lead argument, and the basic criticisms and compliments I thought I should point out.  When I wrote the review, I created a number of drafts, making sure to return to the text to ensure my consensus was well designed.

A few weeks after submitting the draft I received the first round of edits and suggested changes.  This was an interesting experience.  Aside from my supervisor’s interaction with the Thesis, as well as suggestions and critiques made by lecturers over the years, I’d yet to have any sort of editorial suggestions made about something I had written for publication.

At first I found myself feeling defensive about the suggestions.  ‘Why,’ I thought arrogantly, ‘would there be suggestions?!’  ‘It’s perfect!’  I then reminded myself to grow up a bit.  In fact, and in retrospect, the editorial process was quite rewarding.  The individuals involved made very distinct arguments about structure and style, and in the end I think they truly helped in making the final draft feel much more coherent. However, there was one suggestion that kept appearing that I thought interesting, and it leads to my next aspect.


Capitalisation

For whatever reason, I have found myself over the years Capitalising words or terms that really don’t need it.  This occurs most often with research fields, like ‘Religious Studies’ or ‘Ethical Criticism.’  I’m usually quite open to amending this in my writing.  However, where I will stand-fast on capitalisation is in the title of things.

Throughout my research, and even throughout this blog, I have, and will, capitalise the terms ‘Atheism’ and ‘Atheist.’  As well, depending on the context, I will do the same with ‘Theism’ or ‘Theist.’  While the latter is done in direct reference to the former, it has become something that comes up time and again when people evaluate my writing.  My reasoning for capitalising the ‘A’ in Atheism is quite simple to explain.  In my research of the concept itself, I have adopted a particular methodology in order to study Atheism.  While I will likely discuss this in vivid detail in the near future, I can summarise this methodology here as follows:

rather than contribute to the present discourse on defining the term, and in that way avoid the precarious notion of stipulating what Atheism might mean to those individuals who identify themselves as ‘Atheists,’ I approach the term in a discursive manner.  What this means is that I am more interested in how individuals use the term, how it is constructed, what ‘agency’ they give to it, and how that then dictates the way it is given meaning.  I think of the term as an ’empty signifier,’ that is then ‘defined’ by the individual filling it with their particular meaning.  What this also means is that the term itself transmutes from a ‘defined thing’ into an ‘identity.’  In this way, just as we might capitalise terms like ‘Christian,’ ‘British,’ ‘American,’ or ‘Buddhist,’ so ‘Atheist’ receives the same treatment.  This likewise removes it from the category of ‘descriptive terminology’ like ‘blonde’ or ‘short.’  This does not mean, however, that I use the term in an apologetic or promotional manner.  That is, for me, capitalising the term ‘Atheism’ does not mean that I am making the argument that it is equal to ‘Christian’ in that ‘Atheist’ signifies the title of an individual who belongs to the ‘religion’ Atheism.  While that is an extremely interesting conversation I might take up (and likely will at some point), it is not my justification here.          


Copy Editing 

This brings me to my final aspect.  With the final draft submitted, and with my use of the capitalised ‘A’ in ‘Atheism’ accepted, I awaited final approval from the copy editors.

Now, as I have stated, the editorial process was a very rewarding experience, and I am truly indebted to those individuals involved.  The copy-edited alterations are another thing entirely.  Interestingly, a colleague was going through a similar experience around the same time.  For her, the final draft that she had submitted for a chapter in an encyclopaedia came back with a number of ‘re-written’ sections, including her lead argument, thus altogether changing what she had intended to say.  While my experience was in no way this drastic, it did offer an intriguing insight to the process itself.

For me, the changes that I found were mostly structure-based.  Sentences were re-written, and arguments were restructured.  Nothing was so drastic as my friend had found.  Still, it was a bit jarring to see something I had worked on re-designed.  A similar thing happened years ago on a group project I participated in on a course about American politics in the 1960s.  The four of us involved had each elected to write about a thousand words of a group essay, which we then sent off to our group leader, who compiled it all together.  After we got the paper back a few weeks later, we all noticed that our group leader had re-written each of our contributions.  While the grade we received was not as high as we had hoped, my greatest issue with this was that the work that was evaluated under my name was not, at that point, ‘my work.’

I felt a similar feeling with the copy-editor’s re-writes.  While my experiences with the editing process at the start were quite humbling about the benefits of other’s suggestions about my writing, this seemed different.  After all, since I was being critical of Spencer’s work, I felt it should be my writing, and wholly my writing, that did that.  Otherwise, I thought, it wouldn’t be fair to him.  Fortunately, when I returned the final draft with my original writing, there was no argument and the published version appeared as I had wished.  Which brings me to a conclusive point.


Conclusion

Writing this book review came at a very useful time for me.  I am quickly approaching the point where I need to submit the Thesis, and after roughly four years of working on one piece of writing, it was good to have a bit of a distraction (even though the topic was still on Atheism).  However, writing this review was not just a distraction from the Thesis, it was also a healthy reminder of some important things.

  • Now that I am reaching the end of the writing process, it is proving, perhaps for no other reason than anxiety, more and more difficult to accept criticisms about the writing.  My experience with editing the review helped with that.  It reminded me that another perspective is not only useful, but important.
  • Likewise, defending my capitalisation of ‘Atheism’ was a reminder of the methodology I had adopted for the Thesis, and seeing it written out as simplistically as possible in a brief defence helped me clarify my reasoning within the Thesis.
  • Lastly, seeing the copy-editor’s re-writes, and defending my original draft, was a reminder that the Thesis is my work.  While there have been a number of individuals who have played a major and important role in helping me get it done, when I defend it, it will be my writing and no one else’s.  Defending it as such, I would argue, is quite important.

In the end, then, writing this review helped me in a number of important ways, from distracting me from the anxieties of finishing and submitting the Thesis, to reminding me of the importance of taking advice, clarifying my argument, and defending my finished product.  For these reasons, I think it is perfectly fair to say: ‘thank God for that.’

‘Statistics can prove anything’ (and other fictions used by New Atheists)

As I mentioned in my previous post, I’ve been spending my time recently trying to find video clips that might represent a ‘New Atheist discourse.’  This week, I selected a few examples in order to demonstrate how one might do bad scholarship.  In fact, this post is perhaps just a continuation of my previous one on locating New Atheism within a discursive boundary demarcated by an ‘asshole’ mentality.  Does this mean I think ‘bad scholars’ are ‘assholes?’  Maybe.  It varies from day to day, and from scholar to scholar.


Let’s begin with Richard Dawkins.  In the following clip Professor Dawkins is listing off a number of ‘religious things’ that offend him.  Or, rather, that should offend us.  While we might all ‘generally agree’ that these are certainly things that we should be offended by, were we to specify them as actions outside our own contextual boundaries about ‘ethics’ and ‘morality,’ the way that he shapes his argument is what shines through for me.

In the clip, he displays horrible atrocity after horrible atrocity while deftly (or, as he says it, ‘logically’) associating these things with ‘religious thinking.’  Then, he casually moves his argument toward a quote from Martin Amis, a rhetorical question about what a ‘secularist’ would shout when ‘cutting off an infidel’s head.’  To answer this question, he cites a ‘critic’ of Amis’ book, who responds: “they shout ‘Heil Hitler.'”  Now, his argument is about Hitler, focused on whether or not he was an ‘Atheist,’ to which he easily declares that he was a Catholic.  Or, at least his soldiers were.  Likewise, he adds, even if he was an Atheist, that shouldn’t matter.  He was also a vegetarian.  He asks: “Does that suggest that vegetarians have a special tendency to be murderous, bigoted racists?”  To which the audiences giggles at such audacity.

From here, he easily slides into his conclusive point:

The point is, that there is a logical pathway leading from religion to the committing of atrocities.  It’s perfectly logical, if you believe that your religion is the right one, you believe that your God is the only God, and you believe that your God has ordered you through a priest or through a Holy Book, to kill somebody, to blow somebody up, to fly a plane into a skyscraper, then you are doing a righteous act.  You’re a good person.  You’re following your religious morality.  There is no such logical pathway leading from Atheism and secularism to any such atrocious act.  It just doesn’t follow.  

Based on the evidence he provides, his argument appears sound.  Of course, were we to test his argument against historical facts, not only would much of what he says be logically unsound, it would also be questionable on a vast number of theoretical points. For these reasons, we might ask, why did he not supply more evidence?  Or conflicting evidence?  Why would he use Hitler as an example, if the association of Hitler to ‘secularism’ is a “truly outrageous thing to say?”  As well, could we not simply take his argument about the ‘logical pathway’ leading from ‘religion’ to the committing of atrocities and relate it to another evidential example, such as the State Atheism (Communism) of Albania, China, and Cuba, or the ‘secular revolutions’ of France and Mexico?  Of course, to do this, we might be forced to define, via discursive and lexical examples, how those ‘Atheisms’ might represent some sort of ‘Atheism’ that we could then relate to that found in Britain or the United States.  Then again, this takes work, to which, for logical reasons, I doubt Professor Dawkins is willing to commit.  When it comes to this subject, he is a bad scholar.


I would argue that this is another trait of the ‘New Atheism’ presented by Dawkins, Harris, and a few others, such as Bill Maher.  While Maher has been an Atheist advocate for some time now, his discursive alignment with the Atheism representative of Dawkins, et al. is not only established but their shared argumentation, but by their bad scholarship as well.

Here’s a handy example.

In an interview with Charlie Rose in 2014, Maher presents his position on religion (‘they’re all stupid’), and then turns the conversation (with the help of Charlie) toward religious violence.  The essential point I will be focusing on here is a citation he makes to support his statement that Muslims, in general, ‘condone violence.’ When asked by Charlie, “how do you know that,” he states:

There’s a Pew Poll of Egypt done a few years ago, 82 percent I think it was, said, uh, stoning was the appropriate punishment for adultery.  Over 80 percent thought, uh, death was the appropriate punishment for leaving the Muslim religion.  

This statement is made at the 2:23 mark.

This citation returns a bit later on his own show on HBO, Real Time with Bill Maher.  His guests for this episode are Sam Harris, Ben Affleck, Michael Steele, and Nicholas Kristof.  Again, the discussion is on religious violence, predominately about ISIS, and the way that ‘liberals’ are failing to control this sort of ‘fundamentalism.’  The section of this clip that I will focus on here comes toward the end of the debate after Ben Affleck attempts to convey his own argument that Maher and Harris are simply being racially insensitive, and stretching their ideologies beyond reasonable limits.  In reaction, Maher states:

No it’s not.  It’s based on facts.  I can show you a Pew Poll of Egyptians–they are not outliers in the Muslim world–that say like 90 percent of them believe that death is the appropriate response to leaving the religion.  If 90 percent of Brazilians thought that death was the appropriate response to leaving Catholicism, you would think it was a bigger deal.  

The specific statement occurs at the 8:02 mark.

While many of the rejoinders to Maher’s and Harris’ statements here about their ‘caricatures’ of religious individuals, as well as connections of this sort of language to ‘white racism against blacks’ in the United States, are indeed poignant responses, it is the statistics that Maher is using to defend his position that I think stand out the most.

On two occasions, when arguing that Muslims (or ‘Islam,’ since he tends to refer to the religion, rather than to religious individuals) are inherently violent, he has done so based on ‘facts.’  This is, of course, a good means of argumentation.  It gives credence to one’s position, and grounds the statements made in a foundation of ‘truth.’

Of course, this only tends to work when those ‘facts’ remain details without any further elucidation.

With this in mind, and in considering that he has based his argument that ‘Islam’ is violent because a Pew Poll stated that a vast majority felt a certain way about using violence, let’s turn now to those facts themselves.

The poll cited was a part of a Pew Forum report on The World’s Muslims: Religion Politics, and Society conducted in 2011.  The section Maher is referencing is Chapter 1: Beliefs about Sharia.  In the first clip, he references the results about stoning as a punishment for adultery.  Here is an image of that data:

Screen Shot 2015-03-31 at 22.09.34In the first and the second clip, he refers as well to the punishment that should be given for leaving the Muslim faith.  Here is that data:

Screen Shot 2015-03-31 at 22.11.07While he is incorrect about the exact percentage, he is still fairly close to the actual numbers.  Nevertheless, here is quantitative data, ‘fact,’ that he can use to support his argument.  From this, we might agree that, yes, it seems that up to 80 or 90 percent of Egyptians believe the death penalty is the appropriate punishment for adultery or leaving the faith.  He would be correct, then, in saying that, as an example, this supports the idea that ‘Islam’ is violent.

However, his statement begins to turn into a ‘caricature’ when we look at the actual numbers polled.  Here is an image taken of Appendix C: Survey Method that gives the specifics about the data itself:

Screen Shot 2015-03-31 at 22.15.17

Likewise, here is an image of the sample size that make up the data:

Screen Shot 2015-03-31 at 22.16.11

As we can see, the data itself is representative of a sample size, meaning it should not be used as a representation of all people in Egypt, let alone all Muslims.  Furthermore, this is data based on face-to-face interactions over a single month.  It would be impossible in that time to actually poll the roughly 79,000,000 people who lived in Egypt in 2011.  As well, the two responses concerning the death penalty for adultery and leaving the faith are based on individuals who ‘say sharia should be the law of the land.’  These details do not support his argument in the way that he has made it.

I should add here that this is not meant as a critique of this sort of data, nor of the utility in gathering and using such quantitative information.  Rather, I’d argue it provides a bit of a caveat about mis-using it.  This is evinced best, I think, by how Maher does exactly that.  That is, if he were to re-word his statement, the resulting conversation might be a bit more effective.  Perhaps something like this:

Of the Egyptian people polled in 2011 who thought Sharia should be the law of the land, 80-90 percent of them stated the death penalty was an appropriate response to actions we here in America might find unethical because of our differing political context.  While this is merely a sample size and should not be considered a representation of the 79 million people living in Egypt, let alone the billion Muslims worldwide, I still think it provides an interesting entry point to a discussion we could have on the way this sort of discourse might influence how Islam is perceived, both in and out of the context in which these answers are given.   

Instead, he chose a different approach:

It’s the only religion that acts like the mafia, that will fucking kill you if you say the wrong thing, draw the wrong picture, or write the wrong book (6:40 in the third clip).

Again, this is bad scholarship.


The caricatures created by both Dawkins and Maher in these examples reflect a certain type of discourse.  This is, I’d argue, a result of the way they use ‘data’ and ‘facts’ to support their argument, rather than the other way around.  Which, additionally, damages what they have to say.  Instead, they come across as equally fundamentalist in their thinking as the people they are arguing against, using bad scholarship to support their opinions.  In this same way they are telling a particular story about themselves, and about how they construct their discourse, much in the same way the individuals who responded to the Egyptian poll have provided a certain story that we might, were we so inclined, use to interpret them in an equally general (and incorrect) manner.

Comedic Criticism: A Discursive Source of Atheism

In our tutorials for Atheism in Debate this last week we discussed Feuerbach.  The week before that was Strauss, and before that was Hegel.  Understandably, its usually around this point where the energy of the course begins to wane.  In order to try and remedy this, I tend to use video clips, usually of one of the four ‘New Atheists,’ to break up the monotony of just talking about the reading.  For this round of clips I tried to find ways to connect the ‘anthropomorphism’ of Feuerbach’s deconstructive theory about religion being ‘human nature reflected, mirrored in itself,’ with the way Dawkins, Harris, and Hitchens diminish religion to infantile self-creations.  For those interested, these are the clips that I chose:

As I was searching for these I came across this interesting video:

Here was a listicle of ‘Generation Xero Film’s’ “Top Ten Anti-Religion Comedy Routines.”  This got me thinking.  What is the difference between these ‘comedy routines’ and the statements being made by the ‘New Atheists?’  Are they not equally ‘scripted’ critiques of religion?  Do they not function the same way as the rhetorical use of the ‘Atheist discourse‘ being presented by Dawkins, Harris, and Hitchens?

I thought I’d look into this a bit more.

I came across the work of Patrick McKearney at the University of Cambridge who, for a few years, was the ‘Atheist comedy guy.’  Aside from the four conference presentations he gave on the subject (“Public Belief and Civil Society: A Case-Study of Contemporary Anti-Religious Stand-Up Comedy;” “The Ridicule of Religion in Contemporary British and Irish Stand-Up Comedy;” “‘What are you laughing at?’ The Role of Ridicule in Non-religious Identity Formation;” “Methods for Investigating Non-religiosity in Stand-up Comedy”), he also participated in a BBC 4 discussion on Comedy and Religion, and published two articles on the subject in The Guardian (“Heard the One about the Pope?”) and Varsity, the independent student newspaper for the University of Cambridge (“Slap in the Faith“).  The latter is focused on issues of comedic criticism and the reactions we might see in fundamentalist religion striking back (such as we saw with the attacks against Charlie Hebdo a few months back).

Likewise, my good friend Katie Aston deals with this a little bit in her Doctoral Thesis.

So how might these comedic criticisms present a useful example of an Atheist discourse?  I believe the answer lies in some specificity.  For pragmatic reasons, then, I will be using two methodological points made by Norman Fairclough in his Analysing Discourse (2003).

First, in consideration of the utility of discourse analysis in the study of texts, let’s broaden our conception of that term itself:

“written and printed texts such as shopping lists and newspaper articles are ‘texts’, but so also are transcripts of (spoken) conversations and interviews, as well as television programmes and web-pages” (Fairclough, 2003, 4).  

In this way, these video clips, as edited versions of the stand-up comedian’s routine, are texts, filled with, and exemplary of, particular ‘language in use.’  In other words: ‘discourse.’

Second, let’s specify how we might more directly consider these texts via a three-part interpretation:

“the production of the text, the text itself, and the reception of the text” (Fairclough, 2003, 10) 

In this way, we can be a bit more specific about the discourse being used, as well as establish a contextual boundary within which it emerged, was presented, and subsequently received.

These things established, let’s look at three examples, two of which were also on ‘Generation Xero Film’s’ “Top Ten Anti-Religion Comedy Routines.”

The first comes from Ricky Gervais, and focuses on a critical analysis of the Biblical story of Noah’s Ark:

The second comes from Bill Maher, and focuses on examples of religion ‘doing harm:’

The third, and perhaps most famous, comes from George Carlin, and focuses on religion as ‘bullshit:’

From out of a cursory analysis of these three clips as ‘texts,’ we can establish a number of discursive specifics:

  • Each are reactionary, and thus present a criticism directed at a particular subject.
    • The first (Gervais) presents a critical assessment of the fictionality and inherent unbelievability of a Biblical myth through the lens of modernity.
    • The second (Maher) is directed at issues of morality, and the fact, as he sees it, that ‘religion’ is harmful and immoral.
    • The third (Carlin), like Maher, presents a critical assessment of the harmful and equally immoral dangers of religion/religious belief (though with the caveat that his ‘Sun Worship’ (not ‘prayer-to’) is still practical.
  • The ‘religion’ of their collective criticisms is somewhat vague, though we can presume via their language they are reacting against a particular monotheism, likely Christianity (though Maher intermixes this with critiques of Islam).
  • While seemingly problematic, these differences tell us a great deal about their contextual discursive language use.  Gervais’ routine was given in 2010, the same year as Maher’s.  Carlin’s routine comes from 1999.  So, we might concede that Gervais’ and Maher’s routines stem from a ‘New Atheist,’ or post-September 11th discourse, though that might be presuming a bit much.
  • However, simply as ‘texts,’ they do not tell us much about their ‘Atheisms.’  Yes, we might assume (or presume) that they are being inherently ‘Atheist’ by means of their criticisms, it is not as specific as, say, an informant telling us about his or her ‘Atheist identity,’ and how he or she has constructed that identity in a specific way.

So how might we use them as textual discursive sources?  By taking up Fairclough’s three-part interpretive method, we can begin to shift them from mere textual examples to more direct discursive ones.

  1. Learning about how they were produced (written) we can learn a great deal about the individuals doing the writing, the context that writing took place, the type of Atheism they themselves identify with, and the influences that shaped their texts based on that type of Atheism.
  2. Then, our cursory analysis (such as above) becomes a bit more nuanced.
  3. Finally, we can look at how they are received by individuals (audience or viewers) who equally identify as ‘Atheist,’ while equally deciphering how these texts assist these individuals in their own identity constructions.

By weaving these together, we begin to form a much clearer (in my opinion, at least) conception of ‘Atheism,’ such as we might use to better understand the discursive elements that influence the New Atheist clips presented above.  While this isn’t a better means of approach then conducting interviews and ethnographically shaping a textual representation, as a means of understanding the discourse that might underscore or influence the identities that make up such an ethnographic textual representation, this seems quite beneficial.  Likewise, I believe this works much better than merely speculating or theoretically stipulating what we think these sorts of things (like Atheism) mean, and is therefore a much more useful (and, to be honest, more enjoyable) means of researching precarious concepts such as ‘religion’ or ‘Atheism.’


Norman Fairclough, Analysing Discourse: Textual Analysis for Social Research (London: Routledge, 2003).

*As an extra bonus, here is an animated version of Louis CK (who is not an Atheist) talking about ‘God as a shitty girlfriend,’ and the oddity of ‘saying Jesus Christ with a shitty attitude.’

Tell me about your mother, Mr. Hubbard.

For some time now, I’ve maintained the ridiculous notion (fiction?) that I’ve never been, nor ever will be, a ‘good academic’ because I tend to approach my subject with a bit more creativity or invention than one might presume of the common or proper academic.  I refer to this as ‘ridiculous’ because many of the researchers I’ve come to know over the years are just as equally creative.  For example, consider the digital anthropology of Beth Singler, the conspiracist demythologisation of David Robertson, Venetia Robertson‘s research on human-animal relations, the ‘invented religions’ of Carole Cusack, and Vivian Asimos‘ research on folklore, myth, and video games.

I am, by all means, not unique in my creativity (even when I argue that ethnography and novels are equally ‘fictitious‘ by means of their shared literary qualities). However, this does not mean that my creative approach hasn’t warranted a few truly bizarre interpretations (such as my recent takes on New Atheism or the Secularization Thesis).

In fact, while I was recently thinking about my attempts to push the boundaries on academic creativity, I suddenly remembered a paper I submitted years ago for a course on the Method and Theory in the Study of Religion.  This was, in memory, a rather unorthodox paper, and the more I thought about it, the more I asked myself it if was as crazy as I was remembering it.  Sure, the memory of the topic seemed sort of ridiculous, but could it have been that bad?  I passed the course, after all.  It must have had some merit to it, or, if nothing else, I must have been at least marginally successful in touching on the important points required of the assignment.

After some digging around, I found the paper and read through it.  Not only was I appalled by the horrific work that I had submitted for a grade, I was equally dumbfounded that I actually passed.  The ‘creativity’ was, though based in good intentions, less creative than it was inane.

Yet, a part of my subconscious came to my own defence.  It argued proudly for the logic behind the correlations I was making, so much so, that it began to win.  I read through the paper again.  Suddenly, somewhere within the conclusion, it struck me: this paper isn’t as bad as I had originally thought.  Rather, when I wrote it, I was simply unconscious of the role stories play in determining how we might perceive things wholly separate from each other as connected by a narrative bridge.  This post is a short defence of that notion.

BEFORE CONTINUING HERE , A BRIEF, YET ESSENTIAL CAVEAT.  

I’M NOT WEALTHY , NOR DO I HAVE CHILDREN TO BARTER.

PLEASE, PLEASE, DON’T SUE ME.  

Our assignment was to discuss two of the scholars of religion we had been studying throughout the semester in the context of their notions about the origins, and/or uses, of a particular religious system.  For my paper I chose Freud and Durkheim.

However, rather than simply do the assignment as asked, my ‘creative’ approach tried to use Freud’s psychoanalysis to make sense of Durkheim’s sociology (though I’ve come to realise through hindsight that this would have worked much better with Weber and his father issues).  The religious system I chose was Scientology.

While the paper I eventually wrote did a fairly decent job breaking down how the theories espoused by Freud and Durkheim tried to ‘make sense’ of religious belief, my approach to Scientology (the majority of which leaned more on the Freudian perspective than the Durkheimian one) was a bit more specific.  In essence, I tried to relate Freud’s ‘Psychic Apparatus’ (Id, Ego, and Superego) to L. Ron Hubbard’s equally tripartite notion of Body, Mind, and Thetan.

In brief, here is a précis of each:

Id: the unconscious, functions on instinct, and seeks pleasure regardless of any sort of consequences

psychic apparatusEgo: an intermediary between the Id and the rational world, it translates reality in the Id’s search for pleasure; like a man (ego) on horseback (id), the ego “has to hold in check the superior strength of the horse” (Freud, The Ego and the Id, 15).  

Superego: Constructed by one’s social context, it regulates the impulses of the Id; consisting of the ‘conscience’ and the ‘ideal self,’ it dictates and causes guilt, as well as establishes goals and aspirations.  

Body: The body is the organized physical composition or substance of Man, whether living or dead. It is not the being himself.

Mind: the mind, which consists essentially of pictures.

thetan body mindThetan: Of the three parts of Man, the thetan is, obviously, most important. Without the thetan, there would be no mind or animation in the body. While without a body or a mind, there is still animation and life in the thetan.

(These definitions come from Scientology.org, and the video they provide about these three elements does a much better job at describing this than I do.)

If there was a correlation to be made here, I argued, it might perhaps be found somewhere between Hubbard’s description of the Thetan’s ability to use its mind “as a control system between itself and the physical universe” (‘Thetan,’ Scientology.org), and Freud’s notion that the Ego and Superego function as two different sorts of filters through which the individual might maintain the desires and impulses of the Id.  Or, perhaps there might be a connection between the Body and Mind of Hubbard’s system and the unconscious of the Id and conscience of the Superego.  Or perhaps a myriad of other correlations.

This is where my paper seemed to get a bit sidetracked, mostly because (as the person writing this would tell the person who wrote that) finding correlations like this are not really all that necessary.

Rather, what I’d likely tell myself, were I to talk to that individual, is that though these might seem relatable in a number of ways, and though the direct criticism Scientologists direct toward ‘psychology’ would definitely present an ironic sort of correlation (“Psychology, for instance, had worked itself into a dead end. Having no concept of the existence of an animating factor to life, it had degenerated into a practice devoted solely to the creation of an effect on living forms.”), these might be better understood were ‘we’ to merely see them as two similar sorts of stories used by two men in their attempts at making sense of their worlds.

That is, while it might be ‘creative’ or even ‘clever’ to interpret Hubbard’s tripartite via Freud’s, the outcome would likely only provide a solution inherently built upon an opinion of either man’s system.  Which, to me, doesn’t really seem very creative at all.  Instead, as narrative devices, as stories that tell us something about how these men interpreted their world, and thus in turn tell us something about them personally, they function on an entirely different spectrum of criticism.  Thus, rather than merely trying to connect dots that might creatively lead us to some sort of conclusion, using these narratives to make sense of the individuals who told them, as well as the individuals who use them, becomes that much more useful than even the most pragmatic attempts at comparing like with like.

The Zombie Apocalypse Secularization Thesis

In 1967, Peter Berger published The Sacred Canopy: Elements of a Sociological Theory of Religion.sacred canopy  In 1968, George A. Romero released his film, The Night of the Living Dead.  Night_of_the_Living_Dead_afficheWhile these two pieces of cultural insight might not seem linked in any sort of comparative way, an argument can be made that they do, in fact, share similar discursive perspectives concerning the theory of secularisation.  The intent of this post is to discuss, as well as defend, those similarities.

The Secularisation Thesis (in brief)

The idea that religion would eventually wane in social importance as humanity moved closer and closer to ‘modernity’ has its roots in the theoretical conclusions of Freud, Weber, and Durkheim.  Arguing that religion would eventually lose its social significance by means of modernisation, there eventually arose a Theory (or, rather, Thesis) concerning secularisation.

Heavily determined by the notion of ‘differentiation,’ of the divorce between religious/state cohabitation, this thesis is perhaps best be summarized by Casanova (1994), who defines it thus:

the conceptualization of the process of societal modernization as a process of functional differentiation and emancipation of the secular spheres—primarily the state, the economy, and science—from the religious sphere and the concomitant differentiation and the specialization of religion within its own newly found religious sphere [Casanova, 1994, 20.]  

Adding to this definition the equally essential sub-theses of ‘decline’ and ‘privatization’—the idea that ‘secularization’ would eventually lead to a ‘progressive shrinkage’ of public religion until it ultimately disappeared—Casanova’s is merely a contribution to this particular discourse, linked back to a mid-twentieth-century re-conceptualization.

As Berger (1990) states, its original significance was to signify the “removal of territory or property from the control of ecclesiastical authorities” (Berger, 1990, 106), such as in the wake of the Wars of Religion in the sixteenth and seventeenth centuries, or more specifically, the “return to the ‘world’ of a person in orders” under Roman canon law (Ibid.).  Under the influence of academic criticism, the term has come to embody a number of differing ‘types’ of secularization, all inherently anchored to a ‘modernized’ resolution of societal differentiation from religious authority.

As a central voice in this discussion, Berger’s (1967) early definition, “the process by which sectors of society and culture are removed from the domination of religious institutions and symbols” (Berger, 1967, 107), has often been considered the foundation for the thesis in general, and has thus been amended on a number of occasions.

For instance, by citing the term’s reliance on ‘modernism,’ Bruce (2002) defends the thesis’ validity by pointing out the modern value bestowed upon certain ‘non-religious roles and institutions,’ which he sees as equally signaling the ‘declining importance’ and ‘social standing’ of religion, particularly in the extent to which people ‘engage in’ or ‘display’ their beliefs publically (Bruce, 2002, 3).  Additionally, and in localizing this discourse into a more specific ‘Christian’ context, Smith (2010) and Martin [(1969)1978] remove the concept of ‘secularization’ from the more general framework of ‘religion.’  Smith offers a more theologically-centered insistence, to the point of stating that secularization is nothing more than the “latest expression of the Christian religion” (Smith, 2010, 2), so that what we perceive as the differentiation or decline of ‘religion’ is merely a representation of what he calls the ‘fluid,’ and ‘evolving’ identity of “Christian ethics shorn of its doctrine” (Ibid. 7).

Martin, whose conception might be considered just as ‘foundational’ as Berger’s, offers a more general theory, framed within what he refers to as a ‘Christian ambit’ (Martin, 1978, 2).  Built upon the results of certain antecedent processes or ‘crucial events’—the English Civil War (1642-60), the American Revolution (1776), the French Revolution in 1789, and the Russian Revolution of 1917 (Ibid. 4-5)—this ambit forms a type of ‘continua’ that, with respect to individualism, pluralism, and modernity, he marks as being uniquely influential on later thought systems “most central for the secularization process” (Ibid. 8-9).

Taken up in later discussions pertaining to cultural issues beyond just significant ‘shifts in mood’ (see Davie, 1994, 4) concerning the place of religion in people’s lives, such as gender-specific discursive changes about the roles played by men and women in relation to ‘piety’ (Brown, 2001, 9-10), or the inherent ‘double burden’ between in-, and out-of-home female labor (Woodhead, 2008, 189), this discourse has substantial social scientific roots as well.

Zuckerman (2007) aptly summarizes this influence, which I believe is worth citing in full:

Norris and Inglehart (2004) found that 39 percent of those in Britain do not believe in God. According to a 2004 survey commissioned by the BBC, 44 percent of the British do not believe in God. According to Greeley (2003), 31 percent self-identify as ‘atheist.’ According to Bruce (2002), 10 percent of the British self-identify as an ‘agnostic person’ and 8 percent as a ‘convinced atheist,’ with an additional 21 percent choosing ‘not a religious person.’ According to Froese (2001), 32 percent of the British are atheist or agnostic. According to Gallup and Lindsay (1992:121), 29 percent of the British do not believe in God or a ‘Higher Power’ (Zuckerman, 2004, 49)[3]

Secularisation is thus perhaps best described by Martin (2005) as a “semantically rich, contradictory, and paradoxical” concept (Martin, 2005, 58).

For my rather playful use of it here, let us pretend that the Secularisation Thesis is a means to decipher a discursive shift from the ‘religious’ to the ‘secular,’ that in this particular context signifies a transition from ‘myth’ to ‘science.’  Now, to the zombies.

Zombie Narratives (films)

My citation to Romero’s Night of the Living Dead above is not by accident.  First, as a film it marks the sort of narrative I have chosen to focus on herein.  This does not merely ‘dismiss’ any other sort of narrative example, such as novels or other media, but determines my conscious choice toward specificity.  Second, because this film was released in the late 1960s it equally demonstrates a liminal stage in the transition I have chosen to focus on in using zombie narratives as representatives of the Secularisation Thesis.

Prior to this film’s release, zombie films were quite ambiguous about the term itself, to the point that we can isolate two specific categories:

Alien-based:

Creature with the Atom Brain (1955)

Plan 9 From Outer Space (1959)

Invisible Invaders (1959)

Astro-Zombies (1968)

Voodoo-based:

White Zombie (1932)

King of the Zombies (1941)

Voodoo Man (1944)

Voodoo Island (1957)

Zombies of Mora Tau (1958)

The Woman Eater (1958)

I Eat your Skin (1964)

While the former category appears to be the result of discursive influences concerning UFOs and Aliens (see David G. Robertson’s work for everything UFO based), the latter seems heavily influenced by an almost orientalist interest in the ‘unknown’ beliefs and practices of Haitian, Creole, and Caribbean religion.  For more on this, check out Sheller’s Consuming the Caribbean.

What stands out, then, with Romero’s Night of the Living Dead is that the zombies are never exactly explained (though we might also add here that they are ‘living dead,’ rather than merely under some form of mind control).  That is, even though in the film an emergency broadcast tells us that there are a number of assumptions about their origins, including a theory concerning radiation from outer space, it is never made apparent.  In fact, this lack of explanation carries on throughout Romero’s ‘Living Dead Series:’ Dawn of the Dead (1978), Day of the Dead (1985), Land of the Dead (2005), Diary of the Dead (2007), and Survival of the Dead (2010).

These films are liminal examples, representing a point between the myth-based films wherein the zombies are the result of either alien or religious practices, and those that have been produced more recently.  Moreover, we might even trace this transition to a single origin, out of which has spawned a whole new (secularised) genre characteristic.

In 2002, Danny Boyle’s 28 Days Later re-invigorated the zombie genre, while at the same time infected the narrative with an explanation of zombiism in distinct medical terms.  While not a ‘zombie film’ in the sense that the creatures hunting the ‘living’ are infected with a significantly violent strain of rabies, rather than being ‘undead,’ the use of a viral infection as the source of the quasi-zombiism shifts the discourse from mythical to scientific.

Now, terms like ‘plague’ or ‘outbreak’ are used to describe the ‘zombie apocalypse,’ and epidemiology is often used as a narrative device as survivors search for a cure.  After 28 Days Later, this discourse embodies nearly every single zombie film: Resident Evil (2002) and its many sequels, though it was based on the video game first released in 1996; Shaun of the Dead (2004); REC (2007), as well as its American re-make Quarantine (2008); and World War Z (2013) which was based on the phenomenal book of the same name.

In fact, even remakes of older films adopt this discourse: I am Legend (2007), which was based on the book by Richard Matheson and the films The Last Man on Earth (1964) and The Omega Man (1971), finds Will Smith’s Robert Neville a lone virologist trying to ‘cure’ the disease plaguing the mutated, seemingly zombified, remnants of humankind; the 2008 ‘loose’ remake of Romero’s Day of the Dead in which a viral outbreak has created ‘zombie-like’ creatures; and, perhaps most importantly, the incredibly popular The Walking Dead series, based on the graphic novel of the same name.

The latter is an important addition here, not just because of it’s massive popularity, but because of its longevity.  Unlike the films cited here, which are given a limited amount of time to tell their stories, The Walking Dead has had five seasons, and is scheduled for a good deal more.  It even has a spin-off, set in Los Angeles, that has begun shooting.  What this equally means is that the show, which quite occasionally veers away from the source material, has taken the opportunity to explain, in detail, just where the ‘zombie disease’ came from, and how it affects us all.  In the first season finale, “TS-19,” the zombie virus is revealed to us via an explanation from a lone CDC medical technician working tirelessly to cure the disease, to no avail.  He gives us vivid details about it, even showing us how it works, using scanned footage of his wife’s brain as she died, and then re-animated.  The name of the virus gives the episode its title.  The website Nerdist.com has even provided a scientific explanation beyond the limits of the show’s dialogue, which can be viewed here:

https://www.youtube.com/embed/HG9BbvW2tco?feature=player_detailpage“>

With these examples we can trace a very distinct discursive shift, from mythological to empirical, a shift that, via zombie narratives, provides for us a unique perspective on the manner with which secularisation might manifest itself.

Conclusion

Perhaps more than anything else, I chose the playful nature of this post as a symbol of the utility of discursive perceptions.  Just as we might perceive the influences that shaped the Alien and Voodoo-based categories prior to Night of the Living Dead as stemming from very specific sources, the shift from mythological to scientific in the zombie narratives after that latter film gives us an equal lens through which we might make sense of the secularisation thesis’ notion about the weaning away of religion in the face of modernity.  This is perhaps not a perfect example, but it is apt.

To summarise: zombie narratives, once predominately based in mythical discourse, shifted to more scientific and medical language in reflection of cultural modernity.  Now, though the idea of a horde of dead humans roaming the earth and feeding on the living is scientifically preposterous, the origins of such an apocalyptic vision are no longer justified by myth.  Rather, science has stepped in, a symbol of our cultural shift from mythical explanations to empirical ones.  These zombie narratives reflect our own shift, fiction representing our fact.

Which brings us back to everything is fiction.  Using fictional texts as discursive sources balances itself quite precariously on the nexus between what we might determine as fictional (entertaining and made-up) and factual (designed and made-from).  Yet it also gives us freedom to grow, experiment, and translate our ideologies via different sorts of thematic languages.  Zombie narratives and the Secularisation Thesis might not, on the surface, seem like relatable subjects, but the discourses that shape them are nonetheless malleable when we try hard enough.

 

[3] Norris and Inglehart, Sacred and Secular, 2004, 186-191; Greeley, Religion in Europe and the End of the Second Millennium, 2003, 190 and 205; Bruce (2002), 192-194; Froese, “Hungary for Religion,” 2001, 251-268; http://news.bbc.co.uk/1/hi/programmes/wtwtgod/3518375.stm (accessed 24 February 2015); and http://www.peace.ca/gallupmillenniumsurvey.htm (accessed 24 February 2015).

Peter Berger, The Sacred Canopy: Elements of a Sociological Theory of Religion (New York: Anchor Books, 1967).

Peter Berger, “The Desecularization of the World: A Global Overview” in Peter Berger, ed., The Deseculariaztion of the World: Resurgent Religion and World Politics (Grand Rapids: William B. Eerdmans Publishing Company, 1999).

Callum G. Brown, The Death of Christian Britain: Understanding Secularisation 1800-2000 (New York: Routledge, 2001).

Steve Bruce, God is Dead: Secularization in the West (Oxford: Blackwell Publishers Ltd., 2002).

Jose Casanova, Public Religions in the Modern World (Chicago: University of Chicago Press, 1994).

Grace Davie, Religion in Britain Since 1945: Believing without Belonging (Oxford: Blackwell 1994).

Graeme Smith, A Short History of Secularism (London: I.B. Tauris, 2010).

David Martin, “Notes for a General Theory of Secularization” (European Journal of Sociology, Vol. 10, Iss. 2, 1969).

David Martin, A General Theory of Secularization (Oxford: Basil Blackwell, 1978).

David Martin, On Secularization: Towards a Revised General Theory (Aldershot: Ashgate, 2005).

Linda Woodhead, “Gendering Secularization Theory” (Social Compass, Vol. 55, No. 2, 2008).

When Trolling Religion Becomes Religion; Or, Why it’s All Bertrand Russell’s Fault

In our course on New Atheism this week, we discussed Bertrand Russell, who could likely provide enough philosophical material to span the entirety of a course of his own.  His early twentieth-century arguments establish quite a foundational platform upon which much of modern/contemporary/New Atheism has been built.  So, when it came time to discuss his ‘Atheism,’ which would orbit around a debate on ‘The Existence of God’ between Russell and F.C. Copleston in 1948, we had much to talk about.  For those interested, an audio recording of the debate can be found below.

https://www.youtube.com/embed/hXPdpEJk78E?feature=player_detailpage“>

At the time of the debate, and even identified as such, Russell argues from the point of agnosticism, a position of pragmatic and expressed ‘lack of knowledge,’ derived from both the etymological foundation of the term (the alpha privative ‘A’ combined with ‘γνῶσις:’ ‘without knowledge’), as well as that coined by Huxley as a ‘method,’ rather than a ‘creed.’  In the latter, we might benefit from a more direct and primary description.  For example, let us consider the story of how Huxley coined his term, from the man himself:

When I reached intellectual maturity, and began to ask myself whether I was an atheist, a theist, or a pantheist; a materialist or an idealist; a Christian or a freethinker, I found that the more I learned and reflected, the less ready was the answer; until at last I came to the conclusion that I had neither art nor part with any of these denominations, except the last. The one thing in which most of these good people were agreed was the one thing in which I differed from them. They were quite sure that they had attained a certain “gnosis”–had more or less successfully solved the problem of existence; while I was quite sure I had not, and had a pretty strong conviction that the problem was insoluble.

[…]

 Agnosticism, in fact, is not a creed, but a method, the essence of which lies in the rigorous application of a single principle. That principle is of great antiquity; it is as old as Socrates; as old as the writer who said, ‘Try all things, hold fast by that which is good’; it is the foundation of the Reformation, which simply illustrated the axiom that every man should be able to give a reason for the faith that is in him, it is the great principle of Descartes; it is the fundamental axiom of modern science. Positively the principle may be expressed: In matters of the intellect, follow your reason as far as it will take you, without regard to any other consideration. And negatively: In matters of the intellect, do not pretend that conclusions are certain which are not demonstrated or demonstrable. That I take to be the agnostic faith, which if a man keep whole and undefiled, he shall not be ashamed to look the universe in the face, whatever the future may have in store for him (Huxley, Agnosticism, 1889).

This, we might concede, is a different sort of position than the more devout Atheism that we find in the arguments of Harris, Dawkins, Dennett, and Hitchens.  In fact, we might even concede that the Atheism we find discursively represented by these four individuals are, themselves, different from one another.  In fact, we might further conclude that, definitively, each of these voices offers a different type of Atheism.  Which would be lexically, and thus contextually, incorrect.  For this reason, when we discuss these individuals in our tutorials, we do so looking less for ways to use these discourses as means to construct a definition.  Instead, we use them to try and understand how each of these individuals discursively contributes to their own interpretation of the larger notion of ‘Atheism,’ from within their own contexts and specialised usages, and in order to shape their own particular identities.  It’s a fine line, but it’s an important one.


Russell eventually shifted his own position from agnostic to Atheist, a shift that provides us with an interesting insight into how the leading argument that inspired this change came to influence a truly interesting type of discursive Atheism.

In 1952 he wrote (though it was not published until later) a short piece titled, “Is There a God,” in which he put forth the following argument:

Many orthodox people speak as though it were the business of sceptics to disprove received dogmas rather than of dogmatists to prove them. This is, of course, a mistake. If I were to suggest that between the Earth and Mars there is a china teapot revolving about the sun in an elliptical orbit, nobody would be able to disprove my assertion provided I were careful to add that the teapot is too small to be revealed even by our most powerful telescopes. But if I were to go on to say that, since my assertion cannot be disproved, it is intolerable presumption on the part of human reason to doubt it, I should rightly be thought to be talking nonsense. If, however, the existence of such a teapot were affirmed in ancient books, taught as the sacred truth every Sunday, and instilled into the minds of children at school, hesitation to believe in its existence would become a mark of eccentricity and entitle the doubter to the attentions of the psychiatrist in an enlightened age or of the Inquisitor in an earlier time (Russell, “Is There a God,” 1952).

This argument pushed Russell from agnostic to Atheist, or, as he himself stated later:

I ought to call myself an agnostic; but, for all practical purposes, I am an atheist. I do not think the existence of the Christian God any more probable than the existence of the Gods of Olympus or Valhalla. To take another illustration: nobody can prove that there is not between the Earth and Mars a china teapot revolving in an elliptical orbit, but nobody thinks this sufficiently likely to be taken into account in practice. I think the Christian God just as unlikely [Bertrand Russell, “Letter to Mr Major,” in Dear Bertrand Russell: A Selection of his Correspondence with the General Public, 1950 – 1968 (London: Allen & Unwin, 1969)].

teapot

Not only has Russell’s teapot inspired an entire discourse of Atheism, the logic of doing so has equally led to a position with which to structure this discourse, what we might call ‘the argument from fictionalisation.’

First, we see very distinct influences in later arguments, such as Antony Flew’s “Presumption of Atheism:”

flew

What I want to examine is the contention that the debate about the existence of God should properly begin from the presumption of atheism, that the onus of proof must lie upon the theist. […] In this interpretation an atheist becomes: not someone who positively asserts the non-existence of God; but someone who is simply not a theist. Let us, for future ready reference, introduce the labels ‘positive atheist’ for the former and ‘negative atheist’ for the latter [Antony Flew, The Presumption of Atheism (London: Elek Books, Ltd., 1976), 13-14].

This, then, seems to infect the arguments made by individuals like Jack David Eller, who argues that Atheism is not only humankind’s inherent position, but that it is our ‘natural’ starting point:

eller

Humans are natural atheists—not in the sense of attacking god(s) but in the sense of lacking god(s).

[…]

What would happen if a child were never told a word about any of these religious concepts? It is unlikely that he or she would spontaneously invent his or her own religious concepts, and astronomically unlikely that he or she would reinvent Burmese village Buddhism or Lakota religion or Christianity. No human is born a theist. Humans are born without any god-concepts. Humans are natural atheists.

[…]

There are two fates that a natural atheist can follow. If she is never exposed to the idea of god(s), never urged to ‘believe’ in any god(s), she will retain her natural atheism—even if it is tainted with other religious but nontheistic notions. […] But under the pressures of a theistic milieu, the great majority of natural atheists will have their natural atheism replaced with an acquired theism, that is, they will be turned into or converted into theists. Some of these learned-theists will, for various reasons, come to question, ‘doubt,’ and ultimately reject the theism thrust on them and will ‘deconvert’ into ‘recovered atheists’ [Jack David Eller, “Chapter 1: What is Atheism?” in Phil Zuckerman, ed. Atheism and Secularity–Volume 1: Issues, Concepts and Definitions (Santa Barbara: Praeger, 2010), 4-5].

We see this same sort of argumentation in Baggini’s Very Short Introduction on Atheism, albeit told through a humorous metaphor:

baggini

However, some people believe that the loch contains a strange creature, known as the Loch Ness Monster. Many claim to have seen it, although no firm evidence of its existence has ever been presented. So far our story is simple fact. Now imagine how the story could develop.

[…]

The number of believers in the monster starts to grow. Soon, a word is coined to describe them: they are part-mockingly called ‘Nessies.’ (Many names of religions started as mocking nicknames: Methodist, Quaker, and even Christian all started out this way.) However, the number of Nessies continues to increase and the name ceases to become a joke. Despite the fact that the evidence for the monster’s existence is still lacking, soon being a Nessie is the norm and it is the people previously thought of as normal who are in the minority. They soon get their own name, “Anessies’—those who don’t believe in the monster.

[…]

Is it true to say that the beliefs of Anessies are parasitic on those of the Nessies? That can’t be true, because the Anessies’ beliefs predate those of the Nessies. The key point is not of chronology, however.

[…]

The key is that the Anessies would believe exactly the same as they do now even is Nessies had never existed. What the rise of the Nessies did was to give a name to a set of beliefs that had always existed but which was considered so unexceptional that it required no special label [Julian Baggini, Atheism: A Very Short Introduction (Oxford: Oxford University Press, 2003), 8].

Likewise, we might also consider Carl Sagan’s famous construction, the invisible fire-breathing dragon that lives in his garage:

sagan

“A fire-breathing dragon lives in my garage”

Suppose (I’m following a group therapy approach by the psychologist Richard Franklin) I seriously make such an assertion to you. Surely you’d want to check it out, see for yourself. There have been innumerable stories of dragons over the centuries, but no real evidence. What an opportunity!

“Show me,” you say. I lead you to my garage. You look inside and see a ladder, empty paint cans, an old tricycle–but no dragon.

“Where’s the dragon?” you ask.

“Oh, she’s right here,” I reply, waving vaguely. “I neglected to mention that she’s an invisible dragon.”

You propose spreading flour on the floor of the garage to capture the dragon’s footprints.

“Good idea,” I say, “but this dragon floates in the air.”

Then you’ll use an infrared sensor to detect the invisible fire.

“Good idea, but the invisible fire is also heatless.”

You’ll spray-paint the dragon and make her visible.

“Good idea, but she’s an incorporeal dragon and the paint won’t stick.”

And so on. I counter every physical test you propose with a special explanation of why it won’t work.

Now, what’s the difference between an invisible, incorporeal, floating dragon who spits heatless fire and no dragon at all? If there’s no way to disprove my contention, no conceivable experiment that would count against it, what does it mean to say that my dragon exists? Your inability to invalidate my hypothesis is not at all the same thing as proving it true. Claims that cannot be tested, assertions immune to disproof are veridically worthless, whatever value they may have in inspiring us or in exciting our sense of wonder. What I’m asking you to do comes down to believing, in the absence of evidence, on my say-so [Carl Sagan, “The Dragon in my Garage” in Carl Sagan, The Demon-Haunted World: Science as a Candle in the Dark (New York: Ballantine, 1996), 171.]

This leads us to a truly intriguing sort of argumentation, an attack on certain ad-hoc hypothesising, and what shall henceforth herein be referred to as ‘Troll Religions.’


Religions constructed for the sole purpose of representing a type of satirical criticism, which have also been defined as ‘Parody Religions’ or ‘Invented Religions’ are beginning to get the academic attention they deserve.  For those truly interested, see the work of Beth Singler (University of Cambridge) and Carole Cusack (University of Sydney).  I shall refer to a certain of these herein as ‘Troll Religions’ in order to demarcate a boundary between those constructed for Atheistic purposes (and thus for reasons of criticism) and those constructed by individuals who identify with these religious constructions for their own personal benefit (inward, rather than outward usage).  As representations of the latter group, consider Jediism or Dudeism.

As representations of the former, we might consider those who worship the Invisible Pink Unicorn, or The Church of the Flying Spaghetti Monster.

The Invisible Pink Unicorn, a paradoxical goddess that embodies both invisibility and colour, acts, like Russell’s teapot, as a device used to question, as well as discredit, the idea that God maintains the same sort of essence.  The IPU even stands in for God in arguments about the inherent ridiculousness of Theistic belief.  For instance, by replacing the word ‘God’ or ‘Lord’ in Biblical accounts with ‘Invisible Pink Unicorn,’ the sacred nature of these texts transmutes into farce:

IPUIPU2

I remain confident of this: I will see the goodness of the [Invisible Pink Unicorn] in the land of the living. Wait for the [Invisible Pink Unicorn]; be strong and take heart and wait for the [Invisible Pink Unicorn]. (Psalm 27:13-14

This, as adherents argue, reveals the nature not only of religious belief, but of the way this sort of belief might misguide individuals into believing nonsensical (and thus, empirically disprovable) ideas.

Originating out of internet discussions (alt.atheism) in the early to mid 1990s, the IPU, as defined by Steve Eley (who refers to himself as the ‘Chief Advocate and Spokesguy’ of the religion itself), exists through the same sort of belief that gives meaning to ‘God’ or other deities:

Invisible Pink Unicorns are beings of great spiritual power. We know this because they are capable of being invisible and pink at the same time. Like all religions, the Faith of the Invisible Pink Unicorns is based upon both logic and faith. We have faith that they are pink; we logically know that they are invisible because we can’t see them (Steve Eley, cited in the Quotable Atheist, ed. by Jack Huberman).          

As a satirical device, the IPU thus equally exists as an Atheist device, a discursive signifier used to establish, argue, and defend an Atheistic position.

In similar fashion, the Church of the Flying Spaghetti Monster, who’s adherents are called ‘Pastafarians,’ consists of analogous satirical language.  fsp1Perhaps much more ‘religious’ than the religion that orbits around the IPU, the CFSM has come to embody ritual components.  This likely stems from the political basis of its origination.  The creation of Bobby Henderson, the FSM was conceived in order to argue against a decision being considered by the fsp2Kansas State Board of Education concerning the teaching of Intelligent Design as a counter position to biology and physics.  In an ‘open letter‘ to the Board, Henderson made his case for the equal acknowledgement of the FSM’s role in creating the universe.  He later published a sacred text.

The ritual aspects of the church not only involve wedding ceremonies wedding(usually overseen by an individual in Pirate regalia, as Pirates are considered ‘absolutely divine’ and the first ‘Pastafarians’), but religious garments.  The latter consists of colanders, which equally represent the political origins of Henderson’s argument, in that they tend to be worn in order to defend the idea that one religious permission (such as the Muslim Hijab or Jewish Yarmulke) should allow for all religious permissions.  Some examples include:

collander Christopher Schaeffer, newly elected to the Pomfret town Board in New York.

canuelObi Canuel, an ordained minister in the CFSM, who fought to keep his driver’s licence, colander and all.

nikoNiko Alm, an Austrian Atheist who was permitted to wear the colander for his licence.

jessicaJessica Steinhauser, formerly known as the adult film star, Asia Carrera, also wearing the colander.

All these things combined, including the satirical mockery of Christian prayers and invocations (akin to the IPU), blend into a ‘religion for Atheists.’

hail pastaour pasta noodles in the sand


Bertrand Russell’s hand in all of this can be found in the fingerprints we might find smudging the edges of the logical arguments each of these examples provide.  Moreover, each of them contributes an intriguing insight, not just on how we might use them to make sense of the identification going on ‘under the surface,’ but on how they have been influenced by related, but altogether different, sorts of discursive sources.  I might conclude here, then, with the notion that understanding the who, how, and why concerning these discursive examples is inextricably linked to the logical arguments that came before.  This is not, of course, the same as saying Russell’s teapot is the same as the Flying Spaghetti Monster or the Invisible Pink Unicorn.  Rather, this is more akin to locating the roots of the former reinforcing the beliefs and practices of the latter.  We might also consider how these representations might equally alter our conceptual understandings about religion and religious identity.  When troll religions become religions, how then might we make sense of these identities when they’re Atheistic, and thus antithetical to our normative ideas about what might constitute ‘religion’ or ‘religious?’