This week is graduation, and since it’s the only ceremony of this type in which I have allowed myself to be forced to attend, my family graciously came to visit.
Part of the fun of family coming to visit, is you get to see the city through the eyes of first-timers to Edinburgh. Suddenly, all the places that eventually blended into the background of your mundane day-to-day, have regained the romance they had when you first arrived.
Our sudden (or maybe longstanding) interest in all things Scottish tartan came with a reason. It was perhaps quite convenient that just before my family arrived, a relative of ours discovered the following information about our Scottish heritage (on my father’s side):
The most relevant part of this new info is this:
In 1988, while researching an ancestor with Scottish lineage, I discovered that Maldred [my ancestral grandfather, d. 1045] was the younger brother of Duncan I, King of Scotland. With this discovery, twenty additional generations were added to the previous documented 29 generations, resulting in 49 documented generations in this family.
So, knowing now that we are descendants of Scottish Royalty, this last trip, with the whole family, felt extra special.
Of course, anyone slightly familiar with the content of this blog would know that I would simply write this off as a type of ‘fiction.’ In this case, however, and ever so briefly, I’ll let it slide. I mean, I do in fact look a bit like Fassbender’s MacBeth, right?
So, all hail me, Dr. Ethan G. Quillen, Scottish Royalty.
On a less ridiculous note, my family’s new info, and thus further interest in all things Scottish Tartan, got me thinking. In fact, while waiting out the long list of names called at the ceremony today, and perhaps as one last chance to consider changing my Thesis topic, I threw together this idea. I will present it here as a brief abstract, because, given the celebratory frivolity of this afternoon and evening’s events, I simply don’t have the time to expand.
It’s All Relative: An Ethnographic Analysis of American-Scottish Identity Constructions
Everyday in Edinburgh, visitors from America come to the numerous ‘Scottish Heritage’ shops conveniently placed on the Royal Mile. These individuals are, in our contemporary context, a new type of pilgrim. They are in search of a connecting thread, a symbolic link to an ancient past. They spend hundreds and hundreds of pounds purchasing clan information booklets, kilts, scarfs, and clothing fashioned from a particular woollen tartan, their tartan, a physical embodiment of their ancestral lineage. Why do they do this? This analysis will attempt to answer this simple question with four case studies, while at the same time both establish a linkage between these pilgrim’s construction of Scottish ancestry and the notion that they further an intercontinental sense of imagined community, as well as challenge the perception that one’s heritage is nothing more than a type of identity artifice, of fiction.
***One last funny anecdote from this week***
When my parents arrived, a day ahead of my brother and his family, we took them to the Christmas market. While standing at the bar in St. Andrews Square, I caught the attention of a rather sullied and drunken gentleman chatting up a young woman. He looked at me, caught her attention, and announced to all in ear shot:
“Look eh this chap, ‘ere. This is a Scotsman!”
Then to me, he said:
“I bet your name is Robert Robertson from the highest highlands!”
Back to the young woman:
“Look at him. He’s the most Scottish I’ve ever seen!”
Taking a moment to let his declaration sink in, as well as to build a rather large pregnant pause, I responded, in my most Southern California accent:
“Sorry to disappoint you, but I’m just an American”
My fellow drinkers found it rather humorous, and the drunken fellow happily hugged me.
One of the benefits of using WordPress to host this blog is it provides some rather amusing data.
For example, it keeps tabs on where each post has been viewed. This provides the excitement of knowing there are people in Nepal who’ve read something that I wrote. As well, it allows me the ability to keep track of how many Americans have viewed my blog, versus Brits.
Another thing it does is provide me alerts, such as the one I received last week, wishing me a ‘Happy Anniversary!’
So, apparently, it’s been a year.
To celebrate, I thought I’d put together a little year in review. Which then got me thinking: a blog like this, with weekly updates, is like a diary, an on-line cache not only of my obscure thoughts, but about the things that have inspired those thoughts throughout the year. Or rather, it’s like an auto-ethnographic discursive source, where I am both anthropologist and subject, so that in equal measure, the text, this text, is like a fieldwork account, a window onto my own unique cultural perspective.
With this in mind, this review is something of a look back, not just for my dear readers, but for myself as well: a short trip back in time to see not only what it is that I have done, but how those things have shaped my way of thinking about my surrounding world.
This first post was nothing more than an introduction, a foundation on which to build the theme of the whole blog. I wrote it when, at that time, I had somehow deluded myself into thinking I’d be done with the Thesis by Christmas. As such, it is heavily influenced by my Conclusion, particularly the quotes I provide concerning the use of the term ‘fiction’ and how it challenges any sort of normative understanding we might have about texts that are considered ‘true’ or ‘authentic.’
The second post came by accident. For our tutorials that week on a course called ‘Modern Religious and Ethical Debates in Contemporary Fiction,’ we had read the last of the Harry Potter novels. Our discussion for the week was on the religious implications in the book (Harry’s ‘Christian world’ in contrast to the wizarding world in which he now found himself), as well as the public’s perception of ‘witchcraft via a popular medium. One of the students in my tutorial chose to shape they’re presentation of the novel around an on-line ‘fan fiction’ called “Hogwarts School of Prayer and Miracles,” by a Grace Ann Parsons under the name ‘aproudhousewife.’ While we enjoyed a nice conversation about how this fan fiction represented a type of religious identity, via the language used by the author in her argument against J.K. Rowling’s own fictional representation of witchcraft, we had an even more fun chat about the precariousness of using fiction when examining identity, as I revealed the fact that though popular, Grace Ann Parson’s fan fiction was not real. It was, in fact, an example of something called ‘Poe’s Law:’ no matter how ridiculous or humorous a fundamentalist argument is, and though it might be fake, it is inevitable that someone will confuse it for real, because of his or her perceptions of fundamentalism as being ridiculous or humorous. Tread lightly, was my conclusion, as all writing, whether a novel or an ethnography, is ‘fiction,’ due to the fact that it is inherently artificial.
This post, like the previous one, was inspired by a tutorial. This time, we had just read Howard Jacobson’s The Finkler Question, and we’re discussing the use of a novel as a source for ‘jewishness.’ The Finkler Question is perfect for this as the story it tells is of a gentile, obsessed with Jewish culture, struggling to identify himself as being ersatz Jewish. The post I wrote not only discussed the use of stereotypes within written accounts, it also dealt with ones I might have considered when I visited Israel in 2014, as well as a few humorous examples from popular media such as Seinfeld, Fraser, and Mel Brooks’ History of the World Part One. Both culturally insensitive, as well as representative, these sources proved rather useful as we delved deeper into the use of fiction as an ethnographic source.
I few years back I graciously accepted the offer to present a paper at the Sociology of Religion Study Group (SOCREL) Conference held at the University of Chester, by a friend whose research focuses (in part) on conspiracy theories. I didn’t really know all that much about conspiracy theories, aside from the few things I remembered from my degree on New Religious Movements, nor did I have a clue on how to combine that lack of knowledge with my research on Atheism. So, I cobbled together some related details, out of which emerged this theory of Atheism. In short, when we take Donald Rumsfeld‘s famous tautological reasoning for declaring war on Iraq (Known knowns, Known unknowns, and Unknown unknowns) with the philosophical foundation of the dichotomy between Theism and Atheism, we find some interesting similarities: Known knowns (Theism and Atheism), Known unknown (practical agnosticism), and Unknown unknowns (complete ignorance of both Theism and Atheism).
When I attended the Non-Religion and Secularity Research Network’s conference in 2012 I wanted to present my criticism of the term ‘non-religion’ in a way that was both memorable, as well as humorous. My thinking was, if I was going to be utterly critical of the term’s usage, I might as well do that in a way that was rather funny. I focused my presentation, then, on the nominal battle over whether or not the Brontosaurus actually ever existed, as its title came from a mis-named larger sample of the Apatosaurus. The essence of my argument was as follows: in his attempt at beating his rival by discovering and labelling more specimens, Othniel Charles Marsh called the Apatosaurus something else, sort of like referring to Atheism as non-religion.
For a Christmas break, my traveling companion and I spent a few days in Bruges, Belgium. Aside from biking around the city, eating waffles, and drinking delicious ales, we also went to the Church of Our Lady to see Michelangelo’s Madonna and Child. Whilst there, I noticed an oddity about the baby Jesus: he was uncircumcised. Surely, I thought out loud, in a church, after eight days as Jewish boy would be circumcised. Why isn’t this Jesus snipped. I was then reminded that Michelangelo’s David, in Florence, is likewise ‘intact.’ This blog post was about the use of foreskin as a symbol of an artist’s own influence over the factual accuracies of his or her representation. In other words, Jesus and David weren’t snipped, because Michelangelo wasn’t. Food for thought.
After it premiered, Ridley Scott’s epic, Exodus: Gods and Kings, was banned by the country of Egypt. While it was, for me, a rather innocuous action film, for many around the world, its parting ways with the Biblical narrative not only came across as blasphemous, but as threatening as well. Thus, it was banned. This got me thinking. Did this banning have anything to do with anxieties felt by some that a re-telling such as this was harmful to the sour material by reminding the audience that both are nothing more than stories? That is, if Ridley Scott can re-write the story so easily, then is the original nothing more than a template, a plastic and bendable thing able to be re-created, and thus void of what we might perceive as some sort of ‘sacred’ something? To conclude, by way of an answer, I posed the curious question: when the critically disliked and epic-looking Troy came out a few years back, with its predominant white cast and highly adapted re-telling of the Trojan war (which ‘historically’ took place around the same time as the Exodus out of Egypt), why was it not banned for its inaccuracies or insults to history? Is it because we now think of the Trojan War as nothing more than a myth?
When I first came to Edinburgh, even before I started my degree, I had the great privilege of meeting Chris Cotter and David Robertson, the two minds behind the Religious Studies Project. Since that first meeting, I’ve had the wonderful opportunity to be a part of a number of roundtable discussions, one of which took place at the University of Chester on the topic of narrative and reflexivity in the study of religion. This was, in my five years in Britain, one of the most fun and rewarding conversations I’ve had on the topic of fiction, ethnography, and the use of discourse and narrative in the study of both. Below is the link to the recording, which I encourage everyone to enjoy.
When the former Seventh-Day Adventist minister, Ryan Bell, concluded his ‘year without God’ with the announcement that he no longer believed in the existence of God, it caught my attention. This post was about his conversion, and how I thought it related to the story of the Cardiff Giant, a hoax believed, and defended, by a curious and believing audience. Here’s the essence of my argument: if we were intent on understanding how beliefs become solidified, such as the way a hoax is marketed and devoured by a demanding audience, or in Bell’s case, how identity becomes constructed, is this not the ideal set of data with which to study? That is, though it might look, through a certain lens, to be something designed or formed in such a way as to inspire criticism, is it not still something worth examining? Or, is all of this once again a reminder that no matter how cautious or critical we are, there’s never really a sure way of knowing if something is a hoax (such as discourse observed), so that we must continually remind ourselves that in the study of ‘others,’ and regardless of objectivity, we might be nothing but ‘suckers?’
Perhaps the most popular of my posts, this one was once again inspired by a tutorial, or rather, by a course for which I tutored: ‘Atheism in Debate.’ While I have had my criticisms of this course, particularly concerning the fact that it was designed to bring in students interested in reading the four ‘New Atheist texts,’ only to ‘trick’ them into reading nineteenth-century theological apologetics, maybe my biggest point of critical discussion over the three years in which I led tutorials was the manner with which we compare the New Atheism with the Old. How, we were often asked, do these two groups differ? My simple answer: the New Atheists are assholes. This is an oversimplification. To make my argument, I used the ‘theory of the asshole’ as defined by the philosopher Aaron James, who describes an asshole as such: “a person counts as an asshole when, and only when, he systematically allows himself to enjoy special advantages in interpersonal relations out of an entrenched sense of entitlement that immunises him against the complaints of other people.” (4-5) In order to apply this to New Atheism, I took examples where the New Atheists did just that, and thus determined them as ‘assholes.’ Which, I concluded, also made them different from their ‘older’ counterparts. It’s not a perfect description, but it is fun.
During a trip to Bergen, Norway, my travel companion and I took the tram out to the re-constructed Fantoft Stavkirk, that was famously burned in 1992. Though not convicted for this particular arson, the Satanist Varg Vikernes was found to have been connected to it when he was convicted for similar crimes, as well as murder, in 1994. In response to this information, I recalled thinking, ‘at least he was a Satanist, and not an Atheist,’ a statement I assigned to a ‘friend’ within the post itself. Aside from providing some simple background on the rise and fall of Satanism in Norway, I used this post to present a theory that, in fact, the ‘Satan’ of the Bible was the first skeptic, a foundational precursor to modern Atheism. I justified this via Biblical references where ‘שָׂטָן’ was associated with doubt, skepticism, or an adversarial position, such as Numbers 22:32, 1 Samuel 29:4, 2 Samuel 19:35, 1 Kings 5:4, 1 Kings 11:15, 1 Kings 11:23 and 11:25, 4:1-11, Mark 1:12-13, Luke 4:1-13, Luke 22:3, and John 13:27. Or, to put it differently, examples of ‘Satan’ as a ‘Devil’s Advocate,’ such as we famously find in the Book of Job: ‘Skin for skin!’ Satan replied. ‘A man will give all he has for his own life.But now stretch out your hand and strike his flesh and bones, and he will surely curse you to your face.’ (Job 2:1-7). I thus concluded: “in combining the lexical process of being deemed an ἄθεος (scepticism, doubt, critical debate) with the doubt, opposition, and adversarial nature of Satan (שָׂטָן; διὰβολος) we might confortably conclude here that Satan is, in fact, a representative sort of Atheism.”
Presenting at conferences is something that I have found, as an academic, to be a rather rewarding experience. Not only does it necessitate travel, it also gives one the opportunity to receive feedback from people about his or her work. At the same time, though, it also leads to the notion that one’s work is his or her’s property. This feeling presented itself when I met Liam Frasier and Chris Cotter briefly to discuss a roundtable we were planning for the students of our course on ‘Atheism in Debate.’ As we introduced ourselves to each other, giving the ‘elevator pitch’ of our research, I came to realise that the thing I study is a large part of my identity. In this way, I ‘own’ it, in that it is my perspective, my interpretation, my product. This sense of ‘ownership’ always brings me back to Malinowski, who himself saw his presentation of the Trobriands, and thus the Trobriands themselves, as his ‘property:’ “Joy: I hear the “Kiriwina” [another name for the Trobriands; more strictly the northern province of Boyowa]. I get ready; little gray, pinkish huts. Photos. Feeling of ownership: It is I who will describe them or create them.” (Malinowski, A Diary in the Strict Sense of the Term, 1967, 140).
Another post inspired by a film, this time in response to the debate taking place about the ‘accuracy’ of Clint Eastwood’s Oscar nominated film, American Sniper. While many of the ‘facts’ about Chris Kyle’s life have become exaggerated myths, the exaggeration is, as I argued, not unlike the ‘bumper sticker arguments’ we might see attached to people’s cars. While these might reflect opinions that are as affixed as the stickers themselves, they also represent a type of narrative statement: a story, told with few words, that reflect a facet of the individual who places them. As a conclusion, I made this argument: “when we see these sorts of images, perhaps we might be better off simply understanding that they represent a narrative, a means with which certain individuals define themselves, either for or against the statements made. Whether we want to simply believe them as true, research the facts within, or work to disprove them, they will always be stories. After all, Chris Kyle now lives solely in legend, but only because he now exists solely as a character within a story; a fate that awaits us all in time. For pragmatic reasons, then, the stories others tell, the stories we tell about them, and the stories we tell of ourselves, work as identifiers, assisting us in making sense of life in our determined search for meaningful fictions.”
For a period of time, I put off writing a post for the Non Religion and Secularity Research Network’s blog: http://blog.nsrn.net. This wasn’t because I didn’t want to, but because I was worried I didn’t really know what to write. While I’ve been critical of their term usage, doing that again seemed like overkill. Instead, I decided to write a post that presented the discursive approach I adopted for my Thesis. To get to that, though, I felt like I should tell the story about how I got involved in the study of Atheism itself. This post was a companion, then, an informative, and personal, precursor for the one published on their blog: http://blog.nsrn.net/2015/02/13/discourse-analysis-and-the-study-of-atheism-definitions-discourse-and-ethnographic-criticism/
While the worksop was an excellent and a truly great experience, because my time in Spain, and especially Barcelona, was short, I spent most of it going to ‘tourist attractions,’ such as the Sagrada Familia and the Barcelona Cathedral. This got me thinking about ‘sacred spaces,’ and the focus of this post is on the interesting balance we perform between sacred and profane when we visit such ‘tourist trap sacred spaces,’ such as I’ve done in Jerusalem, the Vatican, and even outside Waco, Texas.
In my efforts to discursively examine Atheism, one of the routes I’ve taken is following specific lines of influence from one argument to another. As such, this post presented one of these discursive threads, from Bertrand Russell, to the New Atheism. More specifically, it followed a particular philosophical discourse that originated with Russell’s argument against the belief in the existence of God, by comparing it to the belief that there exists a ‘china teapot’ orbiting in an elliptical between the Earth and Mars. By fictionalising a belief that is as equally impossible to prove or disprove, Russell inaugurated what I coined as the ‘argument from fictionalisation,’ a position promoted by a number of individuals across the last fifty years. In specific, I then compared the same usage of this fictionalisation between three ‘invented religions,’ religious organisations designed on the premise that their deity is as equally provable and disprovable (the Flying Spaghetti Monster, the Invisible Pink Unicorn, and the Church of Bacon) as ‘God,’ which in turn provides for us an interesting discursive type of Atheism.
Zombies are very popular lately. Then again, zombies have been popular for quite some time now. As entities that have infiltrated a number of popular culture media, from movies and television shows, to graphic novels, the living dead have been a part of ‘horror’ for at least a century. What this also means is that the discursive influences on the writing of these outlets has changed just as much as the culture within which they’ve thrived. Do these films/shows/graphic novels tell us something about that culture? This post answered ‘yes,’ with a specific focus on the role of religion in the twentieth-century. By looking at how the origin of the living dead turned from Alien and ‘Voodoo’ inspired, to George Romero’s liminal ‘they simply exist’ era, which was then replaced with the discourse of disease (plague/epidemiology), the Zombie narrative provides for us an insight into the influence of secularisation: a transition from a discourse based on either mythical or mysterious influences, to that of the scientific and empirical.
Back in my younger, and perhaps more naive, days, I wrote a paper where I analysed the philosophy, and thus foundation, of L. Ron Hubbard’s ideas that turned into Scientology, via the deconstructive lens of Freudian psychology. In essence, I took Freud’s three part ‘Psychic Apparatus,’ and compared it to Hubbard’s notion of Body, Mind, and Thetan. While the result might have led to a criticism of one against the other, the focus of my post was more narrative driven. As I concluded: “as narrative devices, as stories that tell us something about how these men interpreted their world, and thus in turn tell us something about them personally, they function on an entirely different spectrum of criticism. Thus, rather than merely trying to connect dots that might creatively lead us to some sort of conclusion, using these narratives to make sense of the individuals who told them, as well as the individuals who use them, becomes that much more useful than even the most pragmatic attempts at comparing like with like.”
In another attempt at discursively examining Atheism, this post utilised comedy as a medium. In this example, however, I got a bit more specific about exactly how I was going to use these discourses to examine Atheism. By adopting three ‘anti-religious comedy routines’ (Ricky Gervais, Bill Maher, and George Carlin) as ‘texts,’ I then analysed them via Norman Fairclough‘s ‘three analytically separable elements:’ “the production of the text, the text itself, and the reception of the text” (10). By closely examining the language used by each individual in the three examples, I produced a number of correlative discursive elements across each. These, then, contributed to a discursive understanding of the Atheism presented within each account. When compared to three clips of the New Atheists promoting their Atheism, this sort of analysis assists us in better understanding how such relatable philosophical arguments are disseminated via different venues (genres), as well as how they are received by a larger ‘public.’ In the end, I made the following conclusion: “this works much better than merely speculating or theoretically stipulating what we think these sorts of things (like Atheism) mean, and is therefore a much more useful (and, to be honest, more enjoyable) means of researching precarious concepts such as ‘religion’ or ‘Atheism.'”
This is perhaps more of a ‘reactionary’ post. Toward the end of the semester, and thus the end of my final tutorial period on Atheism in Debate, I was looking for specific ways to approach New Atheism in a manner that both presented their arguments from an unbiased and objective, yet also critical, perspective. For this week, I chose their bad scholarship, which, regardless of whether or not someone agrees with them, is something that should infuriate everyone. Three of them have PhDs, after all. That doesn’t mean I assume anyone with a PhD should automatically be considered smart; but come on. Getting a PhD means you’ve been trained on how to do good research, how to build your arguments on solid data, and how to avoid becoming a caricature of bias and opinion. Nevertheless, what the New Atheists do is bad research, of which they are then arrogantly proud. It’s sort of embarrassing. The purpose of this post was, however, not just meant to point out their poor scholarship. Instead, it was designed as a means to assist my students in understanding how when they do good scholarship, the sort of arguments that they make can be even more meaningful than when they’re presented with the passion one might associate with the rhetoric of a zealot.
Book reviews are often considered ‘easy’ publications, something much less strenuous than an article, monograph, or edited volume. When simplified, they come across as rather simple: read a book, summarise it, tell the reader what was good and bad about it. With my first experience writing a review (on which this post was focused), I quickly learned this is not the case. In fact, the three-part story that I present in this post functions as a personal account of the lessons learned through writing this review: how the editing process works in the writer’s favour, how defending your argument (such as my capitalisation of the ‘A’ in Atheism) reminds you of why you are doing the research, and how copy-editing, or having someone re-write your work, and then arguing in favour of your version, builds a sense of confidence in one’s writing. My review itself can be found here: http://www.secularismandnonreligion.org/articles/10.5334/snr.au/
This post was inspired by the premier of the fifth season of HBO’s hit series, Game of Thrones. Specifically, it was inspired by the furore expressed by many fans that, unlike the previous four seasons, this one would be mostly the product of original writing, rather than adaptation, as the series had thus far exhausted almost all of the story published in book form by George R.R. Martin. I took this as an opportunity to discuss the perhaps disappointing fact that all writing, regardless of whether it is original work or an adaptation, is still an artifice, and is thus entirely made-up. There is, then, no such thing as an ‘original source.’ Using a quote from Clifford Geertz’s Works and Liveswherein he states that acknowledging ethnographic writing as just as creative and literary as fictional writing, is the same as realising a magic trick is, in fact, not really magic. Thus, my conclusion stated: “In a world where everything is fiction, or rather, where everything is artifice, the notion that an adaptation is telling a story incorrectly is rather moot. Even when the ‘original’ author might agree. In the end, all stories are adaptations, even when they are initially told. Which also means that all stories, just like looking at the discourse that gives meaning to a word, rather than just defining it, are neither right, nor wrong, by the mere fact that all stories are nothing more than re-tellings of a story none of us will ever see.”
Around the end of April, I began putting together what would eventually become the final draft of the Thesis. I still had a few months to go, but I didn’t know that at this point, and any full draft, once finished, felt like the final thing. As well, a colleague and close friend, Jonathan Tuckett, successfully defended his, adding to the anxiety. For these reasons, I found myself leaning (perhaps a bit too much) on distractions to break up the stress that comes with finally being finished. One of those came in the story on which I based this post: an individual on the website reddit posted a meme in which they admitted to submitting a thesis, and thus earning a PhD, that was entirely ghost written. As I sank deeper into the anxiety of polishing off one, more, draft, this story hit rather close to home, so I decided to investigate a bit more into the world of academic plagiarism. What I found was both interesting and disheartening. The conclusion I put together was a criticism not just plagiarism, but of the academy itself, an argument that as the academic world becomes more and more like a business, the more and more profit-gaining opportunities (like plagiarism) begin to make sense.
This post dealt exclusively with two examples wherein fiction and ethnography appeared to merge into a category of similar, yet distinct, types of fictional writing. The first was a New York Times article by Laura Tavares on the use of Harper Lee’s To Kill a Mockingbird as a source of cultural insight into the American south, and the struggle of racial violence. The second was a relatable discussion of the subtle differences between a novel as an ethnographic source, and an ethnographic description, by Thomas Hylland Eriksen. Together, these two examinations worked to further blur the line between ‘fiction’ and ‘non-fiction,’ particularly concerning how both are products of creative artifice. As I concluded, in regard to the cover image used of the film Noah as being ‘based on real life:’ “While I am quite willing to blatantly claim that all textual representations are fiction by means of their ‘artifice-ness,’ this of course brings us into a discourse where, like the notion of ‘everything is fiction,’ we get somewhat distracted by what might be ‘based on real life’ and what might be a story assumed by some as the same. This is not equal, however, to a declaration that the story of Noah, which might be defined as both, either, or neither a myth and truth, is definitively one of these things. Rather, my point of having it here, and the point of this post in general, is a reminder that when we declare ‘everything’ as fiction because of the role that artifice plays in the creation and presentation of interpreted ‘things,’ a movie about Noah and a movie about William Wallace are equally ‘based on real life.’ In other words, the distinction between what is ‘fact’ (quantitative data about lynchings in the US) and what is ‘fictional’ (Lee’s To Kill a Mockingbird) might blur into a perception where they become equal representations of some type of ‘truth.’ I, for one, am ok with this.”
The Hotel Preston, in Nashville, Tennessee, has listed on its in-room amenities an intriguing option: A Spiritual Menu. What this means is, at any time, day or night, a guest of the hotel may request a religious text brought to his or her room for some quiet reflection. For me, this led to a discussion of the World Religions Paradigm, and its limitations on the broader study of religion. While on the surface, the options provided by the Preston Hotel’s menu seem to simply further support the WRP, I argued that is, in fact, providing narrative sources, rather than theoretical interpretations. Or, as I stated in my conclusion: “by translating the mythological and doctrinal narratives that are used by individuals in the process of their ‘religious identity construction’ as a ‘menu,’ through which they isolate their own discursive understandings of ‘religion,’ we can form a much more complex and varied person-to-person perspective on how individuals use, and thus define, the concept for their own intentions. Which, I believe, seems much more in keeping with the culture of religious studies.”
Based on Bruno Latour’s Gifford Lecture series here in Edinburgh, and particularly on his notion of the ‘Anthropocene,’ this post presented my argument that we are, currently, living in what I call the ‘Profitable Age.’ I based this argument on evidence such as the ‘trilogy’ that Peter Jackson created out of Tolkien’s short novel, The Hobbit, as well as the similar film franchises based on the Fast and Furious, Harry Potter, the Hunger Games, and the Marvel Cinematic Universe. Politically, this is evinced by Supreme Court decisions such as Citizens United vs. The Federal Elections Commission. Academically, this is found within not only the rising costs of education, but in the rise (and thus debt contributing) of ‘for-profit’ educations centres. Each of these equally contributed to my theory. Or, and as I stated in my conclusion: “This is a Profitable Age. Whether that is defined by film or novel franchises, by political developments, or the business of academia, it seems more often than not that the world in which we are living is dictated by profit. How this then dictates the way we move forward, and whether that might mean a diminishment of value, is something we will have to wait and see.”
At this point in my story, I had finally finished the Thesis, and turned it in. So, for this week’s post I turned that into a story itself. More specifically, I told a story about how on certain occasions life looks like the plot of a novel, with characters and sets and plots designed for some ultimate purpose. This story, the story about my thesis, was one of those examples. Mixing in my turn to fiction for the PhD, and focusing on the journey I have taken to get the Thesis written, I also took the opportunity to weave in my theory that all writing is fictional, which then led to the argument that a thesis is as equally a novel as those texts on which my research was focused. We, then, are equally novelists, as the theses we write tell, in their own way, a part of our story.
In honour of Memorial Day in the United States, this post focused on the religious diversity found within the U.S. National Cemeteries, especially Arlington. At the top of each headstone is a religious symbol, chosen by the deceased or his or her family, to symbolise the religious beliefs of the individual interred. While originally these symbols were isolated to a few types for Christian and Jewish, the symbols permitted (and thus accepted) by the United States Department of Veteran’s Affairs have grown to accept a wide spectrum of religious beliefs, including Atheism, Wicca, and Odinism. In fact, the symbols permitted and accepted keep growing, a testament, as I argued, to the religious diversity (and freedom) found within the United States.
In late May I was given the opportunity to present at the Old Religion and New Spirituality conference, at the University of Tartu. Unfortunately, our plane out of Edinburgh was delayed en route to Amsterdam for weather issues, and we subsequently missed our flight to Estonia. In order to make the best of a bad situation, we chose to stay in Amsterdam for a few nights. On our last day, we had a few hours before our afternoon flight, and given the rain, decided to waste the time in a movie theatre. We saw Mad Max: Fury Road.
While the film has proven quite successful, both with critics and at the Box Office, I decided to use it as the basis for what I called a ‘dream course,’ those classes we hear about where the instructor has created a connection between some subject and a popular medium. Here is the course I created:
Title: Fictional Anthropology: Making Sense of Observation by Looking through an Imaginary Lens
Description: When we read Malinowski’s seminal work, Argonauts of the Western Pacific, the famed anthropologist describes for us the requirements necessary of a truly objective cultural observation. While this gives us a useful means of observing, recording, and writing about an other’s culture, it sometime leaves us without a practical description of how that might be done. With this course we will apply the methodology of anthropological observation to a more ‘hands-on’ experience by making sense of a ‘fictional culture.’ What this will entail is a detailed observation of the world created by George Miller for his film “Mad Max: Fury Road.” Alongside reading Malinowski’s Argonauts, we will try to determine the ‘imponderabilia’ of the culture within the film. We will take field notes, compare insights, and even construct short ethnographic representations, both empirically objective and reflexively subjective, in order to make sense of the methodological requirements demanded of an anthropologist’s job in the field. As an introductory course, those interested need not have any prior knowledge about anthropology, though students from all levels are warmly invited.
By June, I had reached the post-thesis malaise. This is that time between submission and defence where you find yourself wanting to separate as much from the damned thing, but also realise you need to prepare as much as possible for the viva. So, for this post, I wrote about it. A cathartic attempt at vocalising the anxiety that comes from suddenly being finished with the Thesis, but still having it there, in your mind. It’s like having an obsession that haunts your thoughts, whilst you sit on the cusp of the end of it all. As I stated: “To conclude, the malaise that I associate here with the post-submission mindset is in its own way indicative of a ‘crisis,’ not only in our confidence of what it is we have written, but in the loss of the obsession that is writing a thesis. It is a malaise defined by this double loss, a horrific perfect storm bolstered by a separation from that which has defined us for years, and the ultimate concern that the typo on page 137 will be the deciding factor in our inevitable failure.”
By mid-June, commencement ceremonies in both the US and Britain are in full swing, which also means individuals of merit are providing audiences with commencement addresses. One such address came from Ian McEwan, given at Dickinson College, in Carlisle, Pennsylvania. While I gained a great deal of respect over the last five years for McEwan, given that my PhD research focused on two of his novels, I found myself in sincere disagreement with the theme of his address: the defence of free speech and an admonition of those unwilling to stand beside the authors of the recently attacked Charlie Hebdo. That is, though I do agree with him that free speech is a human right for which we must always fight, I could not help but find an almost hypocritical argument within his lack of empathy for those who take such offence at certain examples, that they respond with violence. Empathy, I argued, is the ability to understand another’s perspective, so that even when we despise their reaction, we still understand why they might have reacted that way. Ironically, this thinking comes across in his fiction, though was sadly missing in his commencement address.
For a couple weeks over the summer, news outlets were seemingly obsessed with Rachel Dolezal, who had resigned as President of the Spokane, Washington branch of the NAACP after it was revealed that though she had been presenting herself as ‘black,’ she was, in fact, biologically caucasian. As ‘identity’ was a major part of my research, this of course caught my attention. By using her story as data, her use of ‘creative non-fiction’ in identifying as ‘African-American’ becomes an intriguing insight into the difference between self-identification and normative categories. As a tie-on to my previous post using stand-up comedy as a discursive source on Atheism, I presented this conclusion: “Issues of racial identity are likely to arise within nations (such as the US) wherein the ethnic and racial identities of the citizens that make up that nation’s culture come from a myriad of different origins. In response to this, comedians have attempted to address this in an equal number of ways. As I perceive it, perhaps the three best, if not most memorable, are the links below. I place them here as a supplement to my own opinion, a translation, if you will, of a heavily serious topic, textually transformed into a comedic response.”
This post came about a week prior to my viva, so I felt the best way to deal with this would be to fictionalise a normal day in my life, which I also fictionalised, for the benefit of the story. In this post I described a regular day, as experienced, by an American, in Scotland. I designed the post to make it read like an ethnographic field report, partly to support my theory that ‘everything is fiction,’ and partly because it was fun to do it.
This post came five days after I successfully defended my Thesis. I used this as an opportunity not only to share my experiences (and the idea that it was, in the end, nothing to be afraid of), but to discuss the anxieties writers feel, when writing. To do this, I used a quote from Hemingway that I think best encapsulates the perfect advice for a writer, of any type of fiction:
For Christ sake write and don’t worry about what the boys will say nor whether it will be a masterpiece nor what. I write one page of masterpiece to ninety one pages of shit. I try to put the shit in the wastebasket. You feel you have to publish crap to make money to live and let live. All write but if you write enough and as well as you can there will be the same amount of masterpiece material (as we say at Yale). You can’t think well enough to sit down and write a deliberate masterpiece and if you could get rid of Seldes and those guys that nearly ruined you and turn them out as well as you can and let the spectators yell when it is good and hoot when it is not you would be all right. (Re-printed via www.lettersofnote.com)
In mid-July, Harper Lee’s second novel, Go Set a Watchmen, was published, to mixed reviews. In particular, much of the disappointment felt by reviewers and readers alike was Lee’s description of Atticus Finch, the beloved father of Scout and proud advocate for racial equality in To Kill a Mockingbird, as an old racist. This disappointment stemmed from a shock, as if suddenly this literary hero’s true self was revealed. I took this disappointment and applied it to a similar comparison between Malinowski’s seminal Argonauts of the Western Pacific (which has been seen as the paragon of textualized participant observation), and the diary he kept, published some twenty years after his death. Like this new Atticus, Malinowski’s diary revealed a betrayal of sorts, showing readers his true opinions, his bias, and his more human (subjective) side. By comparing these two comparisons, I concluded that the result of comparison would only ever lead to disappointment.
My two favourite paintings a the Scottish National Gallery are these two by William Dyce:
I enjoy these paintings because they represent a different perspective, a contextualization of two Biblical figures (David and Jesus) into Dyce’s setting: the Scottish highlands. This post, however, only used these as entry points into a larger discussion about the ‘expert level’ people adopt when they gain knowledge about something. Likewise, this presented the opportunity to further discuss the reality that people with PhDs, whilst knowledgeable on single subjects, are not knowledgable about everything. I used some images by Matt Might to represent this, particularly this one:
This post was, as the title suggested, a shameless self-promotion. Earlier in the year I submitted an article for consideration to a special volume of the journal Science, Religion, and Culture. It was, to my surprise and delight, accepted. So, I used this post to promote it, as well as the other articles included. It’s really that simple. Here’s my article: “Discourse Analysis and the Definition of Atheism.”
Last July, the UK Home Secretary, Theresa May, announced that there would be major changes to the visa permissions and restrictions for non-european individuals coming to study in the UK. At first, I figured this wouldn’t affect us, and focused this post on the ‘little differences’ that make big waves for people living within cultures not their own. Part of my discussion included a Guardian article by Adam Trettel, who made many points about feeling ‘unwanted’ or ‘different’ while studying here in the UK. Though I wouldn’t know it for some time, these visa changes would indeed affect us, and cause us to feel our own sense of ‘difference’ and ‘unwantedness,’ but that wouldn’t happen for some time. A post on this will be coming soon.
In early August, we took our last trip to Paris, a city that we have come to love, not only through multiple trips, but also after I lived there for two different months to learn French at L’Institut Catholique de Paris. This post was a brief description of this last trip, and how when we (people) visit foreign countries we tend to ‘create’ selves that we wish to be seen by others. In turn, those others also ‘present’ the ‘selves’ they want us to see, a never-ending cycle of performances that contribute to an interactional back-and-forth between entities developing an identity of ‘humanity.’
This post took its focus from the idea that, without us giving things meaning, nothing means anything. I presented a number of examples where this might prove valid: a controversial Kouros statue at the J. Paul Getty Museum in Southern California; the Fälschermuseum, in Vienna, Austria, and the Museo Del Falso at the University of Salerno’s Center for the Study of Forgery; the Hitler Diaries, forged documents created and sold by Konrad Kujau to the German magazine Stern, the UK’s Sunday Times, and the American magazine Newsweek; the exact replica of the Lascaux Cave, created to preserve the delicate art of the original, which can no longer be seen in person; and the genuineness of the items available for purchase on the website Screenbid from the AMC hit show, Mad Men. By comparing these examples, I concluded that there is no difference between something that is ‘authentic’ and something that is not. The only difference is the one we give to the former, in our attempts at differentiating the two. In this way, nothing is actually real.
This post was the first of a series of ‘live from’ posts, where I was writing from a particular conference. This one came from the XXI Quinquennial Congress of the IAHR. This was an amazing conference, and something that I had been looking forward to since I first arrived in Edinburgh. The post itself was about the panel on which I presented, and how these sorts of gatherings remind us just how small the world is, via our theoretical and methodological differences. Also, side note: I wrote this with a cracking hangover.
After the XXI Quinquennial Congress of the IAHR, we all went our separate ways. My way home was via Berlin. This post told that story, specifically about a simple insult that I received from a fellow passenger just prior to boarding the flight, from which it takes its title: vanilla english. While it was a comment made in passing with an Englishman drinking beer at an Irish bar in Berlin, it also reminded me of the diversity of the IAHR (and particularly the panel in which I presented), and the incorrect labels we sometimes assume about things that might seem ‘bland’ from our particular perspective.
Shortly after the IAHR, a few of us attended the annual conference of the British Association for the Study of Religions at the University of Kent, in Canterbury. Rather than what I had done before, using the post to just describe my presentation, I took this opportunity to tell the story of the BASR conference’s role in my life in Edinburgh. Writing on the train south from Edinburgh, I described how I came to associate the BASR conference with the beginning of each new semester, and how this one, perhaps my last, would once again act as the penultimate start of my last September in Edinburgh, and the last conference I would attend in Britain.
With the start of Fall, I decided to talk about stereotypes, particularly the one that associates Starbuck’s Pumpkin spice Lattes with ‘basic white girls.’ While this, in its own way, provided a humorous route into the use of stereotypes as discursive data, it also proved rather telling about the way we develop these stereotypes and what they tell us about ourselves, and those who create them. As I concluded: “These are stereotypes, and stereotypes are interesting things. Sure, they can tell us a lot about ‘other’ people, about their customs and culture, and about the way they define themselves. In this way, they even represent a type of discourse: language used by individuals that we perceive in a particular way, and thus the language we use to describe those ‘others’ in a way that makes sense for both their context, as well as for our description itself. Yet, they also tell us a lot about ourselves as well, not just in how we perceive those ‘others,’ but in how we might thus be stereotyping ourselves in the process. After all, if identity construction is all about projecting an image we want to be seen by others, which is then validated by an external entity (that other person), and vice versa ad nauseam, then aren’t we constantly being stereotyped as we stereotype others. This is something we should all consider, particularly concerning the type of terminology not only being used in Europe at the moment concerning the difference between a ‘refugee’ and an ‘immigrant,’ but about how we perceive others on a day-to-day basis in our interactions and conversations with other human beings.”
In September, the redditor, /u/FaithMilitant posed the following simple question to the subreddit, /r/AskReddit:
PhD’s of Reddit. What is a dumbed down summary of your thesis?
Based on this, I focused this post on a discursive bias concerning the perception people have about what constitutes a ‘PhD.’ In essence, my argument stated that individuals tend to associate the PhD with research in the sciences, rather than the humanities, based on the higher ranked responses to /u/FaithMilitant’s question. In fact, this was my thesis:
This discussion represents a particular bias, or rather, a particular discursive perception of the concept ‘PhD,’ and how the public might perceive of that concept as something more predominately associated with the sciences, rather than the humanities.
In relation to the notion that there currently exists a ‘crisis’ in the humanities, or that the humanities is a dying art, this was my conclusion: “more than anything, perhaps it reminds us that though there are differences between these two fields, the level of importance between a thesis that tests the accuracy, or even existence, of a Higgs-Boson, and a thesis that argues that all writing, from ethnography to a novel, is fictional by means of its ‘artificial’ nature, is in itself a fictional differentiation established by our discursive perceptions, and perpetuated by the language of random sample data. Understanding how that works will largely influence both the future of the humanities, as well as the future of education worldwide. After all, how can we be expected to promote and describe our research, if we can’t even control how those descriptions fit into the discourse on what it means to have a ‘PhD?’”
This post came about after my usual museum companion and I failed to see the The Amazing World of M.C. Escher exhibition at the Scottish National Gallery of Modern Art. When we arrived, two days before the exhibition closed, we found a line of people that stretched all the way from the entrance to the street. It was, by our estimation, at least an hour wait before we would get in. As we sat and watched these people queuing up to see a few etching by the famous artists, I started to think about how rituals change over time, and how they are influenced by new additions and generational gaps. For instance, is seeing Escher’s work ‘in person’ the reason we stand in line to see artwork we can view, in perhaps better detail, on the internet? That is, is it the ritual of seeing the work, not actually seeing the work, that becomes the important part of the experience? If so, then are sacred rituals (going to church, etc.) the same: is doing the ritual more important than what the ritual is venerating? Or, in other words, is worshiping God more important than the belief that God exists?
This post was short, because it it didn’t really require all that much more commentary than what I added. Alongside this clip, which I think best describes my ridiculous thoughts about things.
In this post, I took on relational terms, and in particular the term ‘nones.’ I haven’t, as evidenced by a few of the posts listed here, been much of a fan of relational terms. I feel like they don’t quite represent the individuals we study well enough, and leave far too much room for ambiguity. For this criticism, then, I chose to take on the ‘none’ category, and concluded with a useful (albeit likely disappointing) comparison: “The term ‘nones,’ and with that any terminology that has adopted the prefix ‘no’ or ‘non,’ thanks in great part by sociologists attempting to embody how individuals define themselves in relation to the religions of others ‘broadly conceived,’ seems like an attempt at defining what we can only presume is a large and growing group of like-minded individuals by simply describing them by what they aren’t. Which, in my opinion, seems incredibly unfair. After all, I’m not ‘non-British,’ I’m an ‘American.’ To call myself the former is just silly, and, really, doesn’t seem all that useful.” For this reason I offered this solution, a ‘write-in’ option that I think will help assuage some of the ambiguity about this topic:
“In the space provided, please describe how you identify religiously, using the terms you prefer.”
How do we define ‘religion?’ Do we focus on rituals? On belief? Do we use dimensional or categorical means to define what we think religion might be? In this post, I used a conversation between colleagues where we considered these questions to point out a means of defining religion that I thought worked best: I’ll know it when I see it. This, sadly, is not my original thought. I, in fact, borrowed it from Justice Potter Stewart’s exact expression, “I know it when I see it,” which came from his concurrent opinion on the 1964 Supreme Court case, Jacobellis vs. Ohio. The case itself dealt with free speech and the difference between pornographic material and ‘artistic expression.’ Though the Court found that the film in question (Louis Malle‘s The Lovers) did not represent pornographic material, they were unable to define what constituted the definition ‘pornography.’ In his attempt to address this, Stewart stated:
So, by applying this to the question of ‘how do we define religion,’ my response was equally the same: I know it when I see. Of course, while I do (and did) accept that this might lean “perhaps a bit too precariously toward the substantive side of the debate, essentially arguing that what I think is religious is defined as such for no other reason than my own convictions,” I also feel (felt) that it’s equally “rather clarifying in its simplicity.” After all, as I concluded: “just like how I might be able to determine something as ‘religious’ when I see it, this methodological approach seems to me that much better than the theoretical discourse of the last century, merely because I know it is.”
This was my last ‘live from’ post of this year, this time from the 4th Annual Graduate Conference on Religion at Harvard Divinity School, ‘Ways of Knowing.’ While I chose not to write about the presentation I gave (I did that the next week), I did write about stereotypes, and in particular, those about Boston. I used these three movie trailers:
Not surprisingly, the Boston that we found was quite different from that portrayed in these films. Once again stereotypes tried to influence my perception which, anthropologically, led to this thesis:
Our depictions of culture, either fictional or ethnographic, are isolated representations that, though we may emphatically defend as authentic, are unique to our own perceptions, and thus can never truly be so. That is, even when we try to ensure that our representations honour our subjects with as much authenticity as possible, we can never truly grasp the reality of a place and its people because, no matter how hard we try, our representations are, by their inherent nature, the products of artifice.
This post was a proposal, a detailed description of my post-thesis research. It describes a number of details about how I will be using select ‘Atheist gospels’ to discursively analyse the Atheist arguments and philosophies found within each author’s shared re-write of the gospel narrative. Within my description I provided a chapter outline, research proposal, research program, and bibliography. Hopefully no one steals the idea. Or, if nothing else, hopefully this will act as a kind of ‘copyright’ so if they do, I’ll be able to claim ownership.
My response, as I plotted out in this post, was, in essence, this:
when viewed as a cultural unit, in the same way we would objectively assess the subjects of an anthropological examination, the polyvocality of this discursive field becomes a collective of individual identities conforming into a group one. Thus, rather than the result being the “frustrating morass of contradictions and cross-purposes” (13) that Bullivant predicts, our different theoretical approaches to Atheism/non-religion/un-belief/ir-religion becomes a useful cultural unit with which we might, from a third-level perspective, make sense of the field itself. That is, if we step back and look at ourselves just as objectively as we look at our subjects, our differences transform from an atonal mess of scholastic disagreements, into a more discursively valuable cultural system.
This brings us, then, to this post, which I have listed here to further the notion (as I’ve done throughout this year) that everything, even this blog, is fictional by the fact that it is designed, constructed, created, and imagined via my intentions. As discursive data, however, it also provides an interesting insight into those things that influenced my thoughts. Here’s to another year.
In an attempt to avoid the rain the other day, I ducked into the Scottish National Gallery here in Edinburgh. It’s a rather lovely gallery, neither too large, nor too small, with some rather impressive pieces. Two of my favourites are “The Man of Sorrows” (1860) and “David in the Wilderness” (1860), both by the Victorian painter, William Dyce.
I enjoy these paintings because they represent a change of setting, a perspective of the artist that contradicts the ‘historical record,’ wherein his subjects (Jesus and David) have migrated from the realm of the Biblical Holy Land to Dyce’s own: the Scottish Highlands. I especially enjoy what these paintings tell us about an artist’s perception, about how a narrative might be adopted and amended to suit one’s own context. Or rather, how as a Christian, Dyce has placed these individuals into his own geographical context, shifting them out of legend and into something more attainable. He has, in essence, made his religion ‘Scottish.’ To me, this seems aptly similar to the way in which religious beliefs shift and translate, how they become nationalised and tied in with the civil religion of a central location, their discourses homogenised into something entirely new.
Dyce is also known, perhaps more famously, for his “Pegwell Bay–A Recollection of October 5th 1858” (1858-1860)
Renowned for its association with the genre of ‘Atheist Aesthetics,’ “Pegwell Bay” depicts a discursive shift, a narrative ‘sea change’ wherein the once predominate use of ‘religious’ imagery has been replaced with that of science. Here, families gather shells and fossils on the low-tide shore as Donati’s Comet soars overhead. A site frequented by novice and professional fossil hunters, as well as notable theorists like Darwin, Dyce’s use of Pegwell Bay as a setting allows the image to speak on his behalf, revealing a discursive commentary about the ebbing tide of religious belief and the reality of a more science-minded perspective on life, the universe, and everything.
In his Faith and Its Critics, David Fergusson contends that this painting depicts a type of ‘wistful’ and ‘nostalgic’ Atheism, a longing for days gone by, which matches in tone the basis of certain theoretical definitions of Atheism by scholars such as Hyman and Buckley: ‘Modern Atheism’ (that which arose out of and within the Enlightenment) appears as a ‘re-emergence’ of the classical ‘rational-naturalism’ that defines our notion of ‘Ancient Atheism.’ Likewise, this is an Atheist discourse that is equally expressed in textual examples, such as Thomas Hardy’s “God’s Funeral,” or Matthew Arnold’s “Dover Beach.” The latter even evokes a sense of tidal retreat, a poetic mimicry of “Pegwell Bay” via signifying terms like the ‘long withdrawing roar’ of the ‘sea of faith:’
The Sea of Faith
Was once, too, at the full, and round earth’s shore
While this is all very interesting, and is definitely worth a bit more discussion, my point with this post is actually about something else entirely, inspired by a humorous exchange that I witnessed in the ‘Impressionist’ room of the Scottish National Gallery.
I was enjoying one of Monet’s ‘Haystacks,’ standing off to the side, and a ways back. Two gentlemen, perhaps in their late fifties or early sixties, approached the painting. The one on the right, the taller of the two, drew his companion’s attention to the canvas.
“See this brushstroke here,” he said, “that’s indicative of the impressionist’s style, that heavy use of paint, and the way he dragged it up, and to the left.”
“Yeah, I see that,” his companion replied, a slight hint of angst in his voice.
“He had a remarkable eye for colour, and for distinguishing simple tones within the palette, most notably for his use of blue. You should see his ‘Nymphéas’ at L’Orangerie, in Paris,” the first man said, his voice adopting a velveteen accent.
The companion smirked slightly, then responded, as if pulling a sandwich out of his pocket and presenting it as evidence:
“I have a minor in Art History. I’ve seen it.”
The expert is an odd character, mostly because he or she can appear anywhere. We are all experts at one thing or another, from the utmost banal and prosaic to the select and specific. Likewise, the expert might not only appear in the most unexpected times and places, but from the oddest of origins.
One of the great myths of the PhD is that achieving one will make you an expert. Even I fell into this trap years ago when I stated I wanted to be ‘a world’s expert’ on Atheism and Ian McEwan. In retrospect, I now think of that as a rather silly goal. This is especially the case now that I’ve learned that after years of isolated study on a particular topic what you really become an expert on is the realisation that you’ll never actually know everything there is about that topic. Or, in more colloquial terms: the more you learn, the less you know.
There’s a useful ‘illustrated guide’ for what I mean here, that I’ll happily steal from Matt Might:
When we imagine all of human knowledge as a circle, by the time we finish our Bachelor’s Degree, we’ve accumulated a rather slight ‘specialty.’ That looks like this:
With a Master’s Degree, that specialty grows a bit:
By the time we’ve reached the PhD, that specialty begins to push against the boundaries of known human knowledge. This creates a darling little bump:
So now, given that the circle on which our little bump has protruded represents all human knowledge, it’s important to acknowledge where our expertise exists within this context:
While Matt Might’s illustration here is rather useful (it’s also available for purchase, for those interested), it also quite poignantly illustrates the oddities of the ‘expert.’
Moreover, it serves to remind us, just like my story of the two ‘experts’ staring at Monet’s canvas, that as we’re all experts, then perhaps none of us are. As we come to realise that the more and more we know something, the less we actually know in general, and therefore further accept that the expertise we’ve accumulated isn’t a substitute for the world’s knowledge, then we’re all rather ignorant. Does this mean we’re in denial, or that our attempts at proving our expertise to people who seem to have similar expertise is a means of pacification? Are we trying to claim ownership?
Perhaps.
Or maybe not. After all, clearly i’m not ignorant about Atheist discourse. Just look at what I said above.
That is to say: last Thursday, I sat for my Viva, an oral examination to defend the thesis that I’ve been working on for the last four years. To refer to this as ‘defending my life’ is a bit of a mistranslation as ‘viva voce’ really means ‘oral examination,’ or more specifically: ‘with a living voice.’
Yet, still, I think the idea of this being a defence of one’s life isn’t all that imprecise. After all, writing the thesis has been my life for the last four years, and especially as it has brought on an entirely new sort of life within a ‘foreign’ country, the thesis has been the central point around which my life has orbited in that time.
However, this was by no means a ‘trial’ of any sort. At least not like I thought it would be. I blame this solely on my examiners.
In fact, were I to describe my experience with the Viva, this close to the aftermath, and in a single word, it would be: demythologised.
Here’s what I mean.
Being a PhD candidate is in a particular way like being a young student, in your late teens, still in High School but nearing the end, having an older sibling/cousin/friend/acquaintance, who has come to visit, share with you the painful realities of their experiences in the ‘real world.’ They’ll describe that world in realistic terms, painting a picture that reflects back a harshness where the ease and simplicity of youth is quickly replaced with taxes, insurance, rent, jobs, pay checks, medical bills, student loans, etc. You might listen to their sage wisdom and take some of it to heart, but it won’t really ‘sink in.’
Then, and with just a hint of irony, you might find yourself, years later, sharing similar wisdom to your own younger siblings/cousins/friends/acquaintances.
This is not unlike the advice you might receive from colleagues who have passed through the viva stage. However, in this iteration, the message seems a bit more constructive than the ‘dose of reality’ you might get from the previous description.
In fact, one of the predominant advisements I’ve received over the years about the viva was: It’s nothing to be afraid of.
That’s nonsense, I’d proclaim. How could it not be something to be afraid of? Here is the culmination not just of all your time working on this one particular project, a close and critical examination of 100,000 words wherein typos are bound to happen and arguments might seem less developed than someone might want, but, perhaps more frightening, here is the culmination of years, a decade, maybe more, of research and studying and moving from university to university, city to city, country to country. Here is the last and final defence you must make to prove that you are worthy to join that extremely elite club of individuals and thus earn the title ‘Dr.’ Is this not, of all the things one must do within this movement up the academic ladder, the quintessential thing to be afraid of?
Someone once put it to me this way: as a kid, when you were playing video games, what is scarier, getting through the first level, which might seem hard at the time, or spending hours/days/weeks/months playing through a game, getting to the final boss, and realising if you lose here, everything that came before would be in vane?
That, to me, seems like something worth being afraid of.
Of course, I was wrong. This is the mythology I built up, the narrative I convinced myself was real, fed by a discourse composed of things like this:
When I’d hear the ‘it’s nothing to be afraid of’ line I thought it simply did not pertain to me. It was something one merely said in the euphoria that followed the viva, usually spoken by individuals who appeared physically and emotionally exhausted, driven half-mad by that final battle, a piece of their soul left somewhere behind.
These were the type of thoughts that preceded my own experiences with the viva.
As I blithely stated above, I blame much of the demythologisation here on my two examiners. I somehow got quite lucky by having the two individuals that I had personally chosen, and were at the top of my list, to be those who would read, examine, critique, and discuss the thesis with me. Not only did I select these individuals because I thought their backgrounds fitted the topic and contents of the thesis, but because I respect them above all others as exemplary experts in their fields. What this produced was an examination less like a defence, and more like a discussion, as if somehow we had each colluded to transform the final hurdle in my race to doctorship into an engaging and quite beneficial supervisory meeting.
Thus, even when we disagreed on points, there was a congeniality underscoring the criticisms, each suggestion filtered through a respectful means of assisting me in making the thesis the absolute best, and thus clear and definitive, text it could possibly be.
Though the final verdict requires corrections to be made within the text, and though this means I did not ‘ace’ the thesis (those in my inner circle here have taken to calling this the ‘Whitney,’ based on the results of a genius colleague’s stellar viva result), I have come away from the experience not just invigorated about addressing these corrections, but with a newfound admiration for the topic itself. In other words, because my examiners did such an impeccable job, I’m spending these immediate days after the viva neither exhausted, nor wishing to remove myself as far from the thesis as possible, but excited about the prospect of making it all that much better.
I will conclude here with two observations:
First, as I was preparing myself last Thursday morning I was trying to remember a quote from Hemingway about the fear one might have of writing. For whatever reason it kept popping up as I read my Introduction and Conclusion for the fourth time, like a half-memory that my brain kept trying to link to my argument.
I eventually found it:
“I write one page of masterpiece to ninety one pages of shit. I try to put the shit in the wastebasket.”
To briefly put this into context, it comes from a letter that Hemingway sent to F. Scott Fitzgerald after the latter requested the former’s opinion on his recently published Tender is the Night. Just as much as Hemingway is mythologically notorious for an incessant need to prove himself truly masculine, Fitzgerald is just as mythologized for being on the opposite spectrum. Consider, for humorous example, this passage from Hemingway’s A Moveable Feast:
‘Zelda said that the way I was built I could never make any woman happy and that was what upset her originally. She said it was a matter of measurements. I have never felt the same since she said that and I have to know truly.’
‘Come out to the office,’ I said.
‘Where is the office?’
‘Le water,’ I said.
We came back into the room and sat down at the table.
‘You’re perfectly fine,’ I said. ‘You are O.K. There’s nothing wrong with you. You look at yourself from above and you look foreshortened. Go over to the Louvre and look at the people in the statues and then go home and look at yourself in the mirror in profile.’
‘Those statues may not be accurate.’
‘They are pretty good. Most people would settle for them.’
‘But why would she say it?’
‘To put you out of business. That’s the oldest way in the world of putting people out of business. Scott, you asked me to tell you the truth and I can tell you a lot more but this is the absolute truth and all you need.’ (Pg. 163)
My quote above stems from another advisement from Hemingway, a criticism not so much of Fitzgerald’s writing, but of his lack of confidence as a writer. Here it is within the context of the letter:
For Christ sake write and don’t worry about what the boys will say nor whether it will be a masterpiece nor what. I write one page of masterpiece to ninety one pages of shit. I try to put the shit in the wastebasket. You feel you have to publish crap to make money to live and let live. All write but if you write enough and as well as you can there will be the same amount of masterpiece material (as we say at Yale). You can’t think well enough to sit down and write a deliberate masterpiece and if you could get rid of Seldes and those guys that nearly ruined you and turn them out as well as you can and let the spectators yell when it is good and hoot when it is not you would be all right. (Re-printed via www.lettersofnote.com)
Hemingway is telling his friend, in his own way, to ‘forget his personal tragedy,’ to move beyond those things that might have caused his anxiety and diminished his confidence, to let go the myth that he has anything to be afraid of.
He tells him: “Go on and write.”
Which brings me to my second conclusive observation.
Leading up to the viva I was told a number of times that the experience was nothing to be afraid of, that I would ‘do fine.’ While this advice did, in fact, come true, it is not something I find myself able to adopt.
I suppose, like all those times when as a young man someone gave me sage advice about the realities of the real world, I am once again dismissive. Not in a rude or negative sense, mind you, but in a practical way. Yes, my viva experience was wonderful, and better than I could have even imagined or day-dreamed it in order to pacify my anxiety, but I also think the mythology of it was necessary too. Much like how Fitzgerald’s myth about the poorness of his writing forced him to ensure it was always clean and detailed and perfect, were it not for my fear of the viva, perhaps I would not have been as prepared as I was, or, more importantly, perhaps it would not have been such a rewarding experience, simply because that reward came from the demythologisation of it.
In other, and final words, were I asked what to expect from the viva by a colleague approaching their own, because my fear of it proved so useful to the outcome itself, “it’s nothing to be afraid of” is something I’m afraid I just couldn’t say.
***One last thing.***
Because I turned to it often when accomplishing the great milestones of my thesis, here’s another discursive example that I think nicely puts into perspective the myth of the viva:
The following is the ethnographic translation of the below field notes that you asked for, taken during one of my days of observation here in Edinburgh, the capital city of Scotland. It is presented here as requested, incomplete, but I do hope that you find the details as riveting and nuanced as I do. I also sincerely hope that my conclusions will be enough to justify further monetary support.
08:02 AM
I was applauded this morning whilst crossing the street by a group of women dressed in ornate pink and red costumes. One was wearing a tiara, and I presumed she was the leader and/or of some lesser-ranked royal class. I was embarrassed to have been a part of their group, and slightly disappointed that I might have too closely become a participant. When I reached the opposite side of the street, I moved past the women and hid myself behind a tree so as to better view their actions without too destructively intervening. As I did this, the lower ranked of the two smiled at me with what I can only presume was either a gesture of greeting, or dominance.
The two women were then greeted themeselves by an individual dressed in a dark green coat, long in the sleeves, and that reached down below her knees. I thought this choice of garment was odd as it was neither raining, nor cloudy. I have periodically found myself considering the oddity that is the dress habits of these natives, as they tend to adorn themselves in often drab woollen accoutrements based on an assumption (either via lived experience or prophetic divination) about what type of weather might actually occur, rather than for what is actually occurring.
The green woman handed the leader of the pink and red women a paper cup filled with a yellow-tinted libation.
This exchange, as well as the ceremony that followed, is worth noting in detail:
The crowned woman drank first. She sipped lightly at the libation, then handed the cup to her subordinate, who equally sipped lightly. They both seemed to have found the contents pleasing as they happily thanked the green woman. Then, they each drank again in turn. The crowned woman made an intriguing gesture, tipping the cup toward the green woman, who nodded her head forward in response. The red and pink women then continued walking, sharing the cup back and forth until it was empty. I followed, cautiously. At this point, the subordinate woman crushed the cup in her left (dominant, it seemed) hand and dropped it against the wall of a merchant shop that specialised in a local delicacy the natives affectionately refer to as ‘chippy.’ (It is an acquired taste). The two women then began to move quicker, laughing to one another. When I examined the cup I noted the contents to be sweet and slightly chemical, like alcohol. I tried in earnest not to disturb the cup, as I did not wish to interrupt, and thus become a part of, the ritual.
Given the age of the two women, the colour of their garments, and their body shapes, I believe an educated hypothesis about this ritual might conclude that this was a type of fertility act. In fact, from previous observations, I am confident in the assumption that these women were performing a liminal transformation, akin to a removal of oneself into the wilderness, only to return an acknowledged member of the tribe. As I myself returned to my original position this hypothesis became even more valid as a large group of similarly dressed red and pink women appeared on the opposite end of the street. I noted in my journal that the green woman excitedly began preparing more libations from a glass bottle with the label scratched off and a box of what looked to be some kind of juice. As there is indeed much more that I could describe of the interactions and dialogues that occurred once this group of women crossed the street, I will leave here further confident in my impression that I inadvertently discovered a festival devoted entirely to a fertility act.
More on this to follow.
13:45
A few notes on the customs that I have witnessed in my time here. I have broken these into ‘rules,’ because, as you know, it feels easier for me to delineate the imponderabilia of these people in particular categories.
Rule 1: There is always a hill.
Edinburgh is a city of hills. There is always a hill. Even when one climbs to the apex of one of the ritually sacred hills (I have counted at least four thus far), there still seems to be a hill to further climb. I have noted that a number of the visitors who come here to explore the mysterious culture of the Scotspeople take to wearing clothing suitable to such a geographical landscape. I have counted an immeasurable amount of hiking boots and trousers and jackets to match. They seem to have all invested in large brim, thin, flopping hats. I even once saw a man using two walking canes. Such is the terrain of this environment.
The locals, of course, seem not dissuaded by the hills here. Even to the point of stubbornness. I have taken to using many of the transport options available to the native and visitor alike to make my way through the city, but the natives insist on walking.
Interesting point to support this: in the last few years, the tribal elders gathered and financed the construction and instalment of a tram system. The natives have not taken to using it, to the point of insistence, and even protest at times. Perhaps this is due, as I have decided, to the fact that it only leads in a single direction, and thus seems rather pointless. Either way, their familiarity with the hills of this city seems ingrained within their genes. It is indeed an intriguing aspect to their cultural identity.
Rule 2: Someone is always behind you.
I walk quite often here, making useful observations of the natives and their interactions with each other and the visitors alike. However, I have found myself, repeatedly, and without fail, being followed. This does not, of course, mean that I am actually being followed, but that there is always someone walking behind me.
I’ve found this to be such a frequent occurrence that I have named it the ‘Edinburgh phenomenon.’ I will look more into this through the remainder of my research here.
Rule 3: The heaters are always on.
It is no secret that the weather here can be a bit rain-soaked and blustery. The natives have a number of terms for these weather patterns, ‘dreach’ and ‘haar’ being the most often ones that I have heard thus far. The latter is a name given to a low and choking fog that rolls in and blankets everything in sight. The former is difficult to translate. One of my informants tried to roughly define it as: “the weather is terrible and cold and it is raining, I think I’ll have a lie in today.”
Given this type of weather, the natives tend to always have their heaters on. They likewise will usually be adorned in woollen garments. This makes it difficult for an outsider such as myself to acclimate to the sweating. Even on days when the sun is out and it is warm, without fail, the heaters will be on. While I have tried to guard myself from too subjectively being influenced by this, it has proven to be the hardest part of my observations.
I am always sweating.
22:00
This evening I took a bus to one of the city’s secular sanctuaries. It is called ‘Usher Hall,’ but I believe the natives pronounce it ‘Oosher Heel.’ I’m not entirely certain, and will look into this more.
On the bus I noticed two oddities.
An elderly woman was obsessively engaged in picking out the blue embroidery from a white towel. She was using a pair of scissors and cutting it loose, then depositing the blue thread on the floor of the bus. On closer inspection, I believe it was a hotel towel.
An elderly man came onto the bus, muttering to himself in a native dialect that is difficult to make out, even with my extensive language study. I’ve been told it’s what is called ‘Leither,’ but have yet to source from where this originates. He remained standing during the entire bus ride, busying himself by the buses entrance. I moved seats to better observe him and found that he was removing the discarded bus tickets from out of a red plastic bucket, flattening them in his hands, and then eating them one by one.
At Oosher Heel I sat amongst a few of my informants who had invited me to hear a reading from a fellow American, a humorist named David Sedaris. They were quite fond of him, and I took this as a compliment based on their views of my own culture. Overall I felt this experience definitely bonded me with them, and I look forward to the cultural observations I will achieve via this friendship.
During the reading a remarkable realisation came to me that I will transcribe in full:
At one point David Sedaris was telling a story about how he likes to pick up trash in the village in which he lives in England (apparently he is conducting his own research). His deeds were so welcomed by the natives in his area that he was invited to the Queen’s Palace for lunch. While the story he told was quite humorous, and while I do not intend to bastardise it here, what stood out to me was the reaction of a number of the natives in the audience. When Sedaris said ‘the Queen,’ people booed in a critical tone. I had been warned that these natives were no fans of the Queen of England, but I was not expecting them to be so vocal about it. It then occurred to me that they were doing more than just booing, they were being supportive of their own queen, whom I had likely observed earlier this morning, the one in the tiara. This realisation has altered my perception of this morning’s fertility ritual. I will thusly be re-focusing my research on locating this queen, and will alert you further on my successes.
Best from the field,
–E.G. Quillen
PS: please ensure the grant proposal goes through, I am indeed sure that I have stumbled upon an essential aspect of this culture and fully intend to further explicate its meaning.
I’ve been asked in recent weeks what my life is like now that I’ve submitted the Thesis. I myself asked this very question of colleagues and friends as they too entered the stage between submission and the impending viva. One answer that seems to always come up, and one in which I, again, have agreed with, is that my life is now defined by an odd sense of ‘malaise.’ While others might not agree with my wording here, I think this term perfectly sums up this stage for me.
Here’s why.
First, the term’s lexical definition, the definition you might find in a dictionary, seems to fit this stage quite nicely:
1 : an indefinite feeling of debility or lack of health often indicative of or accompanying the onset of an illness
2 : a vague sense of mental or moral ill-being <a malaise of cynicism and despair
Spending years obsessed with writing a long paper takes it’s toll on a person. That’s years of feeling guilty for ‘taking the afternoon off,’ or, as a good friend was once advised to do, ‘take the full weekend.’ That’s years of thinking about the weakness at the end of chapter three, how the conclusion needs to be a bit more nuanced, how you should ‘unpack’ your term usage throughout. That’s years of feeling like everything you write is terrible, that your ideas are too simplistic, that you aren’t saying anything truly unique or different. Then, finally, there’s that feeling that someone, somewhere, will point out how you didn’t read that one obscure text related to your subject, and, of course, that person will be one of your examiners.
This sort of life is a disease in itself, so the malaise that follows is very much a side-effect of replacing these symptoms with those associated with the equally obsessive curiosity about how what you have written is being read. This is a very special kind of malaise, like a bizarre liminal stage, just this side of the threshold that defines us as ‘finished.’ Which also means, it is a different sort of stage than that which defines the post-viva mindset. This, again, is why I think this term is perfect. The viva is like the impending ‘illness,’ so that the malaise felt at this stage is like the ‘lack of health’ indicative of the onset of that illness.
Second, because other people have used this phrase to point out (even metaphorically) similar issues, my usage seems like a good comparative adaption.
Of those ‘other individuals,’ Jimmy Carter is perhaps the most memorable person associated with ‘malaise.’ Thirty-six years ago this week, and in regard to the looming energy crisis, he took to the airwaves with his ‘Crisis of Confidence’ speech. In this address, he pointed out and discussed what he referred to as a “fundamental threat to American democracy,” an erosion of the nation’s confidence in itself:
I do not mean our political and civil liberties. They will endure. And I do not refer to the outward strength of America, a nation that is at peace tonight everywhere in the world, with unmatched economic power and military might.
The threat is nearly invisible in ordinary ways. It is a crisis of confidence. It is a crisis that strikes at the very heart and soul and spirit of our national will. We can see this crisis in the growing doubt about the meaning of our own lives and in the loss of a unity of purpose for our nation.
The erosion of our confidence in the future is threatening to destroy the social and the political fabric of America.
He went on to describe what he felt were the precursors to this crisis: the assassinations of President Kennedy, his bother Bobby, and Martin Luther King, Jr; the violence and defeat in Vietnam; the distrusting results of the Watergate scandal; and the decreased value of the American dollar during a long and arduous inflation. He described much of this as symptomatic of “paralysis and stagnation and drift.”
Here’s a video of the speech, for those interested:
This address became known as the ‘malaise speech,’ a critical association because it eventually came to negatively effect his presidency, ultimately leading to his re-election loss in 1980. Moreover, the term was associated with what he said because, as many critics argued, it merely pointed out Carter’s own criticism of the American people’s mood, his notion of a ‘crisis’ based on his own perception of the despair, ill-feeling, and cynicism emanating from the nation’s public.
While there is much to debate here about Carter’s language use and how it influenced, and was influenced by, the discourse of the American public at this time in history, the terminology is still quite poignant, especially in its association with the ‘crisis’ we might feel in our post-submission confidence. Which leads me back to my own usage.
To conclude, the malaise that I associate here with the post-submission mindset is in its own way indicative of a ‘crisis,’ not only in our confidence of what it is we have written, but in the loss of the obsession that is writing a thesis. It is a malaise defined by this double loss, a horrific perfect storm bolstered by a separation from that which has defined us for years, and the ultimate concern that the typo on page 137 will be the deciding factor in our inevitable failure.
So, in answer to the question, ‘what is life like after the submission,’ perhaps the best response is: not much, emotionally at least. Which is also why I felt it might be useful to write about this malaise, not only for myself, but for others who might have equally experienced this same sort of emotional tempest. That, and because the malaise has taken quite a strong hold on my current perception of the world, and created for me a distinct crisis of confidence in my own work, I really had no idea what to write about this week.
Last week I was sitting in the bar at the Cameo Cinema in Edinburgh, enjoying a glass of wine, when my drinking companion pointed out that she suddenly felt special. On every table in the bar was a small glass vase, inside of which was a yellow gerber daisy. The vase on our table had two.
It was a relatively warm day, and since spring tends to come later than usual in Edinburgh, these little flowers were a nice touch.
The bar itself was fairly crowded for a weekday afternoon. The King and I, starring Yul Brenner, was playing on the television. The sound was turned off.
I commented to my companion that on certain occasions, like this one, I like to imagine the world (everything) as part of some great novel. In moments like these it almost feels like you can see the crossover between fact and fiction, as if the world suddenly seems to have been quite specifically designed, ‘put together’ for metaphorical effect. On the table behind us, against the wall, two men were engaged in a heated conversation. It wasn’t an argument taking place between them, but rather one they were sharing about something else. They were both sitting forward, arms resting on knees, their language peppered with indignant terminology. “It’s not fair,” said one. “It’s criminal,” said the other.
Their conversation carried on like this for a while, until their raised voices began to melt into the background. However, something interesting about them stood out to me, something that prompted my seeing our surroundings through the novelist’s eye. Everything about the scene within which they were acting appeared to compliment their shared mood. Their body language was rigid and combative, the language in their discourse was atonally violent. They seemed to be heating the room, and I got the sense when looking at them as if they were somehow marginalised to the back corner, isolated in their misery by their own choosing. They looked like characters, constructed for a purpose, like representatives of discomfort or despair or hardship. They were almost cliche, like action figures of a novelist’s tableau.
On their table, the yellow gerber daisy was wilted and dead.
This week I submitted my doctoral Thesis.
Roughly 90,000 words of four year’s research. After I submitted the copies necessary to the School of Divinity, I passed around my own copy at the bar. At dinner that night, it sat on the table at a spare place setting, like an empty plate waiting to be bussed away. This felt like another one of those occasions where my life felt like a novel, and it reminded me about how I found myself writing about fiction.
At some point in the early stages of my PhD I suddenly decided I wanted to do something with fiction. As I had come to Edinburgh to study Atheism, combining these two interests seemed like a fun idea, that made no real sense. It took a few years to figure it out.
Originally, I had come to do an ethnographic study of the Humanist Society of Scotland, but soon my interest in this subject waned as I began to plot out how I might turn that into a Thesis. As well, because Atheism is such a new topic of interest in the world of academia, I found that there wasn’t much of a foundational base in the ‘anthropology of Atheism’ on which to build my own. Defining the term was bad enough, but defining how we might study individuals who use ‘Atheist terminology’ in a broad and abstract manner is even worse. So, instead of following what some might consider the ‘academic route,’ I turned toward fiction.
On so many occasions over the years I’ve found myself either taking or tutoring courses that use a novel as a source for the subject being taught. I once took an American history class where every text was a novel. We read Baum’s The Wonderful Wizard of Oz to discuss the political and civic alterations taking place at the turn of the century; Steinbeck’s The Grapes of Wrath to discuss the Great Depression; Yate’s Revolutionary Roadto discuss the lost sense of identity discovered by many during the mid to late 1950s; and O’Brien’s The Things they Carried to discuss the harsh realities of the war in Vietnam. Here in Edinburgh, we’ve had some great success with our course on Religious and Ethical Debates in Contemporary Fiction, which I have written about a bit already.
Not only did I want to use fiction in this same way, I wanted to elucidate a deeper meaning about why we might use the novel in this capacity, as well as how that might be accomplished.
I took as my focus the use of a novel (two, actually) as an ethnographic text. This required a number of establishing details, which I separated into three ‘pillars:’ a discussion on how ethnographic texts are constructed, and the literary focus adopted into that methodology in the 1980s via the sorts of theoretical arguments started by Clifford and Marcus’ Writing Culture; a discussion about how novels are critiqued within the context of specialised fields in Literary Theory; and the manner with which we approach concepts, especially when they are attached to religious identity construction, within the context of an anthropological analysis. By marrying the first two into a methodological approach, that is then theoretically supported by the third pillar’s focus on the concept ‘Atheism,’ I created a means with which we might read a novel ‘ethnographically.’ I called this ‘Ethnographic Criticism.’ I’ll likely write much more about this, and how it is done, in future posts.
While my use of Ethnographic Criticism seemed rather successful in regard to using fiction as an ‘ethnographic source’ of British Atheist identity (though any certainty about that must wait until after the Viva), there arose in the process a rather precarious defect. This has to deal with the role of the author in the construction of an ethnographic text, and how our acknowledging that role shapes the way in which we read his or her ethnography.
For example, if an ethnography is written in an omniscient voice, adopting the strict objectivity we find in classic texts like Malinowski’s Argonauts of the Western Pacific, the culture we read there is presented in a manner quite different from an ethnography written from a reflexive first-person perspective, such as we find in the ‘ethnographic novels’ of Michael Jackson, Timothy Knab, and Richard and Sally Price. In the latter this is greatly determined by what Geertz (Works and Lives, 1988, 8-9) refers to as the ‘signature of the author,’ or ‘author-function,’ which he borrows from Foucault. Recognising the author’s role in writing the text, and thus in also recognising that the text has been ‘authored,’ means that our perception of the culture within the text is dictated by our acknowledgement that it is an artifice, designed and structured by an intermediary. Thus, when it comes to reading a novel ‘as ethnography,’ our perception of the culture being represented is equally dictated by our acknowledging the author’s role in writing the text in the first place. Where we might simply look at how the author (let’s say Ian McEwan) designed his novel (let’s say Enduring Love) with an intention based on his own discourse, context, and opinions, this is much less simplistic when we consider the ‘author’s signature’ in a novel written from a first-person perspective (such as in Enduring Love).
In other words, where reading a novel as ethnography would require our acceptance and comprehension of McEwan’s role in shaping it from his imagination, when written from a first-person perspective, that novel ceases being written by McEwan at all. That is, it becomes something more akin to the first-person ethnographic novels cited above. In this way, it is no longer a novel. It becomes an ‘auto-ethnography,’ a text written by an individual within the context being depicted, who is writing about his or her own culture. Thus, McEwan ceases to exist. The text we are reading is now a text written by an individual who might now be perceived as ‘actually’ existing. Which also means that the novel equally transmutes out of the realm of ‘fiction.’ This has repercussions on a number of levels.
If the novel ceases being a novel, then Ethnographic Criticism isn’t actually reading a novel ‘ethnographically.’ Likewise, if McEwan ceases being the novelist who created the text, then the lead character narrating his or her story is now suddenly ‘real.’ Brought together, these two issues determine even larger ones concerning how we perceive texts (like ethnographies) as representing fiction, non-fiction, or something somewhere in between. In this way, we might actually start to believe, because a ‘text’ in each and every instance is something both made-from (designed via research and data) and made-up (invented from one’s imagination), that everything is fiction.
This is, in essence, part of the focus of my Thesis. Again, I’ll discuss more of this later.
Next week two friends of ours will be defending their Theses.
Some time ago, one of them shared this Tumblr page with me, with this particular post:
I love the idea of a ‘secret novel,’ as if writing fiction is something we’re just not meant to do. We’re academics. We write facts, not fiction. Our work is empirical and objective. It isn’t just ‘made-up.’
I find myself disagreeing with this now. Four years of reading and writing and thinking about the thin pragmatic line we maintain between fiction and non-fiction has brought me to a somewhat vague conclusion. Everything, I’d argue, is fiction. This doesn’t mean that everything is the product of imagination, solely made-up and invented with no connection to what is ‘real,’ but it also doesn’t mean that imagination is entirely left out of it. When we sit down to write a wholly objective text, we are still imagining how it will take shape. It’s still designed. It is still an artifice, no matter how empirical we are about its creation.
A thesis is no different.
A thesis is, just as much as McEwan’s Enduring Love, a novel. We’re all novelists, merely by the fact that we are writers. Our novels are a particular genre. So, the idea of a secret novel is not that secretive at all.
Here’s an example. A good friend of mine (Jonathan Tuckett) just successfully defended his thesis and passed his viva. His focus was on phenomenology, and the text he produced was very ‘academic.’ Yet, it was designed to tell a particular story. On one level it told the story of his research, defending his argument, a narrative driven by a plot that came to a particular conclusion. On another, it told the story of his research, of his efforts in proving his knowledge and expertise on the subject.
Whilst he was writing this novel, he was writing another (and likely others, he’s quite the wordsmith). That second novel was published recently. It is the first of a saga, titled: The Mystery of Farholt. You can read more about this, and his other ‘fictions’ here: https://johnstonewilson.wordpress.com
I use him as an example because his writing works well for my argument. Both of the texts produced by Jonathan are novels, designed and created by an individual employing the art of writing to tell a particular story.
As well, the fact that each represent a plotted narrative designed for a particular purpose furthers my argument that everything is fiction. How we determine the meaning of that term in relation to the ‘fiction’ of Jonathan’s thesis and The Mystery of Farholt, is merely a difference of genre distinction. Therefore, I will conclude here by once again arguing that, because we are writers, and because what we are writing is fictitious in its being written, we are all novelists. There’s nothing really secretive about that. In fact, when you look around, there’s usually good evidence for the idea that not only is everything fiction, but that we are all of us living within novels of our own devising.
A few years back, the University of Edinburgh hosted Bruno Latour for its Gifford Lecture. Latour’s lecture series was titled: Facing Gaia: A New Enquiry into Natural Religion. You can find more about his six lectures, as well as view them, by clicking the link there. Also, my good friend David Robertson put a lot of work and effort into interviewing Latour for the Religious Studies Project. That’s worth looking at as well. Here’s Part One and Part Two.
At one of the lectures, titled “The Anthropocene and the Destruction of the Image of the Globe,” a lecturer from New College, the University of Edinburgh’s School of Divinity, which also houses the Religious Studies department and which has been my home for the last four years, questioned Latour’s use of the term ‘Anthropocene’ to describe the current age in which we now live. Instead, he argued, we aren’t living in a ‘time of man,’ but rather the ‘Christocene,’ an era defined by the influence that Christianity has collectively given to the ‘modern world.’ It shouldn’t be surprising that this lecturer is a theologian, and while his suggestion was really just a way for him (as many academics tend to do) promote his own ideas on top of Latour’s, it’s an interesting take.
For my intentions here, I will do the same. I will build atop these distinctions with my own. This does not mean that I think they are wrong, but rather than I see the age in which we currently live as defined by something else as well.
We are currently living in a Profitable Age.
Here’s some evidence for this, based on data that I retrieved from perhaps the most accessible source: popular culture.
Depending on which addition you look at, J.R.R. Tolkien’s novel, The Hobbit, is roughly 300 pages in length. While the remarkable success of Peter Jackson’s film trilogy of the Lord of the Rings not only garnered 17 Academy Awards, including Best Picture for The Return of the King, it also accumulated a financial largess roughly equal to 3 billion dollars. Which might explain his reasoning for stretching the limits of The Hobbit to three films over two hours in length each. Worldwide, the films have earned almost a billion dollars in revenue.
This isn’t limited to just this series of films. In the last few years we’ve been provided with a number of entertaining films in serial form: Harry Potter, The Hunger Games, Twilight, Fast and Furious 1-7. Likewise, in print form, it appears that publishers have been earnestly trying to locate the next Harry Potter, the next Twilight, the next The Hunger Games, the next Hobbit. Fifty Shades of Grey (the books) have sold over 100 million copies. The first film of the trilogy based on the novels, which premiered this year, has grossed roughly $600,000,000. It is third on the list of the most profitable films this year. The second is Avengers: The Age of Ultron, the eleventh film in the Marvel Cinematic Universe, which has films planned up to 2019, with 22 films so far either already released or in development. The highest grossing film so far for 2015 is Furious 7.
Franchises are proving quite profitable.
Beyond popular culture, we might also find evidence for our living in The Profitable Age in politics, particularly with examples such as the United States Supreme Court’s decision in Citizens United vs. The Federal Elections Commission (2010). Originally dealing with issues pertaining to Michael Moore’s Fahrenheit 9/11 (2004) and Citizen United’s Hilary: The Movie (2008), and the role these films played in deciding what constitutes a political statement, and who might be considered the ‘voice’ of that statement. In brief, the decision was, in essence, a First Amendment issue: in their argument, the Supreme Court declared that restricting or denying Citizens United from releasing the film would be an unconstitutional restriction of free speech. This, though colloquial understood as removing any sort of restriction on what a corporation or union could give to, or spend on, a particular candidate, or that this would somehow mean that ‘corporations are people,’ really means that corporations were free to directly advocate for an candidate. Whether this means a financial advocacy is something of debate. For a better description, here’s a New Yorker article by Jeffrey Toobin that explains the case in much better detail: http://www.newyorker.com/magazine/2012/05/21/money-unlimited
Again, this seems like another exemplary addition to the idea that this is a Profitable Age.
One final example.
A week or so back I wrote something here about the ‘business‘ of academia, and last week, in my discussion of the World Religions Paradigm I touched on this a bit as well concerning the way our ‘product’ is being presented in a manner that will guarantee financial success.
Tuition costs for example, are on the rise. In the United States, the cost of a four-year undergraduate education has exponentially risen in the last thirty years. Here’s a useful graph based on data provided by Collegeboard.org: As well, here is the data unabridged. Notice also that these prices have been amended to ‘2014’ dollars in order to address inflation: At the same time, ‘for-profit’ universities have been on the rise as well. Rather than my terrible description, here’s John Oliver talking about student debt, and for-profit education:
Humour aside, Oliver touches on a number of essential issues, particularly concerning the rise in cost of an education, and the means with which students are driven to pay for these costs with loans. Granted, taking out a loan is one’s choice, a choice that each and every student makes in full knowledge of the amount and difficulty there might be in paying it back. Of course, in many instances, an education is a necessary requirement for employment, so it is, in many ways, a double-edged sword, and an inescapable quagmire. As far as ‘for-profit’ education is concerned, here’s an interesting take on it from the Atlantic about the University of Phoenix (the largest for-profit education system in the US) by Bourree Lam from April of this year. While this article addresses many of the issues facing for-profit education, such as the lack of relatable skills that they offer to students, as well as their low graduation rates, I think this quote is rather poignant:
People who graduate from these schools don’t seem to be getting jobs sufficient for paying off the costs of attending them.
I think this, perhaps more than anything, nicely sums up the Profitable Age.
Let me conclude here.
Here I have offered three examples. This is, I admit, not really enough to support my argument, but that should not dissuade the reader from appreciating my theory. This is a Profitable Age. Whether that is defined by film or novel franchises, by political developments, or the business of academia, it seems more often than not that the world we are living in seems dictated by profit. How this dictates the way we move forward, and whether that might mean a diminishment of value, is something we will have to wait and see.
In mid July of this year, we, as a collective human society, will have one more book to read by the novelist Harper Lee. Written before her famous To Kill A Mockingbird, Go Set a Watchmen is a ‘sequel’ to the former, set twenty years later. While the publication of this book has brought with it a renewed interest in her writing, it has also inspired a bit of scepticism about the legality, even morality, in publishing it (considered problematic given Lee’s presumed health issues). Within the former category, a recent article in the New York Times caught my attention, particularly in how the author, Laura Tavares, makes use of To Kill a Mockingbird in a way that elevates it above the restrictions of mere aesthetic media.
Her intention, as described at the beginning, is to elucidate for instructors (who might be reading the article for the sake of using it in their classrooms) how such a novel might ‘speak’ to ‘real life:’
To encourage students to make these important connections, we’ve chosen to pair an excerpt from Chapter 15 of the novel with The Times’s article on the Equal Justice Initiative report, “History of Lynchings in the South Documents Nearly 4,000 Names,” with the goal of helping students more deeply understand “Mockingbird,” the world of the novel, and our own world.
Here’s a ready example. In his “The Author as Anthropologist: Some West Indian Lessons about the Relevance of Fiction for Anthropology,” in Eduardo P. Archetti’s Exploring the Written: Anthropology and the Multiplicity of Writing, Thomas Hylland Eriksen distinguishes between two ways in which the novel might function ‘as an ethnography:’
First, novels may serve as ethnographic sources and may to this effect rank with informant’s statements. At this level, the author—whether he is a Mittelholzer or a Naipaul—more or less unwittingly reveals aspects of his society. As Bakhtin and many others have reminded us, the author is a prisoner of his own time. The author, known through the novel, is here seen as an aspect of the production of society.
[…]
Second, novels may be read as ethnographic descriptions; that is, the formation conveyed may be taken more or less at its face value, as a kind of ethnographic documentation. (191)
In this way, he continues, the novel and the ethnography are ‘relevant’ to each other, but they are not the same thing. To further delineate his meaning here, he states:
[Novels] cannot be used as plain ethnography since they do not profess to represent the truth and because their relationship to social reality is ultimately uncertain. Besides, if they are to be exploited as ethnographic sources (and not as evidence), the reader must be familiar with the society at the outset of the reading. They cannot, therefore, replace the ethnographic footwork either. It therefore seems a paradox that some of the best anthropological writings extant on Trinidad are works of fiction (cf. Melhuus, infra, for a Mexican parallel). In order to asses their validity, a reader must have first-hand experience of the society. (190)
However, and though his distinction here between the ‘source’ and the ‘description’ is a useful one in determining the differences between the way fiction might ‘function’ in a way exclusive of its existence as an aesthetic piece of entertainment, I would argue that he is incorrect in his strict separation between the ethnography and the novel. This is especially the case with his opening remarks about the ‘simple distinction’ between the two forms of writing:
Fictional accounts, then, present persons and events which have been invented by the writer. Anthropological texts try to present a few aspects of social reality as accurately as possible, taking account of the limitations entailed by fieldwork, ‘cultural translation’ (or, if one prefers, cultural reduction) and attempts at linguistic representations of society. Lies and deliberate misrepresentations are banished from anthropological scholarship, which should additionally—unlike fictional writing—try to present empirical material systematically and comprehensively and distinguish between description and analysis so that the reader may draw his or her own theoretical conclusions. (168-169)
I would further argue that he is quite mistaken here, particularly concerning the difference between ‘fictional’ and ‘anthropological’ accounts. Both are artifice, meaning both are designed and dictated by choice. Likewise, both are the result of a textual process, a ‘storytelling’ wherein the author has tried to re-create a discourse in a way that represents his or her subject in a manner ‘true’ to his or her interpretation. In fact, I would agree in many ways with Clifford (1986) that ethnography is, in fact, a type of ‘fiction:’
To call ethnographies fictions may raise empiricist hackles. But the word as commonly used in recent textual theory has lost its connotation of falsehood, of something merely opposed to truth. It suggests the partiality of cultural and historical truths, the ways they are systematic and exclusive. Ethnographic writings can properly be called fictions in the sense of ‘something made or fashioned,’ the principal burden of the word’s Latin root, fingere. But it is important to preserve the meaning not merely of making, but also of making up, of inventing things not actually real [emphasis in original]. (Clifford, “Introduction,” Writing Culture, 1986, 6)
Beyond mere etymological determination, I think Clifford is correct here mainly because I think any and all textual representations are ‘fictional’ by their inherent ‘artificial nature.’ Eriksen can argue all he wants that fiction represents ‘lies’ or ‘deliberate misrepresentations,’ but I would again contend that this is equally a problem for the ethnographer for no other reason than the fact that he or she is, as Malinowski stated, ‘creating’ or ‘describing’ his or her subjects. As intermediaries between subject and reader, the ethnographer is just as much an author of ‘fiction’ as the novelist inventing his or her own subjects.
Which brings me back to Tavares and Eriksen. In my opinion, the former’s use of Lee’s novel and the latter’s differentiation between the novel as a source or description of ethnographic ‘truth’ share the same DNA. In fact, I’d even go so far as to state that they are both siblings of the parentage between Ethnography (texts designed to present a cultural or historical representation of a certain people, time, and place) and the novel (a text designed to present a fictional creation of an author intent on representing a particular individual or individuals in the certain time and place).
However, this also brings forth an issue that I believe is perfectly exemplified by the image I used for this weeks ‘feature image:’
While I am quite willing to blatantly claim that all textual representations are fiction by means of their ‘artifice-ness,’ this of course brings us into a discourse where, like the notion of ‘everything is fiction,’ we get somewhat distracted by what might be ‘based on real life’ and what might be a story assumed by some as the same. This is not equal, however, to a declaration that the story of Noah, which might be defined as both, either, or neither a myth and truth, is definitively one of these things. Rather, my point of having it here, and the point of this post in general, is a reminder that when we declare ‘everything’ as fiction because of the role that artifice plays in the creation and presentation of interpreted ‘things,’ a movie about Noah and a movie about William Wallace are equally ‘based on real life.’ In other words, the distinction between what is ‘fact’ (quantitative data about lynchings in the US) and what is ‘fictional’ (Lee’s To Kill a Mockingbird) might blur into a perception where they become equal representations of some type of ‘truth.’
As I’ve mentioned earlier, I’m an ‘emotional writer,’ meaning that I have, like many I know, a certain method to my madness. One necessity that I require is a good distraction. Too much time, effort, and focus on one thing makes, in my opinion at least, for too myopic of a perspective. Distractions are fun, and they break up the monotony of doing research. It’s helpful, and I think healthy, to look away from one’s work from time to time.
Right now, of course, distractions are the last thing I need (or want, for that matter).
However, I came across something recently that I needed to discuss, even if only briefly.
One of my random sources of distraction is the inane and ridiculous website, Reddit.com. It’s not an easy thing to describe this website, what ‘up-voting’ means, or how it affects your ‘karma.’ So I won’t here. The best way to understand what it is would be to just go there and make sense of it in your own way.
While browsing through the ‘all’ section, I came across this ‘meme:’
To explain this for the uninitiated, this is what is called a ‘confession bear.’ When you want to confess something, such as was done here, you use this meme to do so. I should also add that reddit is, if you want it to be, an anonymous website.
In essence, this individual was admitting to having their PhD Thesis ‘ghostwritten’ for them. While this is an interesting statement in itself, a good friend of mine sent me a link to an article (https://chronicle.com/article/The-Shadow-Scholar/125329/) written a few years ago by a gentleman who did this very thing for a living.
Published in the Chronicle for Higher Learning in November of 2010, it soon became one of, if not the most famous, articles ever published by The Chronicle. In it, the author, who refers to himself as ‘Ed Dante,’ tells us about his job, citing the number of pages he tends to write during any regular work week, how he prepares his ‘research,’ and shares some personal insights about how he came to be a plagiarist for hire. As well, throughout the article he repeatedly refers to his most recent client who has had the unfortunate circumstance of having an abstract that he wrote for her accepted, and who now needs him to write the dissertation. The article is an engaging read, and likely (as it did) will inspire quite a flurry of emotional responses for those who read it. It was so successful, that news agencies picked it up (like the ABC News clip below), and it was turned into a text, this time under the name, ‘Dave Tomar.’ Dave, if that is his real name, is now a ‘legitimate’ author.
My interest in Dave/Ed’s story is of course piqued by the notion that any sort of writing is ‘fictional’ in its ‘made-from-ness,’ as well as whether we should consider anything he says to be ‘true,’ because, like writers of both fiction and non-fiction, he lies for a living (“all constructed truths are made possible by powerful ‘lies’ of exclusion and rhetoric […] even the best ethnographic texts—seriously true fictions—are systems, or economies, of truth”[1]). However, I’m also intrigued by the very nature of Dave/Ed’s description, and it’s here where I think I’ll conclude this brief distraction.
I think his story speaks to an amended version of what a colleague at last year’s British Association for the Study of Religions Conference called ‘indentured academia.’ Rather than an indentured perspective on ‘being’ an academic, I think it better speaks to the notion of ‘indebted academia,’ the idea that, because academia is becoming a ‘business,’ students have no choice but to treat their education like a transaction.
In this way, it begins to seem, at times at least, all about the money. Alongside rising tuition costs, textbook costs, etc., Dave/Ed’s story appears very much like a product of that. Of course, while I would immediately respond to my own argument here with the counter statement that even with this more ‘financial-based’ perspective, ‘cheating’ should never be an option worth considering, it’s still a discursive influence we can’t just ignore.
We might ask, then, as costs rise, and as academia becomes more and more of a business, does the ‘value’ somehow go down? For me, that hasn’t been the case, and cheating isn’t anything new. Yet, even if Dave/Ed’s story is a singular example, and even if it doesn’t account for the wider meaning of ‘academia,’ it is still something that I think must be considered when we begin to examine how ‘academia’ and ‘academics’ are perceived discursively within and without the context of ‘higher education.’
***I should mention that a good friend of mine, and pretty clever guy, Jonathan Tuckett just successfully defended his Thesis, and I swear this post is neither inspired by, nor is in reflection of, his achievement. I swear. Really. Check him out, he writes fiction (the made-up kind, as well as the made-from)!***
Religion, Critical Theory and Conspiracism | Lecturer in Religious Studies at the Open University | Co-founding Editor of the Religious Studies Project | Editor, Implicit Religion | Bulletin Editor of the British Association for the Study of Religion.