Shameless Self-Promotion

My initial intention with this week’s post was to discuss the little differences we might imagine between cultural and national identities, particularly concerning the notion of ‘community’ in regard to the odd liminality felt by the ‘foreign’ PhD student.  That will be next week.

Today, a special edition of the journal Science, Religion, and Culture on Atheism, Secularity, and Science was published, for which I srccontributed.  Not one to let the iron cool before striking, I thought it might be useful to use this week’s post as a blatant and entirely shameless plug not only for my own article, but for the others that accompany it as well.


I first came to learn of this special issue through Tommy Coleman, a colleague at the Religious Studies Project, who has made quite a name for himself in the field of the Psychology of Religion, and its influence on the study of Atheism, secularity, etc.  He is quite the proliferate scholar, so here are some useful sources for reading about his work:

As the co-editor of this special issue along with John R. Shook and Ralph W. Hood Jr., Tommy has played an integral role in communicating and assisting throughout the process.  For that, I am quite grateful.


The issue itself (as I perceive it) is an attempt at tackling the ever-growing identity crisis within the field of Atheist Studies, particularly in reference to the fact that there are some (perhaps many) who would likely disagree with my notion that this field should be, in any way, referred to as such.  In fact, this is rather well said in the issue’s Introduction:

Where it was once typical to begin a research article, introduction to a book volume, or special journal issue such as this one, by the researcher lamenting their particular field of study for neglecting such topics, this kind of pleading is no longer tenable (Bullivant and Lee, 2012). Nonetheless, as researchers we cannot afford to rest on our laurels for very long. While studies on atheism and secularity now exist across disciplines ranging from psychology, cognitive science, sociology, religious studies, philosophy, anthropology, and many others, this provides only a theoretical and methodological starting point from which to explore the given topic. Importantly, within each of these disciplines lay multiple competing frameworks, field-specific conceptualizations, and inter-disciplinary scuffles as to precisely what secularity is, and how to study it. Typically, pre-existing frameworks developed for use in religious believing populations are modified to fit nonbelievers, as nonbelief is often presumed to be the dark shadow of whatever belief or religiosity is (Coleman and Arrowood, 2015Silver, Coleman, Hood, and Holcombe, 2014). How far this approach will go toward answering whatever questions the scholar is interested in is an open one.

While there are points and theoretical positions within the articles published here with which I find myself in disagreement (such as Jonathan Jong’s “On (not) defining (non)religion”), the issue itself makes a number of quite useful strides toward an establishment of some sense of academic identity.  Which is no small endeavour.

For years now I have been referring to this area of interest as a ‘flying dutchman,’ cast about in a sea of opposed approaches and interests, without a distinct port-of-call.  Where before I might have lamented this fact, such as we might do when presented with the myriad ways in which the very terms we use are defined, my position has shifted a bit.

No longer do I think our ‘flying dutchman’ status is detrimental to our cause.  After all, while many voices proclaiming different things might seem to some as an atonal din, for others, that might sound like a chorus.

Or, said otherwise, and as I argued in my article, rather than dismiss this discourse because it reflects many voices saying different things, why not embrace it and simply allow people to say what it is they think and believe.

It is my opinion that this special issue does just that.

For this reason, not only was I quite happy to have been considered for this publication, I am also hopeful that it might be perceived as an example of how our theoretical and methodological hodgepodge might also prove ultimately beneficial to the academy’s larger understanding of Atheism and its many cognate terms.

For the benefit of the reader, then, I’ve provided the following links:

An Introduction to Atheism, Secularity, and Science,” by Thomas J. Coleman III, Ralph W. Hood Jr., and John R. Shook.

On (not) defining (non)religion,” by Jonathan Jong

Discourse Analysis and the Definition of Atheism,” by Ethan G. Quillen

The NonReligious-NonSpiritual Scale (NRNSS): Measuring Everyone from Atheists to Zionists,” by Ryan T. Cragun, Joseph H. Hammer, and Michael Nielsen

Atheism, Wellbeing, and the Wager: Why Not Believing in God (With Others) is Good for You,” by Luke Galen

Atheism Looking In: On the Goals and Strategies of Organized Nonbelief,” by Joseph Langston, Joseph Hammer, and Ryan T. Cragun

Explaining the Secularity of Academics: Historical Questions and Psychological Findings,” by Benjamin Beit-Hallahmi

The God of Nonbelievers: Characteristics of a Hypothetical God,” by David F. Bradley, Julie J. Exline, and Alex Uzdavines

When Rabbis Lose Faith: Twelve Rabbis Tell their Stories about their Loss of Belief in God,” by Rabbi Paul Shrell-Fox

Research note: “A Profile of the Members of the British Humanist Association,” by Gareth Longden

Research note: “Simple Markov Model for Estimating the Growth of Nonreligion in the United States,” by John Stinespring and Ryan T. Cragun

Book Review: Trent Dougherty, The Problem of Animal Pain: A Theodicy for All Creatures Great and Small, by Liz Goodnick

Book Review: The New Atheist Novel: Fiction, Philosophy, and Polemic after 9/11, by Marcus Mann

Book Review: Living the Secular Life: New Answers to Old Questions, by Amanda Schutz

The Expert in the Room

In an attempt to avoid the rain the other day, I ducked into the Scottish National Gallery here in Edinburgh.  It’s a rather lovely gallery, neither too large, nor too small, with some rather impressive pieces.  Two of my favourites are “The Man of Sorrows” (1860) and “David in the Wilderness” (1860), both by the Victorian painter, William Dyce.

christ in highlandsdavid in highlands

I enjoy these paintings because they represent a change of setting, a perspective of the artist that contradicts the ‘historical record,’ wherein his subjects (Jesus and David) have migrated from the realm of the Biblical Holy Land to Dyce’s own: the Scottish Highlands.  I especially enjoy what these paintings tell us about an artist’s perception, about how a narrative might be adopted and amended to suit one’s own context.  Or rather, how as a Christian, Dyce has placed these individuals into his own geographical context, shifting them out of legend and into something more attainable.  He has, in essence, made his religion ‘Scottish.’  To me, this seems aptly similar to the way in which religious beliefs shift and translate, how they become nationalised and tied in with the civil religion of a central location, their discourses homogenised into something entirely new.

Dyce is also known, perhaps more famously, for his “Pegwell Bay–A Recollection of October 5th 1858” (1858-1860)

Pegwell Bay, Kent - a Recollection of October 5th 1858 ?1858-60 William Dyce 1806-1864 Purchased 1894 http://www.tate.org.uk/art/work/N01407

Renowned for its association with the genre of ‘Atheist Aesthetics,’ “Pegwell Bay” depicts a discursive shift, a narrative ‘sea change’ wherein the once predominate use of ‘religious’ imagery has been replaced with that of science.  Here, families gather shells and fossils on the low-tide shore as Donati’s Comet soars overhead.  A site frequented by novice and professional fossil hunters, as well as notable theorists like Darwin, Dyce’s use of Pegwell Bay as a setting allows the image to speak on his behalf, revealing a discursive commentary about the ebbing tide of religious belief and the reality of a more science-minded perspective on life, the universe, and everything.

In his Faith and Its Critics, David Fergusson contends that this painting depicts a type of ‘wistful’ and ‘nostalgic’ Atheism, a longing for days gone by, which matches in tone the basis of certain theoretical definitions of Atheism by scholars such as Hyman and Buckley: ‘Modern Atheism’ (that which arose out of and within the Enlightenment) appears as a ‘re-emergence’ of the classical ‘rational-naturalism’ that defines our notion of ‘Ancient Atheism.’  Likewise, this is an Atheist discourse that is equally expressed in textual examples, such as Thomas Hardy’s “God’s Funeral,” or Matthew Arnold’s “Dover Beach.”  The latter even evokes a sense of tidal retreat, a poetic mimicry of “Pegwell Bay” via signifying terms like the ‘long withdrawing roar’ of the ‘sea of faith:’

The Sea of Faith
Was once, too, at the full, and round earth’s shore
Lay like the folds of a bright girdle furled.
But now I only hear
Its melancholy, long, withdrawing roar,
Retreating, to the breath
Of the night-wind, down the vast edges drear
And naked shingles of the world.

While this is all very interesting, and is definitely worth a bit more discussion, my point with this post is actually about something else entirely, inspired by a humorous exchange that I witnessed in the ‘Impressionist’ room of the Scottish National Gallery.

I was enjoying one of Monet’s ‘Haystacks,’ standing off to the side, and a ways back.  Two gentlemen, perhaps in their late fifties or early sixties, approached the painting.  The one on the right, the taller of the two, drew his companion’s attention to the canvas.

“See this brushstroke here,” he said, “that’s indicative of the impressionist’s style, that heavy use of paint, and the way he dragged it up, and to the left.”

“Yeah, I see that,” his companion replied, a slight hint of angst in his voice.

“He had a remarkable eye for colour, and for distinguishing simple tones within the palette, most notably for his use of blue.  You should see his ‘Nymphéas’ at L’Orangerie, in Paris,” the first man said, his voice adopting a velveteen accent.   

The companion smirked slightly, then responded, as if pulling a sandwich out of his pocket and presenting it as evidence:

“I have a minor in Art History.  I’ve seen it.”


The expert is an odd character, mostly because he or she can appear anywhere.  We are all experts at one thing or another, from the utmost banal and prosaic to the select and specific.  Likewise, the expert might not only appear in the most unexpected times and places, but from the oddest of origins.

One of the great myths of the PhD is that achieving one will make you an expert.  Even I fell into this trap years ago when I stated I wanted to be ‘a world’s expert’ on Atheism and Ian McEwan.  In retrospect, I now think of that as a rather silly goal.  This is especially the case now that I’ve learned that after years of isolated study on a particular topic what you really become an expert on is the realisation that you’ll never actually know everything there is about that topic.  Or, in more colloquial terms: the more you learn, the less you know.

There’s a useful ‘illustrated guide’ for what I mean here, that I’ll happily steal from Matt Might:

When we imagine all of human knowledge as a circle, by the time we finish our Bachelor’s Degree, we’ve accumulated a rather slight ‘specialty.’  That looks like this:

BA
With a Master’s Degree, that specialty grows a bit:
MA
By the time we’ve reached the PhD, that specialty begins to push against the boundaries of known human knowledge.  This creates a darling little bump:
phdSo now, given that the circle on which our little bump has protruded represents all human knowledge, it’s important to acknowledge where our expertise exists within this context:
phd all knowledge

While Matt Might’s illustration here is rather useful (it’s also available for purchase, for those interested), it also quite poignantly illustrates the oddities of the ‘expert.’

Moreover, it serves to remind us, just like my story of the two ‘experts’ staring at Monet’s canvas, that as we’re all experts, then perhaps none of us are.  As we come to realise that the more and more we know something, the less we actually know in general, and therefore further accept that the expertise we’ve accumulated isn’t a substitute for the world’s knowledge, then we’re all rather ignorant.  Does this mean we’re in denial, or that our attempts at proving our expertise to people who seem to have similar expertise is a means of pacification?  Are we trying to claim ownership?

Perhaps.

Or maybe not.  After all, clearly i’m not ignorant about Atheist discourse.  Just look at what I said above.

Clearly I’m an expert.

In Comparison a Disappointment Dwells

The title of this post is stolen from J.Z. Smith, particularly from a chapter titled “In Comparison a Magic Dwells,” in his Imagining Religion.  In it, and in his uniquely erudite, yet frightfully frustrating tangential style, Smith constructs the argument that ‘comparison’ is an endeavour that leads, inevitably, to theoretical disappointment.  As he states toward the end:

[…] comparison is, at base, never identity.  Comparison requires the postulate of difference, as the grounds of its being interesting (rather than tautological) and a methodological manipulation of difference, a playing across the ‘gap’ in the service of some useful end. (35)

In the contextual realm of identity construction, comparison becomes a necessary evil, the utilitarian acknowledgement of the way we identify ourselves in relation to others, how they recognise us as different to themselves, how that then dictates a two-pointed acknowledgment of opposed ‘selves,’ how we then recognise those ‘selves’ within groups, both similar and different, and in opposition to opposing groups, and vice versa, etc., in ad nauseam.

Beyond this condition, however, difference becomes, as Smith points out, ‘problematic.’  In comparison, we find ourselves not only seeing the difference between things, but how that difference reveals a bias we might have inherently developed about something we might consider ‘established.’

Here’s a good example:

In 1960, the novelist Harper Lee’s To Kill a Mockingbird was published.  In the fifty-five years since then, it has become almost exclusively ingrained in the American discourse, a fictional representation of a darker and more racially sinister nation, told from the perspective of a naive young girl, known as ‘Scout.’  Today, Lee’s second novel, Go Set a Watchman, will be released.  Written before To Kill a Mockingbird, but held from publication, and while a ‘sequel’ in the sense that the characters within her narrative are chronologically and philosophically developed versions of those represented in To Kill a Mockingbird, it is a very different sort of novel.  One major difference, as pointed out by a number of seemingly disappointed critics, is the ‘racism’ and ‘bigotry’ of Scout’s father Atticus, whose stalwart and passionate fight for justice created the moral backbone of To Kill a Mockingbird.  In this iteration, he is apparently, and bluntly, a very different man.

To mark this difference, and via a clever combination of two popular narratives having been recently ‘re-written’ for a new audience, the New Yorker published this cartoon:

new yorkerWhat perhaps intrigues me the most about the outcry over Atticus’ bigotry, revealed to Scout (now referred to by her real name: Jean Louise) via her sudden, almost shocked realisation that the moral compass against which she has shaped her own perception of the world is now a representative of the Southern bigotry of the 1950’s, is that I don’t think it is surprising at all.

This is where comparison comes in.

Atticus is the construction of Lee’s imagination, meaning that though we might have collectively elevated the character to the level of a paragon, a representation of a ‘good man,’ perhaps this is the Atticus that has existed all along.  That is, as the creator of the text, Lee’s notion of Atticus is really the only ‘true’ description.  Beyond that, and as readers, all we are capable of doing is perceiving that individual via the text ‘as it is.’  In other words, the shocking revelation that Atticus is a bigot, which works a wonderful magic for us as we empathise with the narrative’s protagonist, shouldn’t be all that shocking unless we have ‘established’ this character’s description in a particular way, such as Jean Louise has done.  Which reveals our inherent problem.  In our comparison of these two Atticuses (Attici?), we reveal our bias, our perception of an individual who, for the last fifty years, has meant ‘one specific thing,’ when in actuality, he’s always been this way, both in Lee’s imagination, and for the fact that Go Set a Watchman was written prior to To Kill a Mockingbird.  We are ‘shocked’ because we have betrayed our bias.  Our use of comparison has exposed our collective opinion that this individual, this fictionalised exemplar, is no longer represented in the way we had unanimously decided.

In this way, comparison breeds disappointment, simply because one thing compared with another reveals the fact that we might have been ‘wrong’ in our crystallised notion that this character was supposed to be a certain way.


Here’s a comparative example:

In 1922, Bronislaw Malinowski’s classic text, Argonauts of the Western Pacific, was published.  This text so ideally described the ‘doing’ of anthropology, not just in regard to the proper process of fieldwork, but in how that fieldwork should be translated into a realist text representative of a whole culture via select detailed parts, that it became something of a primer.  Soon, anthropology-in-general was conducted not only via his method, but his style was replicated to the point of exact duplication.  Most pertinent, perhaps, was the strict objectivity he prescribed, a complete removal of one’s opinion and voice, the evacuation of subjective notions for the benefit of objective facts.

The post-Malinowski era represented a rigorous and strict methodological paradigm: the placement of the anthropologist amongst his or her subjects, removed from his or her own culture; an immersion that required the learning of that subject’s language, their beliefs, customs, and rituals; a participant observation wherein the ‘imponderabilia,’ the native’s day-to-minutiae, would become first-hand experienced knowledge.  Moreover, this would then infect the textual representation later constructed to illustrate this subject’s culture, an omniscient and equally objective text focused with exact precision on providing the reader a vivid snapshot of another way of life.

In 1967, forty-five years after the publication of Argonauts, the anthropological advocates of this methodological precision would find themselves disappointed by comparison.

In a series of unfortunate circumstances, this year brought the publication of Malinowski’s personal diary, recorded during his time amongst the Trobriand natives.  Though he had died twenty-five years prior, and because we will never know whether he had ever intended for these personal reflections to have become publicly accessible, the content could only ever be read ‘as it is,’ and thus without his personal commentary.

Thus, A Diary in the Strict Sense of the Term was quite shocking.  Not only did we learn of his bizarre medical and psychological eccentricities (including an odd obsession with reading, while subsequently hating, fiction), we also came to find that he, the originator of anthropological objectivity, had a number of rather disparaging opinions about his subjects.  On certain occasions this would turn to an almost detailed hatred, not only of his subject’s way of life, but of their strange customs and culture.  Here, for the first time ever, was a subjective perspective from the paragon of an ‘objective observer.’

The result of this publication proved effectively critical to the notion that the ‘doing’ of anthropology was isolated to Malinowski’s prescribed method, to the point that over the next thirty or so years, the strict objectivity of both observing and textually representing one’s subject transitioned to a number of experimental products, including the fictionalisation of one’s fieldwork in the ‘ethnographic novel,’ and the reading of fiction ‘as ethnography,’ the latter of which I myself am guilty of exploiting.

Like my example above concerning Lee’s novels, the comparison here once again leads to disappointment, which then leads to a series of adaptations, shifts made in order to not only pacify the idea that something we once thought established has been ‘undone’ by a new perspective, but by the idea that in comparison our revealed biases must be re-established.

With both these examples, Smith’s notion of comparison, and thus the larger notion of determining a ‘difference’ between two things, reveals not just a bias on the part of the comparer suddenly disappointed by that comparison, but an impractical approach to the study of two things interrelated by an inherent similarity (such as his issue of the terms ‘history’ and ‘religion’ in his own description of himself as an ‘historian of religion’).  Thus, by comparing these two issues of comparison, we are reminded that any sort of comparison inevitably leads to disappointment, sending us further toward the need to adapt, pacify, and even re-establish our perceptions of that which we study, and further complicating the process as a whole.


BUT WAIT, you might ask, isn’t what you’ve done here just another comparison?  Did you not just compare the disappointment of Atticus’ bigotry in Go Set a Watchman with the disappointment felt by anthropologists after Malinowski’s Diary was published?

Yes, I did.  And isn’t that disappointing?

Nothing to be Afraid Of

Last Thursday, I defended my life.

That is to say: last Thursday, I sat for my Viva, an oral examination to defend the thesis that I’ve been working on for the last four years.  To refer to this as ‘defending my life’ is a bit of a mistranslation as ‘viva voce’ really means ‘oral examination,’ or more specifically: ‘with a living voice.’

Yet, still, I think the idea of this being a defence of one’s life isn’t all that imprecise.  After all, writing the thesis has been my life for the last four years, and especially as it has brought on an entirely new sort of life within a ‘foreign’ country, the thesis has been the central point around which my life has orbited in that time.

However, this was by no means a ‘trial’ of any sort.  At least not like I thought it would be.  I blame this solely on my examiners.

In fact, were I to describe my experience with the Viva, this close to the aftermath, and in a single word, it would be: demythologised.

Here’s what I mean.


Being a PhD candidate is in a particular way like being a young student, in your late teens, still in High School but nearing the end, having an older sibling/cousin/friend/acquaintance, who has come to visit, share with you the painful realities of their experiences in the ‘real world.’  They’ll describe that world in realistic terms, painting a picture that reflects back a harshness where the ease and simplicity of youth is quickly replaced with taxes, insurance, rent, jobs, pay checks, medical bills, student loans, etc.  You might listen to their sage wisdom and take some of it to heart, but it won’t really ‘sink in.’

Then, and with just a hint of irony, you might find yourself, years later, sharing similar wisdom to your own younger siblings/cousins/friends/acquaintances.

This is not unlike the advice you might receive from colleagues who have passed through the viva stage.  However, in this iteration, the message seems a bit more constructive than the ‘dose of reality’ you might get from the previous description.

In fact, one of the predominant advisements I’ve received over the years about the viva was: It’s nothing to be afraid of.

That’s nonsense, I’d proclaim.  How could it not be something to be afraid of?  Here is the culmination not just of all your time working on this one particular project, a close and critical examination of 100,000 words wherein typos are bound to happen and arguments might seem less developed than someone might want, but, perhaps more frightening, here is the culmination of years, a decade, maybe more, of research and studying and moving from university to university, city to city, country to country.  Here is the last and final defence you must make to prove that you are worthy to join that extremely elite club of individuals and thus earn the title ‘Dr.’  Is this not, of all the things one must do within this movement up the academic ladder, the quintessential thing to be afraid of?

Someone once put it to me this way: as a kid, when you were playing video games, what is scarier, getting through the first level, which might seem hard at the time, or spending hours/days/weeks/months playing through a game, getting to the final boss, and realising if you lose here, everything that came before would be in vane?

That, to me, seems like something worth being afraid of.

Of course, I was wrong.  This is the mythology I built up, the narrative I convinced myself was real, fed by a discourse composed of things like this:

thesis_defense

When I’d hear the ‘it’s nothing to be afraid of’ line I thought it simply did not pertain to me.  It was something one merely said in the euphoria that followed the viva, usually spoken by individuals who appeared physically and emotionally exhausted, driven half-mad by that final battle, a piece of their soul left somewhere behind.

These were the type of thoughts that preceded my own experiences with the viva.


As I blithely stated above, I blame much of the demythologisation here on my two examiners.  I somehow got quite lucky by having the two individuals that I had personally chosen, and were at the top of my list, to be those who would read, examine, critique, and discuss the thesis with me.  Not only did I select these individuals because I thought their backgrounds fitted the topic and contents of the thesis, but because I respect them above all others as exemplary experts in their fields.  What this produced was an examination less like a defence, and more like a discussion, as if somehow we had each colluded to transform the final hurdle in my race to doctorship into an engaging and quite beneficial supervisory meeting.

Thus, even when we disagreed on points, there was a congeniality underscoring the criticisms, each suggestion filtered through a respectful means of assisting me in making the thesis the absolute best, and thus clear and definitive, text it could possibly be.

Though the final verdict requires corrections to be made within the text, and though this means I did not ‘ace’ the thesis (those in my inner circle here have taken to calling this the ‘Whitney,’ based on the results of a genius colleague’s stellar viva result), I have come away from the experience not just invigorated about addressing these corrections, but with a newfound admiration for the topic itself.  In other words, because my examiners did such an impeccable job, I’m spending these immediate days after the viva neither exhausted, nor wishing to remove myself as far from the thesis as possible, but excited about the prospect of making it all that much better.


I will conclude here with two observations:

First, as I was preparing myself last Thursday morning I was trying to remember a quote from Hemingway about the fear one might have of writing.  For whatever reason it kept popping up as I read my Introduction and Conclusion for the fourth time, like a half-memory that my brain kept trying to link to my argument.

I eventually found it:

“I write one page of masterpiece to ninety one pages of shit. I try to put the shit in the wastebasket.”

To briefly put this into context, it comes from a letter that Hemingway sent to F. Scott Fitzgerald after the latter requested the former’s opinion on his recently published Tender is the Night.  Just as much as Hemingway is mythologically notorious for an incessant need to prove himself truly masculine, Fitzgerald is just as mythologized for being on the opposite spectrum.  Consider, for humorous example, this passage from Hemingway’s A Moveable Feast:

‘Zelda said that the way I was built I could never make any woman happy and that was what upset her originally.  She said it was a matter of measurements.  I have never felt the same since she said that and I have to know truly.’

‘Come out to the office,’ I said.

‘Where is the office?’

Le water,’ I said.

We came back into the room and sat down at the table.

‘You’re perfectly fine,’ I said.  ‘You are O.K.  There’s nothing wrong with you.  You look at yourself from above and you look foreshortened.  Go over to the Louvre and look at the people in the statues and then go home and look at yourself in the mirror in profile.’

‘Those statues may not be accurate.’

‘They are pretty good.  Most people would settle for them.’

‘But why would she say it?’

‘To put you out of business.  That’s the oldest way in the world of putting people out of business.  Scott, you asked me to tell you the truth and I can tell you a lot more but this is the absolute truth and all you need.’ (Pg. 163) 

My quote above stems from another advisement from Hemingway, a criticism not so much of Fitzgerald’s writing, but of his lack of confidence as a writer.  Here it is within the context of the letter:

For Christ sake write and don’t worry about what the boys will say nor whether it will be a masterpiece nor what. I write one page of masterpiece to ninety one pages of shit. I try to put the shit in the wastebasket. You feel you have to publish crap to make money to live and let live. All write but if you write enough and as well as you can there will be the same amount of masterpiece material (as we say at Yale). You can’t think well enough to sit down and write a deliberate masterpiece and if you could get rid of Seldes and those guys that nearly ruined you and turn them out as well as you can and let the spectators yell when it is good and hoot when it is not you would be all right. (Re-printed via www.lettersofnote.com)

Hemingway is telling his friend, in his own way, to ‘forget his personal tragedy,’ to move beyond those things that might have caused his anxiety and diminished his confidence, to let go the myth that he has anything to be afraid of.

He tells him: “Go on and write.”

Which brings me to my second conclusive observation.

Leading up to the viva I was told a number of times that the experience was nothing to be afraid of, that I would ‘do fine.’  While this advice did, in fact, come true, it is not something I find myself able to adopt.

I suppose, like all those times when as a young man someone gave me sage advice about the realities of the real world, I am once again dismissive.  Not in a rude or negative sense, mind you, but in a practical way.  Yes, my viva experience was wonderful, and better than I could have even imagined or day-dreamed it in order to pacify my anxiety, but I also think the mythology of it was necessary too.  Much like how Fitzgerald’s myth about the poorness of his writing forced him to ensure it was always clean and detailed and perfect, were it not for my fear of the viva, perhaps I would not have been as prepared as I was, or, more importantly, perhaps it would not have been such a rewarding experience, simply because that reward came from the demythologisation of it.

In other, and final words, were I asked what to expect from the viva by a colleague approaching their own, because my fear of it proved so useful to the outcome itself, “it’s nothing to be afraid of” is something I’m afraid I just couldn’t say.


***One last thing.***

Because I turned to it often when accomplishing the great milestones of my thesis, here’s another discursive example that I think nicely puts into perspective the myth of the viva: