Monthly Archives: October 2017

The Future of Objectivity (2)

Reading Bammer and Boetcher Joeres’ The Future of Scholarly Writing through the lens of “objectivity”, guided, in the first instance, by the book’s own index, reveals a somewhat embattled concept. (See part 1 here.) Objectivity is mentioned mainly as an illusory experience to be eschewed or an unreasonable demand to be rejected. In some cases (but not the one I will talk about in this post) it is granted a kind of limited dignity, but the general mood of the book’s contributors seems to be that objectivity is overrated. It has had its chance to shape our scholarship and it has been found wanting, even damaging. Some would prefer it went away altogether, others would at least open the literature to alternatives. (Of course, it has already been opened in this way in places.) For my part, I am happy to acknowledge that the pursuit of objectivity not only has a cost but that some of these costs have, historically, been too high. At the end of the day, however, I hope the notion of objectivity retains its epistemological legitimacy. It is worth pursuing — perhaps not solely for its own sake, but for ends that are in fact quite valuable.

“I have no problem declaring,” says Leo Spitzer, “that I am not an ‘objective historian’ in the old sense”  (193). Indeed, he “abhors” the traditional “neutrality” of historians and their “omniscient narratives.” He does not, like them, “claim to stand above the fray” but is, rather, an “engaged historian” who writes in a “personal voice”. Perhaps most tellingly, he refers to this as “so-called” academic writing.

He grounds his stance in his “feeling of empathy”, which he defines and specifies as “a predisposition to sympathize with the hopes, aspirations and frustrations of the subjugated and displaced” and traces back to his childhood among “German speaking, largely Jewish refugees” in Bolivia and, later, as an immigrant to the United States. But it is, perhaps surprisingly, not this personal background that he uses to justify his rejection of objectivity. Indeed, he seems, for a moment, apologetic about the limits of his empathy, his inability to cultivate it to the point of an objective understanding of the people he has studied. He seems to be saying that true objectivity requires us to be able to see the world exactly as someone else does, to “get inside their skins”. It is, in the first instance, because this is beyond the reach of even his empathy, that he cannot declare himself an “objective historian”.

I don’t share this view of objectivity. I don’t think we can or should try to be objective about the subjective, lived experience of other people. I don’t think failing on this score is a failure to be objective. Objectivity merely requires us to note those facts that are experienced in the same way as any other person who is acculturated to experience them in that way. A fact can be experienced differently by different people. An “objective” fact is what everyone who is properly trained to do so sees in it.

Spitzer rejects this kind of objectivity too, however. His goal, he says, “is not to provide a seamless impersonal narrative in which subjectivity and emotion are suppressed or left unacknowledged.” But that’s putting it rather strongly. After all, many claims of historians are perfectly objective acknowledgements of the subjectivity and emotion of the people they study. (My favorite example is when Daniel Defert declares that “Wage-earners liked having the right to find employment where they pleased.”)  We can very definitely write a seamless impersonal narrative in which subjectivity and emotion are foregrounded and acknowledged.

For Spitzer, however, it is necessary to reject objectivity because historians are themselves implicated in the historical process. This is perhaps the best argument he offers, since it points to the very real reflexivity of all social science, not just history. We are in most cases members of the cultures were are studying, a fact that I have recently myself invoked for my own purposes. (Of course, my argument there was not against being objective in our scholarship, but against being objective about our scholarship. The distinction is an important one, but one I’ll leave for another occasion.) Here’s how Spitzer describes his perspective on history and historiography:

For me, the voice (or, perhaps more accurately, the voices) of the historian, however interested, as well as the multiple voices and memories of the participants–the stories they tell and how they tell those stories–are as much part of the fabric of history as are written records and other archival materials. To take into account and reveal subjectivity and affect–to consider what is remembered as well as what is forgotten, fears as well as imaginings, the apprehension and misapprehension of events–complicates and restores a measure of contingency to history. It deepens our historical understanding and helps us to resist interpretative closure.

There is little one wants to disagree with here except the suggestion that these goals are beyond the reach of “objective historians”. Indeed, one wants to say that in order to do justice to the many voices of the people who live and shape history one might usefully document them, not just in the first-hand personal account of the historian, who hears a story and tells it to the reader, but by corroborating those aspects of the stories that can be corroborated and thus made better able to stand up against the “contingency” of history. That word, after all, denotes the well-known possibility of “rewriting” history, to include those previously excluded and exclude those previously included, to redistribute the emphasis, to shift the blame, to declare the victor. Historians play a significant role in history for this very reason, but we do not want them to be merely another set of history’s actors, another group of participants. We want them, to use Spitzer’s own phrase, “to stand above the fray.” To cultivate an impartiality.

Surely a historian can acknowledge multiple stories, multiple accounts, multiple voices, and not declare any of them “objectively” true? The objective facts of history are that the stories are being told.

[Part 3]

Authors

I want to talk about something that has a bit of history and can get very complicated. Back in the 1960s Michel Foucault and Roland Barthes began to question whether “the author” had a future in a “postmodern” culture. (They didn’t call it that, but it’s what we have come to call the future they were talking about, which bears some resemblance to our present, of course.) Foucault asked us to consider what the function of the author in discourse has historically been and whether that function might not change. Indeed, is it not possible that “the author function” could be transformed entirely beyond recognition? This would in effect be what Barthes called “the death of the author”.

Like I say, this isn’t a new issue. It’s been a topic of discussion for fifty years now and I’m sure there are people who will take issue with my simple-minded statement of the problem even in the preceding paragraph. I want to take it up within the limited domain of “scholarly” discourse. Is the problem of authorship different for scholars? Does the “scholar function” mark a different set of problems?

In the natural sciences, this hasn’t really ever been an issue. Scientists do not consider themselves primarily writers but, precisely, scientists. They do their work in labs and observatories or out in the field. They worry more about coordinating their hands and eyes with the apparatus than they do about finding their “voice” in the “text”. There aren’t really “authors” in science, except in very special cases, usually when a scientist begins to “popularize”. Carl Sagan, for example, had an “author function” in discourse. But it must be noted that this function doesn’t really have a place in the technical, scientific literature. It’s in public discourse that his voice is heard.

Since the early twentieth century, the social sciences have–sometimes eagerly, sometimes reluctantly–tried to emulate the discourse of the natural sciences. While social theory was once organized around a few major figures–Marx, Weber, Habermas, to take one famous lineage–it is now increasingly organized into research projects with rather more anonymous contributors. Social scientists spend less of their time reading and critiquing their precursors these days and more of their time generating new “empirical results”. Even before Foucault, Martin Heidegger was noticing this more “incisive atmosphere” in academia, which was shaping minds of “a different stamp”. They spent less of their time reading, thinking and writing, and more of their time travelling, meeting and negotiating. Today we might add that they spend a great deal of time writing grant applications and research assessments. “The scholar disappears,” as Heidegger put it.

Maybe I’m a romantic, but I don’t think we should abandon the author function in scholarship. Foucault found the questions we usually ask about an author “tiresome”:

  • “Who is the real author?”
  • “Have we proof of his authenticity and originality?”
  • “What has he revealed of his most profound self in his language?”

He wanted to ask other questions:

  • “What are the modes of existence of this discourse?”
  • “Where does it come from; how is it circulated; who controls it?”
  • “What placements are determined for possible subjects?”
  • “Who can fulfill these diverse functions of the subject?”

It’s not that I don’t think Foucault’s questions are interesting to ask about major writers like Shakespeare and Flaubert. It’s just that I’m not sure they are very helpful in thinking about our own work and that of our peers. I always think of something Wayne Booth said in the preface to his Rhetoric of Irony:

“I have heard it said that the two standard tutorial questions at Oxford are “What does he mean?” and “How does he know?” I doubt the report—no university could be that good…”

I think these are good questions to replace Foucault’s “tiresome” ones, and better than the ones that Foucault proposes, since his threaten to do away with the author altogether. I don’t think we should always reduce “How does the author know?” to question about the control and circulation of discourse. We shouldn’t always think that “authors” must account for their “authority” (and hence their authenticity). We should mean these questions in the ordinary sense of what justification the author has for believing the claims made in the text. Are those justifications also good enough for us?

I truly believe that our discourse needs to recover its author function. It should be possible ask each other what we mean and how we know without raising profound questions of authenticity or dissolving them in “diverse functions of the subject” (collaborating with other subjects on a proliferating network of social media). I’m not saying an analysis like this can’t bring interesting features of discourse to light. I’m just saying we shouldn’t be embarrassed to presume the authority we need to speak our minds. And while I would never demand that authors reveal their “most profound selves” to me in their writing, I do expect to learn what is on their minds from reading them.

The Reader

“To know whom to write for is to know how to write.” (Virginia Woolf)

Your scholarly writing depends on the existence of about two-dozen readers whose minds you want to change and who are qualified to change yours. You should know who they are and what they’re thinking about. The more accurately you know this, the more effective your writing will be.

In academia, please remember, there is no mystery about who your readers are. They are your peers. They know more or less what you know you and live lives more or less like yours. If you are a scholar you are writing for other scholars in the same field. They have read the same work that you have read, including yours and theirs. You are, in principle, familiar with each other’s work. You share the experience of collecting and analyzing similar kinds of data, of framing these analyses with a similar sets of concepts, and, importantly, of regularly teaching what you know to similar kinds of students–namely, university students.

If you are such a student, there is also no mystery. Your peers are your fellow students, the brightest and most diligent among them. You are writing for someone who has read the required course materials with interest and curiosity. They have participated in a highly engaged way in the classroom discussions that you, too, have attended. They have struggled as you have with difficult concepts and exotic facts. Like you, they have sometimes failed in that struggle and, like you, they have sometimes succeeded. They are familiar with some of your views and have taken a position on them, not always your position but at least one that is familiar to you. You can engage them in conversation and the conversation can be interesting.

You are free to construct your reader in a, let’s say, “aspirational” manner every now and then. If you are a student, you can try to imagine your teacher’s peers, other professional scholars, as your peers and attempt to write for them. But remember that this will require you to read more than an ordinary student. You should not construct an image of the reader as vastly more erudite or intelligent than you. Don’t imagine your actual teacher as your reader, that is. Don’t imagine someone who understands your theories better than you and is much more familiar with the literature than you. But do try, every now and then, to become as familiar as your teacher with some small corner of the literature. This is actually possible and very much worth the effort.

As a student or scholar, never write for a reader who knows much more or much less than you about your subject. This confuses the issues. Have an awareness of where you know more and where you may know less. Don’t make too much depend on your reader making up for your ignorance. Make sure you yourself know whatever you expect your reader to know. If there is something you need your reader to learn, make sure to provide all the information they will need. The most common case of this has to do with your presentation of the data, of which the reader is understandably ignorant until you present it. Remember that they know as much as you about methodology, however. You are pitching your claims to someone who shares your research practices and quality standards. After reading you, the reader should know what you know about the relevant data points.

Don’t overthink this. Don’t make it harder than it is. From your first day at university, make a list of your peers and what you think is on their minds. This list will change and grow. The contents of your reader’s mind will change and grow. Don’t try to keep up as much as keep track of where you are. Don’t think that there is some ideal peer group that you must win the good graces of. Your intellectual peers are simply people with minds like yours. Address them in your writing from the center of your strength. Seek them out, too, and listen to what they have to say.

All along the way you are simply trying to open your mind to the input of minds that are, at the moment, similarly engaged in similar intellectual puzzles. If you keep at it, you will naturally find your place in the discourse. Some people find this place so familiar and enjoyable that they choose a life of scholarship. They find it satisfying to be “like-minded” in this way. Others don’t enjoy the company of their academic peers as much and long to get out into the “real world” and tackle life’s “real” problems. There’s no shame in either position.

The important thing is to realize that scholarly writing is not about “the loneliness that is the truth of things.” We owe that beautiful image to Virginia Woolf and it is, arguably, a description of what her novels were about. Fortunately, she was less poetic about the domain of non-fiction. Knowing how to write, she told us there, is simply knowing who you’re writing for.

The Future of Objectivity (1)

Does objectivity have a future? Do objects have a place in a post-factual world? I certainly hope so. But the more I read about the state of academic writing today, the more uncertain I grow. The emerging ethos of academic writing instruction seems poised to jettison objectivity from our scholarship altogether. Angelika Bammer and Ruth-Ellen Boetcher Joeres’s anthology The Future of Scholarly Writing (Palgrave, 2015) is an excellent case in point, and I’m going to devote a few posts to it in the weeks to come. I will structure my reading as an annotation of each of the indexed appearances of “objectivity” in the book’s contributions. I will start at its last appearance and work my way forward, taking issue with the authors’ various treatments  of this famously “academic” notion as I go.

To begin, then, on page 206 Susan McClary explains the valorization of objectivity by way of the “dominance of the left hemisphere [of the brain]” in academic contexts. As “a product of the analytic predisposition [of the left hemisphere, the binarism of ‘subjective’ and ‘objective’] has the effect of acknowledging as valid only those observations that can be verified regardless of the researcher’s particular investments,” she tells us; “anything else is relegated to the scrapheap of the ‘merely’ subjective.”

I think this is putting the point somewhat too strongly. I don’t think the distinction between objective and subjective is a hard and fast binary. Most people will say that objectivity and subjectivity are relative notions and that any particular observation will have inexorably subjective and objective components or aspects. I personally think of objectivity as a “socially constructed” affair, always accomplished and, indeed, never more than approximated, through inter-subjective triangulation and negotiation. At the other end of the spectrum, it is hard to imagine the position of extreme or “pure” subjectivity would yield any particular observation, and it is often instead identified with the radical passivity of the transcendental subject. Indeed, it is often a gesture at mystical forms of experience. In between, I like to think, and each from our own subjective points of view, we try to accomplish our objectivity as best we can.

This does indeed mean trying to present our views “regardless of [our] particular investments” in them. That is, we want to open our beliefs to criticism from people who may not be invested exactly as we are in the outcome. The financially inflected language here is telling since we would certainly treat medical researchers who had “particular investments” in the drug company whose medicine they are testing with some skepticism. But skepticism is not, I want to emphasize, tantamount to “scrapping” the relevant observations. Objectivity does not actually mean that we only acknowledge observations that have been completely divested (if you will) of a personal stake. It normally just means that we should declare this interest and accept that our contribution will be taken with a correlative amount of salt.

“In most academic disciplines,” McClary continues, “the premium put on objectivity has strangled not only prose style–the exclusive emphasis on documentation and a deliberately drab vocabulary–but also methods: the questions we may ask and the ways we go about trying to engage with those questions” (206-7). This, again, is some strong language and I must say I don’t recognize this picture of academia at all. (See also “Academic Discourse, Folk Psychology and Intelligent Cat Pictures”.)

I’ve never read a paper that confines itself exclusively to documentation, nor is there any shortage of papers and books that manage to present their ideas using lively and evocative language. Even texts that appear deliberately restrained in their prose are not always “strangled” by this effort. Indeed, we sometimes appreciate the admirable parsimony of a writer’s vocabulary as a breath of fresh air in a discourse that is too often overwrought in its terminology. Now, it is true that objectivity demands a certain (perhaps narrow) range of methods and that it can only be achieved in the pursuit of answers to a particular class of questions. But here, too, its hard to see researchers as “strangled”. Rather, it seems to me that their adherence to these approaches make particular observations possible that otherwise wouldn’t be.

For the past 50 years, in any case, there has been a cultivation, not only of much inter-disciplinarity, but a methodological pluralism, which should have afforded almost any researcher an opportunity to ask and answer almost any question in pretty much any way they choose. When I look at the wide range scholarship published since, say, May of 1968, I just can’t recognize the “dominance” of an objective ethos, nor any particular hemisphere of the brain. Rather, I see a struggle for dominance by multiple scholarly discourses in which objectivity is an increasingly embattled notion.

I would much prefer that no one asserted dominance and that we instead let objectivity be one among several values to pursue. I enjoy an intensely subjective paragraph as much as clear, objective one. Today, I’m afraid, if anything risks ending up on the scrapheap it is allegedly “drab” presentations of what were once called “facts”. Indeed, I have a feeling that Bammer and Boetcher Joeres are as concerned about the turn that our public discourse has taken in the years after the publication of their book. I think we need to think about reasserting, and perhaps reclaiming, the virtue of writing that is anchored in an objective sense of reality. Perhaps it is time to give our analytic predisposition a little space (on the left?) in which to work?

[Part 2]

Being Open

“The point is to experience being there, in the sense that I, the human being, am the there, the openness of being for me, insofar as I undertake to preserve this openness, and in preserving it, to unfold it.” (Martin Heidegger)

Let me wax philosophical for a moment. To be really ‘present’, to be really ‘there’, is to be open to what is going on around you. Human existence, perhaps, is uniquely defined by this openness, this capacity to be present in the now, to “be there”. This is something Heidegger argued very forcefully for, and he included a social element; existence, he said, is always bound to the existence of others, to “them”. So being open is also a matter of “being there” for others.

Indeed, Heidegger distinguished the “logical conception of science”, according to which it is “an interconnection of true propositions”, from an “existential conception”, in which it is a mode of being with others and engaging with things of practical value. It’s not just a matter of being open to the facts, we might say, but a way of being open to what other people think of those facts and what we can do with them. I think this is enormously important to keep in mind, and much of the success of post-WWII “science studies” has come from pushing this awareness on people whose natural inclination is to stick to “the facts” alone.

Ironically, however, our awareness of the social conditions of “knowledge production” has at times made us less open to the idea that another person’s view of the world might be more valid than our own. Many of us are inclined to rely on the views of our closest peers, like-minded people who appreciate the value of what we are doing in our research. We are, though we are loath to admit it, a bit too eager to believe what is said in our own research community and we close ourselves off to input from people who might come at our problem from a completely different perspective. Though they come at it differently, however, they may well arrive at the same place you are. Here.

In a recent essay in the Chronicle, Alice Dreger has made a strong case for cultivating greater openness in our thinking to the ideas of people who disagree with us, even to ideas that outright offend us. In a key paragraph, she shares a formative experience from her grad school days.

Let us require our students to read difficult work and learn to respond to uncomfortable chalk by chalking back. Teach them histories of censorship and blacklisting on the right and the left. Require them to reflect upon their (and our) uncertainty. Teach reliable methodologies, not infallible ideologies. Let us always be implicitly asking what one graduate professor explicitly asked me when I was being an intellectually recalcitrant pig: If you haven’t changed your mind lately, how do you know it’s working?

A reliable methodology is one that opens you meaningfully to the world of facts. An infallible ideology, by contrast, closes us off even to the input of other knowledgeable people about those facts. It’s all well and good to be certain that racism or sexism is wrong. The problem arises when you are so certain that your interlocutor is an incorrigible racist or sexist that you close yourself off from their criticism of your views. They may be wrong. But so may you. Indeed, you may both be wrong and this encounter with another mind might ultimately only have made you see your error. That would have been good. For you.

I hope pigs won’t take offence at Dreger’s slur. As I said at the beginning, the open nature of our existence may be what makes us uniquely human. And our recalcitrance in the business of changing our minds is, indeed, often a little piggish, i.e., less than human. I wonder if it is too much of a philosophical inside joke to say that pigs live in a pen while human beings, as Heidegger suggested, live in the clearing. Out in the open.

Update: See also “The Place of Form” and “Academic Purpose”