The Reader

“To know whom to write for is to know how to write.” (Virginia Woolf)

Your scholarly writing depends on the existence of about two-dozen readers whose minds you want to change and who are qualified to change yours. You should know who they are and what they’re thinking about. The more accurately you know this, the more effective your writing will be.

In academia, please remember, there is no mystery about who your readers are. They are your peers. They know more or less what you know you and live lives more or less like yours. If you are a scholar you are writing for other scholars in the same field. They have read the same work that you have read, including yours and theirs. You are, in principle, familiar with each other’s work. You share the experience of collecting and analyzing similar kinds of data, of framing these analyses with a similar sets of concepts, and, importantly, of regularly teaching what you know to similar kinds of students–namely, university students.

If you are such a student, there is also no mystery. Your peers are your fellow students, the brightest and most diligent among them. You are writing for someone who has read the required course materials with interest and curiosity. They have participated in a highly engaged way in the classroom discussions that you, too, have attended. They have struggled as you have with difficult concepts and exotic facts. Like you, they have sometimes failed in that struggle and, like you, they have sometimes succeeded. They are familiar with some of your views and have taken a position on them, not always your position but at least one that is familiar to you. You can engage them in conversation and the conversation can be interesting.

You are free to construct your reader in a, let’s say, “aspirational” manner every now and then. If you are a student, you can try to imagine your teacher’s peers, other professional scholars, as your peers and attempt to write for them. But remember that this will require you to read more than an ordinary student. You should not construct an image of the reader as vastly more erudite or intelligent than you. Don’t imagine your actual teacher as your reader, that is. Don’t imagine someone who understands your theories better than you and is much more familiar with the literature than you. But do try, every now and then, to become as familiar as your teacher with some small corner of the literature. This is actually possible and very much worth the effort.

As a student or scholar, never write for a reader who knows much more or much less than you about your subject. This confuses the issues. Have an awareness of where you know more and where you may know less. Don’t make too much depend on your reader making up for your ignorance. Make sure you yourself know whatever you expect your reader to know. If there is something you need your reader to learn, make sure to provide all the information they will need. The most common case of this has to do with your presentation of the data, of which the reader is understandably ignorant until you present it. Remember that they know as much as you about methodology, however. You are pitching your claims to someone who shares your research practices and quality standards. After reading you, the reader should know what you know about the relevant data points.

Don’t overthink this. Don’t make it harder than it is. From your first day at university, make a list of your peers and what you think is on their minds. This list will change and grow. The contents of your reader’s mind will change and grow. Don’t try to keep up as much as keep track of where you are. Don’t think that there is some ideal peer group that you must win the good graces of. Your intellectual peers are simply people with minds like yours. Address them in your writing from the center of your strength. Seek them out, too, and listen to what they have to say.

All along the way you are simply trying to open your mind to the input of minds that are, at the moment, similarly engaged in similar intellectual puzzles. If you keep at it, you will naturally find your place in the discourse. Some people find this place so familiar and enjoyable that they choose a life of scholarship. They find it satisfying to be “like-minded” in this way. Others don’t enjoy the company of their academic peers as much and long to get out into the “real world” and tackle life’s “real” problems. There’s no shame in either position.

The important thing is to realize that scholarly writing is not about “the loneliness that is the truth of things.” We owe that beautiful image to Virginia Woolf and it is, arguably, a description of what her novels were about. Fortunately, she was less poetic about the domain of non-fiction. Knowing how to write, she told us there, is simply knowing who you’re writing for.

The Future of Objectivity (1)

Does objectivity have a future? Do objects have a place in a post-factual world? I certainly hope so. But the more I read about the state of academic writing today, the more uncertain I grow. The emerging ethos of academic writing instruction seems poised to jettison objectivity from our scholarship altogether. Angelika Bammer and Ruth-Ellen Boetcher Joeres’s anthology The Future of Scholarly Writing (Palgrave, 2015) is an excellent case in point, and I’m going to devote a few posts to it in the weeks to come. I will structure my reading as an annotation of each of the indexed appearances of “objectivity” in the book’s contributions. I will start at its last appearance and work my way forward, taking issue with the authors’ various treatments  of this famously “academic” notion as I go.

To begin, then, on page 206 Susan McClary explains the valorization of objectivity by way of the “dominance of the left hemisphere [of the brain]” in academic contexts. As “a product of the analytic predisposition [of the left hemisphere, the binarism of ‘subjective’ and ‘objective’] has the effect of acknowledging as valid only those observations that can be verified regardless of the researcher’s particular investments,” she tells us; “anything else is relegated to the scrapheap of the ‘merely’ subjective.”

I think this is putting the point somewhat too strongly. I don’t think the distinction between objective and subjective is a hard and fast binary. Most people will say that objectivity and subjectivity are relative notions and that any particular observation will have inexorably subjective and objective components or aspects. I personally think of objectivity as a “socially constructed” affair, always accomplished and, indeed, never more than approximated, through inter-subjective triangulation and negotiation. At the other end of the spectrum, it is hard to imagine the position of extreme or “pure” subjectivity would yield any particular observation, and it is often instead identified with the radical passivity of the transcendental subject. Indeed, it is often a gesture at mystical forms of experience. In between, I like to think, and each from our own subjective points of view, we try to accomplish our objectivity as best we can.

This does indeed mean trying to present our views “regardless of [our] particular investments” in them. That is, we want to open our beliefs to criticism from people who may not be invested exactly as we are in the outcome. The financially inflected language here is telling since we would certainly treat medical researchers who had “particular investments” in the drug company whose medicine they are testing with some skepticism. But skepticism is not, I want to emphasize, tantamount to “scrapping” the relevant observations. Objectivity does not actually mean that we only acknowledge observations that have been completely divested (if you will) of a personal stake. It normally just means that we should declare this interest and accept that our contribution will be taken with a correlative amount of salt.

“In most academic disciplines,” McClary continues, “the premium put on objectivity has strangled not only prose style–the exclusive emphasis on documentation and a deliberately drab vocabulary–but also methods: the questions we may ask and the ways we go about trying to engage with those questions” (206-7). This, again, is some strong language and I must say I don’t recognize this picture of academia at all.

I’ve never read a paper that confines itself exclusively to documentation, nor is there any shortage of papers and books that manage to present their ideas using lively and evocative language. Even texts that appear deliberately restrained in their prose are not always “strangled” by this effort. Indeed, we sometimes appreciate the admirable parsimony of a writer’s vocabulary as a breath of fresh air in a discourse that is too often overwrought in its terminology. Now, it is true that objectivity demands a certain (perhaps narrow) range of methods and that it can only be achieved in the pursuit of answers to a particular class of questions. But here, too, its hard to see researchers as “strangled”. Rather, it seems to me that their adherence to these approaches make particular observations possible that otherwise wouldn’t be.

For the past 50 years, in any case, there has been a cultivation, not only of much inter-disciplinarity, but a methodological pluralism, which should have afforded almost any researcher an opportunity to ask and answer almost any question in pretty much any way they choose. When I look at the wide range scholarship published since, say, May of 1968, I just can’t recognize the “dominance” of an objective ethos, nor any particular hemisphere of the brain. Rather, I see a struggle for dominance by multiple scholarly discourses in which objectivity is an increasingly embattled notion.

I would much prefer that no one asserted dominance and that we instead let objectivity be one among several values to pursue. I enjoy an intensely subjective paragraph as much as clear, objective one. Today, I’m afraid, if anything risks ending up on the scrapheap it is allegedly “drab” presentations of what were once called “facts”. Indeed, I have a feeling that Bammer and Boetcher Joeres are as concerned about the turn that our public discourse has taken in the years after the publication of their book. I think we need to think about reasserting, and perhaps reclaiming, the virtue of writing that is anchored in an objective sense of reality. Perhaps it is time to give our analytic predisposition a little space (on the left?) in which to work?

Being Open

“The point is to experience being there, in the sense that I, the human being, am the there, the openness of being for me, insofar as I undertake to preserve this openness, and in preserving it, to unfold it.” (Martin Heidegger)

Let me wax philosophical for a moment. To be really ‘present’, to be really ‘there’, is to be open to what is going on around you. Human existence, perhaps, is uniquely defined by this openness, this capacity to be present in the now, to “be there”. This is something Heidegger argued very forcefully for, and he included a social element; existence, he said, is always bound to the existence of others, to “them”. So being open is also a matter of “being there” for others.

Indeed, Heidegger distinguished the “logical conception of science”, according to which it is “an interconnection of true propositions”, from an “existential conception”, in which it is a mode of being with others and engaging with things of practical value. It’s not just a matter of being open to the facts, we might say, but a way of being open to what other people think of those facts and what we can do with them. I think this is enormously important to keep in mind, and much of the success of post-WWII “science studies” has come from pushing this awareness on people whose natural inclination is to stick to “the facts” alone.

Ironically, however, our awareness of the social conditions of “knowledge production” has at times made us less open to the idea that another person’s view of the world might be more valid than our own. Many of us are inclined to rely on the views of our closest peers, like-minded people who appreciate the value of what we are doing in our research. We are, though we are loath to admit it, a bit too eager to believe what is said in our own research community and we close ourselves off to input from people who might come at our problem from a completely different perspective. Though they come at it differently, however, they may well arrive at the same place you are. Here.

In a recent essay in the Chronicle, Alice Dreger has made a strong case for cultivating greater openness in our thinking to the ideas of people who disagree with us, even to ideas that outright offend us. In a key paragraph, she shares a formative experience from her grad school days.

Let us require our students to read difficult work and learn to respond to uncomfortable chalk by chalking back. Teach them histories of censorship and blacklisting on the right and the left. Require them to reflect upon their (and our) uncertainty. Teach reliable methodologies, not infallible ideologies. Let us always be implicitly asking what one graduate professor explicitly asked me when I was being an intellectually recalcitrant pig: If you haven’t changed your mind lately, how do you know it’s working?

A reliable methodology is one that opens you meaningfully to the world of facts. An infallible ideology, by contrast, closes us off even to the input of other knowledgeable people about those facts. It’s all well and good to be certain that racism or sexism is wrong. The problem arises when you are so certain that your interlocutor is an incorrigible racist or sexist that you close yourself off from their criticism of your views. They may be wrong. But so may you. Indeed, you may both be wrong and this encounter with another mind might ultimately only have made you see your error. That would have been good. For you.

I hope pigs won’t take offence at Dreger’s slur. As I said at the beginning, the open nature of our existence may be what makes us uniquely human. And our recalcitrance in the business of changing our minds is, indeed, often a little piggish, i.e., less than human. I wonder if it is too much of a philosophical inside joke to say that pigs live in a pen while human beings, as Heidegger suggested, live in the clearing. Out in the open.

Sustainable Discourse

Scholars are adept at forming their beliefs on the basis what other people know. Think of a historian’s views about the rise of “scientific management” in the early twentieth century, for example. She will no doubt have done some research of her own, perhaps in the archives of the Bethlehem Steel Corporation, but she will have a learned a great deal more about the subject by reading books and papers from her fellow historians. Indeed, she won’t have learned just about scientific management from these peers, she will have learned about the entire history of the world, going all the way back to her undergraduate studies. All this knowledge forms a frame around her specialization and a foundation beneath it.

When you think about it, this is a marvelous cultural achievement. We don’t just believe that Frederick Winslow Taylor had a profound influence on the organization and management of the modern corporation; we know this. Some of us know this in great detail and others only know the broad outlines. But these are not just “opinions” we hold. It is knowledge we have acquired. And we’ve been able to acquire this knowledge much more easily than the hardworking historians who have uncovered all the documentation and brought it together in their work. All we had to do was read what they had written. Then we knew.

But is that really all there is to it? Is this not almost a magical theory of literary meaning? All I have to do is pass my eyes over the pages of a book, it seems, and suddenly my mind is in a state of knowing! Well, no. As every student knows, it’s not that easy. You read the words and try to understand them. You struggle and you learn.

A very important part of this struggle comes in the confrontation of our own reading with that of others. After we have read a book or essay we discuss it with our peers–be they fellow professors or fellow students. Sometimes, we discuss it with “authorities” or “superiors”, i.e., experts outside our own field or teachers with a better understanding than us students. In those conversations, we find out how well we understand the book we were reading, how effective our struggle with those pages was.

What I want to emphasize is that we did, in fact, form beliefs while reading. We thought we knew something about scientific management after reading a chapter about it. But then, when we discuss with other people who have also read that chapter, we come to see the matter from a different point of view. Sometimes we recognize that we had misunderstood what the book was saying. Sometimes we realize that, however well researched and argued the book may be, the author seems simply to have gotten the facts wrong. Our reading, that is, may turn out not to “hold up” under the pressure of another reader’s take on it.

This is something to be mindful of as you go about your scholarly work. It’s one thing to make up your mind about something; it is another to speak your mind to others. You want to become good at making a claim, i.e., saying (claiming) that something is true. You then want to observe what happens to that claim in a conversation with qualified peers–people who make similar claims about similar things for similar reasons. Does the claim survive the criticism of your peers? Is the claim sustainable in discourse?

The Presumption of Criticism

Scholars often make claims based on research done by other scholars. It is standard practice to rely on the work of others to support or frame your own work. This practice is justified by a set of presumptions that it is our obligation, as scholars, to make true. Doing so does not guarantee that everything you read in a peer-reviewed article is true, but it does justify the (measured) confidence with which we draw on such claims when conducting our research.

In  a word, we presume that the claims made in the literature are subject to ongoing critical scrutiny by qualified peers. Suppose you read in a journal article from 2014 that “between 16% and 40% of expatriate managers return prematurely from their assignment” abroad. What impact should that fact have on your own research? Well, you could be happy to see that the subject you are interested in is, it seems, part of a big problem in the real word. Your ethnographic work on cross-cultural business appears much more relevant in that light. In your own introduction, then, you make this claim, duly citing the source that you found the figure in. You submit the paper for publication, your reviewers recommend publication, and the paper is published. Your claims, including the 16-40% expatriate failure is now opened to the aforementioned “critical scrutiny” of your peers. What happens next?

Well, the reason that you provided a source is that people want to be able to check your facts. Not all readers will do this, but some might. Suppose someone does. And suppose they find the claim embedded in a sentence like the following: “Previous research, reported on by Black and Mendenhall (1989), reveals that between 16% and 40% of expatriate managers return prematurely from their assignment.” Please understand how shocking that is. Your paper made it look like the rate was reported in 2014. We find here that this rate is almost thirty years old! But it gets worse than that. Checking Mendenhall 1989 they will see that the figure is asserted, not on the basis of empirical evidence, but still other studies, going back as far 1971. Looking at those studies, finally, does not solve the mystery either. It’s simply not possible to track down anyone who provides evidence of the 16-40% range.  This is what’s not supposed to happen in scholarship. You should not have cited the rate you did because you, too, should have tried to trace it to its source and failed. You should then have written to the authors of the 2014 paper and pointed out their mistake. The journal should have issued a correction.

It’s only when we believe that such an error-correcting mechanism exists that we can trust the literature on a particular subject. Seeing something we think we can use in an a journal article from four or five years ago, we go to the library and try to see if there’s been any published criticism of it. If not, we check the underlying sources (or evaluate the methods) of the paper in question. We decide that we trust this result and that our readers would trust it too. Then we include it in our own paper. Simply citing the first appearance of a convenient fact is not good enough.

I use the example of expatriate failure rates advisedly. Over twenty years ago, Anne-Wil Harzing discovered that her peers had not been as critical as they should have been when citing high reported rates of expatriate failure. As she put it in a follow up paper in 2002, the paper she wrote as a PhD student about this problem was “was borne out of sheer amazement and indignation that serious academics seemed to get away with something students at all levels were warned not to do.” (Indeed, my example wasn’t pulled out of thin air either, though I have left out the names to protect the guilty. Click here for a more detailed critique.)

We can’t make too much of the courage it takes to challenge your entire discipline in this way as a PhD student. Indeed, I’m not sure it’s even advisable, though Harzing’s hard work, also on other topics, has clearly paid off for her in the long run. What she did was “presumptuous” in a good way. She assumed that standards of scholarly rigor applied in her field even if many scholars seemed to be entirely innocent of them. She acted as though good research was a norm. That’s how we should all work.

Indeed, that’s how most people presume academia works. Mistakes are made but they don’t remain for long. They are caught by critically minded peers and eventually corrected. You can play your part. I highly recommend reading Harzing’s 2002 paper, which is organized around the rules you should be following and examples of how they are broken. Learn them the easy way now. The hard way is not pleasant to think about.