Monthly Archives: October 2017

Against “Reductionism”

Sometimes a draft gets longer than we’d like. Sometimes we are asked for a text that is shorter than the one we’re working on. We’re writing a paper for a journal with an 8000-word limit and before we know it we’ve written ten-thousand words. Then we’re suddenly asked to submit an extended abstract on the same subject with a 1500-word limit. The problem, we tell ourselves, is to “reduce” what we’ve got to something shorter. I want to offer an argument against this way of thinking.

Remember that a text is a series of paragraphs of at least six sentences and at most 200 words that say one thing and support, elaborate or defend it. When planning or re-organizing a text, you should always use a key-sentence outline as your guide. That is, you should take the one sentence in each paragraph that states what the rest of the sentences merely support, elaborate or defend and copy it into a separate document. If you’ve got a 40-paragraph paper you’ll have 40 sentences in your key-sentence outline. These sentences should always make sense in sequence without the context of the paragraphs in which they will ultimately appear. A good paper will be a series of claims that indicate an argument independent of the basis you are providing for each claim.

Now, each paragraph will consist of between 100 and 200 words. A first draft of a paper with an 8000-word limit should consist of about 40 paragraphs, i.e., between 4000 and 8000 words altogether, which should leave you with plenty of space to add more paragraphs as needed in revision. Always think of the revision process as identifying (1) new paragraphs that need to be written, (2) existing paragraphs that need to be removed or (3) existing paragraphs that need to be rewritten. There’s nothing else that can be wrong with your paper.

When trying to imagine a shorter version of longer paper, don’t imagine that the task is to “reduce” the bigger text to a smaller one. Don’t think of the job as removing words and sentences from the paper you have already written. Think of it as imagining a new text that makes fewer claims. You may have a 60-paragraph paper that is 9000 words long. Okay, imagine a 40-paragraph version of the same argument. You need to find 20 sentences in your key-sentence outline that you can do away with, perhaps while modifying some of the others. If you’ve got reasonably uniform paragraph lengths, you’ve just imagined a 6000 word paper. But don’t think you’ll arrive at this paper simply by “boiling” or “pruning” the longer text. That’s not how it works. Instead, write the new text following the new outline. It will take you 20 hours.

Or imagine “reducing” the text to a 1500 word extended abstract. You’ll now have to make due with 10 paragraphs at best. (I actually recommend dividing the word limit by 200, which will force you to write even more economically than necessary at first pass. You will probably then have room for an extra paragraph or two at the end.) What are the ten (or eight) things you want to say? Imagine a paper that says them. Then write it. It will take 5 hours, 27 minutes (of writing) + 3 minutes (for a break) at a time.

That is, I’m urging you not to think of your longer draft as setting a material constraint on your shorter one. The challenge is not one of representing a existing longer text in an imagined shorter text that leaves something out. Rather, the longer text was an attempt to represent what you know about something in 8,000 words, or 10,000 words, or whatever. But there’s no ideal amount of words to represent a body of knowledge. If you had 20,000 words you could do it even more justice. But that doesn’t meant that the 10,000-word text is somehow a deficient or “reduced” version of the “ideal” longer one. (The truly ideal text would, I guess, have no word limit at all? It would be infinitely long?) Rather, the enormous surplus of knowledge that the longer text demonstrates you have is a material resource for producing a different, shorter text.

You just have to represent that knowledge within the space of fewer paragraphs. In the main, think of a “shorter” text not in terms of the amount words but the amount of paragraphs. Don’t try remove words and sentences (except for the usual reason of keeping each paragraph below 200 words). Remove whole claims, i.e., key sentences, i.e., entire paragraphs. That said, I understand, for some purposes, imagining a text with shorter paragraphs. Sometimes, especially in an abstract or a conference paper, it can be useful to define the paragraph as consisting of least 4 sentences and at most 150 words. This gives you at least 10 paragraphs for a 1500 word text, which may make it easier to decide what to say. It may also bring the style more into line with the kind of text you are trying to write–more a synopsis of an argument than the argument itself.

But my point still holds: don’t try to reduce a longer text to a shorter one. Outline a new text with fewer claims. Then write the best possible paragraphs to support each one. You’re not boiling anything down. You’re not pruning branches off a tree. You’re not weeding a garden. You’re not forcing anything into a form. You are doing what you always do when you write, namely, making series of claims, one paragraph at a time. Your word limit tells you only how many things you can say. Saying them well is the same old problem of writing, the familiar difficulty.

A Dim View of Criticism

There’s been a lot of great discussion over at Andrew Gelman’s blog in the wake of Susan Dominus’s piece in the NYTimes about Amy Cuddy and power posing. I wrote about it here when the story broke, and Andrew has since published a number of posts about criticism in science (see this one and this one in particular). It reminded me of a post I wrote six years ago while reading Michael Lewis’s The Big Short, which I want to re-purpose for this blog today.

Lewis’s book is about the Wall Street outsiders and oddballs who “shorted” (i.e., bet against) the subprime mortgage market and made a killing when it finally collapsed. Interestingly, after they had decided that the market was going to collapse, it was not, actually, a straightforward matter to bet against it. Had they thought that a company was going to go bust, there’d be a standard way of making money on that belief: they could borrow stock in the company, sell it, and then wait for its share price to crash. At that point, they buy back the shares (cheap) and pay off their debt. But, as Lewis points out, things were very different with mortgage bonds:

To sell a stock or bond short you need to borrow it, and [the bonds they were interested in] were tiny and impossible to find. You could buy them or not buy them but you couldn’t bet explicitly against them; the market for subprime mortages simply had no place for people in it who took a dim view of them. You might know with certainty that the entire mortgage bond market was doomed, but you could do nothing about it. (29)

I had a shock of recognition when I read that. Back in those days, I was working very hard to find a way to “bet against” a number of stories that have been told in the organization studies literature. I have now somewhat resigned myself to the fact that there’s no place in that literature for people who take a dim view of them. While some people say encouraging things to me in person about what I do, there isn’t really a genre (in the area of management studies) of papers that only points out errors in other people’s work. You have to make a “contribution” too. In a sense, you can buy the stories people are telling you or not buy them but you can’t explicitly criticize them.

Back then, I thought about this in terms of the difference between faith and knowledge. Knowledge is a belief held in a critical environment, while faith is a belief held in an “evangelical” environment. The mortgage bond market was an evangelical environment in which to hold beliefs about housing prices, default rates, and credit ratings on CDOs. There was no simple way to critique the “good news”. So it took some dedicated outsiders to see what was really going on. These were people who insisted on looking at the basis of the mortgage bonds that were being pooled and traded on Wall Street in increasingly exotic ways.

One of these guys was Steve Eisman, who was a notoriously cantankerous personality. (He was fictionalized brilliantly by Steve Carell in the movie.) He recalls meeting Ken Lewis, the CEO of Bank of America. “[The CEO’s on Wall Street] didn’t know their own balance sheet … I was sitting there listening to [Ken Lewis]. I had an epiphany. I said to myself, ‘Oh my God, he’s dumb!’ A lightbulb went off. The guy running one of the biggest banks in the world is dumb” (TBS, p. 174). Yes, or perhaps he was just working in an evangelical rather than critical environment. Here, “any old balance sheet” will do … as long as you think it’s bringing good news.

I think, sadly, the same thing can be said about various corners of the social sciences today. Amy Cuddy’s work is being defended by many as “good news”, and there is little room in the mainstream literature to publish critiques (and replications with null results) that suggest that power posing does not have the effect it claims to have. As in the case of the housing bubble, these things can be more easily discussed now that there actually is a crisis, but we mustn’t forget the incredible amount of hard work that was done by Uri Simonsohn, Joe Simmons, Lief Nelson, Andrew Gelman and others to reach this point. It was and still is a somewhat thankless task and, unlike Burry and Eisman, they don’t stand to make a billion dollars on their bet. Fortunately, the work of the Amy Cuddys and Brian Wansinks of the world isn’t likely to bring the global economy to its knees either.

It is sad, however, that so many social scientists take such a dim view criticism. Back in the mid-nineteenth century, the Danish philosopher Søren Kierkegaard–who was, incidentally, born in the year of a financial crisis–raised the question of the sense in which sin is simply ignorance. If so, he asked, is it

the  state of  someone  who  has  not  known  and  up  until  now  has  not been  capable  of  knowing  anything  about  truth,  or  is  it  a  resultant,  a  later ignorance? If it is the latter, then sin must essentially lodge somewhere else than in ignorance. It must lodge in a person’s efforts to obscure his knowing. (The Sickness Unto Death)

Dominus tells the story of Amy Cuddy as someone who was following all the rules until the rules suddenly changed. That may be partly true. But a lot of the problems in the social sciences today, and the reason that they have gathered themselves into something like a full blown crisis, is, I fear, that people have been making a real effort to obscure their knowing, as Kierkegaard put it. Or perhaps they’re just not, as Andrew somewhat charitably suggests, making the effort to do something difficult (statistics, scholarship) well.  I hope that the social sciences will stop taking such a dim view of criticism going forward and give more space in the literature to people who take a dim view of underpowered studies with overblown publicity. Kierkegaard’s works are traditionally divided into “edifying” and “existential” discourses. Perhaps all of us need to be both evangelists of science and critics of it? Perhaps we need to be evangelists for criticism?

Clarity, Truth and Writing

If you haven’t already done so, I strongly recommend you read Francis-Noël Thomas and Mark Turner’s Clear and Simple as the Truth. In many ways, my approach to academic writing is a training regimen in the “classic style”. What I call the Writing Moment, in particular, embodies a core principle of this style, namely, that thought precedes speech. As Thomas and Turner point out, this principle runs counter to what a great many people have been taught (and go on to teach) about the role of writing in inquiry. Thomas and Turner do a good job of describing this influential and somewhat pernicious doctrine:

Records are understood as a sort of external memory, and memory as internal records. Writing is thinking on paper, and thought is writing in the mind. The author’s mind is an endless paper on which he writes, making mind internal writing; and the book he writes is external mind, the external form of that writing. The author is the self thinking. The self is the author writing the mind. (59)

Like Thomas and Turner, I caution against this view of yourself (your self) as a writer. They describe the alternative in compelling terms:

Thinking is not writing; even more important, writing is not thinking. This does not mean that in classic style all of the thinking precedes all of the writing, but rather that the classic writer does not write as he is thinking something out and does not think by writing something out. Between the period of sentence and the beginning of the next, there is space for the flash of a perfect thought, which is all the classic writer needs. (59-60)

Notice that this space is one that the reader’s mind can occupy as well the writer’s. Indeed, that’s the whole point of the writing, to instantiate in the reader’s mind the “flash” of what Descartes (the patron saint of classic style) called a “clear and distinct idea”, a “perfect thought”. Classic writers don’t make a big deal of their imperfections; they know that their own thoughts, and those of their readers, are often less perfect than they would like, but they don’t show this in their writing. Instead, they do the best they can to present only ideas that they have thought through, as clearly and truthfully as they can. Simply put, they try to say only things they know are true in their writing, and they make sure that their text leaves this space for real thought to flash before the reader.

If you want to train this ability–which is, you’ll notice, as much a training of your mind (to think) as your hands (to write)–I recommend trying my rules for a few weeks. End the day with a clear and distinct idea of what you’ll write in the morning. Articulate a thought in a key sentence and relax for the rest of the evening. Then, in the morning, spend 27 minutes composing that thought into at least six sentences and at most 200 words that present it to an intellectual peer. Imagine your reader’s mind to be as spacious and brilliant as yours.

This will not just make you a better writer. By uncluttering your mind of the multiple “drafts” of your “internal writing” and distilling it, if only for a moment, into an actual thought–one that can live independent of your text–you are strengthening a mental faculty that too many of us neglect. You are learning to put the writing where it belongs: on the surface. This will free you to explore the depths of your own mind. And that, friends, is where the truth is ultimately found.

The Future of Objectivity (3)

Some of the most successful challenges to the objectivity of scholarly writing have come from feminist thinkers. Amy Katz Kaminsky raises the issue briefly in her contribution to The Future of Scholarly Writing and suggests it represents a tension between “academic” and “ethical” principles. Fortunately, she asserts the importance of “reconciling the two” (184) and not, as others have, of abandoning objectivity altogether in pursuit of some higher aim. As in my previous engagements with this book (see parts 1 and 2 in this series), please remember that I’m taking Kaminsky’s views on objectivity out of their larger context, both of the chapter they appear in and the book that it, in turn, is a contribution to. I will eventually read this book from front to back like we normally do.

Kaminsky begins (183-4) by questioning traditional standards of “mastery” and “authority” in scholarly work. Women’s Studies, as she points out, is a relatively young field and many of its practitioners were therefore trained in other disciplines. But she notes that authority can be problematic in any case, when, for example (I imagine), a scholar of Latin American literature who does not have Spanish as a first language proposes to teach, say, Latin American students. One is always, in this sense “between cultures”, she suggests. Moreover, the “stark” history of the relationship between the United States and Latin America makes this cultural encounter even more difficult to navigate.

The notion of objectivity comes up when she turns her attention to the legitimacy of feminist scholarship in the academy. The problem, she says, has been one of “carving out a space for situated knowledge … in a realm where objectivity and neutrality have been key values” (184). She argues that the “neutrality” that is invoked is often simply the “generic masculinity” of the “dominant group”. This defines a “norm” and maintains the “status quo” that it is the goal of feminist scholarship to change. Presumably, “situated knowledge” is neither objective nor neutral because it involves something like Susan McClary’s “particular investments” in political and ethical projects of various kinds, which, the argument might go, are inexorably partisan and subjective.  The challenge, then, is to bring about a transformation of dominant group commitments (shades of Kuhn) without losing the legitimacy that adhering to those commitments confers. This is arguably the dilemma of all social change projects.

It is not entirely clear in this passage what the endgame is, only that Kaminsky does not wish to maintain the status quo. I can’t tell whether she wants to maintain a semblance of objectivity and neutrality only long enough to do away with it, so that the future of scholarly writing will be liberated from the “high seriousness of academic standards” and be free to pursue more “situated” concerns, or whether she wants merely to challenge the “masculinity” of the current norms and achieve a new kind of neutrality (gender neutrality?) with its own kind of seriousness even after “the foundations of those very standards” have been challenged. I do know that some of the conversations about the current replication crisis have turned on whether traditional criticism, which involves directly pointing out the errors in the work of other scholars, is actually a distinctly male form of bullying. I hope the pursuit of objective truth is not destined to be seen as a “generic masculine” form of harassment.

Like I say, I am entirely encouraged by Kaminsky’s suggestion that we must find a way to reconcile traditional norms of objectivity and neutrality with the increasingly political and engaged desires of modern academics, who, as if to adopt Karl Marx’s famous slogan, are not content to interpret the world, but hope also to change it. I’m not sure that this tension is as gendered as some people seem to think it is (I’m not ready to say how gendered Kaminsky thinks it is)–Marx, after all, was very much a man, and revolution has always, it seems to me, had a certain machismo about it. But I will admit that, at this moment, I am more concerned about preserving, and even conserving, the objectivity and neutrality of our scholarship in the face of the “post-factual” dystopia that seems to be looming, than I am about finding room for the “situated knowledge” of any number of political projects that seek the authority of “academic” work.

[Part 4]

A Rejection Plan

Most scholars have a publication plan. For a given research project, they have a list of planned “deliverables”, specifying some number of articles to be published in specific journals. Collaborative research projects, too, have a such a plan, distributing responsibility for authoring and co-authoring a series of papers among members of the research group. On a still larger scale and over a longer term, departments and whole universities have goals defined in terms of a certain amount of publications in a certain set of journals. Researchers internalize these goals and make a plan for themselves for the coming year, five years, and so on. All of this is perfectly reasonable (or at least rational) in a time where publication has a such a decisive influence on the course of one’s career.

But a publication plan has the very important drawback that it is almost inevitably an overview of one’s coming failures. The most attractive journals are characterized by very high rejection rates. One cannot simply plan to be published in a good journal, just as one cannot just plan to make an important scientific discovery. One can hope to do these things, to be sure, but one cannot simply undertake to get oneself published. It’s not the sort of goal that a deliberate plan can help one to accomplish. Success is almost entirely out of one’s own hands.

For many years, therefore, I have argued that one should plan to submit work, not to publish it. Indeed, when I talk to department heads and university administrators I encourage them not to keep asking their faculty members what they have published, but what they have submitted. In this regard, I’ve compared myself to Moneyball‘s Billy Beane. A researcher who is regularly submitting work for publication is worth more to a department than one who occasionally publishes in a top journal.  (I’m happy to discuss exceptions here, but do note that the willingness to discuss what one knows is an important part of being a scholar. Those who rarely submit work for peer review are not really demonstrating this willingness.) A submission plan, moreover, is one you control yourself. While there are all manner of barriers to publication, no one can prevent you from submitting a paper when you have done your work as planned.

I recently had a conversation with an author that suggested an even more striking, perhaps even jarring, image. Make a rejection plan. That is, plan to have your papers rejected by three or four journals before it is published. Normalise the experience of rejection as something to be expected. Write your paper and submit to the best journal you think it is suitable for. But make sure you have a list of three or four other journals that it is also suitable for. When you get rejected, incorporate whatever criticism the rejection involved and send the paper on to another journal on the list. Don’t give up until the list is exhausted, but perhaps make sure that there’s always some kind of published end-game, even if it is merely making the paper available in an institutional repository. As Brad Pitt says in Moneyball, “It’s a process.”

Obviously, there will be exceptions. If a reviewer convinces you that your study is fundamentally flawed, you might decide not to waste anyone else’s time with it. But most people retain some confidence in their work, even after a reviewer has found shortcomings in it or an editor has deemed it a poor “match” for the journal. Our general attitude is that errors can be fixed and there are other fish in the sea (or perhaps other seas in which to swim).  It is rare that we learn from any one experience with a journal that our research is altogether worthless. In fact, I would argue that to take this as the lesson of any one rejection is always a mistake.

Here’s another interesting feature of this plan: when you get a “revise and resubmit”, you can decide whether the suggested revision is worth the effort when compared to making just a few minor changes and sending it to the next journal on your list. It lets you gauge the amount of effort you are willing to put into the next step.

But the most important reason to think in terms of a series of predictable rejections, rather than planning for publication in a particular journal, is that it forces you to write your paper for an audience that is more general than a particular journal’s reviewers and editors. In fact, it gets you to think in terms of your actual readers rather than the “gatekeepers” that stand in your way of reaching them. You will have to write a paper that, with minor adjustments, is relevant to one of several journals, all of which your peers (the members of your discipline) read and cite in the course of their own work. Perhaps ironically, writing the paper in this spirit–with no one particular journal in mind–will produce prose that is more authoritative, more objective, more “classic”. It is altogether likely that your reviewers will prefer a paper that wasn’t mainly written to “get past” the particular filter they happen to represent. It will have been written to accomplish something on the other side of it.

The author and I quickly agreed that this was a refreshing way to look at the publication problem. It recognizes that the most likely result of submitting to a journal (in which you’d like to be published) is rejection. It is altogether sanguine about this prospect. It increases the odds of publication by planning the next step after rejection already before this situation arises, a situation that can’t be taken to be “unfortunate” because it is so very probable. This both tempers the disappointment of rejection and increases the joy of  acceptance.  Unlike a publication plan, a rejection plan is not a list of planned failures. It is a plan for how to move forward after an entirely imaginable outcome occurs.