New Friend

Keep Writing Better

 

Paper here.

Writing Matters
Jan Feld, Corinna Lines, Libby Ross*
[March 15, 2022]
Abstract
For papers to have scientific impact, they need to impress our peers
in their role as referees, journal editors, and members of conference
committees. Does better writing help our papers make it past these
gatekeepers? In this study, we estimate the effect of writing quality
by comparing how 30 economists judge the quality of papers written
by PhD students in economics. Each economist judged five papers
in their original version and five different papers that had been
language edited. No economist saw both versions of the same paper.
Our results show that writing matters. Compared to the original
versions, economists judge edited versions as higher quality; they
are more likely to accept edited versions for a conference; and they
believe that edited versions have a better chance of being accepted
at a good journal.

1. Introduction
Across the sciences, we have heard calls for better academic writing (e.g. Pinker 2015; Sword
2012; Salant 1969). In economics, McCloskey has argued since the 1980s that economists should
write better, and demonstrates how (McCloskey 1992, 1987, 1985). Her book Economical
Writing has since become a standard part of the curriculum for economics PhD students
(McCloskey 2000).
Many economists and other scientists follow these calls and try to write better. They
spend days toiling over their introduction, writing and rewriting, and spend their research
budgets on language editing. Yet we do not know if our writing matters. If we write better, do
our peers as referees, journal editors, and members of conference committees perceive our
papers as better?
The answer to this question has implications for who gets to contribute to the scientific
discourse. In most disciplines, written English has become the language of modern science. This
dominance of English makes it harder for scientists from non-English-speaking countries to
publish in leading academic journals — especially if they cannot afford language editing. As a
result, we may be losing important contributions from scientists who struggle to write well in
English.
Academic literature does not provide a satisfactory answer to how important writing
quality is. Many papers have investigated the correlation between writing quality and scientific
impact as measured, for example, with numbers of citations. Depending on the discipline, and
measures of writing quality and scientific impact, these papers found that better-written papers
have more impact, similar impact, or less impact (e.g. Didegah and Thelwall 2013; Dowling,
Hammami, and Zreik 2018; Laband and Taylor 1992; Fages 2020; Hartley, Trueman, and
Meadows 1988). It does not take a PhD in economics to recognize the limitations of these
papers. Correlation is not causation. Papers that are well written likely differ on several
dimensions from papers that are not. To find the causal effect of academic writing, we need to
compare well-written papers with poorly written papers that are otherwise identical. This is what
we do in this study.

We estimate the causal effect of writing quality by comparing how experts judge the
quality of 30 papers originally written by PhD students in economics. We had two versions of
each paper: one original and one that had been language-edited. The language editing was done
by two professional editors, who aimed to make the papers easier to read and understand. We
then asked 18 writing experts and 30 economists to judge some of the original and edited papers.
Each of these experts judged five papers in their original versions and five papers in their edited
version, spending around 5 minutes per paper. None of the experts saw both versions of the same
paper. None of the experts knew that some of the papers were edited. The writing experts judged
the writing quality and the economists judged the academic quality of the papers. All economists
in our sample have PhDs in economics and their academic positions range from postdoc to full
professor; four of them are editors of academic journals; and all of them are regularly involved in
judging the quality of academic papers as referees or members of conference committees. We
estimate the effect of language editing on perceived writing quality and perceived academic
paper quality by comparing the average judgement of original and edited papers.
Our results show that writing matters. Writing experts judged the edited papers as
0.6 standard deviations (SD) better written overall (1.22 points on an 11-point scale). They
further judged the language-edited papers as allowing the reader to find the key message more
easily (0.58 SD), having fewer mistakes (0.67 SD), being easier to read (0.53 SD), and being
more concise (0.50 SD). These large improvements in writing quality translated into still
substantial effects on economists’ evaluations. Economists evaluated the edited versions as being
0.2 SD better overall (0.4 points on an 11-point scale). They were also 8.4 percentage points
more likely to accept the paper for a conference, and were 4.1 percentage points more likely to
believe that the paper would get published in a good economics journal. Our heterogeneity
analysis shows that the effects of language editing on writing quality and perceived academic
quality are particularly large if the original versions were poorly written.
The approach of manipulating text to estimate the effect of writing has been used in
several contexts, for example, for legal documents (Mindlin 2005) and financial reports (Tan,
Wang, and Yoo 2019). However, only one other paper has investigated how the writing quality
of academic papers affects the evaluations of scientists. Armstrong (1980) altered the writing
quality of the conclusion section in four management papers. In contrast to our findings, his
results suggest that improving the writing causes experts to evaluate papers less favorably. We
improve upon Armstrong’s approach by having a larger sample, as well as a more rigorous study
design and empirical analysis. For example, our study includes 30 papers (instead of four) and
the language editing was done by professional editors (instead of Armstrong, who is a good
writer but not an expert editor). By asking scientists to evaluate whole papers instead of one
individual section, the context of our study is also closer to how peer review is conducted in
practice. Finally, since Armstrong conducted his study, attitudes towards writing have changed.
We have seen the birth of the plain language movement, and several countries now require
government and other agencies to write according to plain language principles. For example, the
United States passed The Plain Writing Act of 2010, requiring all government agencies to write
in plain language (Office of the Federal Register 2010). And the European Union requires
companies to communicate clearly with customers about how they use their data (European
Union 2022). Our study therefore provides the best answer of whether the quality of academic
writing matters today.