book reviews\(\def\hfill{\hskip 5em}\def\hfil{\hskip 3em}\def\eqno#1{\hfil {#1}}\)

Journal logoJOURNAL OF
APPLIED
CRYSTALLOGRAPHY
ISSN: 1600-5767

Bias in Science and Communication. A Field Guide. By Matthew Welsh. IOP Publishing, 2018. Pp. 177. ISBN 978-0-7503-1312-4.

CROSSMARK_Color_square_no_text.svg

aDepartment of Chemistry, University of Manchester, Manchester M13 9PL, UK
*Correspondence e-mail: john.helliwell@manchester.ac.uk

I was reading my monthly Physics World magazine and an advertisement from IOP Publishing immediately attracted my interest. Why did it catch my interest? The book Bias in science and communication. A field guide is advertised to `introduce the concept of biases arising from cognitive limitations and tendencies with a focus of the implications of this for scientists in particular'. There is real concern about the irreproducibility of science, eloquently described most recently by Bishop (2019[Bishop, D. (2019). Nature, 568, 435.]). As I have been an editor assessing research articles for some 25 years, acting as a gatekeeper to publication of over 1000 article submissions, the study of irreproducibility is more than a passing interest. Deliberate fraud is extremely rare but not zero, even in crystal structure analyses. Mistakes can be avoided by careful scrutiny of articles with underpinning data and by checking of reports by editors and referees. But after these two categories, what about bias and how we communicate science? This book does not stimulate happy memories. I recall the author who cited more than 20 of their own articles out of a reference list of 25, an obvious case of bias borne out by a look at those references in detail. After rejection of the article in question and also of two appeals, the hate e-mails from the disgruntled author came my way as I was Editor-in-Chief dealing with the final appeal. So, bias can be spotted and dealing with it can be unpleasant. Another situation I recalled was a science advisory committee meeting where one role was to advise the management team how to deal with a funding agency panel who would come `baring teeth'. Is that bias? How to communicate the facts was our main tool as antidote to the panel's teeth, but was our committee's advice also a type of bias? The biggest example of bias in science is of course the lack of gender balance and underrepresentation of ethnic minorities in academic faculties; that it exists at all is widely cited as due to `unconscious bias'. This book then covers all these issues in depth, and much more besides. According to the author's web site, he is `a psychological scientist working in the Australian School of Petroleum's Centre for Improved Business Performance'.

The book's contents are as follows:

1. Pop quiz: a battery of decision-making questions for self-assessment and reflection

2. Anchors aweigh: an introduction to the book, decision making and decision biases

3. On message: reasons for and types of communication

4. Improbable interpretations: misunderstanding statistics and probability

5. Truth seeking? Biases in search strategies

6. Same but different: unexpected effects of format changes

7. I'm confident, you're biased: accuracy and calibration of predictions

8. Sub-total recall: nature of memory processes, their limitations and resultant biases

9. Angels and demons: biases from categorisation and fluency

10. Us and them: scientists versus lay-people and individual differences in decision bias

11. Warp and weft: publication bias example to weave it all together

12. Felicitous elicitation: reducing biases through better elicitation processes

13. A river in Egypt: denial, scepticism and debunking false beliefs

14. The field guide: general conclusion and spotters guide to biases

The book has rather a lot of typographical errors, as if the proofs have not been read. That said, the typos were easily spotted and did not cause me confusion. The chapters have detailed chapter subheadings and sub-subheadings. These are useful, but I still think the book needs a subject index. The whole book is engaging and readable. Each chapter has a helpful and clear conclusions section.

There are themes running through the book, notably how statistics in the presence of uncertainty should be presented for optimal clarity, and the pitfalls to be avoided. Also, how people's memories work, and do not work, is described in detail. Thirdly, on the subject of how to take a decision, two methods are contrasted: intuitive thinking versus logical reasoning. The book's overall title includes that it is a field guide. This aspect is to do with `identifying biases in the wild', and the final chapter (The field guide: general conclusion and spotters guide to biases) is an extensive tabulation and excellent summary of the diverse information presented in the book. I did take exception to one specific, strongly made, point in the book, in Chapter 3, that `…people receiving communications from scientists can interpret the additional information and clarity of expression required in precise scientific communication as undermining the strength of any statement'. As counter examples I would mention that a member of the public asking for directions when they are lost does want precise clear information. Likewise, when they hear a politician's evasiveness in response to a question they in fact prefer clarity. So, I would disagree with that piece of advice. The context of one's discourse with a member of the public would, of course, be all important.

An issue highlighted by both Bishop (2019[Bishop, D. (2019). Nature, 568, 435.]) and this book is the phenomenon of HARKing, namely making a hypothesis after the research outcome is known. This is due to the need to get published, and so scientists do their best to write up a rational account with as stunning a conclusion as possible, but which may have little to do with the starting hypothesis. The antidote to this, it is suggested, is to require a research proposal to be judged by a journal in advance. The journal would then state whether they could at the end of the research accept an article submission or not. This would obviously be a major change in the research process. Some mention of the fact that unexpected discoveries do happen would have added balance to the description of this topic.

In keeping with the book's title, and mission, it seeks to root out bias and false communication, but in so doing it overlooks all the positives of science and its discoveries and applications. Max Perutz (1991[Perutz, M. (1991). Is Science Necessary? Oxford University Press.]) asked Is science necessary?, and answered, with specific examples, a very strong yes, science is necessary, be it science and food production, science and health, or science and energy. One can worry over issues of truth and certainty, and bias, but science delivers, as is obvious from our computers, smartphones and medicines to name a few examples from daily life.

Overall, though, I did like the book. I learnt a lot about the various pitfalls in science communication in particular. I especially liked the description of trained `elicitors of science who elicit the facts from scientists', i.e. who may be biased in some way, unconsciously or consciously, and stray into advocacy

References

First citationBishop, D. (2019). Nature, 568, 435.  CrossRef PubMed Google Scholar
First citationPerutz, M. (1991). Is Science Necessary? Oxford University Press.  Google Scholar

© International Union of Crystallography. Prior permission is not required to reproduce short quotations, tables and figures from this article, provided the original authors and source are cited. For more information, click here.

Journal logoJOURNAL OF
APPLIED
CRYSTALLOGRAPHY
ISSN: 1600-5767
Follow J. Appl. Cryst.
Sign up for e-alerts
Follow J. Appl. Cryst. on Twitter
Follow us on facebook
Sign up for RSS feeds